The three new iPhones unveiled next to Apple’s glassy circular headquarters Wednesday look much like last year’s iPhone X. Inside, the devices’ computational guts got an invisible but more significant upgrade.
Apple’s phones come with new chip technology with a focus on helping the devices understand the world around them using artificial intelligence algorithms. The company says the improvements allow the new devices to offer slicker camera effects and augmented reality experiences.
For the first time, non-Apple developers will be allowed to run their own algorithms on Apple’s AI-specific hardware. That could enliven the iTunes app store with rich new experiences for socializing, creating art, or getting things done. Machine learning algorithms can help apps to understand and respond to what’s happening in photos and video, for example. Combined with Apple’s support for augmented reality, more AI oomph could help your iPhone transform the world around you.
All three iPhones announced Wednesday include a new chip called the A12, designed in-house by Apple. It has a unit called a neural engine, dedicated to running the neural network software behind recent advances in the ability of machines to understand speech and images.
Apple introduced the iPhone’s first neural engine last year inside the iPhone X, 8, and 8 Plus. It was the first major smartphone with a dedicated chip core for neural networks.