While AI has become the universal abbreviation for “artificial intelligence,” serendipity has allowed one company to conveniently hijack the letters for its own marketing strategy: “Apple Intelligence.” When ChatGPT stormed onto the scene in late 2022, however, several observers pondered whether the AI revolution had caught Apple off-guard. True, Apple had been incorporating “neural engines” in its processors and devices for years, but the company was criticized for failing to take full advantage of its position in the marketplace, where Apple appeared to be the one company with the resources and infrastructure to deliver AI tools to the masses.
Others say that Apple is simply being cautious, careful, and deliberate in its AI journey, mainly due to the company’s overarching goal of customer privacy. There are so many unknowns swirling around AI tools today when it comes to tracking user habits, using personal information, and training on submitted prompts and uploaded data. Apple continues to assure customers that it considers privacy and security to be of the utmost importance. And that apparently means they are willing to appear behind the AI-generated curve . . . for now.
In summer 2024, when Apple held its Worldwide Developer’s Conference (WWDC24), Apple appeared to reveal the foundation for its AI vision and boldly dubbed it “Apple Intelligence.” But rather than release a new app, tool, or device, CEO Tim Cook described Apple Intelligence as a “new era” for the devices they already produce, including the iPhone, iPad, and Mac. Apple Intelligence wasn’t available immediately—we had to wait for a new version of iOS and macOS, where artificial intelligence was baked into the programming guts.
The Three Echelons of Apple Intelligence
Most of the excitement surrounding publicly available AI tools today is focused on large language models (LLMs), which are trained on vast data sets. This was the significant breakthrough heralded by ChatGPT and others. In the past, we had computers that were trained to play and win at chess, but those computers were only trained on the specific “model” of playing chess and nothing else. LLMs are trained on billions of documents, thousands of libraries full of books and tomes, and the entire Internet and the total sum of human knowledge. That is how generative AI (GenAI) tools can craft documents and conversations that appear so human-like. LLMs are trained on the enormity of human-created material, requiring massive data warehouses with millions of computer servers, only accessible through the cloud.
For LLMs to stay on par with humankind, they need continuous access to new information and content, which means that they learn from every prompt and every document and file uploaded to their systems. That’s a potential privacy hole, and Apple did not want to contribute to that exposure at the expense of its customers. This appears to be the reason why Apple opted for a three-tiered, hybrid approach to AI.
The first gateway for accessing Apple Intelligence is locally on your iPhone or Mac for “common” tasks that don’t require an LLM. Apple designed a much smaller model that melds with the processing power of modern-day iPhones and Macs. This access to Apple Intelligence is free of charge and available to anyone with a compatible iPhone or Mac that can run the latest operating systems.
When a larger AI model is required for your request, Apple created Private Cloud Compute. The only details we know about it are those that Apple has chosen to share with the public, and users don’t have any control over what tasks are done locally or in Private Cloud Compute. The only way to know (at this point) is if something doesn’t work when you’re offline and, therefore, can’t access Private Cloud Compute.
And when you really need maximum AI horsepower, Apple will hand you off to ChatGPT (and potentially other assistants in the future). You get a prompt asking if you’re cool with potentially compromising your privacy for the answer you seek, and if you answer yes, Apple can at least say they warned you.
None of this happens if you don’t turn on Apple Intelligence in the first place. At this point in the AI timeline, Apple Intelligence is not turned on by default, so on your Mac, you’ll have to go to System Settings and turn on the option (which Apple still indicates is in “Beta” testing). At the bottom of that Settings window, you will also choose whether or not you want the ChatGPT handoff to be allowed.