Apple Goes Third Party to Jumpstart Siri

In December, Apple moved to consolidate its AI leadership under Federighi, completing a transition that had begun earlier in the year when responsibility for ‌Siri‌ was removed from the AI group and brought under Federighi’s software division. In January, Apple announced plans to use Google’s Gemini AI models to power future AI upgrades, including an improved version of ‌Siri‌. In Federighi’s view, integrating a third-party model would allow Apple to finally ship a revamped ‌Siri‌ later this year after controversially postponing the update in 2025.

However, the report also outlines internal concerns about the implications of placing AI under Federighi’s control. People who have worked closely with him described him as highly cost-conscious and skeptical of investments with uncertain returns. This approach stands in notable contrast to rivals such as OpenAI, Meta Platforms, and Google, who invest tens of billions of dollars in data centers, chips, and AI researchers.

Given how AI and LLMs are developing, I don’t think this is the wrong choice in the short-to-mid term, and maybe even in the long run. There’s not a world where we end up with four-or-five-basically-identical-for-general-use LLMs. Every few months, there’s a new leader for general use, and then everybody catches up. It’s a market rapidly heading to commoditization (though on a time scale likely impacted by the insane CPU and power costs). Spending billions of dollars to be at par (or worse) isn’t terribly prudent and also likely wouldn’t be underwritten by investors (meaning they’d be doing it out of cash).

Instead, Apple gets to use a good (or great) LLM without investing billions to build one. Given how easy it is to switch LLMs (even setting aside some of the privacy nits, which any provider would agree to for Apple to pay them a large sum of money each year), it can swap providers very easily. Apple can be aggressive on the M&A front when the market inevitably shakes out and some (many?) of these companies who are valued at 10 to 100 times what they are actually worth are forced to reckon with an unclear path forward when the money tap dries up.

The thing Apple needs to be focused on is how to get ahead on what comes next to consumers, which is probably how to run bespoke, targeted models (rather than the generalized LLMs) and how to get them running on devices with less power and memory than an army of servers. That’s a game Google is certain to be attacking, and one that the larger players in the market (like OpenAI and Anthropic) are not as likely to. Ending up being boxed out by your #1 frenemy when models go small is a much bigger risk.