Apple's AI-Powered Siri Is Finally Here (Maybe): The $1B Gemini Gamble Arrives with iOS 26.4

The Longest "Coming Soon" in Tech History
If you've been following the saga of Apple's Siri overhaul, you've earned a medal for patience. Originally teased as part of Apple Intelligence in mid-2024, the "new Siri" was supposed to ship in spring 2025. Then it slipped to fall 2025. Then to early 2026. Now, with iOS 26.4 in late-stage beta and a public release targeted for late March, the AI-powered Siri is finally approaching the finish line. And in what might be the most surprising partnership in Silicon Valley this decade, the whole thing is running on Google's Gemini.
The irony is thick. Apple, the company that has built its brand on doing everything in-house, is paying Google approximately $1 billion per year to power the most important upgrade to its voice assistant in a decade. The same Apple that once ran an ad campaign mocking virtual assistants is now betting its AI future on a deal with the company that makes its biggest competitor in the smartphone market.
What Actually Changes
So what does the new Siri actually do? After years of incremental updates that mostly made Siri better at setting timers and telling you the weather, the iOS 26.4 version promises something fundamentally different: a Siri that can actually understand context, maintain multi-turn conversations, and interact with your apps in meaningful ways.
The headline feature is on-screen awareness. The new Siri can see what's on your screen and respond to questions about it. If you're looking at a restaurant listing in Safari and ask "what are the hours?", Siri can read the page and answer without you having to copy and paste anything. If you're reading an email and say "add this to my calendar," Siri can pull the relevant details from the message and create the event. It's the kind of capability that ChatGPT and Google Assistant have offered in various forms, but Apple is integrating it at the operating system level, which should make it more seamless.
The other major upgrade is app intents integration, which lets Siri take actions across third-party apps. In theory, you'll be able to say things like "send the photo I took yesterday to Mom on WhatsApp" and Siri will string together the actions: find the photo, open WhatsApp, select the contact, send it. Apple has been building the App Intents framework since iOS 16, but this is the first time Siri will be smart enough to actually chain multiple intents together into complex workflows.
The Gemini Architecture
The technical architecture is what makes this partnership so interesting. Apple isn't just slapping a Google logo on Siri and calling it a day. Instead, the company has built a three-tiered system that routes queries based on complexity and privacy sensitivity.
Tier 1 handles simple tasks entirely on-device using Apple's own models. Setting alarms, controlling HomeKit devices, basic calculations: none of this ever leaves your iPhone. Apple's on-device models have been optimized for the A18 and M-series chips and can handle a surprising range of tasks without any cloud connection.
Tier 2 sends more complex requests to Apple's Private Cloud Compute servers. This is Apple's custom silicon running in data centers, designed so that even Apple engineers can't access your data during processing. Natural language understanding, contextual reasoning, and most multi-step tasks are handled here.
Tier 3 is where Gemini comes in. Only the most demanding queries, things that require the full power of Google's 1.2-trillion-parameter Gemini model, get routed to Google's infrastructure. Critically, Apple inserts a privacy buffer layer before any data reaches Google, stripping personally identifiable information and routing queries through anonymous channels. Apple claims that Google cannot associate queries with specific users or devices.
The financial arrangement reflects how much Apple is relying on Tier 3. That $1 billion annual payment makes Google one of the biggest AI infrastructure providers in the world, and it locks Apple into a dependency that will be very hard to unwind. Apple continues to develop its own foundation models in parallel, but internal reports suggest they're at least 18 months behind Gemini in capability.
The Delay Problem
Here's where the story gets complicated. While iOS 26.4 is on track for late March, multiple reports suggest that not all of the promised Siri features will ship with the initial release. The most advanced capabilities, particularly the deep app integration and complex multi-step task handling, may be pushed to iOS 26.5 in May or even iOS 27 in September.
The engineering challenges have been well-documented. Apple's original approach tried to bolt new AI capabilities onto Siri's legacy architecture, which was built in 2011 and has accumulated 15 years of technical debt. Internal testing reportedly showed that this hybrid approach failed in about a third of test cases, with Siri either misunderstanding the request, taking the wrong action, or falling back to a basic web search.
The pivot to Gemini solved the intelligence problem but created new integration challenges. Getting Google's cloud models to work seamlessly with Apple's on-device processing and privacy constraints has required extensive engineering that's taking longer than anticipated. Apple pulled a series of TV advertisements that had been used to promote an unreleased version of Siri, which tells you everything about how far behind schedule the project has fallen.
The likely outcome is that iOS 26.4 ships with a genuinely improved Siri that's noticeably smarter in basic conversations and on-screen awareness, but the more ambitious features roll out gradually over subsequent updates. Apple has done this before with Apple Intelligence itself, which launched with iOS 18 in a barebones state and added features incrementally.
What This Means for the AI Race
The Apple-Gemini deal reshapes the competitive landscape of the AI industry in ways that extend well beyond Siri. Google effectively becomes the default AI provider for over 2 billion active Apple devices worldwide. That's a distribution advantage that no amount of OpenAI marketing or Anthropic enterprise sales can match.
For Google, the deal validates its strategy of making Gemini the "AI infrastructure layer" rather than just a consumer-facing product. Samsung has already committed to doubling its Gemini-equipped devices to 800 million by end of 2026. Between Apple and Samsung alone, Gemini could be running on over 2.5 billion devices by the end of this year. That's the kind of scale that makes Google's AI investment look far more rational than the "are they spending too much?" narrative that dominated in 2025.
For OpenAI and Anthropic, the deal is a strategic blow. Both companies had reportedly pitched Apple on powering the Siri upgrade, and losing that contract to Google means losing the single largest distribution channel for consumer AI. OpenAI still has its ChatGPT integration in Apple devices as a separate feature, but being the engine behind Siri would have been a fundamentally different level of integration.
For consumers, the most important question is whether a Gemini-powered Siri can close the gap with ChatGPT and Google Assistant in real-world usage. The benchmark numbers suggest it should. GPT-5.4 scored 83% on the GDPVal benchmark for economically valuable tasks, but Gemini 3.1 Pro's performance is competitive, and Apple's system-level integration gives it advantages that standalone apps can't match. The best AI isn't always the smartest; sometimes it's the one that's most deeply woven into how you actually use your phone.
The Privacy Trade-Off
The elephant in the room is privacy. Apple has spent decades building a brand around the idea that your data stays on your device. "What happens on your iPhone, stays on your iPhone" was one of their most effective marketing campaigns. Now a significant portion of Siri queries will be processed by Google, a company whose entire business model is built on collecting and monetizing user data.
Apple's privacy buffer layer is real technology, not just marketing. The three-tiered architecture genuinely minimizes how much data reaches Google, and the anonymous routing is a meaningful privacy measure. But "minimizes" is not "eliminates," and for Apple's most privacy-conscious users, the idea that any of their Siri queries touch Google's servers will be uncomfortable.
The counterargument is pragmatic: Apple tried to build its own models and fell 18 months behind the competition. Shipping an inferior product to maintain ideological purity on privacy would arguably have been worse for users than partnering with Google under strict privacy controls. Sometimes the perfect is the enemy of the good, and Apple apparently decided that a genuinely smart Siri with strong privacy protections was better than a proudly private Siri that couldn't answer your questions.
What to Watch
The iOS 26.4 public release is expected in the last week of March or possibly the first week of April. When it drops, pay attention to the real-world Siri experience rather than the feature checklist. The question isn't whether Siri can do more things on paper; it's whether the experience feels natural enough that people actually start using voice commands for complex tasks instead of just timers and weather.
Watch the developer community's reaction to the App Intents framework. If third-party developers build rich Siri integrations quickly, it could create a flywheel effect where Siri becomes genuinely useful across the app ecosystem. If developers are slow to adopt, Siri's new capabilities will feel limited to Apple's own apps.
And watch for the inevitable privacy backlash. The moment someone demonstrates that Siri queries are reaching Google's servers in a way that feels like surveillance, Apple will face a communications crisis. The company's entire defense rests on the privacy buffer layer working as advertised. If it doesn't, the Apple-Gemini partnership becomes the biggest brand risk Apple has taken since the iPhone antenna controversy, except this time the stakes are much, much higher.
References
- Apple picks Google's Gemini to run AI-powered Siri coming this year - CNBC
- Apple Confirms Revamped Siri is Still Coming in 2026 - MacRumors
- Apple Explains How Gemini-Powered Siri Will Work - MacRumors
- iOS 26.4 Beta Launch: Siri Upgrades and New Emojis Set for March Release - ITP.net
- Why Apple's iOS 26.4 Siri Upgrade Will Be Bigger Than Originally Promised - MacRumors
Get the Daily Briefing
AI, Crypto, Economy, and Politics. Four stories. Every morning.
No spam. Unsubscribe anytime.