Apple plans to spend $1 billion to buy Google AI services, with a 12 trillion parameter model to greatly upgrade Siri

Wallstreetcn
2025.11.05 20:04
portai
I'm PortAI, I can summarize articles.

Reports indicate that the new version of Siri is expected to be launched in the spring of next year, with Google's Gemini model responsible for handling Siri's core functions such as information integration and task execution. However, Apple emphasizes that this is just a transitional solution and is still developing its own trillion-parameter model. Following the announcement, the stock prices of both companies briefly surged to intraday highs on Wednesday

Apple Inc. is planning to adopt a 1.2 trillion parameter artificial intelligence model developed by Google to provide technical support for the long-promised upgrade of its Siri voice assistant.

On November 5th, media reports citing informed sources revealed that the two parties are finalizing an agreement in which Apple will pay approximately $1 billion annually for the use of Google's technology.

The new version of Siri is expected to be launched in the spring of next year, with Google's Gemini model responsible for handling Siri's core functions such as information synthesis and task execution.

After the news was announced, the stock prices of both companies briefly surged to intraday highs on Wednesday. Apple's stock rose by less than 1% to $271.70, while Alphabet's increase reached 3.2% at $286.42, before both gains retreated.

Although Apple emphasizes that this is only a transitional solution and is still developing its own 1 trillion parameter model, Google's Gemini 2.5 Pro currently leads in most large language model rankings, making it difficult to catch up.

Significant Improvement in AI Processing Capability

Google's customized Gemini system represents a major technological leap.

Compared to the 150 billion parameter model currently used by Apple's intelligent cloud version, the Gemini model's 1.2 trillion parameter scale will significantly expand the system's processing capability and its ability to understand complex data and context.

Wallstreetcn previously mentioned that Apple had considered using other third-party models to complete this task. After testing Gemini, OpenAI's ChatGPT, and Anthropic's Claude, Apple locked in on Google earlier this year. Apple hopes to use this technology as a temporary solution until its own model is powerful enough.

The project is internally codenamed Glenwood at Apple, led by Mike Rockwell, the developer of the Vision Pro headset, and Craig Federighi, the software engineering chief. The new voice assistant is planned to be used in the iOS 26.4 system, codenamed Linwood.

Clearly Defined Technical Architecture

According to reports, under the agreement, Google's Gemini model will handle Siri's information summarization and task planning functions, which help the voice assistant synthesize information and determine how to execute complex tasks. Some Siri functions will continue to use Apple's internal model.

The model will run on Apple's own private cloud computing servers, ensuring that user data is isolated from Google's infrastructure. Apple has allocated AI server hardware to support the operation of this model.

Reports indicate that despite the substantial scale of cooperation, it is unlikely to be publicly promoted. Apple will regard Google as a behind-the-scenes technology supplier This makes the agreement different from the deal between the two companies regarding the Safari browser, which set Google as the default search engine.

The agreement is also independent of earlier discussions about integrating Gemini directly as a chatbot into Siri.

These discussions were close to reaching an agreement in 2024 and earlier this year, but ultimately did not materialize. The collaboration also does not involve embedding Google AI search into Apple's operating system.

During Apple's recent earnings call, CEO Tim Cook stated that Siri may eventually offer more chatbot options, not just the current ChatGPT.

Apple is not the only company adopting Gemini-driven AI capabilities; several large companies, including Snap, are building applications based on Google's Vertex AI platform. However, for Apple, this move signifies its acknowledgment of having fallen behind in the AI field and its willingness to rely on external technology to catch up.

The Path of Independent Development Continues

Apple still does not wish to use Gemini as a long-term solution.

According to media reports citing informed sources, although the company is losing AI talent, including the head of the model team, management still intends to continue developing new AI technologies and hopes to eventually replace Gemini with an internal solution.

To this end, Apple's model team is developing a cloud model with one trillion parameters, which they hope to use for consumer applications as early as next year.

Apple executives believe they can achieve a quality level similar to that of customized Gemini products. However, Google is also continuously enhancing Gemini, making it not easy to catch up. The Gemini 2.5 Pro version currently ranks among the top in most large language model comparisons