Apple is reportedly working on an enhanced version of Siri, incorporating large language models (LLMs) to create a more conversational and intuitive assistant
Bloomberg has cited sources that indicate Apple is creating a new version of its voice assistant, Siri, enabled by advanced large language models (LLMs).
Apple is attempting to keep pace with its competitors in AI, who have introduced remarkable features such as Google’s Gemini Live, which are more natural to converse with than Siri. Consequently, the company has implemented a more conversational Siri experience.
The new assistant is expected to supplant the Siri interface that users currently rely on completely, and Apple intends to release the feature in the spring of 2026.
The feature appears to be comparable to OpenAI’s Advanced Voice Mode, but it will have the same level of access to personal information and applications as Siri currently has.
Apple is currently utilizing third-party providers to enable the iPhone’s sophisticated AI capabilities until that time.
OpenAI’s ChatGPT will be integrated into Apple Intelligence in December, and Apple has reportedly engaged in discussions with other AI providers, including Google and Anthropic, regarding similar agreements.