If you upgrade to the latest iPhone recently, you may notice that Apple Intelligence appears in some of your most used applications, such as messages, mail and notes. Apple Intelligence (yes, also briefly to artificial intelligence) appeared in the Apple ecosystem in October 2024, and from here that Apple remains competing with Google, Openai, Anthropic and others to build the best artificial intelligence tools.
What is Apple intelligence?

Cupertino CEOs described marketing: “Amnesty International for the rest of us.” The basic system is designed to take advantage of the things that the artificial intelligence works well, already well, such as generating text and images, to improve current features. Like other platforms, including Chatgpt and Google GeminiApple Intelligence has been trained in large information models. These systems use deep learning to form communications, whether text, pictures, video or music.
The presentation, supported by LLM, presents itself as tools for writing. The feature is available via various Apple applications, including mail, messages, pages and notifications. It can be used to provide summaries for the long text, correct, and even writing messages for you, using the content and demands of the tone.
The generation of images is also combined, in a similar way – although it is a little less. Users can demand smart Apple Genmojis generation (Genmojis) In Apple House style. Photo stadium, in the meantime, is a Independent image generation application Claims are used to create visible content that can be used in messages, keys or sharing via social media.
Apple Intelligence also represents a long -term student The face lift to my walk. The smart assistant was early for the game, but he was mostly neglecting over the past few years. Siri is deeper into Apple operating systems; For example, instead of the familiar symbol, users will see a glowing light around the edge of their iPhone when they do it.
More importantly, The new Siri works through applications. This means, for example, that you can ask Siri to edit an image and then enter it directly in a text message. It is a frictional experience that the assistant was lacking before. Awareness on the screen means that Siri uses the context of the content in which she is currently participating to provide an appropriate answer.
Lead to WWDC 2025Many expected that Apple would introduce us to a more version of Siri, but we will have to do so Wait a little longer.
“With our participants, we continue our work to provide features that make Siri more personal,” said Apple SVP.
It is supposed to be the most customized version of Siri that is not released from Siri capable of understanding the “personal context”, such as your relationships, communications routine, and more. But according to the Bloomberg report, the development version of this new biograph is It suffers from very mistakes for chargingAnd therefore delay it.
At WWDC 2025, Apple also unveiled a new AI feature called Optical intelligenceWhich helps you search for pictures for the things you see while browsing. Apple also revealed Live translation The feature that can translate conversations in actual time in messages, faceime, and phone applications.
Optical intelligence and live translation are expected to be available later in 2025, When iOS 26 is launched for the public.
When is Apple’s intelligence detected?
After months of speculation, Apple intelligence He took the lead center in WWDC 2024. The platform was announced in the wake of a torrent of news of artificial intelligence from companies such as Google and Open AI, causing concern that the famous famous technology giant had missed the boat on the latest technical madness.
Unlike these speculation, however, Apple had a team in place, as it worked on what proved to be a very apple approach in artificial intelligence. There was still a bitzam in the midst of illustrations – Apple always loves to show it – but Apple’s intelligence is ultimately a very pragmatic process in this category.
Apple Intelligence is not an independent feature. Instead, it comes to integration into current offers. Although it is an exercise for brands in the very real sense, the large LLM technology will work behind the scenes. Regarding the consumer, technology will mostly introduce itself in the form of new features of current applications.
We learned more During the iPhone 16 event from Apple In September 2024. During the event, Apple described a number of features that operate in Amnesty International coming to its devices, from Translation on Apple Watch 10 seriesand Visual search on iPhone devicesAnd A number of amendments to Siri’s capabilities. The first wave of Apple intelligence arrives at the end of October, As part of iOS 18.1, iPados 18.1 and MacOS Sequoia 15.1.
The features that were first launched in the American English. Apple later added that Australia, Canadian, New Zealand, South Africa and the United Kingdom in the United Kingdom. The support of Chinese, English (India), English (Singapore), French, German, Italian, Japanese, Korean, Portuguese, Spanish and Vietnamese will reach in 2025.
Who gets apple intelligence?

The first wave of Apple Intelligence arrived in October 2024 via iOS 18.1 and iPados 18. And MacOS Sequoia 15.1. These updates included integrated writing tools, photo cleaning, articles summaries, and writing for writing for Siri’s experience redigning. The second wave of features is available as part of iOS 18.2, iPados 18.2 and MacOS Sequoia 15.2. This list includes, GenmojiPhoto stadium, visual intelligence, photo stick, and ChatGPT integration.
These offers are free to use, as long as you have One of the following pieces of devices:
- All iPhone 16 models
- iPhone 15 Pro Max (A17 Pro)
- iPhone 15 Pro (A17 Pro)
- iPad Pro (M1 and then)
- iPad Air (M1 and then)
- iPad mini (A17 or newer)
- MacBook Air (M1 and then)
- MacBook Pro (M1 and then)
- IMAC (M1 and then later)
- Mac Mini (M1 and then)
- Mac Studio (M1 Max and then)
- Mac Pro (M2 Ultra)
It is worth noting that only professional versions of iPhone 15 are accessible, due to the shortcomings on the standard model set. It is assumed, however, the entire iPhone 16 will be able to run the Apple Intelligence when it arrives.
How artificial intelligence works from Apple without internet connection?

When you ask GPT or Gemini a question, your inquiry is sent to external servers to create a response, which requires an internet connection. But Apple has taken a file Small modelDetailed approach to training.
The biggest benefit in this approach is that many of these tasks become less intense in resources and can be implemented on devices. This is because, instead of relying on the kitchen sink approach that nourishes platforms like GPT and Gemini, the company collected data groups in the company for specific tasks such as, for example, creating an email.
This does not apply to everything. You will use the most sophisticated queries to display the new cloud account. The company is now running the remote servers that run on Apple Silicon, which claims to allow it to provide the same level of the privacy of its consumer devices. Whether it is carried out locally or through the cloud it will be invisible to the user, unless his device is in a non -connection mode, and at this point, the queries will be displayed from a mistake.
Apple Intelligence with third -party applications

A lot of noise was made about the Apple Partnership with Openai before the launch of Apple Intelligence. However, in the end, it turned out that the deal was less about Apple’s intelligence and more about offering an alternative platform for those things that were not really built. It is an implicit recognition that building a small model system has its limits.
Free apple intelligence. So, too, It is access to Chatgpt. However, those who have paid to the latter accounts will be able to reach the distinctive features that free users do not make, including unlimited queries.
Chatgpt integration, which for the first time on iOS 18.2, iPados 18.2 and MacOS Sequoia 15.2, has two basic rotation: supplementing the base of knowledge in Siri and adding current writing tool options.
By enabling the service, you will demand some questions for the new Siri to demand the user to agree to reach Chatgpt. Recipes and travel planning are examples of questions that may highlight the option. Users can also demand Siri directly to “order Chatgpt”.
Compose is the other basic Chatgpt feature available through Apple Intelligence. Users can access it in any application that supports the new writing tools feature. It adds the creation of the ability to write content based on a wave. It joins current writing tools such as style and summary.
We know with certainty that Apple plans to partnership with additional AI services. The whole company said that Google Gemini is the following in that list.
Can developers build on ABER AI models?
in WWDC 2025Apple announced what you call Basic Models FrameWhich will allow developers to benefit from artificial intelligence models in the event of non -connection.
This makes it possible for developers to create artificial intelligence features in third -party applications that benefit from the current Apple systems.
“For example, if you are preparing for an exam, an application like Kahoot can create a dedicated test from your notes to make the study more attractive,” said Federigi at WWDC. “And because it happens using models on devices, this happens without cloud API costs (…) We cannot be more enthusiastic about how developers are building on Apple Intelligence to provide new and available new experiences when you are not connected to the Internet, which protects your privacy.”
https://techcrunch.com/wp-content/uploads/2024/06/wwdc24-apple-intelligence-02.jpg?resize=1200,675
Source link