I believe many friends stayed up late to watch yesterday's Apple press conference. For friends who are following AI, they may still think before going to bed: Where has Apple's AI gone?
So today we will take a look at some "hidden" AI functions at this new product launch conference.
The first is Airpods Pro 3. There is no need to mention better sound quality and spatial audio, and the added heart rate sensor. It mainly depends on one function - real-time translation. The real-time translation effect of the Airpods Pro 3 demonstration at the press conference can be said to be quite amazing. You can translate what the other person says into voice in real time and hear it in your ears, or you can translate what you say into text in real time and present it on your phone to show it to the other person. If both parties are Airpods Pro 3, then real-time translation can be achieved synchronized by two people. This not only requires the hardware performance of the headphones, but also requires that the AI algorithm is excellent enough. Therefore, at this point, we can temporarily believe that Apple AI has made some progress in the application of headphone end side.
The second is the AI embodiment on the mobile phone. This time, the entire iPhone 17 series supports the Center Stage function. In addition to the square sensor, Apple's AI is also hidden in it. It will automatically detect the position of the face in the screen and automatically adjust it to the middle of the screen. If it is a multi-person scene, it will automatically switch to the wide-angle screen. It seems ordinary, but it can greatly improve our selfie experience.
In addition, Apple's computing and photography capabilities have also been upgraded this time. The new Photonic Engine plays a role through more machine learning models, which has significantly improved noise reduction, natural detail retention and color accuracy. With Visual Intelligence visual recognition function, users can get a better image experience. Isn't this the role of AI?
The most important thing is the chip that carries Apple AI - A19 Pro. The 6-core CPU + 6-core GPU + 16-core neural network engine has a peak computing power of 4 times that of the A18 Pro, providing sufficient computing power support for the end-side deployment of Apple AI. Once Apple's Apple intelligence makes progress in the future, it is very likely to quickly become the world's leading end-side AI device.
Of course, the above are just our beautiful expectations. Apple intelligence is still in trouble, the mainland version is nowhere to be achieved, and the overseas version is mediocre. Nowadays, in addition to the AI Siri that is being polished, Apple is also actively seeking help from external AI models such as ChatGPT, Gemini, etc. China has chosen Alibaba's Qianwen and Baidu's Wenxinyiyan. However, it is still unknown which one to use in the future and how to use it.
In the short term, the "absence" of AI functions has made Apple slightly conservative in public opinion, and is even considered by some users to lag behind its competitors. But from a long-term strategic perspective, Apple may be building momentum for larger-scale Siri upgrades, intelligent iteration of iOS systems, and even the ecological expansion of Vision Pro.
A industry observer also said: "Apple's style has always been like this. It will not be launched rashly before the technology has been polished. When the AI functions are actually released, it is very likely to be a closed-loop integration rather than a scattered function pileup."