Apple's efforts in AI could pay off in its WWDC announcements, but it is also very keen to protect user data at the same time. Here's how it will get done.
Apple is expected to make a number of big plays in AI for WWDC. The changes are anticipated to include big things in iOS 18 and its other operating systems, with app feature changes such as audio transcription apparently on the way.
But, with privacy a core tenet of Apple's work, it's doing what it can to protect its users.
According to sources of The Information, Apple intends to process data from AI applications inside a virtual black box. The concept, known as "Apple Chips in Data Centers" (ACDC) internally, would involve only Apple's hardware being used to perform AI processing in the cloud.
The idea is that it will control both the hardware and software on its servers, enabling it to design more secure systems.
While on-device AI processing is highly private, the initiative could make cloud processing for Apple customers to be similarly secure.
On-device processing is inherently private, due to not ferrying data away to the cloud. The problem is that it can be a lot slower compared to cloud processing.
However, cloud processing can be a lot more powerful, albeit with the privacy tradeoff. This latter element is what Apple's trying to avoid.
Avoiding use and abuse
Part of the problem is the potential for the uploaded data to be misused, or exposed by hackers. With a reliance on cloud servers, AI services do pose a risk to user data getting out.
By taking control over how data is processed in the cloud, it would make it easier for Apple to implement processes to make a breach much harder to actually happen.
Furthermore, the black box approach would also prevent Apple itself from being able to see the data. As a byproduct, this means it would also be difficult for Apple to hand over any personal data from government or law enforcement data requests.
Its ACDC initiative can be even more beneficial to Apple in terms of future device designs. By offloading AI features to the cloud, Apple could reduce the hardware requirements of its future products, making lighter wearables and other devices.
Secure Enclaves
Core to the ACDC initiative, which was detailed earlier in May, is the Secure Enclave. Used on the iPhone to store biometric data, the Secure Enclave is a protected element that holds data like passwords and encryption keys, preventing access to the sensitive data by hackers if they compromise iOS or the hardware.
Under the plan, the Secure Enclave would be used to isolate data processed on the servers, former Apple employees told the report. Doing so means the data can't be seen by other elements of the system, nor Apple itself.
22 Comments
I hope this is the case. Earlier rumours suggested that OpenAI was building up their systems to handle a bunch of new load from Apple’s customers. If that were the case it would be a non starter for me. I don’t trust OpenAI. I’d just start looking for ways to disable the features that used their systems.
Apple will get AI w Privacy done as they always step up and consumers/businesses will love it.
Sounds like a paid sub feature to me