Almost every company in tech has jumped into the artificial intelligence bandwagon by now, and Apple is no exception. What makes it a little different is how it plans to handle the data security and privacy issues that come with AI.
At its annual WWDC event earlier this month, the company unveiled Apple Intelligence, its AI flavor, which Apple promises will set a new standard for AI privacy and security. This is despite its plans to «seamlessly integrate» OpenAI’s ChatGPT into its products and software.

But some security experts say that while they don’t doubt Apple’s intentions, which they say remain uniquely altruistic for the industry, the company has its work cut out for it and, potentially, a new target on its back.
Apple’s AI security and privacy promises, as well as its intention to be transparent about how the company plans to use AI technology, are a «step in the right direction,» said Ran Senderovitz, chief operating officer for Wing Security, which is specialized to help. companies provide third-party software on their systems.
Those pledges follow Apple’s long-standing focus on minimizing data collection and making a point not to use it for profit, Senderovitz said. This makes the company stand out in a «jungle» of an industry that not only remains unregulated, but also has so far failed to establish its own codes and standards.
Unlike Apple, companies like Meta and Google have business models that predate the popularization of AI, built on collecting, sharing and selling user data to middlemen, advertisers and others.
But the introduction of AI tools like large language models and machine learning, which have the potential to drive great progress and innovation, comes with significant privacy and confidentiality issues, Senderovitz said.
Putting data into an LLM like ChatGPT «is like telling a friend a secret that you hope they forget, but they don’t,» Senderovitz said. It is difficult to know or control where this data goes next. And even if the entered data is immediately destroyed, what LLM learned from it lives on.
And OpenAI’s widely popular LLM will be a big part of Apple Intelligence. Starting later this year, it will appear in features like Siri and typing tools, but Apple promises that its users will have control over when ChatGPT is used and will be asked for permission before any of their information is shared. theirs.
Traditionally, Apple has kept customer data secure and private by limiting what it collects to the minimum necessary to operate the software or device in question. In addition, the company built its phones, computers and other devices with enough horsepower to keep processing sensitive data on the device, rather than sending it to a cloud server somewhere.
After all, data that is never collected can never be lost, stolen or sold. But by its design, AI changes it. AIs need data in order to train and become more powerful, and some AI operations simply cannot be performed on standard phones and laptops.
Craig Federighi, Apple’s senior vice president of software engineering, said during the company’s WWDC keynote that an understanding of personal context such as a user’s daily routine and relationships is essential for AI to be truly useful, but it must be done in the right way.
«You don’t have to hand over all the details of your life to be stored and analyzed in someone’s artificial intelligence cloud,» Federighi said.
To ensure that, Apple says, it will continue to keep as much AI processing in the device as possible. And what can’t be done on a phone or computer will be sent to its Private Cloud Compute system that will allow greater processing capabilities as well as access to larger AI models.
The data sent is never stored or made accessible to Apple, Federighi said, adding that as with Apple devices, independent experts can inspect the code running on the servers to ensure Apple is fulfilling that promise.
Keeping the private cloud private
Josiah Hagen, a senior security researcher for Trend Micro with more than 20 years of AI system experience, has no doubt that Apple will try its best to fulfill these promises. And he said the cloud offers several security advantages — specifically, that its larger size makes it easier to spot anomalies and stop potential security threats before they become problems.
What will be key, he said, is whether Apple will be able to build controls that can prevent attackers from using AI to do more than intended with the apps it’s connected to.
«I think we’re going to start seeing the takeover of using the AI model for nefarious purposes,» Hagen said, adding that although cybercriminals can use ChatGPT to dig through piles of stolen data, an army of free bots , powered by AI can do this. work faster and cheaper.
Hagen also worries that the tech giant doesn’t use outside companies to help secure its cloud. It can be hard to see the cracks in your security armor when you’ve built it yourself, and an outside perspective can be crucial to finding them before cyberattackers do, he said.
«Apple is not a security company,» Hagen said. «Securing your ecosystem is hard. You’re going to have armor whoever you are.»
Additionally, after years of focusing on traditional PC and Windows systems, cybercriminals are now increasingly attacking iOS systems with malware, and there’s no guarantee that Apple’s closed system will keep them out. It’s that closed system model that worries Hagen more than Apple’s connection to ChatGPT.
Freelance security professionals, who look for flaws in computer systems and then hand them over to companies in exchange for payments known as bug bounties, will become an even more important part of Apple’s defense, he said.
Regarding privacy, Hagen said it’s possible that legal or cost concerns could prompt Apple to begin modifying its privacy practices, sending the company down a slippery slope that eventually ends with changing its terms of service for allowed to use customer data to train the next version of AI.
That’s also a concern for Senderovitz, who said he and his researchers are keeping a close eye on any changes to Apple’s terms and conditions, particularly regarding its data-sharing practices with third-party collaborators like OpenAI . Although Apple has been big on promises about it, he said it has so far been short on specifics.
«We will need to look at the fine print,» he said.
Editors’ note: CNET used an AI engine to help create several dozen stories, which have been tagged accordingly. The note you’re reading is attached to articles that essentially deal with the topic of AI, but are entirely created by our expert editors and writers. For more, see our AI policy.
#Apple #faces #difficult #task #keeping #data #secure #private
Image Source : www.cnet.com