Apple offers $1M to hackers to secure private AI cloud

$1 million is being offered to researchers who find security flaws in Apple’s new Secure Private AI cloud, emphasising the importance of privacy and data security in its latest AI tools.

 Person, Logo, Handrail, Symbol, Accessories, Bag, Handbag

Apple is raising the stakes in its commitment to data security by offering up to $1M to researchers who can identify vulnerabilities in its new Private Cloud Compute service, set to debut next week. The service will support Apple’s on-device AI model, Apple Intelligence, enabling more powerful AI tasks while prioritising user privacy. The bug bounty program targets serious flaws, with the top rewards reserved for exploits that could allow remote code execution on Private Cloud Compute servers.

Apple’s updated bug bounty program also includes rewards up to $250,000 for any vulnerability that could expose sensitive customer information or user prompts processed by the private cloud. Security issues affecting sensitive user data in less critical ways can still earn researchers substantial rewards, signaling Apple’s broad commitment to protecting its users’ AI data.

With this move, Apple builds on past security initiatives, including its specialised research iPhones designed to enhance device security. The new Private Cloud Compute bug bounty is part of Apple’s approach to ensure that as its AI capabilities grow, so does its infrastructure to keep user data secure.