Apple has announced that developers can now access its on-device artificial intelligence system, Apple Intelligence, to build AI features into their own apps. The company made this announcement at its Worldwide Developer Conference, marking a shift in how Apple shares its AI capabilities.
The new Foundation Models framework allows developers to integrate Apple’s AI model directly into their applications using just three lines of code. Apps can now offer intelligent features that work offline and protect user privacy, with AI processing happening entirely on the user’s device rather than in the cloud.
Apple also introduced major updates to its development tools. Xcode 26, the company’s programming environment, now includes built-in support for ChatGPT and other large language models. This allows developers to write code, create tests, and fix errors with AI assistance directly within their development workflow.
Apple Intelligence has expanded to support nine languages, including English, French, German, Italian, Portuguese, Spanish, Japanese, Korean, and Chinese. The system now includes live translation capabilities and works across iPhone, iPad, Mac, and Apple Watch devices.
The features are available for testing through Apple’s developer program, with a public beta planned for next month. The AI capabilities require newer Apple devices, including iPhone 15 Pro models and later, plus iPads and Macs with Apple’s M1 chip or newer processors.