Apple Intelligence hasn’t landed in the way Apple likely hoped it would, but that’s not stopping the company from continuing to iterate on its suite of AI tools. During its WWDC 2025 conference on Monday, Apple announced a collection of new features for Apple Intelligence, starting with upgrades to Genmoji and Image Playground that will arrive alongside iOS 26 and the company’s other updated operating systems.
In Messages, you’ll be able to use Image Playground to generate colorful backgrounds for your group chats. At the same time, Apple has added integration with ChatGPT to the tool, meaning it can produce images in entirely new styles. As before, if you decide to use ChatGPT directly through your iPhone in this way, your information will only be shared with OpenAI if you provide permission.
Separately, Genmoji will allow users to combine two emoji from the Unicode library to create new characters. For example, you might merge the sloth and light bulb emoji if you want to poke fun at yourself for being slow to understand a joke.
Across Messages, FaceTime and its Phone app, Apple is bringing live translation to the mix. In Messages, the company’s on-device AI models will translate a message into your recipient’s preferred language as you type. When they responded, each message will be instantly translated into your language. In FaceTime, you’ll see live captions as the person you’re chatting with speaks, and over a phone call, Apple Intelligence will generate a voiced translation.

Visual Intelligence is also in line for an upgrade. Now in addition to working with your iPhone’s camera, the tool can scan what’s on your screen. Like Genmoji, Visual Intelligence will also benefit from deeper integration with ChatGPT, allowing you to ask the chat bot questions about what you see. Alternatively, you can search Google, Etsy and other supported apps to find images or products that might be a visual match. And if the tool detects when you’re looking at an event, iOS 26 will suggest you add a reminder to your calendar. Nifty that. If you want to access Visual Intelligence, all you need to do is press the same buttons you would to take a screenshot on your iPhone.
As expected, Apple is also making it possible for developers to use its on-device foundational model for their own apps. “With the Foundation Models framework, app developers will be able to build on Apple Intelligence to bring users new experiences that are intelligent, available when they’re…