With Apple Intelligence in iOS 26, your device can now understand the content of your photos, enabling powerful new automations. This deep dive explains how you can use a new Shortcut action to analyze images and automatically sort them based on what they contain, such as whether or not people are present. This feature transforms the tedious task of organizing your photo library into an automated workflow, saving you time and effort.
🏟️📣“Want weekly shortcuts like this? Join the Shortcuts community.
The core of this feature is the enhanced Use Model action in Shortcuts. Previously limited to text, this action can now accept images as input. You can ask a question about the image in natural language and receive a structured response. A key addition is the ability to request a Boolean output (a simple 'yes' or 'no'), which allows you to build powerful conditional logic directly into your shortcuts. For instance, you can now create a workflow that checks if people are in a photo and then performs a specific action based on the answer.
For the best results, be specific with your prompts in the Use Model action. Instead of just "people," using "people, human beings" can improve accuracy. You can also chain multiple analysis shortcuts together. For example, first sort photos with people, then run a second shortcut on the remaining photos to separate landscapes from photos of pets.
Image analysis in Shortcuts, powered by Apple Intelligence, fundamentally changes how you can interact with your photo library. By enabling your device to understand visual content, you can automate sorting, filtering, and metadata generation in ways that were previously impossible. This capability allows you to build custom, intelligent workflows that keep your photos perfectly organized with minimal manual effort.
I love making simple Apple Shortcuts that save time and make your devices feel smarter. Want more? Join my Shortcuts community for the Database, Shortcut of the Week, live streams, and help from other members. Sign up here