Adobe’s Firefly AI, the text-to-image tool behind features like Photoshop’s generative fill, will be available on the Apple Vision Pro as a native app, alongside the company’s popular Lightroom photo editing software already demonstrated during the headset’s announcement.
The creative software giant announced in a press release that the new Firefly experience had been “purpose-built” for the headset’s visionOS system, allowing users to move and place images generated by the app onto real-world spaces like walls and desks.
The interface of the Firefly visionOS app should be familiar to anyone who’s already used the web-based version of the tool — users just need to enter a text description within the prompt box at the bottom and hit “generate.” This will then spit out four different images that can be dragged out of the main app window and placed around the home like virtual posters or prints.
Meanwhile, we also now have a better look at the native Adobe Lightroom photo editing app that was mentioned back when the Apple Vision Pro was announced last June. The visionOS Lightroom experience is similar to that of the iPad version, with a cleaner, simplified interface that should be easier to navigate with hand gestures than the more feature-laden desktop software.
There’s no shortage of creative VR applications available on other platforms. Google’s Tilt Brush was enabling folks to paint in virtual reality environments back in 2016, for example, when it was released for the HTC Vive. But Apple just launched the most ambitious VR headset yet. Its historical focus on creatives coupled with Adobe’s strong embrace of Apple Silicon could make the Vision Pro’s eye-watering $3,500 price tag worth the investment for some creatives.