Luma AI’s latest video model, Ray3, is now available exclusively in the Firefly app, allowing users to generate high-quality 10-second clips and explore creative concepts in Firefly Boards. This premiere follows Google’s introduction of Gemini 2.5 Flash Image last month. Ray3 features significant advancements, including native HDR support for richer contrast and improved scene coherence through its new multimodal reasoning system. Filmmakers and producers can leverage Ray3 for pre-visualization, generating environments and shot compositions during planning. This quick prototyping capability is ideal for adding depth to product tutorials, Instagram Reels, or TikTok transitions. Adobe ensures that all AI-generated content in Firefly is transparent, with Content Credentials verifying the AI used, and promises no training on user-generated outputs. Ray3 joins other models like OpenAI and Runway in Firefly, with more integrations from Moonvalley and Topaz Labs on the horizon.
Source link