Get the latest tech news
Meta’s Movie Gen model puts out realistic video with sound, so we can finally have infinite Moo Deng
No one really knows what generative video models are useful for just yet, but that hasn't stopped companies like Runway, OpenAI, and Meta from pouring
Meta claims it outperforms the likes of Runway’s Gen3, LumaLabs’ latest, and Kling1.5, though as always this type of thing is more to show that they are playing the same game than that Movie Gen wins. What Meta is clearly aiming for here, however, is not simply capturing the “state of the art” crown for a month or two, but a practical, soup-to-nuts approach where a solid final product can be produced from a very simple, natural-language prompt. Image Credits: MetaCamera movements are also generally understood, with things like “tracking shot” and “pan left” taken into account when generating the video.
Or read this on TechCrunch