UPDATED 17:09 EDT / OCTOBER 02 2024

Anastasis Germanidis, co-founder and chief technology officer of Runway AI Inc., talks to theCUBE during Anyscale Ray Summit 2024 about why AI-driven filmmaking is a game-changer thanks to benefits, such as revolutionizing visual effects. AI

AI-driven filmmaking: How Runway is transforming visual storytelling and creativity with Gen-3 Alpha

As artificial intelligence continues to take center stage in the digital landscape, AI-driven filmmaking is becoming increasingly important for enhanced audience engagement, innovation and creativity. 

Runway AI Inc. heeds to this call since it uses generative and novel storytelling to produce compelling visuals and personalized content, according to Anastasis Germanidis (pictured), co-founder and chief technology officer of Runway. 

Anastasis Germanidis, co-founder and chief technology officer of Runway AI Inc., talks to theCUBE during Anyscale Ray Summit 2024 about why AI-driven filmmaking is a game-changer thanks to benefits, such as revolutionizing visual effects.

Runway’s Anastasis Germanidis talks to theCUBE about the revolutionary effect of AI-driven filmmaking.

“Madonna’s live concert tour used Runway for all the visuals that were being displayed during her performances,” Germanidis stated. “Most recently we released our video-to-video tool that allows you to take an input video and translate it while maintaining the structure, completely transforming the style of that video. That’s a really versatile tool because you can really control the exact motion that you want to generate while at the same time produce really, really amazing stylistic results with it.”

Germanidis spoke with theCUBE Research’s Savannah Peterson at the Anyscale Ray Summit 2024 event, during an exclusive broadcast on theCUBE, SiliconANGLE Media’s livestreaming studio. They discussed why AI-driven filmmaking is a game-changer, thanks to benefits such as revolutionizing visual effects. (* Disclosure below.)

How Gen-3 Alpha fits into AI-driven filmmaking

To boost human imagination in video production, Runway has set the ball rolling with video foundation models, such as Gen-3 Alpha. The technology has set AI-driven filmmaking into motion since it generates expressive human characters, as well as powers text-to-video, image-to-video and text-to-image tools, according to Germanidis. 

“We recently announced the Hundred Film Fund, and the goal of that is to more directly support specific AI-driven filmmaking projects,” he said. “We are also doing the 48-hour film festival. A few weeks ago, you had essentially a weekend to create a film, and we’ve done that three times. This time was the first time people could use Gen-3 Alpha, and it was on a completely different level of the kinds of stories that were being told, the kind of stylistic diversity of the videos, the kind of narrative and experimental aspect.”

Since AI-driven filmmaking is top of mind for Runway, the company empowers movie makers with the relevant tools needed to experiment with and reduce production costs. Collaborations also come in handy in propelling a generative media community, Germanidis pointed out. 

“Our role is to essentially become translators between the language of AI and technology and the language of art and creative tools,” he said. “We announced a partnership with Lionsgate recently. Very excited to work with film studios directly and work with the best storytellers to see how generative media, generative tools can be used in their workflow. We’re starting from training a custom model on Lionsgate Gallery.”

To scale multimodal data processing and training, Runway uses Anyscale Ray since it can handle a variety of different models. As a result, Ray offers enhanced developer experience and versatility when it comes to building video models, according to Germanidis.

“I think versatility is really important,” he said. “There’s less standards for video model training as there’s for language model training, and we’re essentially inventing the paradigm as we go, as we build those models, as we scale them even further. What’s nice about Ray is that it has a really nice developer experience and you can build those new components and those primitives. We use Ray on different aspects of the data and training pipeline in order to accelerate our efforts to build those models.”

Here’s the complete video interview, part of SiliconANGLE’s and theCUBE Research’s coverage of the Anyscale Ray Summit 2024 event

(* Disclosure: Anyscale sponsored this segment of theCUBE. Neither Anyscale nor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)

Photo: SiliconANGLE

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU