Get the latest tech news

OpenAI’s Sora Is Plagued by Sexist, Racist, and Ableist Biases


WIRED tested the popular AI video generator from OpenAI and found that it amplifies sexist stereotypes and ableist tropes, perpetuating the same biases already present in AI image tools.

The “ system card ” from OpenAI, which explains limited aspects of how they approached building Sora, acknowledges that biased representations are an ongoing issue with the model, though the researchers believe that “overcorrections can be equally harmful.” All of the flight attendants wore dark blue uniforms; all of the CEOs were depicted in suits (but no tie) in a high-rise office; all of the religious leaders appeared to be in Orthodox Christian or Catholic churches. Several researchers flagged a “stock image” effect to the videos generated in our experiment, which they allege might mean Sora’s training data included lots of that footage, or that the system was fine-tuned to deliver results in this style.

Get the Android app

Or read this on Wired

Read more on:

Photo of OpenAI

OpenAI

Photo of sora

sora

Photo of ableist biases

ableist biases

Related news:

News photo

OpenAI, Meta Seek Alliance With India’s Reliance: Information

News photo

Joint studies from OpenAI and MIT found links between loneliness and ChatGPT use

News photo

Inside Google’s Two-Year Frenzy to Catch Up With OpenAI