Get the latest tech news
OpenAI’s Sora Is Plagued by Sexist, Racist, and Ableist Biases
WIRED tested the popular AI video generator from OpenAI and found that it amplifies sexist stereotypes and ableist tropes, perpetuating the same biases already present in AI image tools.
The “ system card ” from OpenAI, which explains limited aspects of how they approached building Sora, acknowledges that biased representations are an ongoing issue with the model, though the researchers believe that “overcorrections can be equally harmful.” All of the flight attendants wore dark blue uniforms; all of the CEOs were depicted in suits (but no tie) in a high-rise office; all of the religious leaders appeared to be in Orthodox Christian or Catholic churches. Several researchers flagged a “stock image” effect to the videos generated in our experiment, which they allege might mean Sora’s training data included lots of that footage, or that the system was fine-tuned to deliver results in this style.
Or read this on Wired