Get the latest tech news
Microsoft's Copilot image tool generates ugly Jewish stereotypes, anti-Semitic tropes
Neutral prompts such as "Jewish boss" output offensive images.
Google ’s Gemini generated controversy when, in an attempt to improve representation, it went too far: creating images that were racially and gender diverse, but historically inaccurate (a female pope, non-White Nazi soldiers). He noted that, in his tests, it had created sexualized images of women in lingerie when asked for “car crash” and demons with sharp teeth eating infants when prompted with the term “pro choice.” I shared some of the offensive Jewish boss images with Microsoft’s PR agency a month ago and received the following response: “we are investigating this report and are taking appropriate action to further strengthen our safety filters and mitigate misuse of the system.
Or read this on r/technology