Get the latest tech news
Microsoft Closes Loophole That Created Taylor Swift Deepfakes
An anonymous reader shares a report: Microsoft has introduced more protections to Designer, an AI text-to-image generation tool that people were using to make nonconsensual sexual images of celebrities. Microsoft made the changes after 404 Media reported that the AI-generated nude images of Taylor S...
An anonymous reader shares a report: Microsoft has introduced more protections to Designer, an AI text-to-image generation tool that people were using to make nonconsensual sexual images of celebrities. "We are investigating these reports and are taking appropriate action to address them," a Microsoft spokesperson told us in an email on Friday. We have large teams working on the development of guardrails and other safety systems in line with our responsible AI principles, including content filtering, operational monitoring and abuse detection to mitigate misuse of the system and help create a safer environment for users."
Or read this on Slashdot