Get the latest tech news

UK regulator wants to ban apps that can make deepfake nude images of children


The UK's Children's Commissioner is calling for a ban on AI deepfake apps that create nude or sexual images of children.

To that end, de Souza is calling on the government to introduce a total ban on apps that use artificial intelligence to generate sexually explicit deepfakes. She also wants the government to create legal responsibilities for GenAI app developers to identify the risks their products pose to children, establish effective systems to remove CSAM from the internet and recognize deepfake sexual abuse as a form of violence against women and girls. However, the Children's Commissioner is focused more specifically on the harm such technology can do to young people, noting that there is a link between deepfake abuse and suicidal ideation and PTSD, as The Guardian pointed out.

Get the Android app

Or read this on Endgadget

Read more on:

Photo of apps

apps

Photo of Children

Children

Photo of UK regulator

UK regulator

Related news:

News photo

Call for ban on AI apps creating naked images of children

News photo

GNOME 49 Will Enjoy Better Performance With More Fullscreen Apps

News photo

Meta's 'Digital Companions' Will Talk Sex with Users–Even Children