Get the latest tech news
Korean women remove pictures, videos from social media amid deepfake porn crisis
South Korea’s national assembly on Thursday passed a Bill to punish people possessing, purchasing, saving or viewing deepfake sexual materials and other fabricated videos.
Lucas Lee, director of startup Deepbrain AI, said his company launched a deepfake detection system in March, which was developed in partnership with the Korean National Police Agency to help combat such crimes in the country. Aside from law enforcement, companies – including entertainment agencies – are also turning to similar technologies to detect fake videos and images, after cases of actresses and singers falling victim to such crimes. Human Rights Watch said online gender-based violence is a widespread problem in South Korea, where judges, prosecutors, police, and lawmakers – a vast majority of whom are men – do not take these crimes seriously enough.
Or read this on Hacker News