Get the latest tech news
Lawmakers want to carve out intimate AI deepfakes from Section 230 immunity
It would create a duty of care to combat digital forgeries.
A bipartisan pair of House lawmakers are proposing a bill to carve out Section 230 protection for tech companies that fail to remove intimate AI deepfakes from their platforms. It does this by creating a duty of care for platforms — a legal term that basically means they are expected to act responsibly — which includes having a “reasonable process” for addressing cyberstalking, intimate privacy violations, and digital forgeries. Lawmakers on both sides of the aisle have long wished to narrow Section 230 protection for platforms they fear have abused a legal shield created for the industry when it was made up of much smaller players.
Or read this on The Verge