Minnesota's legislature passed a bill that bans AI tools designed to generate non-consensual fake nude images. The measure moves to Governor Tim Walz for signature, making it likely law.
The bill does two things. First, it outright prohibits the creation and distribution of deepfake pornography using AI. Second, it gives victims a direct right to sue the creators of these tools, not just the people who use them. That's the enforcement mechanism that matters here.
This reflects a broader regulatory trend. States and countries are moving faster than federal governments on AI regulation. The EU already has AI Act provisions addressing synthetic media. Various U.S. states have passed similar non-consensual deepfake bans. Minnesota's twist is holding tool creators liable alongside users.
For crypto and tech builders, the precedent stings. It establishes that platforms and protocol developers can face legal liability for what users do with their technology. That's not new to crypto, but it's expanding into AI. If Minnesota signs this, expect other states to follow with similar language.
The tech industry will argue enforcement is impossible. The crypto industry knows better. You can always find someone to sue when incentives align.
