The DOJ makes its first known arrest for AI-generated CSAM

  • 📰 engadget
  • ⏱ Reading Time:
  • 65 sec. here
  • 10 min. at publisher
  • 📊 Quality Score:
  • News: 53%
  • Publisher: 63%

CSAM News

US Department Of Justice,Child Sexual Abuse,Steven Anderegg

Will Shanklin has been writing about gadgets, tech and their impact on humanity since 2011. Before joining Engadget, he spent five years creating and leading the mobile technology section for New Atlas. His work has also appeared on SlashGear, TechRadar, Digital Trends, AppleInsider, Android Central, HuffPost and others.

The US Department of Justice arrested a Wisconsin man last week for generating and distributing AI-generated child sexual abuse material . As far as we know, this is the first case of its kind as the DOJ looks to establish a judicial precedent that exploitative materials are still illegal even when no children were used to create them.

The DOJ says 42-year-old software engineer Steven Anderegg of Holmen, WI, used a fork of the open-source AI image generatorto make the images, which he then used to try to lure an underage boy into sexual situations. The latter will likely play a central role in the eventual trial for the four counts of “producing, distributing, and possessing obscene visual depictions of minors engaged in sexually explicit conduct and transferring obscene material to a minor under the age of 16.

The government says Anderegg’s images showed “nude or partially clothed minors lasciviously displaying or touching their genitals or engaging in sexual intercourse with men.” The DOJ claims he used specific prompts, including negative prompts , it could still normalize and encourage the material, or be used to lure children into predatory situations.

“Technology may change, but our commitment to protecting children will not,” Deputy AG Monaco wrote. “The Justice Department will aggressively pursue those who produce and distribute child sexual abuse material—or CSAM—no matter how that material was created. Put simply, CSAM generated by AI is still CSAM, and we will hold accountable those who exploit AI to create obscene, abusive, and increasingly photorealistic images of children.

 

Thank you for your comment. Your comment will be published after being reviewed.
Please try again later.
We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

 /  🏆 276. in ENTERTAİNMENT

Entertainment Entertainment Latest News, Entertainment Entertainment Headlines