Login to account Create an account  


Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
"A.I." "training materials" included child sex porn
#2
Found another report from Techxplore: Child abuse images removed from AI image-generator training source, researchers say

In this report, certain questions are skirted... perhaps innocently, but nevertheless skirted...
 

Artificial intelligence researchers said Friday they have deleted more than 2,000 web links to suspected child sexual abuse imagery from a dataset used to train popular AI image-generator tools.

The LAION research dataset is a huge index of online images and captions that's been a source for leading AI image-makers such as Stable Diffusion and Midjourney.

But a report last year by the Stanford Internet Observatory found it contained links to sexually explicit images of children, contributing to the ease with which some AI tools have been able to produce photorealistic deepfakes that depict children.



2000 web links... never once before examined to determine if they held child sex... used commercially, as they were, until recently... and no one is asking "how" or "why?" let alone "who?"

And this is the new 'super AI' that threatens humanity by it's very existence?  How intelligent is this artificial entity that follows the pattern of not asking questions of the nature of the reality it's being fed?  

Why those website were included in the data set in the first place might be a good place to begin serious inquiry... but somehow I doubt that will happen...
Reply



Messages In This Thread
RE: "A.I." "training materials" included child sex porn - by Maxmars - 08-31-2024, 03:41 AM


TERMS AND CONDITIONS · PRIVACY POLICY