11-24-2024, 12:09 PM
So, this is ironic. There's a court case in Minnesota, which recently passed a law ostensibly aimed at countering "AI generated deep fake misinformation". Defending this law, the government introduced an "expert declaration" written by a scholar at the Stanford Internet Observatory.
The declaration references several non-existent papers to support its argument. The declaration appears to be AI generated hallucinated misinformation.
Hahahahaha! More coverage of this here and here.
The declaration references several non-existent papers to support its argument. The declaration appears to be AI generated hallucinated misinformation.
Quote:Defendant Ellison filed two expert declarations he contends demonstrate Minn. Stat. §609.771 “actually necessary” and counterspeech “insufficient” to
address AI-generated deepfakes.
But the Declaration of Prof. Jeff Hancock cites a study that does not exist. No article by the title exists. The publication exists, but the cited pages belong to unrelated articles. Likely, the study was a “hallucination” generated by an AI large language model like ChatGPT. A part-fabricated declaration is unreliable.
https://storage.courtlistener.com/recap/...8.30.0.pdf
Hahahahaha! More coverage of this here and here.
"I cannot give you what you deny yourself. Look for solutions from within." - Kai Opaka