03-21-2024, 08:58 PM
This post was last modified 03-22-2024, 11:02 AM by Maxmars.
Edit Reason: grammar
 
I was so tempted to make this into another thread, but it's so relevant here about the "marketing" angle...
From ReadWrite.com: This AI realized it was being tested.
Here we have the very same "AI" "product" (Opus) in the media... The one that uttered the sentence proclaiming itself as something more than algorithms.
But now, in an attempt to bolster the noise...
Claude 3 Opus, Anthropic’s new AI chatbot, has caused shockwaves once again as a prompt engineer from the company claims that it has seen evidence that the bot detected it was being subject to testing, which would make it self’-aware.
I'm sure the shockwaves were in the marketing department. This was a recall test. Bury a piece of information within a large data set and measure the system producing output.
The system report noted that the data found was inconsistent with the provided data set....
Their narrative paints the AI "deducing, supposing, attributing motives," why? Because large language models are designed to elaborate. It's not thought, no matter how much you may want it to be... "Generative Language" is no more "Artificial Intelligence" then an engine is a car.
From ReadWrite.com: This AI realized it was being tested.
Here we have the very same "AI" "product" (Opus) in the media... The one that uttered the sentence proclaiming itself as something more than algorithms.
But now, in an attempt to bolster the noise...
Claude 3 Opus, Anthropic’s new AI chatbot, has caused shockwaves once again as a prompt engineer from the company claims that it has seen evidence that the bot detected it was being subject to testing, which would make it self’-aware.
I'm sure the shockwaves were in the marketing department. This was a recall test. Bury a piece of information within a large data set and measure the system producing output.
The system report noted that the data found was inconsistent with the provided data set....
Their narrative paints the AI "deducing, supposing, attributing motives," why? Because large language models are designed to elaborate. It's not thought, no matter how much you may want it to be... "Generative Language" is no more "Artificial Intelligence" then an engine is a car.