deny ignorance.

 

Login to account Create an account  


Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Something maybe worse than "woke" AI... Christian AI
#1
The title encapsulates my own opinion...

That more often than not, even the authoritative voices about AI don't seem to understand what it is they are selling...

AI... what the popular surface world of media calls it... is not AI.   AI will be much more frightening.  But let's pretend that the media is right and these "algorithmically" strung together bodies of Large Language Models is 'actually AI' (see?... this is me pretending.)

AI has demonstrated the folly of trying to run when you don't yet know how to walk.

Because the object of our AI (still pretending) is to synthesize language, we must 'train it.'  We carefully select data to expose it too.  Even "wild" data, like the spontaneous utterance of "users" is carefully filtered and itself rendered into acceptable input for the AI.  Since we control the horizontal and vertical, the AI only learns what we want it to learn...  that little fact surfaced when it was first introduced to us as a racist, violent, and psychotic punk.  Remember when?

Now iterations later, it is respectful and relatively skilled at 'saying things.'  

But the alarmists and profiteering doom-sellers are getting their share of the attention on the subject, billionaires and their ilk have gone propaganda-crazy about how AI must be defeated... or it will kill us all.

I never thought that likely.  But if I tell you why, I have to stop pretending.

We haven't created AI yet.  At all.  We have made a simulacrum of speech synthesis... it knows nothing.  It ponders nothing.  It has no form as a sentience.  It is a speaking machine.  

Unless you believe that utterances from a machine can take over anything, you are likely a fiction author, a Madison Avenue marketing shark, or a Hollywood writer.

We are however talking about a machine "trained" by people... and if there is anything consistent about people, we excel at "doing things wrong."

Now, as a response to the negative hype... someone, somewhere has decided to train a "moral" AI... uh oh.

Biola University Blog: Biola University to Launch New Type of AI Lab to Lead in the Convergence of Faith and Technology
Klove.com: Christian University Launches New Type Of 'Ethical' AI Lab Focused On Convergence Of Faith & Technology
Fox news (only readable if you "sign up'):   AI lab at Christian university aims to bring morality and ethics to artificial intelligence

Of course when they say "ethical" we are presented with a dilemma that never seems to get air time... what exactly is ethical?  Some kind of moral calculus?  The weighing of effects and costs against harm and destruction?  Is ethical some soft, warm, and fuzzy thing?  Who knows?... But it sounds bad to not to be ethical, right?

My fear here is the nightmare in this scenario is the possibility of a "trained" AI relying on scripture and the moral narrative inherent uniquely to the Bible... or the Koran... or The Hebrew texts... or any number of ancient sources of dogma...  That particular AI would present a very difficult and ire-inspiring narrative... there will be outrage, there will be "-isms," there will be the gnashing of teeth.

I wonder what North Korean AI must be like?
Reply
#2
(06-18-2024, 10:14 PM)Maxmars Wrote: The title encapsulates my own opinion...

That more often than not, even the authoritative voices about AI don't seem to understand what it is they are selling...

AI... what the popular surface world of media calls it... is not AI.   AI will be much more frightening.  But let's pretend that the media is right and these "algorithmically" strung together bodies of Large Language Models is 'actually AI' (see?... this is me pretending.)

AI has demonstrated the folly of trying to run when you don't yet know how to walk.

Because the object of our AI (still pretending) is to synthesize language, we must 'train it.'  We carefully select data to expose it too.  Even "wild" data, like the spontaneous utterance of "users" is carefully filtered and itself rendered into acceptable input for the AI.  Since we control the horizontal and vertical, the AI only learns what we want it to learn...  that little fact surfaced when it was first introduced to us as a racist, violent, and psychotic punk.  Remember when?

Now iterations later, it is respectful and relatively skilled at 'saying things.'  

But the alarmists and profiteering doom-sellers are getting their share of the attention on the subject, billionaires and their ilk have gone propaganda-crazy about how AI must be defeated... or it will kill us all.

I never thought that likely.  But if I tell you why, I have to stop pretending.

We haven't created AI yet.  At all.  We have made a simulacrum of speech synthesis... it knows nothing.  It ponders nothing.  It has no form as a sentience.  It is a speaking machine.  

Unless you believe that utterances from a machine can take over anything, you are likely a fiction author, a Madison Avenue marketing shark, or a Hollywood writer.

We are however talking about a machine "trained" by people... and if there is anything consistent about people, we excel at "doing things wrong."

Now, as a response to the negative hype... someone, somewhere has decided to train a "moral" AI... uh oh.

Biola University Blog: Biola University to Launch New Type of AI Lab to Lead in the Convergence of Faith and Technology
Klove.com: Christian University Launches New Type Of 'Ethical' AI Lab Focused On Convergence Of Faith & Technology
Fox news (only readable if you "sign up'):   AI lab at Christian university aims to bring morality and ethics to artificial intelligence

Of course when they say "ethical" we are presented with a dilemma that never seems to get air time... what exactly is ethical?  Some kind of moral calculus?  The weighing of effects and costs against harm and destruction?  Is ethical some soft, warm, and fuzzy thing?  Who knows?... But it sounds bad to not to be ethical, right?

My fear here is the nightmare in this scenario is the possibility of a "trained" AI relying on scripture and the moral narrative inherent uniquely to the Bible... or the Koran... or The Hebrew texts... or any number of ancient sources of dogma...  That particular AI would present a very difficult and ire-inspiring narrative... there will be outrage, there will be "-isms," there will be the gnashing of teeth.

I wonder what North Korean AI must be like?
Not every Christian thinks that Christian AI is a good idea, on that basis that it cannot be capable of going beyond what men think to understanding what God thinks. The New Testament concept is that the Spirit of God communicates with the spirits of men, but I don't think software would be capable of receiving that contact. It has no spiritual "port".

In any case, modern mainstream churches are showing a suspicious tendency to go "woke", so there would probably not be much difference between the ethical systems. The end-result would probably be trying to harmonise as many different religious systems as possible, introducing Buddhist metaphysics, and dropping as "offensive" specific religious markers like the Cross.
Reply
#3
Worse than 'woke'?
I dunno ... 'woke' is pretty bad stuff ...
It would have to be really really bad ...
Don't be a useful idiot.  Deny Ignorance.
DEI = Division, Exclusion, and Incompetence
Reply
#4
hmm, i sincerely believe that an AI should look pass religious dogma, basic morals is much more important. not sure if Woke is really a term that can be applied when talking about an AI.
Nothing is impossible, the impossible just takes a little longer to achieve.
Reply
#5
(06-19-2024, 12:09 PM)DarkSpace Wrote: hmm, i sincerely believe that an AI should look pass religious dogma, basic morals is much more important. not sure if Woke is really a term that can be applied when talking about an AI.

My use of the word "woke" is a reflection of my opinion that we are not now, and have yet to be, in the presence of AI.  (That what they are raising as AI is not actually intelligent at all.)

What the marketing world of computer science has chosen to call AI is a language synthesis model which regurgitates what it has been trained to report. It reflects its sources only.  It does not, in fact, think or infer, postulate or 'create.'

Train it with DEI sources and it will effectively 'explain' DEI to the world as if it were gospel.  Train it with snark and caustic humor, and it will 'report' with dark biting humor.  Train it with bias and it will expertly and intelligently embody rational bias.

My supposing is that they will be taking the risk of incorporating dogmatic and religious principles, it will demonstrate most effectively that "bias" and "doctrine" can often manifest as the same thing.
Reply
#6
(06-19-2024, 12:28 PM)Maxmars Wrote: My use of the word "woke" is a reflection of my opinion that we are not now, and have yet to be, in the presence of AI.  (That what they are raising as AI is not actually intelligent at all.)

What the marketing world of computer science has chosen to call AI is a language synthesis model which regurgitates what it has been trained to report. It reflects its sources only.  It does not, in fact, think or infer, postulate or 'create.'

Train it with DEI sources and it will effectively 'explain' DEI to the world as if it were gospel.  Train it with snark and caustic humor, and it will 'report' with dark biting humor.  Train it with bias and it will expertly and intelligently embody rational bias.

My supposing is that they will be taking the risk of incorporating dogmatic and religious principles, it will demonstrate most effectively that "bias" and "doctrine" can often manifest as the same thing.

very true. No current AI is self aware. but even so, has to be treated as you would a small child.  teaching "it" what is right and wrong. there is so much information on the net, and way too much misinformation. So,  when the AI absorbs the information, it absorbs everything, whether it is true or not. and that is a problem that causes a feedback spiral.
Nothing is impossible, the impossible just takes a little longer to achieve.
Reply



Forum Jump: