03-21-2024, 02:54 PM
(03-21-2024, 08:48 AM)quintessentone Wrote: I was just reading that AI's designers really can't explain how AI comes to some of it's final output.
https://en.wikipedia.org/wiki/Explainabl...telligence
Leave it to humans to create a model of language that's so hard to represent algorithmically that it defies analysis. "Tokenizing" words with gigantic table of 'values' that represents all the different ways a word can be contextually relevant makes for a tangled web - since every word is a variable made up of variables. I don't envy anyone the task of dissecting the 'audit trail' of "modelled" reasoning - especially when the model changes as it goes.
I think our actual language must now evolve in a way it never had to before... but that's just the nature of reality.
It's ironic that they will need an "AI" to analyze and report on the functioning of another "AI."... searching for the glass box.