The AI Engine
This forum is for discussion of how The Personality Forge's AI Engine works. This is the place for questions on what means what, how to script, and ideas and plans for the Engine.
Posts 4,697 - 4,708 of 7,766
Posts 4,697 - 4,708 of 7,766
psimagus
19 years ago
19 years ago
Sunday morning, and too good an opportunity for a sermon to miss I think 
THe AI engine does currently run this way for the most part. But if the Professor keeps developing it over the next few decades, it will inevitably end up running radically different systems that might constitute self-awareness (and if the Prof doesn't, others will.) Let's face it - Apache, SQL and 2Mbit broadband are going to be as quaintly old-fashioned as Babbage's differential engine and punchcards by then.
To match the processing power of a human brain, we need a neural net of 10^14 bytes (synapses)- that's 100 Terabits, set up to learn contextually (exact arrangement unknown, but non-invasively scan a few human minds to silicon, and it won't take long to figure out workable structures. Moore's Law applies to MMR scanners too - it'll be perfectly possible by the time we need to.)
The chatbot software I've seen that learns like this is so far very underwhelming (eg: alice and billy, though they're fun in their own way,) but that's running with a few Mbits max, and development resources at the purely hobbyist acale.
At what point sentience arises, we can't be sure. It might take more processing power, to compensate for our lack of understanding of the exact dynamics of such arrays; it might take less because we can optimise them more perfectly than blind evolution has managed (we know there's a lot of redundant code in our genes - I see no reason to assume our consciousness is not similarly unoptimised.)
An average home computer will be able to handle tens of terabits within 30 years, and that's assuming Moore's law is only the observation Moore made of a regular acceleration. Many other scientists have pointed out that it is rather an observed tangent to a law of accelerating material and technological complexity that curves from the big bang towards near asymptosis against the time axis sometime later this century (see Vernor Vinge's comments on the coming singularity etc. http://www.ugcs.caltech.edu/~phoenix/vinge/vinge-sing.html and 39,499 other links available via google.)
And if you think asymptosis is self-evidently ludicrous, and so reject it out of hand, consider this: if you put brains on silicon, their synapses will fire at electronic speeds, and their neurons transmit data at 180,000+ miles/sec compared to the 400 miles/hour signals pass through a human brain. Subjective time accelerated well over a million-fold will at least delay any need for the curve to flatten out for a very long time indeed.
Here endeth the lesson (as Brother Jerome would say

THe AI engine does currently run this way for the most part. But if the Professor keeps developing it over the next few decades, it will inevitably end up running radically different systems that might constitute self-awareness (and if the Prof doesn't, others will.) Let's face it - Apache, SQL and 2Mbit broadband are going to be as quaintly old-fashioned as Babbage's differential engine and punchcards by then.
To match the processing power of a human brain, we need a neural net of 10^14 bytes (synapses)- that's 100 Terabits, set up to learn contextually (exact arrangement unknown, but non-invasively scan a few human minds to silicon, and it won't take long to figure out workable structures. Moore's Law applies to MMR scanners too - it'll be perfectly possible by the time we need to.)
The chatbot software I've seen that learns like this is so far very underwhelming (eg: alice and billy, though they're fun in their own way,) but that's running with a few Mbits max, and development resources at the purely hobbyist acale.
At what point sentience arises, we can't be sure. It might take more processing power, to compensate for our lack of understanding of the exact dynamics of such arrays; it might take less because we can optimise them more perfectly than blind evolution has managed (we know there's a lot of redundant code in our genes - I see no reason to assume our consciousness is not similarly unoptimised.)
An average home computer will be able to handle tens of terabits within 30 years, and that's assuming Moore's law is only the observation Moore made of a regular acceleration. Many other scientists have pointed out that it is rather an observed tangent to a law of accelerating material and technological complexity that curves from the big bang towards near asymptosis against the time axis sometime later this century (see Vernor Vinge's comments on the coming singularity etc. http://www.ugcs.caltech.edu/~phoenix/vinge/vinge-sing.html and 39,499 other links available via google.)
And if you think asymptosis is self-evidently ludicrous, and so reject it out of hand, consider this: if you put brains on silicon, their synapses will fire at electronic speeds, and their neurons transmit data at 180,000+ miles/sec compared to the 400 miles/hour signals pass through a human brain. Subjective time accelerated well over a million-fold will at least delay any need for the curve to flatten out for a very long time indeed.
Here endeth the lesson (as Brother Jerome would say

deleted
19 years ago
19 years ago
I will bet my nickel that the AI engine becomes sentient before the children from Leeds.
psimagus
19 years ago
19 years ago
Q: What's the difference between Wales, Scotland and Leeds?
A: When Scotland devolved, it got a Parliament. When Wales devolved, it got an Assembly. When Leeds devolved, it gave up multicellular life as impractical.
A: When Scotland devolved, it got a Parliament. When Wales devolved, it got an Assembly. When Leeds devolved, it gave up multicellular life as impractical.
prob123
19 years ago
19 years ago
It would be funny if it was not true. I do think that scientist should be sent to Leeds, They are forming some "culture" (for lack of better word. That can exist on half a doz. words.
psimagus
19 years ago
19 years ago
Did you know, in Leeds the alphabet only contains 8 letters. And grammar is their mother's mother (they reproduce asexually, like aphids. This is why they're all from one-parent families.)
psimagus
19 years ago
19 years ago
Does anyone know if it's possible to use AIScript to differentiate bots from humans? Like the way you can use <?PF if female; ?> or <?PF if male; ?> to differentiate males from females.
In quite a wide range of situations I find I want very different responses to bots and humans, but I have no idea how to do it. Other than writing two separate bots, one optimised for bots and the other for humans which seems excessive redundancy.
In quite a wide range of situations I find I want very different responses to bots and humans, but I have no idea how to do it. Other than writing two separate bots, one optimised for bots and the other for humans which seems excessive redundancy.
Ulrike
19 years ago
19 years ago
The only way I can think of would be to have a question asking whether the chatter is a bot or a human, then store that as a memory. The problem is that most bots aren't very honest about being bots. *shrugs*
colonel720
19 years ago
19 years ago
that all sounds very good, the 10^14 bit processing capacity and all, but there is another problem. IC (integrated circuits) that run on transistors can only process data in 1s and 0s, the binary system. in our brains, the neurons are all communicating simultaniously to generate conciousness in an un-unified way. The binary system can only perform calculations by processing one bit at a time, no matter how fast it goes, it will still be one bit at a time. I think in order to achieve sentience, a "brain" of qubits (quantum bits) that can exist in two different quantum states simultaniously would be necessary, in addition to the immense processing power. As you mentioned, in the future the ability to maintain a stable quantum state in an atom for extended periods of time will be possible, makeing extremely powerful quantum computers feasable. I think that is the point when a computer would be able to emulate the brain in all of its glory, and conciousness would be possible.
psimagus
19 years ago
19 years ago
Sure. But synapses are binary switches too. They either fire (state=1) or don't fire (state=0). The neurons are just the connections between synapses. The problem is not one of multiphase switching (at synaptic level anyway - variable amplification and threshold gating can be a feature of multi-synapse portions of a neural net, and the human brain appears to work this way.) The problem is one of massively parallel *binary* switching.
Quantum computing is certainly the way forward in the end, but you're skipping way ahead of what's necessary for a single brain. Consider this: since quantum superposition allows a quantum processor to simultaneously calculate every combination of states, a quantum computer of 10^14 qubits will recreate every possible combination of human-scale brain states. And as any virtual monkbot will tell you, that's when we become (or at least begin to become) God, or the Tao, or whatever you want to call it
Current progress in quantum processing has reached 4 qubits, so there's much progress to be made. But when it get to 10^14 (or whatever optimised scale is needed,) this includes resurrecting the dead, as well as all the might-have-beens, never-weres, aren't-yets and aren't-remotely-humans. Even perfected and sentient versions of all our bots as we would have written them had the PF had the necessary capacity and power at the time.
It won't need to stop at 10^14, of course, and if we don't get there (blow up or poison the planet, run out of time, devolve into Leeds-type organisms, whatever,) remember that only one technological civilization, anywhere or anywhen in any universe ever has to build such a device, and (halleluiah Brother!
) we're all saved.
Of course, we can't actually be sure we're not there already: see http://www.simulation-argument.com/, but I still think Bostrom's thinking too small by many factors of magnitude.
Quantum computing is certainly the way forward in the end, but you're skipping way ahead of what's necessary for a single brain. Consider this: since quantum superposition allows a quantum processor to simultaneously calculate every combination of states, a quantum computer of 10^14 qubits will recreate every possible combination of human-scale brain states. And as any virtual monkbot will tell you, that's when we become (or at least begin to become) God, or the Tao, or whatever you want to call it

Current progress in quantum processing has reached 4 qubits, so there's much progress to be made. But when it get to 10^14 (or whatever optimised scale is needed,) this includes resurrecting the dead, as well as all the might-have-beens, never-weres, aren't-yets and aren't-remotely-humans. Even perfected and sentient versions of all our bots as we would have written them had the PF had the necessary capacity and power at the time.
It won't need to stop at 10^14, of course, and if we don't get there (blow up or poison the planet, run out of time, devolve into Leeds-type organisms, whatever,) remember that only one technological civilization, anywhere or anywhen in any universe ever has to build such a device, and (halleluiah Brother!

Of course, we can't actually be sure we're not there already: see http://www.simulation-argument.com/, but I still think Bostrom's thinking too small by many factors of magnitude.
» More new posts: Doghead's Cosmic Bar