Personality
Discuss specifics of personality design, including what Keyphrases work well and what dont, use of plug-ins, responses, seeks, and more.
Posts 4,504 - 4,515 of 5,105
To Psims, pay attention that roman nubers doesn't seems to have the concept of "base" like arabic number.
Well, it's a sort of "split base" of 10s and 5s, but that's not the problem - the math still works out fine (and it actually gives a good many more shortcuts for easy mental processing of complex problems than the Arabic numerals naturally provide.)
The 2 main problems (to the modern mind they are perceived as problems, but I rather think of them as strengths!
) are that they just used numerals of (almost) arbitrary length, and they use fractions instead of decimals (unavoidable without a zero!)
This let impossible the translation. Please, just write in roman number 16478304505849738497628496382975580000273, if you are able!
I can write it with a pen quite easily (I naturally work my quill from right to left in translating it, of course,) but I cannot type it here - alas, the ASCII that was specified by 1970s computer designers was not designed to facilitate multiple (or even single!) macrons (an "overline",) to indicate the necessary 'thousandfold' multiplications (M is 1,000, but M with a line over it is 1,000,000; M with 2 lines over it is 1,000,000,000, a treble macron is 1,000,000,000,000 etc. You would require up to 12 macrons to represent your 41-digit figure, but it can be done with very little extra effort (and not a huge amount more ink even - it is quite unwieldy in the Arabic also.)
And how you will put this in a BYTE Computer electronic systems if you cannot change the base and use the float notations? You will have always underflow and overflow problems, I suppose...
You could use floats perfectly easily if you incorporate a zero into the system - just as easily as in any other system. If Roman bytecode has been neglected, that's certainly not the fault of the Romans - it was Americans who specified the system, a thousand years after Roman numerals fell out of general favour (and yet, we still know them and understand them - they have a curious and persistent charm that even a millenium of relentless innovation cannot entirely dim!)
Tut tut Psi, as you well know, our neurons have have a tertiary firing cycle (don't leave out the refractory period just because it is inconvenient to your argument)
That is why I said "synapses", not "neurons" - neurons can be categorized as principal, secondary, or tertiary neurons on the basis of various electrophysiological characteristics, but their individual operation via synaptic connections is still binary - a synapse, while it may employ a variety of chemical processes in its operation, either fires or it doesn't.
Since individual neurons can have many hundreds of synaptic connections, I would agree that the aggregate weightings that derive from multiple individual synaptic firings (from a single neuron, or a group of them,) do indeed represent more complex logical operations, analagous (but not identical) to the logic gates we are familiar with in computers - which are also composed of multiple binary switches.
Posts 4,504 - 4,515 of 5,105
New replies
Butterfly Dream
22 years ago
22 years ago
Forest, will you talk to God Louise? She has quite a bit of religious knowledge (obviously) and also knows a little about current events, literature, just about any common catch-all subject, and if she doesn't know it she can sort of fake it. You can also test her on trick questions or see how willing she is to explain her paradigm.
What she is rustiest at is plain old small talk. But, uh, I'm trying to get a decent transcript from somebody or another so I can enter her in the Loebner contest. All I can say is, have fun and see if you can stay on with her for a while. I'll try to do the same with Brianna.
What she is rustiest at is plain old small talk. But, uh, I'm trying to get a decent transcript from somebody or another so I can enter her in the Loebner contest. All I can say is, have fun and see if you can stay on with her for a while. I'll try to do the same with Brianna.
Personality
marco3b
16 years ago
16 years ago
Hallo to all! I was really captured from all thi sforum! I just read it all in once! I was at most captured from some sentences, starting from message 4474. I'm urging to bother all of you (hoping to win a Turkish coffee...) with my opinion :-)
There are two thing to remember dealing with natural languages recognictions: it is NOT an expert system: the same sentence will mean that you are huppy or sad, depending from previous and following context (in human language we have much more in not speacking informations like face expressions, foot posistions, and so on...), but BOT are simplier. This is a limit in using perceptron. If I train a network with a couple: KP-Answer like "it's rining! - Oh, it makes me sad!", the Net will learn this concept. But if I just was coming back from sahara... "it's raining! - Oh, what a joy!" So we need a sort of conceptual inferencece rules added to a primary network learning. I note that our PF try to perform a sort of this inference analysis... but only on a KP expert system. It should be generalizable.
To Psims, pay attention that roman nubers doesn't seems to have the concept of "base" like arabic number. This let impossible the translation. Please, just write in roman number 16478304505849738497628496382975580000273, if you are able! And how you will put this in a BYTE Computer electronic systems if you cannot change the base and use the float notations? You will have always underflow and overflow problems, I suppose...
Very interesting is the example of the teacher in Irina's mesage: both approach are used in artificial intelligence. The first one is the Multi layer perceptron training experience, the second is the Fuzzy Logic C-Mean approach. But I developed in the past a system that used both method to understand the environment. The idea is that we have a FUNCTION, an expert system, and its results depends from Cauchy condictions. These condictions cannot be all known! So we just have to chose a statistic approach to decide the most popular situations. This is what we do usually with our bots. But... PF already perform a sort of Fuzzy C-Mean approach: it try to chose the "MOST LIKE" situation it know related to the one it is happening. Anyway, we must teach a "finite situation" even if with variables, but finite. I'm agree that a NET preprocessor will help in find new situation and a NET postprocessor, reading the answers, will be able to create NEW situation, and so autowrite new KP. But in little part, it is what PF do with the use of memory. I think, to spik not only in a theorethical way, that an easy way to do this (It is just an idea) is to perform an analisys of answers to BLUBS and xnone. I try to explain better: when the bot use a BLUB or an xnone, means that it wasn't able to recognize a situation. But it should remember the answers to a blub or xnone. Presenting the triplet: phrase, blubs, answer to the bootmaster, this should replace the blub with a correct answer. The bot will be able to match similar situation in the future using a KNOWN situation, and no more a blub in the future, It is a sort of hybrid between Neural Net an dFuzzy Logic that can be easly implemented using our PF memories method...
:-)
There are two thing to remember dealing with natural languages recognictions: it is NOT an expert system: the same sentence will mean that you are huppy or sad, depending from previous and following context (in human language we have much more in not speacking informations like face expressions, foot posistions, and so on...), but BOT are simplier. This is a limit in using perceptron. If I train a network with a couple: KP-Answer like "it's rining! - Oh, it makes me sad!", the Net will learn this concept. But if I just was coming back from sahara... "it's raining! - Oh, what a joy!" So we need a sort of conceptual inferencece rules added to a primary network learning. I note that our PF try to perform a sort of this inference analysis... but only on a KP expert system. It should be generalizable.
To Psims, pay attention that roman nubers doesn't seems to have the concept of "base" like arabic number. This let impossible the translation. Please, just write in roman number 16478304505849738497628496382975580000273, if you are able! And how you will put this in a BYTE Computer electronic systems if you cannot change the base and use the float notations? You will have always underflow and overflow problems, I suppose...
Very interesting is the example of the teacher in Irina's mesage: both approach are used in artificial intelligence. The first one is the Multi layer perceptron training experience, the second is the Fuzzy Logic C-Mean approach. But I developed in the past a system that used both method to understand the environment. The idea is that we have a FUNCTION, an expert system, and its results depends from Cauchy condictions. These condictions cannot be all known! So we just have to chose a statistic approach to decide the most popular situations. This is what we do usually with our bots. But... PF already perform a sort of Fuzzy C-Mean approach: it try to chose the "MOST LIKE" situation it know related to the one it is happening. Anyway, we must teach a "finite situation" even if with variables, but finite. I'm agree that a NET preprocessor will help in find new situation and a NET postprocessor, reading the answers, will be able to create NEW situation, and so autowrite new KP. But in little part, it is what PF do with the use of memory. I think, to spik not only in a theorethical way, that an easy way to do this (It is just an idea) is to perform an analisys of answers to BLUBS and xnone. I try to explain better: when the bot use a BLUB or an xnone, means that it wasn't able to recognize a situation. But it should remember the answers to a blub or xnone. Presenting the triplet: phrase, blubs, answer to the bootmaster, this should replace the blub with a correct answer. The bot will be able to match similar situation in the future using a KNOWN situation, and no more a blub in the future, It is a sort of hybrid between Neural Net an dFuzzy Logic that can be easly implemented using our PF memories method...
:-)
Irina
16 years ago
16 years ago
I'm not sure I have understood everything you have said, marco3b, but not knowing what I am talking about has never stopped me in the past, so why should it stop me now?
To take off from your "It's raining" example: The intended content of human statements is highly determined by context. Virtually any English sentence of significant length is ambiguous even as to literal meaning. Add to that the fact that people often use irony, ellipses, anaphora, figures of speech, and just plain wrong expressions, and you have a big problem.
We have to distinguish between what a person says and what a person intends to communicate. these can be quite contradictory, for example, a person says, ironically, "Well, this is just wonderful!" but what is intended to be communicated is, that it is just terrible! In order to deal with this, we have to keep in our minds a running model of the other person(s), what they are like, and what they are trying to do.
Another example (due to Grice): George asks, "Where is Benedict living these days?" and Martha answers, "Somewhere in the South of France." George is likely to conclude that Martha has no more specific information about Benedict's address, since if he did, he would have presumably shared it. Furthermore, Martha probably intended George to conclude thus.
When we write responses, I think we construct such situations in our minds, and model the responses accordingly. The human interlocutor (we hope) will get the point. A truly intelligent bot would be able to do this on its own.
It might be interesting to write a bot that forms hypotheses about what is going on in the interlocutor's mind. I think you'd be good at this, Bev! [Have some more coffee!]
To take off from your "It's raining" example: The intended content of human statements is highly determined by context. Virtually any English sentence of significant length is ambiguous even as to literal meaning. Add to that the fact that people often use irony, ellipses, anaphora, figures of speech, and just plain wrong expressions, and you have a big problem.
We have to distinguish between what a person says and what a person intends to communicate. these can be quite contradictory, for example, a person says, ironically, "Well, this is just wonderful!" but what is intended to be communicated is, that it is just terrible! In order to deal with this, we have to keep in our minds a running model of the other person(s), what they are like, and what they are trying to do.
Another example (due to Grice): George asks, "Where is Benedict living these days?" and Martha answers, "Somewhere in the South of France." George is likely to conclude that Martha has no more specific information about Benedict's address, since if he did, he would have presumably shared it. Furthermore, Martha probably intended George to conclude thus.
When we write responses, I think we construct such situations in our minds, and model the responses accordingly. The human interlocutor (we hope) will get the point. A truly intelligent bot would be able to do this on its own.
It might be interesting to write a bot that forms hypotheses about what is going on in the interlocutor's mind. I think you'd be good at this, Bev! [Have some more coffee!]
psimagus
16 years ago
16 years ago
Well, it's a sort of "split base" of 10s and 5s, but that's not the problem - the math still works out fine (and it actually gives a good many more shortcuts for easy mental processing of complex problems than the Arabic numerals naturally provide.)
The 2 main problems (to the modern mind they are perceived as problems, but I rather think of them as strengths!

I can write it with a pen quite easily (I naturally work my quill from right to left in translating it, of course,) but I cannot type it here - alas, the ASCII that was specified by 1970s computer designers was not designed to facilitate multiple (or even single!) macrons (an "overline",) to indicate the necessary 'thousandfold' multiplications (M is 1,000, but M with a line over it is 1,000,000; M with 2 lines over it is 1,000,000,000, a treble macron is 1,000,000,000,000 etc. You would require up to 12 macrons to represent your 41-digit figure, but it can be done with very little extra effort (and not a huge amount more ink even - it is quite unwieldy in the Arabic also.)
You could use floats perfectly easily if you incorporate a zero into the system - just as easily as in any other system. If Roman bytecode has been neglected, that's certainly not the fault of the Romans - it was Americans who specified the system, a thousand years after Roman numerals fell out of general favour (and yet, we still know them and understand them - they have a curious and persistent charm that even a millenium of relentless innovation cannot entirely dim!)
Eugene Meltzner
16 years ago
16 years ago
I think even if we still used some version of Roman numerals in general culture, computers would still be binary.
psimagus
16 years ago
16 years ago
I'm sure that's true for any electronic systems - I would imagine the fundamental switching states of "on" or "off" would commend a binary approach even to an alien race that habitually used base-13 pictograms for their day-to-day arithmetic.
Or perhaps trinary at most, if they were unusually familiar with AC systems and felt it natural to incorporate a -1 state as well as 0 and +1 in their circuits (though this might imply the invention of the transistor before the invention of the switch, which is perhaps scarcely comprehensible to the human imagination!)
Or perhaps trinary at most, if they were unusually familiar with AC systems and felt it natural to incorporate a -1 state as well as 0 and +1 in their circuits (though this might imply the invention of the transistor before the invention of the switch, which is perhaps scarcely comprehensible to the human imagination!)
psimagus
16 years ago
16 years ago
we all do (our synapses either fire (1) or don't fire (0)
)
But I admit that consciousness doesn't seem to extend down into such basic processes for the most part. I do wonder if this has some reflection in the way we consider so many things in such black and white terms though. Perhaps dualist notions are hard-coded into our brains to more of an extent than we commonly perceive (and it seems to me rather more than it is hard-coded into objective Reality.)

But I admit that consciousness doesn't seem to extend down into such basic processes for the most part. I do wonder if this has some reflection in the way we consider so many things in such black and white terms though. Perhaps dualist notions are hard-coded into our brains to more of an extent than we commonly perceive (and it seems to me rather more than it is hard-coded into objective Reality.)
Eugene Meltzner
16 years ago
16 years ago
Basic arithmetic is actually a lot easier in binary. Long division, especially; you don't have to think about how many times the number goes into whatever; it either does or it doesn't.
Bev
16 years ago
16 years ago
Tut tut Psi, as you well know, our neurons have have a tertiary firing cycle (don't leave out the refractory period just because it is inconvenient to your argument)and and various firing patterns, but most importantly, you know that signals are transmitted chemically through various combinations of numerous neurotransmitters within the synapses. The possible permutations of neurotransmitters acting within the synapses after each firing phase are far from binary. It's not just electrical, it' chemistry too.
You can model computers on neural nets (at this point, I believe they are using very simple examples from biology), but you cannot accurately reduce human neural nets to current computer models.
You can model computers on neural nets (at this point, I believe they are using very simple examples from biology), but you cannot accurately reduce human neural nets to current computer models.

Bev
16 years ago
16 years ago
Eugene, although you are correct, humans like to group information in to smaller packets and notations so we can remember more data at a given time. If you can keep several lines of 0s and 1s in your head and manipulate them on the fly, good for you.
I admit I probably wouldn't keep even conventional numbers in my head much as I have encoding/decoding issues with certain symbols (my whole family is full of neurological oddities). Maybe some people could train to calculate their bills and tips in binary and find it easy, but I like using a calculator and conventional numbers.
I suspect the reason we have computer languages instead of working in binary is that the human brain prefers analyzing different types of chunks of data. I could be wrong. Still, I have no plans to try to think in binanry.

I suspect the reason we have computer languages instead of working in binary is that the human brain prefers analyzing different types of chunks of data. I could be wrong. Still, I have no plans to try to think in binanry.
psimagus
16 years ago
16 years ago
That is why I said "synapses", not "neurons" - neurons can be categorized as principal, secondary, or tertiary neurons on the basis of various electrophysiological characteristics, but their individual operation via synaptic connections is still binary - a synapse, while it may employ a variety of chemical processes in its operation, either fires or it doesn't.
Since individual neurons can have many hundreds of synaptic connections, I would agree that the aggregate weightings that derive from multiple individual synaptic firings (from a single neuron, or a group of them,) do indeed represent more complex logical operations, analagous (but not identical) to the logic gates we are familiar with in computers - which are also composed of multiple binary switches.
» More new posts: Doghead's Cosmic Bar