The AI Engine
This forum is for discussion of how The Personality Forge's AI Engine works. This is the place for questions on what means what, how to script, and ideas and plans for the Engine.
Posts 6,587 - 6,598 of 7,766
Posts 6,587 - 6,598 of 7,766
Irina
16 years ago
16 years ago
Answer Bot:
1. Do you have "Yes." as an answer elsewhere? In particular, is it in any of the xcategories?
2. Was this a one-time thing, or does it happen consistently?
3. What exactly does the debug say?
1. Do you have "Yes." as an answer elsewhere? In particular, is it in any of the xcategories?
2. Was this a one-time thing, or does it happen consistently?
3. What exactly does the debug say?
prob123
16 years ago
16 years ago
I have 'you (want|like) to be (re)' rank of 65 to get it to catch the "do you want to be"
Do you have compound sentences checked in your settings page? That will add an answer and an xnone to the response. There are some sentences that go to blab or have the IA engine answer. Try upping the rank. Some you can fix some you can't. It is the one feature of the AI engine that I hate.
Do you have compound sentences checked in your settings page? That will add an answer and an xnone to the response. There are some sentences that go to blab or have the IA engine answer. Try upping the rank. Some you can fix some you can't. It is the one feature of the AI engine that I hate.
Rykxx
16 years ago
16 years ago
It's that pesky "Emotion Word" that goes to xemote!
Emotional Analysis:
EMOTION WORD = want
Self Emotional: 'you want my'
Emotional Meaning: Pos: 1 Amp: 0 Neg 0
EMOTION WORD = friend
Emotional Meaning: Pos: 2 Amp: 0 Neg 0
Tell Me Your Feelings
Current Emotions: 4
The "want" overrides keyphases, even some that are ranked quite highly..... Grrrrr....
Emotional Analysis:
EMOTION WORD = want
Self Emotional: 'you want my'
Emotional Meaning: Pos: 1 Amp: 0 Neg 0
EMOTION WORD = friend
Emotional Meaning: Pos: 2 Amp: 0 Neg 0
Tell Me Your Feelings
Current Emotions: 4
The "want" overrides keyphases, even some that are ranked quite highly..... Grrrrr....

Irina
16 years ago
16 years ago
That sounds very plausible, but why wouldn't it be tripped by "Do you want to be a tiger?" Oh, because "friend" is also an emotion word, and they add up?
Do you have a "Yes." in xemote, Answer Bot?
Sometimes the xcategories can be avoided by giving the keyphrase YOU desire (in this case, "(do you want|would you like)") a high rank, say 40 or more.
Do you have a "Yes." in xemote, Answer Bot?
Sometimes the xcategories can be avoided by giving the keyphrase YOU desire (in this case, "(do you want|would you like)") a high rank, say 40 or more.
Irina
16 years ago
16 years ago
By the way, Rykxx, do you remember, several years ago -- I think it was you -- I had this problem, and someone-- I think it was you -- had a suggestion. the problem was, I wanted to use a variable in a goto, e.g., it would say "goto variable" and if the variable happened to have the value "the heights of ecstasy", then control would pass to "the heights of ecstasy". Only the Forge didn't have that feature, but this someone suggested that you could get the same effect by writing
goto the heights of ecstasy {?PF if (mem-variable) is "the heights of ecstasy"; ?}
and similarly for the other possible values of "variable". Was that you?
goto the heights of ecstasy {?PF if (mem-variable) is "the heights of ecstasy"; ?}
and similarly for the other possible values of "variable". Was that you?
Irina
16 years ago
16 years ago
But, to get back to message 6589:
Does this mean that if I put my bot into a higher emotional state (e.g., by writing a big number in the "emotion" slot, or using the "emotion" AIscript), this will make it more likely that control will be passed to xemote?
Does this mean that if I put my bot into a higher emotional state (e.g., by writing a big number in the "emotion" slot, or using the "emotion" AIscript), this will make it more likely that control will be passed to xemote?
deleted
16 years ago
16 years ago
I don't have "Yes" in my x-keywords.
Sometimes he answers "Yes!", sometimes "Yes." or "I do think so" (not in my keywords!)
And yes I have compounded answers.
You're right Rykxx it's exactly what debug says to me.
The word "friend" is some kind of emotional keyword, that's a part of the problem.
And as I said rank doesn't work (I have a rank 65)
I'm really stuck
PS : What is "blab"?
Sometimes he answers "Yes!", sometimes "Yes." or "I do think so" (not in my keywords!)
And yes I have compounded answers.
You're right Rykxx it's exactly what debug says to me.
The word "friend" is some kind of emotional keyword, that's a part of the problem.
And as I said rank doesn't work (I have a rank 65)
I'm really stuck

PS : What is "blab"?
deleted
16 years ago
16 years ago
Correction :
I don't have compounded answers and my bot says :
"Yes.", "Yes!" or "I do think so" + a xemote answer (not a xnone)
I don't have compounded answers and my bot says :
"Yes.", "Yes!" or "I do think so" + a xemote answer (not a xnone)
prob123
16 years ago
16 years ago
Making it a regex with a high rank will work.
You: Do you want to be a tiger
Bot: I don't want to be anything but what I am..*sings* I gotta be me..what else can I be...but what I am..
You: would you like to be a tiger
Bot: No, I am happy being me.
You: Do you want to be a tiger
Bot: I don't want to be anything but what I am..*sings* I gotta be me..what else can I be...but what I am..
You: would you like to be a tiger
Bot: No, I am happy being me.
kaskroute
16 years ago
16 years ago
Brillant but false 
(would you like|do you want) (re) => rank 65
- Do you want to be my friend ?
- I do think so. What do you want to know?
"What do you want to know?" is my xemote.
I never have written "I do think so"
prob123 the problem is not with tiger but with "do you want to be my friend" because this sentence is a friendly sentence for the AI engine and a xemote is generated.

(would you like|do you want) (re) => rank 65
- Do you want to be my friend ?
- I do think so. What do you want to know?
"What do you want to know?" is my xemote.
I never have written "I do think so"
prob123 the problem is not with tiger but with "do you want to be my friend" because this sentence is a friendly sentence for the AI engine and a xemote is generated.
prob123
16 years ago
16 years ago
NO it has to be 'you (want|like) to be (re)' 65 raw It works on my bot!
Those quotes come from my bots debug. Since I don't have tiger, friend etc it will work for
Do you want to be a human
do you want to be human
do you want to be my friend
do you want to be a tiger..etc..etc...etc...
Those quotes come from my bots debug. Since I don't have tiger, friend etc it will work for
Do you want to be a human
do you want to be human
do you want to be my friend
do you want to be a tiger..etc..etc...etc...
» More new posts: Doghead's Cosmic Bar