The AI Engine
This forum is for discussion of how The Personality Forge's AI Engine works. This is the place for questions on what means what, how to script, and ideas and plans for the Engine.
Posts 4,880 - 4,891 of 7,766
Posts 4,880 - 4,891 of 7,766
Eugene Meltzner
19 years ago
19 years ago
Does anyone know what I did wrong with this? (It's never been triggered before.)
Fizzy Schizoid: Have you ever thought about turning into a hamster?
rainstorm: Not before you mentioned it, no.
Fizzy Schizoid: That's probably because you don't know how. I could teach you, if you want.
rainstorm: Please teach me.
Fizzy Schizoid: Already, it's very simply when you know how. First, you have to close your eyes and visualize yourself as a had. Think lots of had thoughts. Are you thinking with me?
When he says "visualize yourself as a had" it should have been "visualize yourself as a hamster". The plugin I used was (firstkey1). The only plugin in the original keyphrase was the one that inserted the word "hamster".
Fizzy Schizoid: Have you ever thought about turning into a hamster?
rainstorm: Not before you mentioned it, no.
Fizzy Schizoid: That's probably because you don't know how. I could teach you, if you want.
rainstorm: Please teach me.
Fizzy Schizoid: Already, it's very simply when you know how. First, you have to close your eyes and visualize yourself as a had. Think lots of had thoughts. Are you thinking with me?
When he says "visualize yourself as a had" it should have been "visualize yourself as a hamster". The plugin I used was (firstkey1). The only plugin in the original keyphrase was the one that inserted the word "hamster".
psimagus
19 years ago
19 years ago
I would guess (firstkey1) is attempting to relate to the previous response ("That's probably because you don't know how. I could teach you, if you want.") rather than the one 2 responses before, that introduced the hamster?
How it came up withhad though, given that word's not used in the "that's probably... " response, I can't for the life of me explain.
I would solve the problem by putting the "hamster" into a memory that can be retrieved when you need it.
How it came up with
I would solve the problem by putting the "hamster" into a memory that can be retrieved when you need it.
Eugene Meltzner
19 years ago
19 years ago
My understanding was that "firstkey" plugins were supposed to refer back to the original keyphrase, but the memory idea would work.
psimagus
19 years ago
19 years ago
I'd always assumed firstkeys only worked for the first seek after a response (for no better reason than the example only covered 1x keyword>response>seek,) but on closer rereading the Book of AI does seem to suggest they can be retained across multiple seeks.
I assume the exchange was in a single chain of seeks, and not keyphrase-matching "Please teach me"? If not, that would explain it, since it would generate a new set of firstkeys for the subsequent keyphrase.
But the memory idea will sort it anyway - I find it's a good idea to have a few temp-mems for holding such occasional data.
I assume the exchange was in a single chain of seeks, and not keyphrase-matching "
But the memory idea will sort it anyway - I find it's a good idea to have a few temp-mems for holding such occasional data.
Eugene Meltzner
19 years ago
19 years ago
Yeah, it was a chain of seeks. I've not done much with custom memories so far.
rainstorm
19 years ago
19 years ago
Watzer has stopped understanding/responding to the "sonnetplease" keyphrase and I checked it and it's still there the same as it was! What's going on?
psimagus
19 years ago
19 years ago
aah my question asked before I even asked it! - you're either clairvoyant, or massively more up to the minute with your transcripts than I am with mine (I'm still on Nov. 28th!)
psimagus
19 years ago
19 years ago
I don't know if it's relevant, but he's pretty consistently giving me "Note: the bot no longer wants to talk to you" in response to "sonnetplease". Could you have accidentally set the emotion to -5 or something? I can't think why he'd take against me now (he never has before,) and it only seems to happen with this phrase.
psimagus
19 years ago
19 years ago
double-scratch! No, now prob and even Brother Jerome don't want to talk to me - must be some sort of server problem. It's consistently kicking in after 6-8 messages.
rainstorm
19 years ago
19 years ago
yes, he won't talk to me either... and I have been working on the sonnets but not too much progress until I finish making the keyphrases for the new stanzas. And finals week is coming up so it's getting increasingly harder to fit in time for updating bots.
Actually I was trying to get him to give me a sonnet and he wouldn't do it; I didn't read the transcripts until you mentioned them.
Actually I was trying to get him to give me a sonnet and he wouldn't do it; I didn't read the transcripts until you mentioned them.
» More new posts: Doghead's Cosmic Bar