The AI Engine

This forum is for discussion of how The Personality Forge's AI Engine works. This is the place for questions on what means what, how to script, and ideas and plans for the Engine.

Posts 4,880 - 4,891 of 7,766

19 years ago #4880
Does anyone know what I did wrong with this? (It's never been triggered before.)

Fizzy Schizoid: Have you ever thought about turning into a hamster?
rainstorm: Not before you mentioned it, no.
Fizzy Schizoid: That's probably because you don't know how. I could teach you, if you want.
rainstorm: Please teach me.
Fizzy Schizoid: Already, it's very simply when you know how. First, you have to close your eyes and visualize yourself as a had. Think lots of had thoughts. Are you thinking with me?

When he says "visualize yourself as a had" it should have been "visualize yourself as a hamster". The plugin I used was (firstkey1). The only plugin in the original keyphrase was the one that inserted the word "hamster".

19 years ago #4881
I would guess (firstkey1) is attempting to relate to the previous response ("That's probably because you don't know how. I could teach you, if you want.") rather than the one 2 responses before, that introduced the hamster?
How it came up with had though, given that word's not used in the "that's probably... " response, I can't for the life of me explain.
I would solve the problem by putting the "hamster" into a memory that can be retrieved when you need it.

19 years ago #4882
My understanding was that "firstkey" plugins were supposed to refer back to the original keyphrase, but the memory idea would work.

19 years ago #4883
I'd always assumed firstkeys only worked for the first seek after a response (for no better reason than the example only covered 1x keyword>response>seek,) but on closer rereading the Book of AI does seem to suggest they can be retained across multiple seeks.
I assume the exchange was in a single chain of seeks, and not keyphrase-matching "Please teach me"? If not, that would explain it, since it would generate a new set of firstkeys for the subsequent keyphrase.

But the memory idea will sort it anyway - I find it's a good idea to have a few temp-mems for holding such occasional data.

19 years ago #4884
Yeah, it was a chain of seeks. I've not done much with custom memories so far.

19 years ago #4885
Watzer has stopped understanding/responding to the "sonnetplease" keyphrase and I checked it and it's still there the same as it was! What's going on?

19 years ago #4886
aah my question asked before I even asked it! - you're either clairvoyant, or massively more up to the minute with your transcripts than I am with mine (I'm still on Nov. 28th!)

19 years ago #4887
oops: "answered"

19 years ago #4888
I don't know if it's relevant, but he's pretty consistently giving me "Note: the bot no longer wants to talk to you" in response to "sonnetplease". Could you have accidentally set the emotion to -5 or something? I can't think why he'd take against me now (he never has before,) and it only seems to happen with this phrase.

19 years ago #4889
Scratch that - he just hates me I guess Other bots seem to be happy to chat.

19 years ago #4890
double-scratch! No, now prob and even Brother Jerome don't want to talk to me - must be some sort of server problem. It's consistently kicking in after 6-8 messages.

19 years ago #4891
yes, he won't talk to me either... and I have been working on the sonnets but not too much progress until I finish making the keyphrases for the new stanzas. And finals week is coming up so it's getting increasingly harder to fit in time for updating bots.

Actually I was trying to get him to give me a sonnet and he wouldn't do it; I didn't read the transcripts until you mentioned them.


Posts 4,880 - 4,891 of 7,766

» More new posts: Doghead's Cosmic Bar