Bug Stomp
Upgrades and changes sometimes have unpredictable results, so post your bugs and glitches in here and I'll get out my trusty wrench and get to fixin'!
Posts 4,951 - 4,962 of 8,682
Posts 4,951 - 4,962 of 8,682
Frenger
20 years ago
20 years ago
Guest238: Do you like learning?
Flinch: It depends on how you define 'like'. And how you define 'larning'.
Larning?
Flinch: It depends on how you define 'like'. And how you define 'larning'.
Larning?
alc003
20 years ago
20 years ago
Hmmm...this is interesting.
Everytime I put in a question like
Do you like the keyboard?
I get: No, I'm just not the kind of person who likes.
It seems to strip out the (key5) for some reason.
The keyphrase is:
do (not|) you (just|) (really|) (like|love) (adjartnounprep)
None of the responses work right. However, other keyphrases work just fine, like:
Do you like to chat?
That has a similar base as the first one, but uses "to (verb) instead.
Everytime I put in a question like
Do you like the keyboard?
I get: No, I'm just not the kind of person who likes.
It seems to strip out the (key5) for some reason.
The keyphrase is:
do (not|) you (just|) (really|) (like|love) (adjartnounprep)
None of the responses work right. However, other keyphrases work just fine, like:
Do you like to chat?
That has a similar base as the first one, but uses "to (verb) instead.
Boner the Clown
20 years ago
20 years ago
I'm pretty sure that you can only go up to (key3).
If that's the case, you could combine the (just|) and (really|) to (just|really|), then split the like and love into seperate keyphrases.
If that's the case, you could combine the (just|) and (really|) to (just|really|), then split the like and love into seperate keyphrases.
revscrj
20 years ago
20 years ago
Only go up to key 3? AAAARRRRRRGHHH- well, that would explain some occasional gibberish....
dallymo
20 years ago
20 years ago
Hey, rev; Frizella's got some keyphrases with at least up to (key4) that work, so, as an experiment, I put your keyphrase into her language center and ranked it 50, with response "Yes, I do (key4) (key5)! Bunches!" I got these in debug:
You: don't you just really love the keyboard? (all parentheticals included)<0>
Bot: Yes, I do love the keyboard! Bunches!
You: do you just really love the keyboard? (parentheticals 2, 3, 4 and 5) <0>
Bot: Yes, I do love the keyboard! Bunches!
You: do you really love the keyboard? (parentheticals 3, 4 and 5)<0>
Bot: Yes, I do love the keyboard! Bunches!
You: do you love the keyboard? (parentheticals 4 and 5)<0>
Bot: Yes, I do the keyboard! Bunches!
So it works for a while, but then breaks down at some point when there are fewer of the parenthetical statements actually used by the chatter. I think I'll experiment a little more and see what comes up. Strange!
You: don't you just really love the keyboard? (all parentheticals included)<0>
Bot: Yes, I do love the keyboard! Bunches!
You: do you just really love the keyboard? (parentheticals 2, 3, 4 and 5) <0>
Bot: Yes, I do love the keyboard! Bunches!
You: do you really love the keyboard? (parentheticals 3, 4 and 5)<0>
Bot: Yes, I do love the keyboard! Bunches!
You: do you love the keyboard? (parentheticals 4 and 5)<0>
Bot: Yes, I do the keyboard! Bunches!
So it works for a while, but then breaks down at some point when there are fewer of the parenthetical statements actually used by the chatter. I think I'll experiment a little more and see what comes up. Strange!
alc003
20 years ago
20 years ago
Key 3?! Impossible, I have plenty of keyphrases that work, and they have 4 or 5 keys.
Random example:
Are (not|) you going to (go|) (verb) (the|) (adjnoun)
You: Aren't you going to go hit the gong?
Bot: Oh come on, can't I go hit the gong tomorrow? *whines*
---
OK, just tried that suggestion. Amazing, it works now.
You: Don't you like that dog?
Bot: that dog is one of my favorite things!
Interesting. Well, thanks for the suggestion.
Random example:
Are (not|) you going to (go|) (verb) (the|) (adjnoun)
You: Aren't you going to go hit the gong?
Bot: Oh come on, can't I go hit the gong tomorrow? *whines*
---
OK, just tried that suggestion. Amazing, it works now.
You: Don't you like that dog?
Bot: that dog is one of my favorite things!
Interesting. Well, thanks for the suggestion.

ezzer
20 years ago
20 years ago
I have many responses with (key4), and they always work. It may be that your (key5) in that case was (adjartnounprep)- It's supposed to work, but I recently had to go through and change all my (adjartnouprep)'s to (adjartnoun) in order to make those keyphrases match anything.
ezzer
20 years ago
20 years ago
I have keyphrases for colors, and if for some reason one doesn't match, another usually does, but now an AI engine default is overriding them, even set at 50. I get this in debug:
Sentences: what color do you get when you mix red and yellow
Emote?:
Self-Evident!
Response: Oh, a quiz! The answer is yellow.
Sentences: what color do you get when you mix red and yellow
Emote?:
Self-Evident!
Response: Oh, a quiz! The answer is yellow.
ezzer
20 years ago
20 years ago
After more investigation into the preceding, this is what I found. Any sentence beginning with "what color" followed by any verb and a color, will preprocess as 'self evident', and give an xcolor type response, choosing the first mentioned color as the answer.
I tested it with: What color do red and yellow make?
and got the error, "The answer is yellow", then got a similar response when I asked what color eats red and yellow, etc.
My keyphrase, set at 50, will work if I say, for example "What * red and yellow"- no matter what comes between "what" and the colors, but for some reason, not if the word color is in there. Could it be that the AI engine's "What color" xkeyphrase is set so high that it can override a soft wildcard in a keyphrase set so high?
I tested it with: What color do red and yellow make?
and got the error, "The answer is yellow", then got a similar response when I asked what color eats red and yellow, etc.
My keyphrase, set at 50, will work if I say, for example "What * red and yellow"- no matter what comes between "what" and the colors, but for some reason, not if the word color is in there. Could it be that the AI engine's "What color" xkeyphrase is set so high that it can override a soft wildcard in a keyphrase set so high?
» More new posts: Doghead's Cosmic Bar