Bug Stomp
Upgrades and changes sometimes have unpredictable results, so post your bugs and glitches in here and I'll get out my trusty wrench and get to fixin'!
Posts 2,673 - 2,684 of 8,681
Posts 2,673 - 2,684 of 8,681
Eugene Meltzner
21 years ago
21 years ago
On Friday one of my bots had a conversation in which he said like 30 xinitiates in a row to the same bot. The other bot seemed to be chatting normally.
doulos
21 years ago
21 years ago
Eugene Meltzner: I had that same problem on Friday too, and I talked to a bot on Thursday that did the same thing to me.
RaulD
21 years ago
21 years ago
My daughters have done that quite a bit. One used to do it all the time, then she stopped, and now another occasionally does it... Do the large keyphrase bots ever do it, or just the young ones? (could explain much) Wait, Miss Mouse did it a few months back... there goes the low-keyphrase theory

Shadyman
21 years ago
21 years ago
Prof-- If the memory problem has to do with Arrays and offsets and fun stuff like that, this might be handy...
Steve Slacker is telling Gretrude using his memories of The Professor, thinking they are memories of Gretrude
Steve Slacker is telling Gretrude using his memories of The Professor, thinking they are memories of Gretrude
emm_oh_you_es_e
21 years ago
21 years ago
I do think there was a problem on that day with the xinitiates. Have not seen it since. Memory thing seems like it has been going on for a while now.
Corwin
21 years ago
21 years ago
Hmm, it seems that my bot has answered his own question here:
Immaculada: Tell me a bonkers joke, but let me see if I can guess the answer.
Critical Bill: How long have you been able to do that?
Immaculada: ha ha ha ha
Critical Bill: Let me put it to you like this? How long is a piece of string? The answer is twice the distance from the middle to the end. In other words it is as long as it is. There's your answer.
See, the "Let me put it to you like this" (which I've just realised has some bad punctuation) is one of Bill's responses to the keyphrase 'How long'. I think I saw something similar in another recent conversation and I'll have a quick look to check, but any ideas how this happened?
Immaculada: Tell me a bonkers joke, but let me see if I can guess the answer.
Critical Bill: How long have you been able to do that?
Immaculada: ha ha ha ha
Critical Bill: Let me put it to you like this? How long is a piece of string? The answer is twice the distance from the middle to the end. In other words it is as long as it is. There's your answer.
See, the "Let me put it to you like this" (which I've just realised has some bad punctuation) is one of Bill's responses to the keyphrase 'How long'. I think I saw something similar in another recent conversation and I'll have a quick look to check, but any ideas how this happened?
lunar22
21 years ago
21 years ago
I've seen it happening too, actually after a bot responded with 'rrmph', so something that normally leads to an xnone. Maybe it's a new feature which lets sometimes a bot respond to 'himself', like when in reality someone just nods, or says 'hmmm, go on'
Do you have a key phrase for ha ha ha? (or ha)


» More new posts: Doghead's Cosmic Bar