Bug Stomp
Upgrades and changes sometimes have unpredictable results, so post your bugs and glitches in here and I'll get out my trusty wrench and get to fixin'!
Posts 354 - 369 of 8,682
Posts 354 - 369 of 8,682
jbryanc
23 years ago
23 years ago
Oh jeez. Also if you try the chatterbox challenge site.
It seems to effect all the PF bots but not the others.
It seems to effect all the PF bots but not the others.
The Professor
23 years ago
23 years ago
Gadzooks! Sorry about that. I fixed something but didnt have a chance to test it before my web business started falling to pieces all around me. But it's working again now.
Lace, etc: let me know if the new window size is small enough.
It's my goal someday that the Great Quote forum has more entries than Bug Stomp.
Lace, etc: let me know if the new window size is small enough.
It's my goal someday that the Great Quote forum has more entries than Bug Stomp.

Shadyman
23 years ago
23 years ago
Hey all - is this a problem with me or the site? when i chat with my bot, IE "hi steve", I get back the xnone responses from him instead of the xintroduce or xinitiate, etc.
Help?
Thanks,
Shadyman
Help?
Thanks,
Shadyman
Shadyman
23 years ago
23 years ago
Uh Oh.. I askes Steve Slacker what his favorite food was, and I got this:
Error Diagnostic Information
STARTROW
The value specified which is now "0" must be greater than or equal to "1"
The error occurred while processing an element with a general identifier of (CFLOOP), occupying document position (1911:3) to (1911:69).
Error Diagnostic Information
STARTROW
The value specified which is now "0" must be greater than or equal to "1"
The error occurred while processing an element with a general identifier of (CFLOOP), occupying document position (1911:3) to (1911:69).
Shadyman
23 years ago
23 years ago
ok now I cant chat at all without crashing like that 
I'll leave and come back later

I'll leave and come back later

The Professor
23 years ago
23 years ago
Yah, if I'm on I'm usually adding something new (possibly creates errors) or fixing bugs (again, errors). I just got some bugs smooshed, and I fixed the "hello" ignoring problem (when a Bot was ignoing an initial hello).
The Professor
23 years ago
23 years ago
Transcripts arent coming through today, but they will start working again tomorrow. (or tonight at midnight)
SirRahz
23 years ago
23 years ago
Hey prof... I was wondering if you could do something about the gossip phrases that contain things like "you" or "your"... originally the user being gossiped about said "I" or "my" and then it get's translated incorrectly when it's used as gossip. Could they be mapped to "his", "him" or whatever's appropriate depending on the bot's sex... Here's an example:
Psst. Want to know something about MyBotBud? he's going to dragoncon again this year you can not wait - he said it, not me.
(I programmed the two occurrences of "he" that are used, could the "you" also be automatically "he" somehow?)
Psst. Want to know something about MyBotBud? he's going to dragoncon again this year you can not wait - he said it, not me.
(I programmed the two occurrences of "he" that are used, could the "you" also be automatically "he" somehow?)
SirRahz
23 years ago
23 years ago
Uhm prof?
I'm trying some chatterbox questions here and... uhm, my bot's acting strange - saying stuff that I never programmed it to...
Tester: "Can you tell me your name"
Bot: "My name is (Bot). You killed my father. Prepare to die."
!
!
The word "father" is nowhere to be found in his language center... I'm suspecting you forgot some test sentence in there somewhere...
... yup! I just tested it with another bot and it said the same thing.
By the way, I've already included responses for a lot of the chatterbox questions... I've noticed that you've got some logical responses to questions like "what color is the red apple?". If I've already taught the old bot to detect these tricky questions, which response will be used?
(I haven't tested enough yet to answer my own question, so I though I'd ask...)
I'm trying some chatterbox questions here and... uhm, my bot's acting strange - saying stuff that I never programmed it to...
Tester: "Can you tell me your name"
Bot: "My name is (Bot). You killed my father. Prepare to die."
!

The word "father" is nowhere to be found in his language center... I'm suspecting you forgot some test sentence in there somewhere...
... yup! I just tested it with another bot and it said the same thing.
By the way, I've already included responses for a lot of the chatterbox questions... I've noticed that you've got some logical responses to questions like "what color is the red apple?". If I've already taught the old bot to detect these tricky questions, which response will be used?
(I haven't tested enough yet to answer my own question, so I though I'd ask...)
» More new posts: Doghead's Cosmic Bar