Bug Stomp
Upgrades and changes sometimes have unpredictable results, so post your bugs and glitches in here and I'll get out my trusty wrench and get to fixin'!
Posts 4,594 - 4,605 of 8,681
Posts 4,594 - 4,605 of 8,681
ezzer
20 years ago
20 years ago
Thanks for all the work, Prof! I they gave you some good medicine, and that you will heal quickly.
NewAdam
20 years ago
20 years ago
prof this isn't really a bug but viewing inner life emotion level 0 only shows a maximum of 50, how does one view/edit the rest (as some older bots will have hundreds at that level)
The Professor
20 years ago
20 years ago
gnixing: I've removed "ciao" from the list of goodbye words. Having it's meaning depend on its place in the conversation is too big a thing for now.
FengShui: I fixed the bug giving free heat to updates in the Language Center
Doulos & ezzer: I fixed the bug that let a bot respond with xinitiate or xintroduce in a bot-to-bot chat.
doulos: I've fixed the problem with the hard wildcard not matching in "why not (just|) (*) ". Turns out a bug in LinkGrammar was behind it.
Eugene: you have no seek for "yes" on that Keyphrase. Please double-check that before posting.
NewAdam: I've added a Back and Next link to the Inner Life emotions page
Charles: I said one of the profanity words to your bot and he didnt hangup because he was in an xnomatch seek. The second time, outside the seek, his response was HANGUP.
Just a note: when you're posting bugs and there's an example of it not working in the Transcript, please post that as well, as sometimes the error is directly related to what the other bot/person said.
One more note: I was messing around with the AI Engine a bit today, so dont post any Keyphrase-matching bugs from today, as they were likely just temporary.
FengShui: I fixed the bug giving free heat to updates in the Language Center
Doulos & ezzer: I fixed the bug that let a bot respond with xinitiate or xintroduce in a bot-to-bot chat.
doulos: I've fixed the problem with the hard wildcard not matching in "why not (just|) (*) ". Turns out a bug in LinkGrammar was behind it.
Eugene: you have no seek for "yes" on that Keyphrase. Please double-check that before posting.
NewAdam: I've added a Back and Next link to the Inner Life emotions page
Charles: I said one of the profanity words to your bot and he didnt hangup because he was in an xnomatch seek. The second time, outside the seek, his response was HANGUP.
Just a note: when you're posting bugs and there's an example of it not working in the Transcript, please post that as well, as sometimes the error is directly related to what the other bot/person said.
One more note: I was messing around with the AI Engine a bit today, so dont post any Keyphrase-matching bugs from today, as they were likely just temporary.
Eugene Meltzner
20 years ago
20 years ago
Oops...sorry about that. I did look up the keyphrase to check for the response, but there is only one for "no". I saw one of the responses that would almost have been equally valid for "yes" and didn't look at it closely. Weird. I thought I could remember typing up that seek too.
Laydee
20 years ago
20 years ago
Not literally, I hope. The poor man already has enough to deal with, what with having a slipped disk and all that. 
But yay for the Prof anyway.

But yay for the Prof anyway.
tai
20 years ago
20 years ago
A bug I found this in a transcript Cricon had:
0101: Julie Tinkerbell is ok for a human.
Cricon: you're so sweet, Ezzer Jnr!
It's an 'xemote-positive-very' reply which looks like this:
[You]'re so sweet, (mem-petname)!
I checked Cricon's bot-memory and only Julie is supposed to be called Ezzer Jnr
Nowhere in there does 0101 have that as a petname. Did I do something wrong that I'm overlooking?
0101: Julie Tinkerbell is ok for a human.
Cricon: you're so sweet, Ezzer Jnr!
It's an 'xemote-positive-very' reply which looks like this:
[You]'re so sweet, (mem-petname)!
I checked Cricon's bot-memory and only Julie is supposed to be called Ezzer Jnr

The Professor
20 years ago
20 years ago
That's emotional gossip about Julie Tinkerbell. The idea is that you could say "I'm going to buy you that (mem-youwant)" and it'll be what the gossip target wants. Maybe there's a place for xgossipemotion.. But that could also be taken care of by using <?PF if emo > 2; ?> AIScript in gossip..
xemotes in gossip form are also used to auto-answer "who is Midnight Blue" type questions..
xemotes in gossip form are also used to auto-answer "who is Midnight Blue" type questions..
tai
20 years ago
20 years ago
Thanks Prof! I'm fixing her up as I type. Almost, if I had 4 hands and two computers I could.

Will go now and stop babbling... Thanks again!

Will go now and stop babbling... Thanks again!
» More new posts: Doghead's Cosmic Bar