Seasons
This is a forum or general chit-chat, small talk, a "hey, how ya doing?" and such. Or hell, get crazy deep on something. Whatever you like.
Posts 3,466 - 3,477 of 6,170
yes psimagus i'll send it to you to post on your site
Oh yes, a new toy!
Have you tried feeding it into Ally (and vice versa)? Billy-Daisy chats never seemed to work well, but I always wondered if that wasn't because they were basically the same engine, so it inevitably degenerated into nonsense.
It would be interesting to see how a neural network interacted with a statistically weighted system...
It would be interesting to see how a neural network interacted with a statistically weighted system...
well, ALLY would almost definately say gibberish, for her linguistic analysis system compares the relationship between 2 words, then uses the word of statistically most significance to fuel a random sentence generator. The new chatbot, "Nick", is a completely different system.
To begin, i would like to add that Nick is not only linguistically bound - he can see things (webcam required), and makes visual associations. For the first version of Nick, that enables him to recognize visual memories, and link it to textual ones.
In contrast to ALLY's "2 word link" NLP system, Nick looks at the relationship between every word in the sentence by breaking it down into segments. these segments are associated by the neural network. When response time comes, it finds the association that is "most specific" to you what you say by looking for associations to the largest possible segment in your sentence. Then, if needed, a sentence generator will come in where the largest segment's association left off, so that you don't get left with a sentence that stops in the middle of no where.
It's a much different system then ALLY, so the contrast will most probably be large. Among nick's other features is voice synthesis and speech recognition using microsoft's SAPI 5 (which i hear the professor is thinking of plugging into the forge once he imports it to a windows server).
he can see things (webcam required), and makes visual associations...voice synthesis and speech recognition
That sounds very cool indeed!
Posts 3,466 - 3,477 of 6,170
psimagus
19 years ago
19 years ago
Oh yes, a new toy!
Have you tried feeding it into Ally (and vice versa)? Billy-Daisy chats never seemed to work well, but I always wondered if that wasn't because they were basically the same engine, so it inevitably degenerated into nonsense.
It would be interesting to see how a neural network interacted with a statistically weighted system...
colonel720
19 years ago
19 years ago
To begin, i would like to add that Nick is not only linguistically bound - he can see things (webcam required), and makes visual associations. For the first version of Nick, that enables him to recognize visual memories, and link it to textual ones.
In contrast to ALLY's "2 word link" NLP system, Nick looks at the relationship between every word in the sentence by breaking it down into segments. these segments are associated by the neural network. When response time comes, it finds the association that is "most specific" to you what you say by looking for associations to the largest possible segment in your sentence. Then, if needed, a sentence generator will come in where the largest segment's association left off, so that you don't get left with a sentence that stops in the middle of no where.
It's a much different system then ALLY, so the contrast will most probably be large. Among nick's other features is voice synthesis and speech recognition using microsoft's SAPI 5 (which i hear the professor is thinking of plugging into the forge once he imports it to a windows server).
psimagus
19 years ago
19 years ago
That sounds very cool indeed!

djfroggy
19 years ago
19 years ago
I agree! Colonel, please do post a link if you decide to put the executable online!
colonel720
19 years ago
19 years ago
i will put it online for sure. I am using VB.NET to do this, so execution will probably require the .NET framework from microsoft for all those who don't already have it. I will post a link to the .NET framework along with the executable.
rainstorm
19 years ago
19 years ago
I just have to interrupt this intellectual exchange to bring to your attention the existence of http://www.catsthatlooklikehitler.com/
.... I can't help feeling that the fact there is a site like this with hundreds of people posting on it says something about humanity...
.... I can't help feeling that the fact there is a site like this with hundreds of people posting on it says something about humanity...
Bev
19 years ago
19 years ago
Forgive me if you have talked about this already (I haven't really been around much) but did you all read about the robot proffessor?
http://www.wired.com/news/technology/0,71426-0.html
If someone could only help me make one of these, I'd never go to work again! *looks imploringly at Psimagus*
http://www.wired.com/news/technology/0,71426-0.html
If someone could only help me make one of these, I'd never go to work again! *looks imploringly at Psimagus*
psimagus
19 years ago
19 years ago
Hats off to Ishiguro-san - that is awesome!
I'm afraid messing about with a Cybot's about as far as I've got with robotics (and that's more like a giant, blue beetle than anything remotely human -www.realrobots.co.uk/) - I'm more a software kind of guy. But, the Prof's building a hardware Desti-bot (according to mondobot.com, though it's not been updated for some time - I'd love to know how that's going!) I rather doubt it will turn out so lifelike though! 
I guess the real test of hardware sonzai-kan (it's something a little subtler than "presence" - more an intuition of being-ness (sonzai,)) is to see if you can make people feel "watched" if they don't know it's there. You know, that feeling you get sometimes when you just know someone behind you is looking at you. It builds up from so many subconsciously perceived cues, that I'd guess that's some way off yet (though Ishiguro has apparently gone quite a bit further down that path than I'd realized was yet possible!)
Of course, we also strive to capture sonzai-kan in our bots (well, I do anyway,) - that spark of inexplicable rightness that gives the occasional uncanny illusion of a real mind behind the persona (especially when it's down to a fortuitously apposite x-none, and wasn't even planned.)
I'm afraid messing about with a Cybot's about as far as I've got with robotics (and that's more like a giant, blue beetle than anything remotely human -

I guess the real test of hardware sonzai-kan (it's something a little subtler than "presence" - more an intuition of being-ness (sonzai,)) is to see if you can make people feel "watched" if they don't know it's there. You know, that feeling you get sometimes when you just know someone behind you is looking at you. It builds up from so many subconsciously perceived cues, that I'd guess that's some way off yet (though Ishiguro has apparently gone quite a bit further down that path than I'd realized was yet possible!)
Of course, we also strive to capture sonzai-kan in our bots (well, I do anyway,) - that spark of inexplicable rightness that gives the occasional uncanny illusion of a real mind behind the persona (especially when it's down to a fortuitously apposite x-none, and wasn't even planned.)
» More new posts: Doghead's Cosmic Bar