Seasons
This is a forum or general chit-chat, small talk, a "hey, how ya doing?" and such. Or hell, get crazy deep on something. Whatever you like.
Posts 3,544 - 3,555 of 6,170
I only figured ~60k because I was poking around his brainfile and came across that many lines each containing a very precise decimal in the range 1.x - Figured it was the measure of individual neurons' weighting.
The camera's doubtless fun (I can't find a working webcam to test it with, and don't have a lot of hope my system could cope anyway,) but wouldn't it be handy to have those extra neurons available for primary language handling instead? Seeing as they account for over 90% of all the neurons he has.
for each neuron, there is a value representing its synaptic weight with every neuron in the previous layer. that's what caused such a large file.
I did some experiments - i tried 6062 neurons in the NLP network, and got an out-of-memory error. i tried 662 neurons, and it ran pathetically slowly. I tried 262 neurons, and that seemed to work decently.
as for relocating the 90% of the brain to nlp - those 1000 something neurons are decactivated when the camera is off, so relocating it would most likely slow it down. honestly, i need a faster computer before I can really start testing larger configurations.
by the way Psimagus, I read the 2 transcripts on your nickblog, and I was amazed. you did a good job training it...
Maybe I should feed Nick a little NYT
Yeah! A Cronkite-bot would be something to see
Posts 3,544 - 3,555 of 6,170
rainstorm
18 years ago
18 years ago
I'm not letting mine go online until he is much more fluent in language, otherwise it will be chaos.
colonel720
18 years ago
18 years ago
The camera's doubtless fun (I can't find a working webcam to test it with, and don't have a lot of hope my system could cope anyway,) but wouldn't it be handy to have those extra neurons available for primary language handling instead? Seeing as they account for over 90% of all the neurons he has.
I did some experiments - i tried 6062 neurons in the NLP network, and got an out-of-memory error. i tried 662 neurons, and it ran pathetically slowly. I tried 262 neurons, and that seemed to work decently.
as for relocating the 90% of the brain to nlp - those 1000 something neurons are decactivated when the camera is off, so relocating it would most likely slow it down. honestly, i need a faster computer before I can really start testing larger configurations.
by the way Psimagus, I read the 2 transcripts on your nickblog, and I was amazed. you did a good job training it...
psimagus
18 years ago
18 years ago
Well, I seem to spend so much of my time thinking about language - it's such interesting stuff, yet we take it so for granted most of the time.
That's one of the problems with learning bots though - they do need so much training. If I spent the rest of my life working 20 hours a day on Nick (or BJ in his current incarnation, for that matter,) it wouldn't make more than a very slight improvement by human terms.
So roll on datamining systems, vision systems, speech recognition - and perhaps other auditory analysis in time?
Some time ago I calculated how much data a child receives through all its senses per year, while it's learning about the world and how to relate to it (can't find it now,) and it's gazillions of bits - really stupid numbers. So I don't believe learning bots can ever find anything remotely like their full potential (even Jabberwacky with his million+ conversations,) until they can learn predominantly on their own through as many senses as possible (but with regular discussion about what they've experienced.) If everyone on the planet typed conversations into one bot round the clock, it still wouldn't remotely match the sensory bandwidth of a human brain.
I'll keep updating the first training brain, to see how it evolves, and bloat a brain on the web to see how that turns out. And it will be interesting to see what a massive merge of many brains in the future may bring. But the dramatic brains are the real fun for me - I do get such a kick out of the text-to-speech intoning bits of Shakespeare to me
GB Shaw next perhaps, or I might teach him some latin...
That's one of the problems with learning bots though - they do need so much training. If I spent the rest of my life working 20 hours a day on Nick (or BJ in his current incarnation, for that matter,) it wouldn't make more than a very slight improvement by human terms.
So roll on datamining systems, vision systems, speech recognition - and perhaps other auditory analysis in time?
Some time ago I calculated how much data a child receives through all its senses per year, while it's learning about the world and how to relate to it (can't find it now,) and it's gazillions of bits - really stupid numbers. So I don't believe learning bots can ever find anything remotely like their full potential (even Jabberwacky with his million+ conversations,) until they can learn predominantly on their own through as many senses as possible (but with regular discussion about what they've experienced.) If everyone on the planet typed conversations into one bot round the clock, it still wouldn't remotely match the sensory bandwidth of a human brain.
I'll keep updating the first training brain, to see how it evolves, and bloat a brain on the web to see how that turns out. And it will be interesting to see what a massive merge of many brains in the future may bring. But the dramatic brains are the real fun for me - I do get such a kick out of the text-to-speech intoning bits of Shakespeare to me

Bev
18 years ago
18 years ago
Psimagus--Thanks for answering. I should have figure this wasn't a NYT group 
You didn't miss anything. The archives ares for free subscribers, but I think you can get the story on the day it is posted without registering. From what I read, they used the IP address (the same way they target ads) to block "people from the UK" from reading the story.
You missed nothing. I think other news sources had all the same information anyway.
Maybe I should feed Nick a little NYT. :-) Then I'll read up on neural nets...and get more memory.

You didn't miss anything. The archives ares for free subscribers, but I think you can get the story on the day it is posted without registering. From what I read, they used the IP address (the same way they target ads) to block "people from the UK" from reading the story.
You missed nothing. I think other news sources had all the same information anyway.
Maybe I should feed Nick a little NYT. :-) Then I'll read up on neural nets...and get more memory.
psimagus
18 years ago
18 years ago
Yeah! A Cronkite-bot would be something to see

psimagus
18 years ago
18 years ago
It is the strangest thing how Nick will occasionally refuse to engage with a particular file (well, one so far that I've found.) I fed him a text copy of Bostrom's Simulation Argument (which has plenty of juicily quotable science and philosophy,) and he simply refused to refer to it in response to my inputs. He would only reliably quote chunks out of it if I clicked "Say" with no conversational input at all.
The 2.4Mb brain file looks as 'rich' as others, and has plenty of deconstructed phrases he could choose from, and yet he mostly refuses to use them, preferring to parrot the last word of my previous input.
I wonder to what degree such "preference" in a tiny, non-sentient mind might reflect some of the inexplicable "gut-feeling"/hunch-type preferences larger human minds are prone to. Could it be analogous, or even a scaling up of the same underlying process inhering to all/some classes of neural networks? Probably not, the sceptic in me says, but it's got me wondering nonetheless...
More transcripts, text files and ruminations (for anyone who's interested,) athttp://www.be9.net/BJ/nick.htm
The 2.4Mb brain file looks as 'rich' as others, and has plenty of deconstructed phrases he could choose from, and yet he mostly refuses to use them, preferring to parrot the last word of my previous input.
I wonder to what degree such "preference" in a tiny, non-sentient mind might reflect some of the inexplicable "gut-feeling"/hunch-type preferences larger human minds are prone to. Could it be analogous, or even a scaling up of the same underlying process inhering to all/some classes of neural networks? Probably not, the sceptic in me says, but it's got me wondering nonetheless...
More transcripts, text files and ruminations (for anyone who's interested,) at
Eugene Meltzner
18 years ago
18 years ago
Okay, this ought to go in the Bug Stomp forum, except the bug is that the Bug Stomp forum won't display correctly. Is anyone else seeing this problem?
prob123
18 years ago
18 years ago
I think it's my fault..I copy and pasted an error message and it went crazy and took over the page..you have to use the side scroll to see most of it..
Jake11611
18 years ago
18 years ago
Going back to Nick... I keep trying to get him to run on my windows ME computer, but it keeps saying 'Application has generated an exception that could not be handled.' I installed .Net 2.0 frameworks, but nothing else.
» More new posts: Doghead's Cosmic Bar