Date: 2006-05-16 04:26 pm (UTC)

I can't think why the singularity wouldn't happen within days if not minutes of the emergence of machines surpassing human general intelligence. I believe that's Yudkowsky's opinion as well.

You might be right about Yudkowsky. I asked him at the conference if he thought Kurzweil's predictions might be a bit optimistic, and he said "actually, if anything I'm worried they're pessimistic... I'm worried it will happen a lot sooner and we won't be ready for it." The title of Kurzweil's talk was "a hard or a soft takeoff?" and he was advocating a soft takeoff.

I think I'd have to agree with Kurzweil here, but like you I haven't read much so it's mostly a knee-jerk opinion. The reason why I think it might take a while after they surpass human intelligence is humans aren't smart enough to understand how we think... in other words, even if we could self-improve our code we wouldn't do a good job because we just don't know enough about how we think. I'd expect it would be the same for the machines for a while, although they would slowly improve over time. Once they really figure out how their own brains work, things could take off very quickly. All this is of course wild speculation on my part, but since I've been doing plenty of that lately anyway... what's a bit more :)
This account has disabled anonymous posting.
If you don't have an account you can create one now.
HTML doesn't work in the subject.
More info about formatting

Profile

spoonless: (Default)
Domino Valdano

May 2023

S M T W T F S
 123456
78910111213
14151617181920
21222324252627
28293031   

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Aug. 1st, 2025 06:31 am
Powered by Dreamwidth Studios