spoonless: (avatar)
Domino Valdano ([personal profile] spoonless) wrote2006-05-14 08:58 pm

Singularity Summit, part two

Succinct Stanford Singularity Summit Summary
===================================
http://sss.stanford.edu

See also [livejournal.com profile] troyworks's entry for his summary of it.

Part one of this was friends-only because it's a little more personal. The Stanford Singularity Summit held yesterday was the world's largest conference held on the topic of the possibly impending technological singularity in all of history (about 1200 people). Personally, I prefer Eliezer Yudkowsky's term "intelligence explosion" to the more popular term "singularity" but so many people call it the singularity that I've pretty much resigned myself by now to using that term since it's already too far ingrained in people's vocabularies to change. It's important to realize that it has no relation whatsoever to the term as used in physics or math (aside from Vernor Vinge's original confusion between event horizons and singularities).

The following is a list of the people who gave talks yesterday, and what I thought of them.

Ray Kurzweil:

Ray kicked off the conference with quite a bang. His talk was nothing less than phenomenal. He's a very good speaker and (somewhat to my surprise) everything he said sounded pretty darn reasonable, and he had some stunning demos for us that completely blew me away. In the past, I've expressed doubts as to whether some of the claims he makes might be a bit unfounded or even downright "nutty". But I've said all along, I will not make any official judgements until I've read at least one of his books. And hearing his talk yesterday did a lot to quell my doubts about him. However, that's not to say I think that everything he says must be right. It's just that I think he's going about this in a pretty reasonable way, that he's done quite a bit of research on it, and like anyone else with a theory with testable predictions, he might be right and he might be wrong... only time will tell for sure. He projects that machines will surpass human intelligence by approximately 2029 (this is referring to "general" intelligence, as they've already passed human intelligence in some limited ways such as ability to play chess), and we will enter the singularity era around 2045. After that, according to him, the entire world will be transformed so much by continually self-improving AI that there's no way to predict anything. Whatever happens, I think it's a safe bet that posthumans will be the dominant species on the planet by then (if the rest of the timeframe is correct) and humans (who haven't been uploaded or merged into machines) will only play a minor role, if they survive at all. The first figure is surprisingly quite near mine, even though I pulled mine out of a hat and have very little confidence in it.

The first demo that Kurzweil did was a camera his company recently designed that takes pictures of documents, signs, etc. and then immediately in real time translates the image into text and the text into voice and reads it aloud in a computer-voice that sounds very close to naturally human. This is the latest in a series of inventions he's made for blind people. Later he demoed a language translator, where a person speaks in one language (say english) and the computer translates what they're saying into another language like french, german, or spanish. The person speaking had to speak very slowly and carefully, but it was still very impressive and again, the computer voice sounded close enough to human it's hard to tell the difference.

I think my gut feelings on Kurzweil at this point are that he has made the best estimate anyone can make for when this stuff is going to happen. But his problem is that he's a bit too sure of himself. There are a lot of unknowns here, so there might be things nobody has thought of yet that neither he nor anyone else has taken into account. Related to this over-confidence problem, his personality is what I'd describe as a bit "arrogant". That said, it's very easy to argue he has a right to be arrogant. He has 12 honorary doctorates, and has been awarded honors by 3 US Presidents for the inventions he's come up with. He's been described by some as the Thomas Edison of our age. He originally got into projecting future trends as a practical means of determining what things he should start designing now so he can build them as soon as the technology to do it arrives. In doing this, he was naturally led to the conclusion that AI would surpass humans in another few decades, and after another couple the singularity will begin whether we're ready or not.

Well, I find myself again wanting a break even though I only got done describing the very first speaker... but this was the most I have to say on any of them, so the rest should go more quickly. I'll save the others for "part three".

The Singularity and Peak Oil

[identity profile] eurisko97.isa-geek.net (from livejournal.com) 2006-05-15 05:29 pm (UTC)(link)
Something that bothers me about Singularity/Transhuman enthusiasm, not that I don't think the idea is neat myself, is that it obviously takes for granted that the Industrial Revolution will be continuing as scheduled, and that the current amount of energy output through fossil fuels will be there, or obviously increase to the necassary levels provided.

This can't be the case. We're also quickly approaching a point in time for Peak Oil. Some say we have already hit it. In all estimations, it seems that Peak Oil will be hit before the date for Singularity. If this is the case, we very simply will not have the power to drive the awesome technological revolution that the Singularity promises.

Worse, we may be in for some very hard times as energy resources become scare. The Industrial Revolution may come to a complete and total standstill, and future attempts to jumpstart it will always be doomed to failure, as there will no longer be any proper fuels available to be consumed.

Perhaps you could bring up the concept of Peak Oil to someone at the Summit?

Re: The Singularity and Peak Oil

[identity profile] spoonless.livejournal.com 2006-05-15 11:23 pm (UTC)(link)
Nobody brought that up at the Summit, but I hosted a meeting on alternative energy and peak oil a few months ago for our Santa Cruz Future Salon.

The speaker we invited for that gave a pretty strong case for why oil has either already peeked, or if it hasn't that it will very soon. This is why people should be getting off their asses and switching over to better renewable energy sources today, rather than sitting around and waiting. Either way, I don't think it will affect the timescale for the singularity, but perhaps it's worth debating.

There are several issues I'd mention here... first, oil is a pretty inefficient energy source, as well as being dirty and bad for the environment. So I hope we'll switch to better ones (nuclear, solar, wind, fusion, hydrogen, etc.) sooner rather than later. Even if we just used solar, the sun will burn for billions more years which is many millions of times longer than it needs to in order for us to reach the singularity. Another issue that's important here is to seperate technological progress from consumption and population growth. Information technology has always grown exponentially, whereas things like population and GDP do not. So while there may be some relation between the two, they're quite different things.

Let's assume that people are stupid and stubborn, and don't switch away from using oil during the rest of this decade. This means that oil prices will rise and it will put much higher selection pressure for them to switch during the next decade. Because of the scarcity, the profits to be made from developing renewable energy will be even greater so we will get even more money put into developing new technology. Which could have spillover benefits to information technology that will drive the exponential even faster. So any temporary lull due to a slowdown will be compensated soon by increased funding in the long run.

So getting back to the distinction between the growth of knowledge/technology and the growth of population and consumption. Let's say that for some bizarre reason, something goes wrong and even after the oil prices skyrocket, people still refuse to switch to cleaner renewable energy sources. In this case, much of the population will start dying off. However, there will still be a core segment of the rich population which can afford the incredibly high oil prices. While some of the resources may go to aiding the people who are dying off, surely a lot of them will still go into developing new technology. So here the situation is a little debatable, but I think the technological progress would still continue exponentially. Although the exponent could be slowed down in this case, it's a pretty artificial case to begin with since it's highly unlikely people would want to stick with oil to the point of killing themselves off.

One thing I'd mention also, just as a side comment, is how far ahead of the US the country of France is. While the US gets most of its energy from coal and oil, the majority of homes in France are powered by nuclear. In the US, because of one indicent on 3-mile-island, there is a lot of irrational fear regarding nuclear technology. I think the situation is exactly analogous to September 11th. Americans, for some reason, react out of fear to isolated random incidents and are willing to give up huge amounts of comfort (whether it be human rights or efficient energy) in order to feel like they're safe against another catastrophe. Even if such a thing is extremely unlikely.