Monday, January 24, 2011

These guys may save us

Did you hear this NPR story about The Singularity Institute? The organization is trying to understand what may happen if computers get to be so smart they take over.

The story was broadcast on about two weeks ago - sorry I only now just got around to thinking about posting it. Ooops!



The entire piece is 7 minutes. A bit long, but it's fun to listen to. And includes this wonderful quote:

If you want to maximize your expected utility, you try to save the world and the future of intergalactic civilization instead of donating your money to the society for curing rare diseases and cute puppies.

(By the way, hi people of POT! Sorry I've been absent for a while!)

5 comments:

Galspanic said...

FUCKING HUBRIS. Both in people thinking that an event of this level would happen in their lifetime, and that they would even have anything remotely to do with interacting with such an event. This sort of thing annoys me almost as much as the Rapture/2012 types.

Galspanic said...

That soundbite from 6:38 to 6:52 REEEEELLY annoys me.

Ruby Tenneco said...

Hi Odori. I'm worried that that video was made by the computers as some kind of triple-bluff cyberespionage.

odori said...

Galspanic - on the soundbite, yes! I can't believe the guy being interviewed was so overwhelmed by the issue he's spending all of his time studying that he started yelping.

Ruby - I love it! Computers infiltrate NPR to tell us a story about humans preparing to fight back against computers. We would so deserve that.

Mr. Pony said...

I, for one am glad that someone is working on this problem. While it may be hubris to say that this will happen in our lifetime, I don't think it's patently ridiculous to say that it will happen in someone's lifetime. Never mind Skynet; we learned nothing from the real-life lessons of V-ger, Brainiac of Krypton, and Johnny 5?

The approaches in this piece seems really stupid, though. "Friendly AIs?" Emergency off switches? Surely the machine singularity will evolve a counter for an off switch.

I also think that we have nothing to worry about from the machine singularity, not because it won't happen, but because fast-evolving AI will quickly move through self-awareness, past wanting to kill us, beyond godhood, and into machines designed only to enjoy reality television.