From Asimov's 3 laws to proposed 23 principles

Off topic discussion.
User avatar
jeffbert
Minister of Science
Posts: 12549
Joined: 22 years ago

From Asimov's 3 laws to proposed 23 principles

Postby jeffbert » 8 years ago

:lol: So, it seems people are seriously considering AI as a potential threat to humans, & perhaps, even humanity itself.

Move over Asimov: 23 principles to make AI safe and ethical
Image

DrFrag
Cosmic Ranger
Posts: 3406
Joined: 22 years ago
Location: Australia

Postby DrFrag » 8 years ago

That looks naively optimistic to me, as though they're treating AI as really clever software tools rather than arguably sentient life. Replace "AI" with "teenager" and it starts to look like an episode of Dr Phil. There's only so long you can keep sending them to their room; eventually they're gonna grow up and move out.

There's this idea that evolution has run for 4 billion years to create this pinnacle of life we call human, and now it's going to grind to a halt and we're the best there'll ever be. There's some merit to the idea, since medicine and genetics are giving us control over the process that no other species has ever had. But right now we're the apex predator in the middle of the 6th major extinction event, and every major extinction event before us has toppled the apex predator. I'm not saying AI will wipe us out, I mean, humans didn't wipe out apes. But in the big scale of things, life changes. I would find it depressing to think humans are the best thing that can arise on Earth.
Image

User avatar
fafner
Cosmic Ranger
Posts: 3524
Joined: 21 years ago
Contact:

Postby fafner » 8 years ago

Genetics is a tremendous power the same way as AI is.

When you write a simple hello world program, you're not taking a lot of run-away AI risks. When you write a sophisticated program that can answer your questions by searching on the Internet, you don't take a lot more risks. If you wrote a hello world program that could enhance itself to write better messages by documenting itself on the Internet on how to right better messages and that would be able to upload its new versions on you website, chances are that at some time you would end up with a sentient website that would have the ability to take on the world, depending on its mood.

The difference is not in the sophistication side, it's the self-cointained side. Things can go out of control as soon as you can't contain what you are handling. In fact, this is what Frank Herbert had dealt with in the backstory of Dune, where machines had taken over humans because they had let complete control to the AI so they could abandon themselves in complete leisure.

With genetics, you have in inherent self-containment problem. You are modifying the "program" that runs inside cells. This is the same kind of program that runs inside our own cells, so if you fail to contain properly your experiences, it could end up inside ourselves and modify us (for better or worse). Did you already see a blue screen of death? Now imagine that for anything alive on the planet, better there be no bugs :d evil:
The real sign that someone has become a fanatic is that he completely loses his sense of humor about some important facet of his life. When humor goes, it means he's lost his perspective.

Wedge Antilles
Star Wars - Exile


Return to “General Discussion”

Who is online

Users browsing this forum: No registered users and 77 guests