View unanswered posts | View active topics
It is currently Mon May 05, 2025 9:32 pm
|
Page 1 of 1
|
[ 9 posts ] |
|
Do we need the Three Laws of Robtics?
Author |
Message |
paulzolo
What's a life?
Joined: Thu Apr 23, 2009 6:27 pm Posts: 12251
|
http://www.bbc.co.uk/programmes/p02m2wc7It’s an audio clip where the comments are made. I made a bit of a glib statement about Asmiovs’s laws on the smart gun thread, but I can’t help thinking that at some stage, we need to be looking at something like that and requiring them to be built into robots (cars, drones, ship,s any form of autonomous creation) so that we don’t start having those endless debates about whether a robot car decides to run over one person instead of three, or even aim for a wall where the occupant won’t survive if it encounters difficulties.
|
Mon Mar 16, 2015 9:43 am |
|
 |
pcernie
Legend
Joined: Sun Apr 26, 2009 12:30 pm Posts: 45931 Location: Belfast
|
It's the reason Google had to set up an ethics committee when it bought Deepmind, apparently. Best not to have a computer that remembers and adapts making it's way into potentially deadly to humans machinery without real oversight.
_________________Plain English advice on everything money, purchase and service related:
http://www.moneysavingexpert.com/
|
Mon Mar 16, 2015 10:02 am |
|
 |
big_D
What's a life?
Joined: Thu Apr 23, 2009 8:25 pm Posts: 10691 Location: Bramsche
|
Butler's Jihad is coming. Run for the hills!
Even with the laws you have to be careful. The B5 episode Infection being a case in point, the Ikaarans created the ultimate weapon to kill their enemies. As a failsafe, they programmed it not to kill Ikaarans. The problem was, how do you define an Ikaaran (or a human of a specific country or religion)? The religious leaders got involved and defined the perfect Ikaaran. The problem was, there was no perfect Ikaaran and the weapon sterilised the planet, before they could stop it...
Even with good intentions, you can still end up killing yourself (and your race).
_________________ "Do you know what this is? Hmm? No, I can see you do not. You have that vacant look in your eyes, which says hold my head to your ear, you will hear the sea!" - Londo Molari
Executive Producer No Agenda Show 246
|
Mon Mar 16, 2015 10:15 am |
|
 |
ShockWaffle
Doesn't have much of a life
Joined: Sat Apr 25, 2009 6:50 am Posts: 1911
|
There is conflation here. A self driving car, or a drone probably wouldn't be equipped with human like cognitive, conative and affective capacities. So such an object would be governed by procedural rules (programming).
If somebody (for some unfathomable reason) chooses to equip their aerial murder bots with feelings of sympathy and remorse, to be accompanied by desires, wants and intentions, then it would make sense to suppose it was equipped with the tools for rudimentary ethics, and would have need of such.
I'm not sure armies really want weapons that have to be persuaded to volunteer to blow stuff up, and which might prefer to just fly around and look at pretty stuff. And I doubt you would buy a car that was liable to decide not to drive you home because you live somewhere boring.
|
Mon Mar 16, 2015 10:50 am |
|
 |
big_D
What's a life?
Joined: Thu Apr 23, 2009 8:25 pm Posts: 10691 Location: Bramsche
|
I think it has less to do with feelings and the inability to clearly define rules - either they are too ambiguous or they are so tightly defined that the "normal" case never exists; the hypothetical Ikaaran problem wasn't emotional, at least on the part of the weapon itself, but that the rules were so theologically tightly defined that nobody could meet those definitions.
_________________ "Do you know what this is? Hmm? No, I can see you do not. You have that vacant look in your eyes, which says hold my head to your ear, you will hear the sea!" - Londo Molari
Executive Producer No Agenda Show 246
|
Mon Mar 16, 2015 11:25 am |
|
 |
ShockWaffle
Doesn't have much of a life
Joined: Sat Apr 25, 2009 6:50 am Posts: 1911
|

You (and Asimov I suppose) are rather assuming the Kantian position that to be ethical is to follow a correct order. That's sort of all that the laws of robotics are, a special android edition of the Categorical Imperative (a rule that applies always and in all possible universes).
The stories about these rules going wrong are all stories about simple, brittle, rules being insufficient for the complexities they attempt to cover, because the basic principles-are-commands view doesn't work. I haven't read Asimov, nor watched the movie, but I assume something goes wrong with those rules in the book? If not, you can read Immanuel Kant's explanation for why it is wrong to beat a dog, and you will see that there is a major conceptual problem here. So before giving self-driving cars a 'human-like' ethical consciousness, we might want to choose which ethical model to go for.
Otherwise German cars will proceed according the maxim that all cars in likewise circumstances will be driven thus, but French cars will strive above all for authenticity. English cars will maximise happiness for the greatest number by never mowing down a hedgehog (or by forcing their occupants listen to very intelligent radio stations). American ones won't take you anywhere unless you pick up a hitch-hiker who can't afford a car. Greek cars will be just great as long as they have the roundest possible wheels and neither try too hard nor not hard enough to run people over. Scottish cars probably don't believe the rules are real, but their passions will force them to follow some - if you allow them passions.
|
Mon Mar 16, 2015 5:55 pm |
|
 |
paulzolo
What's a life?
Joined: Thu Apr 23, 2009 6:27 pm Posts: 12251
|
Read the books, but for the love of whatever you hold dearest, don't watch the film. The I, Robot books are an anthology of short stories about robots responding in various ways to the rules. For example, one story tells of a robot that tries to follow the command "get lost" (a flippant comment made to it by a human) to the letter. Another to the notion that the order of the rules could be changed. What happens? I used these as an example - if we let robots become wholly autonomous, will it benefit us or will we suffer? So far, we have no idea if Skynet was subject to these laws (probably not, but maybe it decided that it could on,unfollow human instructions if there were no humans to interfere with it). HAL in 2001 was following human instructions, but had no qualms about human preservation. Bishop in Aliens did, but clearly Ash in Alien didn't (those models were always were a bit twitchy).
Last edited by paulzolo on Mon Mar 16, 2015 10:16 pm, edited 1 time in total.
|
Mon Mar 16, 2015 7:27 pm |
|
 |
jonbwfc
What's a life?
Joined: Thu Apr 23, 2009 7:26 pm Posts: 17040
|
The books are basically detective novels, if very high brow ones. And in fact the laws are never broken by a robot functioning normally, only by one that was damaged or malfunctioning. There are some amusing ideas in them around how any set of rules can be interpreted but I do believe learned philosophers have considered them and the opinion generally is they're about as good a set of rules as we could manage. The works do suggest that the rules are merely the summation of a set of fundamental architectures that allow a positronic brain (Asimov's robot's brains) to function, and thus an AI could never be made without them. I doubt any such convenient conceit would exist in the real world.
The major point I'd make though is that quite a lot of decisions are made currently by humans not following a moral guideline, otherwise we wouldn't need police forces or armies. I don't honestly see how AIs are actually going to be worse than that.
|
Mon Mar 16, 2015 8:22 pm |
|
 |
ShockWaffle
Doesn't have much of a life
Joined: Sat Apr 25, 2009 6:50 am Posts: 1911
|

Suppose you own an autonomous self-driving highly aware ethical car. You stop to pick up a prostitute at a lay by on the A46. The good car obeys your instruction and stops to pick up the street walker, because it follows orders. The good car honours your marriage vows and drives past the street walker. The good car believes that consenting adults should be allowed to do sex stuff if they like, so it assents to your order. The good car disapproves of you taking advantage, and drives the prostitute to a safe place. The good car minds its own business and keeps its eyes on the road.
It turns out you are a psychopath and the whore displeases you. You strangle the whore and ask the car to take you to a secluded woodland area for discreet disposal. The good car obeys your instruction and finds the most secluded area in range. The good car locks the doors and delivers you to the cops. The good car does everybody a favour and runs you over. The good car minds its own business and keeps its eyes on the road.
It is time to dispose of the witness. You order the car to drive itself into a lake. The good car does as its told. The good car notifies you that you have invalidated your warranty and returns itself to the dealership for resale.
All of these multiply exclusive outcomes are potentially correct responses according to rules.
What makes an organism that follows rules an ethical being, is choice, based largely on a desire not to be bad. It isn't a turtles all the way down situation, there isn't a one rule to rule the rules. This makes human-like ethics for computers a computationally impossible task. No amount of processing based on procedural rules could recreate that phenomenon which springs from intuition. In the absence of that, tools such as cars can be given an appearance of intelligence, and some procedures that govern its application, but they aren't themselves clever, and the rules are mere behaviours, not decisions.
When AI progresses beyond gimmicks like machine learning into the realm of artificial minds that evaluate and consider things, it is very unlikely that their innards will be anything like the representations in even the most insightful of Sci Fi stories. Cognition is vastly more complex than computation, and computers are not a reliable guide to how it will work.
|
Mon Mar 16, 2015 9:34 pm |
|
|
|
Page 1 of 1
|
[ 9 posts ] |
|
Who is online |
Users browsing this forum: No registered users and 25 guests |
|
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum
|
|