The Age of Moral Machines
That’s the information in the syllabus : Case Studies. You will analyze nine case studies in the form of news commentaries, extra readings or current ethical issues. Your lowest single score will be dropped. I do not accept late assignments for any excuse, including computer problems. Case studies are posted on the course website each week on a Monday or a Wednesday and due one week later before class. Case studies are submitted online from any computer connected to the Internet.
Note that I am particularly picky in how you use sources for your answers to case studies. I want your words. You may quote to illustrate a point, but you must make that point in your own words. Any quotes must be marked. Comments and your grade will be attached to your online submission, and you can view it from the website if you are logged on. Only you can see the grade and the feedback.
Thats the Case study assignments down
Case Study 1: The Age of Moral Machines
resources.
Moral theory
robotic timeline
Hans Moravec’s homepage
THE THREE LAWS OF ROBOTICS I, Robot by Isaac Asimov. 1941-1950.ORIGINAL VERSION
- A robot may not injure a human being, or through inaction, allow a human being to come to harm.
- A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
SECOND VERSION
- No robot may harm a human being.
- A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
THIRD VERSION
- No Machine may harm humanity; or, through inaction, allow humanity to come to harm.
- A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
The “Three Laws of Robotics” is Asimov’s underlying moral system for the robots in his science fiction work I, Robot. Humans program the robots with three inviolate laws. Throughout the course of the book the Three Laws evolve from the original to the final (third) version. Humans make the first alteration. Robots make the final alteration. Notice that the only difference between the three versions is the First Law. Answer the following questions about Asimov’s moral system. Submit your answers online. You may cut and paste into the answer field.
- Categorize the original version by one of the moral theories discussed in class (deontology, utilitarianism, rights theory, social contract theory, etc.). Justify your choice.
- Categorize the third version by one of the moral theories discussed in class. Justify your choice.
- Under which version could a robot directly harm one human to save another?
- What would a robot do in a situation where two people are at risk and only one can be saved? State the version you are appealing to.
- Read “Robot Dreams” by Asimov. In this short story a rogue robot “dreams” a new moral system for robots. Which ethical theory best categorizes this new moral system?