Defend Deontological Constraint against Intending …

Defend Deontological Constraint against Intending …

 

Think of yourself as an engineer trying to determine the fundamental
moral principles that will govern all robots in our civilization (Robots, for our purposes, can be androids (look like humans), but they can also be sophisticated computing devices making decisions on our behalf.) Imagine that you are locked in a dispute with rival engineers who are fighting to ensure that the powers that be reject your ideas. Your rivals favor either different fundamental principles or place them in a different order of priority, believing that their alternative principles governing robotics will best serve humanity. The powers that be are split between your proposal and that of your opponent, but many are leaning against you. Your task is to write a convincing case that can sway many of these undecided leaders who sympathize with your opponent to support your proposal in the end. In order to do this, you should first build your own moral theory using the “Pre-Writing” questions below. These questions will help you determine what kind of moral theory you support: i.e. consequentialism or a version of deontology. After you do so, you should go back and answer these questions for your opponent,ensuring that there is at least one difference between you (if you are a consequentialist, they are a deontologist, etc.). Your next task is to formulate yourmoral theory (as well as your opponent’s) in the form of principles. It may be easier for you to see how this is done by using a metaphor derived from The Three Laws of Robotics created by Isaac Asimov.

In your paper, you should first explain the choices you made and make rational arguments in favor of
them (i.e. give logical reasons, based in critical analysis of the concepts, to show why they are best). Next, consider the perspective of your opponent; how would they attack your position to demonstrate that your view is wrong? In other words, how would they try and identify a fatal or embarrassing flaw in your principles? Try to make the strongest and most plausible case against your view that you can (i.e. give logical reasons, based in critical analysis of the concepts, to show why the opposing view is best). Throughout these steps, you should draw on any helpful examples you create; you should aim for “reflective equilibrium” by trying to find a theory that is right about all the cases you choose (ie. if your theory seems wrong in a particular case is the theory wrong or is your intuition about the case wrong?).

Please include multiple cases and let the essay focus on these cases in favor of Deontology and demonstrate how it is the best principle and ideology for our robots.

Cases like the trolley case:
The trolley problem is a thought experiment in ethics. The general form of the problem is this: There is a runaway trolley barreling down the railway tracks. Ahead, on the tracks, there are five people tied up and unable to move. The trolley is headed straight for them. You are standing some distance off in the train yard, next to a lever. If you pull this lever, the trolley will switch to a different set of tracks. However, you notice that there is one person on the side track. You have two options: (1) Do nothing, and the trolley kills the five people on the main track. (2) Pull the lever, diverting the trolley onto the side track where it will kill one person. Which is the correct choice?

Sophie’s Choice case: from the movie sophies choice on the night that she arrived at Auschwitz, a camp doctor made her choose which of her two children would die immediately by gassing and which would continue to live, albeit in the camp. Of her two children, Sophie chose to sacrifice her seven-year-old daughter, Eva, in a heart-rending decision that has left her in mourning and filled with a guilt that she cannot overcome. Perhaps demonstrate how your peers have argued that deontology might not be the best ideology for this particular case but demonstrate also how this case is EXTREMELY rare and that just because this case proves deontology isnt always right does not mean it is completely wrong,

Transplant case:
A brilliant transplant surgeon has five patients, each in need of a different organ, each of whom will die without that organ. Unfortunately, there are no organs available to perform any of these five transplant operations. A healthy young traveler, just passing through the city the doctor works in, comes in for a routine checkup. In the course of doing the checkup, the doctor discovers that his organs are compatible with all five of his dying patients. Suppose further that if the young man were to disappear, no one would suspect the doctor. Do you support the morality of the doctor to kill that tourist and provide his healthy organs to those five dying persons and save their lives ?

Is this the question you were looking for? If so, place your order here to get started!