Home
If you're familiar with Star Trek, you know that a young Star Fleet cadet named James T. Kirk had an innovative approach to a training exercise that no one had ever beaten. (I'm going to go out on a limb and assume that most Dilbert Blog readers are familiar with Star Trek.)

That Star Fleet training exercise essentially asked young Kirk, "What would you do if this happened to you?" In my post from earlier this week, I asked readers if it was moral to kill a guy who was 99% likely to kill you in a year. The most common response was something along the lines of "You can't calculate the odds of that sort of thing."

This is a fascinating response, and it's the sort of response I often get when asking a hypothetical question on any topic. It leaves me wondering if the person is unclear on the concept of hypothetical questions, or if he's pulling a James T. Kirk maneuver to avoid exposing some flaw in his reasoning.

Do any of you James T. Kirks want to try answering the hypothetical question again, this time without cheating?

If it makes it easier, I will stipulate that in the real world, people are notoriously bad at predicting the future. You could never have 99% certainty that some guy was going to kill you within a year. But in a hypothetical world where you COULD know that the odds were 99%, is it moral to kill that guy in order to probably save yourself?

 
Rank Up Rank Down Votes:  +10
  • Print
  • Share
  • Share:

Comments

Sort By:
0 Rank Up Rank Down
Apr 12, 2010
Dear Jason Drake:

Lets pretend I am thugman. You have multiple good points, but I am going to kill you now before you comment on another post and make my brain hurt again
 
 
Jun 23, 2009
No I wouldn't and you shouldn't. Here's why. Your premise gives no information on why he kills us or any of the !$%*!$%*!$%*! surrounding the incident.
 
 
0 Rank Up Rank Down
Jun 18, 2009
No, it is not moral to do so. However, it is WISE to kill him, as no one in their right mind would bet on the 1% chance.

@ JasonDrake: Dude, stop over thinking things. It's a simple question. TMI
 
 
Jun 17, 2009
I would absolutely kill that person. There's no question. When is anything 99% sure? We can't even be 99% sure a convicted serial killer is the actual killer. Those kind of odds would compel me. Is it moral? Sure! Why wouldn't it be? Is it moral to let someone kill you? Nope. In our society, if you kill someone, you may get killed back and everyone calls it justice. It's little more than vengeance but we call it moral. That kind of 'logic' applies to this hypothetical question and in a world where you CAN have 99% certainty, you would be a seriously retarded induhvidual to not kill that person post-haste.
 
 
Jun 13, 2009
Yes it is moral. Give me his name and address. Does he live alone? Is he a light sleeper?

If you replace the human with a chicken egg does your answer change?
If yes, doesn't that indicate a flaw in your logic.
You have a 99% chance of being dead either way.
If no, I would venture to guess that you believe death in any form is wrong.
To which I answer, "quit trying to over ride 120,000 years of human evolution.
Many have tried, none have suceeded.

Nice to see everyone stayed off the "Causality" arguement.
 
 
Jun 12, 2009


"I asked readers if it was moral to kill a guy who was 99% likely to kill you in a year.... in a hypothetical world where you COULD know that the odds were 99%, is it moral to kill that guy in order to probably save yourself?"

Scott, I have great respect for your intelligence, even in the realm of philosophy (I enjoyed 'God's Debris'), but this question is semantically sloppy. Because it's a blog, I forgive you.

SCENARIO AS WRITTEN:

Premise #1 (declarative): Thugman is 99% likely to kill me in one year.

Premise #2 (conditional): If I kill Thugman, I will probably save myself.

The conditional is a simple p --> q format:

p : I kill Thugman
q : I will probably save myself

Let's examine q:

'I will probably save myself'... this implies:
'I will probably not be killed' (by Thugman) ... which implies:
'Thugman is <50% likely to kill me.'

The last is false, because it contradicts premise #1 (Thugman is 99% likely to kill me in one year.) Therefore, the premise (I kill Thugman) is false (modus tollens). Killing Thugman is impossible; the morality of it is therefore irrelevant.


VARIANT A (first premise only):

Premise #1: Thugman is 99% likely to kill me in one year.

There is no apparent justification for killing Thugman. It has no deterrent or prevenatative value because the probability of Thugman killing me is already fixed. The punivitive value is dubious at best; given the forewarning, Thugman will almost certainly be caught and punished by others. On the other hand, if I DO kill Thugman, it's nearly certain that he will kill me at the exact same time (he cannot kill me after he is dead, and the 99% probability is fixed), which provides a substantial probability (exact value uknown) that he will be killing me in an attempt at self-defense, in which case I will be the murderer.



VARIANT B (modified premise):

Premise #1: It is 99% likely that ALL of the following are true:

Thugman is currently planning to kill me. Thugman will succeed within a year unless I kill him first (no less-than-lethal restraint is available).

In this case, it IS morally justifiable. Or as shandrew so eloquently wrote, "It would be 99% moral to kill that person." There is nothing wrong with mutli-valued logic (google "fuzzy logic"). All human action, including moral decisions, are limited by uncertainty. I expect that a great majority of murder convictions do not exceed 99% certainty, but if we failed to prosecute at that level we would have almost no deterrent for violent crime. Reciprocally, I am willing to accept the small chance of being falsely prosecuted in exchange for deterrence and restraint of others in high-probability situations (thus it meets the Categorical Imperative... google "Immanuel Kant").




VARIANT C (unreal premise):

Premise #1: It is 99% likely that Thugman will kill me in EXACTLY one year if I do not kill him first.

In order for this calculation to be accurate, we must be in some hypothetical deterministic universe where the laws of quantum physics and the Uncertainty Principle do not apply. Also, the prediction would almost certainly have to come from an external source (using myself as the god-like prognosticator presents too many variables to be dealt with here).

For the morality dilemma to work, it's also important that I BELIEVE what comes out of this supernatural informant/hyper-intelligent alien/ Multivac (google "Asimov" "Multivac"). I would have to believe in Mutlivac's accuracy and honesty, which suggests that its capabilities have been demonstrated to me on mutliple occasions.

If all of society is privy to such demonstrations, then we all understand that the universe is purely and demonstrably deterministic. Thus, there can be no concept of morality because normative staments (i.e., "One SHOULD do this, one SHOULD NOT do that") are nonsense. Choices and possibilities do not exist. Killing Thugman would be neither moral nor immoral.

On the other hand, if Multivac speaks only to me (or me and very few others), I would probably consider myself to be part of a priveleged class and not subject to the same moral standards as everyone else (and with some justification, given the awesome power of Mutlivac).

In either case, Multivac has tremendous influence (possibly unlimited influence) on the course of history, right down to the most minute detail. Even if constrained to provide accurate predictions, it is still capable of withholding information, giving selective data, and/or choosing from several options (whichever it predicts is true, because the predictions are self-fulfilling). Hence Multivac (or its creators) would have moral responsibility for nearly everything that happens.


Cheers,
Jason
 
 
Jun 12, 2009
Someone may have already posted this, but I don't feel like reading all the replies, so ....

There is simply not enough information here to decide. Even if the person has 100% chance of killing you, there is not enough information. The HOW, the WHY, and the MOTIVE of said killing play into the moral picture.

Here are three extreme scenarios:

1. If this guy is a plain psycho murderer who targets you for no good reason, then I say it's moral to kill him even if he only has a 1% chance of killing you.

2. If YOU are a psycho murderer and the guy would end up killing you in self defense when, then it's not moral to kill him.

3. If the guy is meant to kills you by accident through no fault of his, then it's not moral to kill him.

There are really tons of possible scenarios, but those three extr
 
 
Jun 12, 2009
I would rather lock him up in the basement for two years. Why the extra year? Just to make doubly sure.
 
 
Jun 10, 2009
As I see it, what separates killing someone in self defense at the time of the act and killing someone in self defense preemptively is options. If you are being attacked by a mugger you might not have any other options but to kill that person to protect yourself. If you know someone is going to kill you within a year, you have time, and thus you have the opportunity to find other ways to protect yourself potentially.

Similarly, say you are being mugged but know that if you ask the mugger nicely to stop they will. You could kill them and claim self defense, but would that be moral, no, because you had another option, you could have asked them to stop.

Of course the next logical step would be, what if you know (99% sure) that someone will kill you within a year, and that you also know for certain that there is no other option to protect yourself other than to kill them, is it moral to kill them. In this case yes it is moral.
 
 
+2 Rank Up Rank Down
Jun 9, 2009
I would kill myself first just to piss him off
 
 
0 Rank Up Rank Down
Jun 9, 2009
Well, if it was, that would mean I was 100% likely to kill the other guy, in which case I'd already be dead, rightfully killed by the other guy. In other words, by your reasoning, if he's 5% likely to kill me, then I'm 100% likely to kill him, which means he is 100% likely to try to kill me first to avoid that. I don't see how any of that is an intriguing thought.

Maybe what's intriguing is that if that is my reasoning, I am the far more dangerous individual, because I am 100% likely to kill someone (over a mere 5% likelihood of being killed), while the other guy is only 5% likely to kill me. Jails are full of people like that. They're the type that draw guns in arguments over parking space.
 
 
Jun 8, 2009
No. It is never moral to kill anyone. People do not do evil things out of evil. They do them because they falsely believe those things to be in their own self interest. If you know the person is likely to kill you within the next year, you have a year to convince him that it is a bad idea.

Besides, why is the person so destined to kill me? It could be that I am an awful person and my death would be good for everyone. It would therefore be, from a societal perspective, immoral to stop the person from killing me.
 
 
Jun 8, 2009
I think it depends on the people involved who would be a greater benefit to society? But I don't think it's immoral to kill someone. It's illegal but I think it takes more than law or society's view to determine morality.
 
 
Jun 8, 2009
Smart? Yes.

Moral? No.
 
 
Jun 8, 2009
There is nothing in the hypothetical to suggest that you could change the odds with your conduct. The 99% probability of his killing you within the next year presumably takes into account what you may do in response to the knowledge. In those !$%*!$%*!$%*!$ it would not be moral to try to kill him since it wouldn't even change the odds of his killing you.
 
 
Jun 8, 2009
I note with interest that the Christian fundamentalist type here Neil Horsley has severely threatened the life of this abortionist Hern.

http://www.guardian.co.uk/world/2009/jun/05/abortion-america-george-tiller

No-one has killed Horsley in a pre-emptive reprisal, or even arrested him. It's not really cricket, what ho.
 
 
+1 Rank Up Rank Down
Jun 7, 2009
Yes. Self defense is a basic right of all people.
 
 
0 Rank Up Rank Down
Jun 7, 2009
In this hypothetical world I'd like to think that after the first few "I'm going to kill you before you kill me" scenarios we would start to catch on and find a better way of dealing with things. Killing someone isn't the only way to stop someone killing you... for example maybe you could move somewhere far far away
 
 
Jun 6, 2009
Depends on your idea of morals I suppose. I would do it out of a sense of self preservation and not lose any sleep, so I guess to me it would be moral.
 
 
Jun 6, 2009
1% chance he won't kill still means I could be killing an innocent man. Morality won't permit that. That's not an acceptable risk. I don't base my objection to this strategy (hypothetical as it is) of yours on inability to calculate the odds, although that is a pragmatic fact in any implementation plan to follow from it. I base it on the fact that people have to be allowed to act before they are punished for transgressing a law. And if there is a 1% chance he won't kill me in the next year and no obvious 'clear and present' danger to myself, then I'm obligated to give that 1% chance an opportunity to occur.
 
 
 
Get the new Dilbert app!
Old Dilbert Blog