Home
Responses to my post from yesterday seemed nearly unified in agreeing that it is immoral to kill someone who has a 5% likelihood of someday killing you in the future. Most people agreed that in order for self defense to be a moral act you need the danger to be more immediate. And if you use the 5% threshold of danger you can justify killing just about anyone now so they won't accidentally drive over you in their car later.

Most of you agree that if someone pulls a gun and says, "I'm going to shoot you after I rob you so there is no witness," you would be morally justified in killing him first if you can manage it, even if he hasn't started to pull the trigger yet. But you wouldn't be morally justified in killing people who own guns just because they might someday use them on you. That's what the consensus seems to be.

But what if your odds of being killed by one individual, one year from now, are 99%? Is it moral to kill that person now if any delay in doing so makes it less likely you could get him before he got you next year?

If you say it is moral to kill that person to save yourself, because a 99% chance is very different from a 5% chance, then I would argue that morality isn't part of your calculation. You're simply making a judgment of what is practical, both for you and for society.

If you say it's not okay to kill someone who will almost certainly kill you sometime next year unless you get him first, you are highly moral indeed.
 
Rank Up Rank Down Votes:  -1
  • Print
  • Share

Comments

Sort By:
+1 Rank Up Rank Down
Aug 12, 2010
It's a trap, when you go to kill the man his kung fu is stronger than yours and you die. There was a 99% chance you would go to kill that man, the 1% was the odds you would not.

Why do we think accurate fortune telling could exist if you could change the future?



 
 
0 Rank Up Rank Down
Jun 7, 2009
Surely morality is defined by what is reasonable for society to exist in a harmonious way, otherwise how could we ever decide what is moral and what is not?
 
 
Jun 4, 2009
I stand by what I said in the previous article.

You can't go around making assumptions. You have to let people act or at least start to act in a discernably and immediately threatening manner before taking *any* action against them.

Also, it would be a ridiculous example of a bifiurcate fallacy if you tried to say the only option, if a guy had a high % of killing you in the next year, is to kill him first. The assumption here is that you could not convince him otherwise, that you could not have him arrested, etc. - other options could well be possible and any example that says otherwise is artificial without much attempt to be reasonable.

If I knew (and how could I, which is an important part of the morality) that someone was very likely to try to kill me in the next year (because I couldn't know they would succeed), then I would probably take some steps to better protect myself, but one of them would not be killing the other person predictively. He could by lying, incompetent, felled by a heart attack in the next year, experience a change of heart, or my own precautions and other options may preclude the necessity on my behalf to kill. Therefore, morality dictates that I not kill needlessly or prematurely and only at an imminent situation of gravest danger. A y ear away is not in any terms imminent nor certain.

And thank you for noting that I'm quite moral.

As a final note: If someone has a weapon and is actively pointing it at me and telling me he's going to kill me, if the best judgement I can make looking him in the eyes is that he means it, I may do something to disable him, possibly killing him, before he can execute a plan to kill me. My first inclination would be to disable, not kill. I have never taken a life and would rather not discover what that is like to live with even if it is justified. Failing that, if I do kill him, I could still be wrong - I could have misread him, he could be lying if he said he was going to kill me, etc. Ultimately, that killing might *NOT* be moral. It might be pragmatic, however.

Morality sometimes requires we do not do the immediately pragmatic thing. Relatedly, sometimes we do the pragmatic thing despite morality. But we should not try to distort pragmatism in an attempt to transmogrify it INTO moral conduct. The two are different. We may give in to pragmatism, but that won't ever make it the moral choice.
 
 
Jun 4, 2009
At the end of this line of reasoning lies a very dark and scary world where social and psychological profiling allows the government to "eliminate" possible criminals and "deviants" before and after they are born. A child of a alcoholic has a high chance of becoming a alcoholic them self, costing the government money and risking the life of everyone else, so lets just get rid of them in advance. When I drive my motorcycle there's a relatively high chance that someone driving a car will run me over due to no fault of my own, so I should just kill everyone I see driving a car.
There is never a morally acceptable reason to kill someone in advance based on a statistically calculated risk. In the case of Israel, it is according to your reasoning, it is just as morally correct for the palestinian people to kill every Isaeli they can get a hand on, since Israel is responsible for killing so many Palestinians...
There are always (BOCTAOE) some other way to solve a conflict than killing, though the American people seem unable to undestand this...
 
 
Jun 4, 2009
As always, very interesting post Scott. :)

However, I don't think we would ever be able to actually predict the chances of a SPECIFIC individuals behavior, even if we can somewhat predict the behavior of a nation or a culture(which has been done in the past). So I know it's supposed to be hypothetical, but we still have this problem : namely that we can't make a probability judgement because too many of the variables required for doing so are not accessible to us.


note : I would also say, as I'm sure you know, that anytime we try to get into talk of morality the conversation quickly gets murky. Because every time we morally assess an action we are doing so based upon other moral assessments. IE, abortion is wrong because it's wrong to take innocent life. So a particular action(abortion) is seen to be wrong because of a previous commitment to another moral position, that taking innocent life is wrong. So it ends up becoming : A is wrong because of B, and yet B remains unjustified because it itself depends upon other moral assessments, which depend upon other moral assessments.

Anyways, thanks for letting us participate on your blog. Love reading it.

- Andrew
 
 
Jun 3, 2009
Mr. Wampus:

Thanks for the links. The second one was especially apt. Teens are really big on copyright issues - because they inhabit a world of ripped music and videos with a few finger-wagging adults visible through the haze on the periphery.

I want Mr. Adams to write a text book on logic for at teens. Unfortunately he'd have to partner with a subject matter expert and a curriculum-development specialist. Since the likelihood of either of those individuals having a sense of humor is rather low, Scott would need to retain editorial control. This arrangement would have a roughly 90% chance of ending in violence - so the ultimate impact on society would all come down to whether the heirs could agree to release the book.

Worth thinking about at least....
 
 
Jun 3, 2009
I say kill them all and let god sort them out. 100% chance that they won't be killing you anytime soon. Sure it will smell for a while, but think of the parking!
 
 
Jun 3, 2009
Stomper - more for your amusement.

A hypothetical moral situation is a runaway train hurtling down the tracks... It is heading right for five kids playing on the tracks. They're sure to be killed. But you have a switch in front of you that allows you to divert the train onto a side track. There is only one kid on that side track who will be killed. Will you throw the switch? What if that one kid is your kid?

What Scott is doing is applying numbers to human behaviour. The assertion I am making is that is a flawed argument.
 
 
Jun 3, 2009
Morals are society's code of conduct. So, it's perfectly reasonable to distinguish between dangers of high likelihood and low.
 
 
Jun 3, 2009
Actually, you just maneuver them into marrying your sister in law. That way, the punishment will fit the "almost crime" that he's too nagged out to commit anymore!
 
 
+1 Rank Up Rank Down
Jun 3, 2009
It would be immoral and illegal if the probability was 100%.
 
 
Jun 3, 2009
Doesn't you wanting to kill someone make it perfectly alright for him to now kill you for wanting to kill him for wanting to kill you?
 
 
Jun 3, 2009
I have a different scenario for this morality test, which I think is interesting:

If you knew there were a 100% chance that a person would murder you sometime in the future, would it be moral to kill him first? Judging by the comments, many people would say "yes." But, what if you knew there was a 100% chance that a person would unintentionally cause your death (such as by a car accident) sometime in the future? Would that be moral? What about if you knew that, through chaos theory or something, some stranger would unknowingly cause your death (they do something which sets off a very long system of chain reactions which at some point leads to your demise)? Would they be moral to kill?

This is, of course, disregarding incarceration for bludgeoning a guy to death on the basis that "his getting a latte would have killed me."
 
 
Jun 3, 2009
I believe the punchline you're looking for is "We've already established what you are, now we're just quibbling over price."

That seems about appropriate for your argument here.

 
 
Jun 3, 2009
What if the 5% where assigned to something else,

What if I'm your neighbour, and my reckless driving means that I have a 5% chance of killing you in a (completely unintentional) driving-related incident over (say) the next ten years,

Do you have the right to kill me then? Because fundamentally the chances of you being killed are equal - the only difference is my intent,

And if I find out that you plan on killing me as a consequence of this threat on your life, am I justified in killing you (because you "plan to kill me")?

If you still think so, then I would suggest that your revised law will create a lot more gun-crimes...
 
 
Jun 3, 2009
As soon as you decide to kill the other party due to their (whatever %) chance of killing you, you have assigned a high percentage chance of you killing that person, which in this thought experiment means that they are then morally justified in getting rid of you. Something about that strikes me as wrong, thus I think the moral justification has to be rejected.

Say when Iraq was invaded, Iraqis knew there was a very large chance that many Iraqis would die due to the invasion of their country. Had Iraq possession of nuclear weaponry, would they then have been justified in wiping out as much of the US as possible?
 
 
+2 Rank Up Rank Down
Jun 3, 2009
I don't see anyone pointing out that if we're moist robots with no free will (a position Scott has espoused in the past), then the question of morality doesn't come up. Morality presumes free will. Without free will, there's no reason for imputing moral responsibility to anyone for anything since they could not act other than how they did -- they didn't have a choice.

If I believed that there was a likelihood someone would attack and possibly kill me in the next year, I would do what I could to turn that person into a friend rather than an enemy.

If there's a likelihood he'll kill me, there's also a likelihood that I'll kill him. From that perspective, it seems like he would be as justified in trying to kill me as I would be in trying to kill him.
 
 
+3 Rank Up Rank Down
Jun 3, 2009
Hi everybody,

While everybody's arguing if its right to kill when the odds of the other guy killing you are 5 % or 99 % or whatever.I would like to ask if killing the other guy the only means of saving oneself from getting killed?? Surely there are other ways to protect onself ???
 
 
-3 Rank Up Rank Down
Jun 2, 2009
I don't know if this has been addressed already (there are, after all, dozens of comments on this and the previous post) but I'd like to sharpen the metaphor a bit:

A guy repeatedly tells you you're going to die. He makes it clear that he finds this state of affairs desirable, but he doesn't specify whether it's going to be by his hand or not. Different people have different opinions as to what he means by this, obviously, and there are many laws that govern whether or not this can or should be constituted as a threat on your life, legally.

But then this guy goes out and buys a gun. He claims it's a replica gun that he's going to hang on the wall because it looks nice, but you have no way of verifying this.

What does your "5 percent chance" look like now?
 
 
Jun 2, 2009
You are ignoring a critical element in your logical thread - intent.
The statistical chance of being killed is one thing, but you cannot disregard intent (yes I know we're moist robots with no free will...) when considering morality.
 
 
 
Get the new Dilbert app!
Old Dilbert Blog