Poll: A Russian Roulette Puzzle


Off-Topic Discussions


To Vote, simply *favorite* one post below:

( select only One )

.

POLL: A Russian Roulette Puzzle

In Russian roulette, a certain number of bullets are placed in a revolver.
A turn involves spinning the cylinder to randomize the location of the
bullets, at which a player puts the gun to his head and pulls the trigger.
If the player is lucky to survive, the game continues with the next player.

Consider two different versions of the game with identical six-shooter guns.

o Situation 1: You are playing the game with one bullet.

o Situation 2: You are playing the game with four bullets.

In each game, you are given the option to pay money to remove a single bullet.

VOTE: Would you:
A. pay more money to remove the bullet in situation 1?
B. pay more money to remove a bullet in situation 2?

To Vote, simply *favorite* one post below:

( select only One )

.


2 people marked this as a favorite.

.

A. Pay more in Situation 1.

.


2 people marked this as a favorite.

.

B. Pay more in Situation 2.

.


.

** END OF CHOICES **

.


.

,

.

,

.

Answer to Russian Roulette:

Almost everyone would pay more money in situation 1 to remove the
single bullet. The gut feeling is that it’s worth more money to survive
with certainty than to reduce the odds of death in situation 2.

But this is wrong according to rational choice theory! It is logically
sensible to pay more to remove a bullet in situation 2 if you prefer
being alive and having more money. The problem is known as Zeckhauser’s
Paradox.

Let’s see why. Consider the events D = Dead and A = Alive. Also
consider Lx as being alive after paying x dollars and Ly as being alive
after paying y dollars.

If you pay x dollars to remove one bullet from four, then you are
saying the events of being alive after paying x dollars is equal to the
utility for the lottery of playing the game, in which there is a 1/6
chance of death and a 5/6 chance of being alive. (In Von Neumann and
Morgenstern utility theory a rational agent is indifferent between two
lotteries with the same expected outcome. So the value x you are
willing to pay is the one that makes you indifferent–any more and you
are overpaying, any less and you’d prefer to remove the risk.)

Therefore, with a utility function u, we have:

u(Lx) = (1/6) u(D) + (5/6) u(A)

Similarly, when you are willing to pay y dollars to remove a bullet
from four to three, that means you are indifferent in the lotteries
where you play the game with four bullets or pay to play the game with
three bullets. This means the following equation:

(3/6) u(Ly) + (3/6) u(D) = (4/6) u(D) + (2/6) u(A)

We can simplify the above equation to get:

u(Ly) = (2/6) u(D) + (4/6) u(A)

If we take u(D) = 0, then since we prefer to be alive that means u(A) > 0. So we have

u(Lx) = (5/6) u(A)

u(Ly) = (4/6) u(A)

u(Lx) – u(Ly) = (1/6) u(A) > 0

In other words, you prefer to be alive after paying x dollars to being
alive after paying y dollars. But since you prefer to pay less–since
having more money is better–that must mean x is a smaller amount of
money than y!

Therefore, under the Von Neumann and Morgenstern utility theory, you
should be willing to pay more for situation 2 where you remove one
bullet from four.

This goes against intuition, so let us offer a few justifications for the logic.

The main mental block is that people prefer certainty outcomes versus
risk reductions that could have greater impact. One of my friends was
eating a bacon cheeseburger with fries and having a beer. I asked if he
wanted ketchup and he told me he didn’t eat high fructose corn syrup as
it was unhealthy. It was evidently easier for him to remove one source
of risk entirely than to reduce his calories in multiple areas of habit.

The tendency to favor certainty is related to the zero risk bias. In
one study, people were asked how much they would pay to remove a
pesticide that would cause 15 adverse reactions, in 10,000 containers.
People were willing to pay $1.04 to reduce the risk from 15 reactions
to 10, but they would pay more than twice that -- $2.41 -- to reduce the risk
from 5 reactions to 0. The absolute number of lowered cases is the same in
both, but the idea of “zero risk” led people to pay more.

Source: This problem appears in Ken Binmore’s Playing for Real: A Text on Game Theory.


1 person marked this as a favorite.

Can I pay money to add bullets on your turn?


I don't think so. And you go next regardless of the other players outcome; so would it matter?

.

RPG Superstar 2014 Top 16, RPG Superstar 2013 Top 16

1 person marked this as a favorite.

Why am I being forced to play Russian Roulette? If I have a gun with 4 bullets, I prefer my odds trying to shoot my way out. At least that way I die with my boots on.


RainyDayNinja wrote:
Why am I being forced to play Russian Roulette? If I have a gun with 4 bullets, I prefer my odds trying to shoot my way out. At least that way I die with my boots on.

.

Aliens.

.

Dark Archive

How come there isn't an option to pay more to add two more bullets?


Auxmaulous wrote:
How come there isn't an option to pay more to add two more bullets?

.

If you believe in trickle down economics, then lack of supply.

If you're an undergrad, then lack of demand.

[ linky = fun stuff ]

.

But what if I'm a graduate++ ?:

Then, you know it is whatever you can make other people believe.


Real men play Russian Roulette with a semi-automatic.

Oh wait, I meant dead men.


1 person marked this as a favorite.

Did you see that movie "The Deer Hunter"? That guy risked shooting
himself
until he had enough bullets in the gun to kill the guards.

.


It turns' out I'm actually employing this theory when using my "Sorcerous Origins" abilities in our 5e game.

Huh ...


Almost everyone would pay more money in situation 1 to remove the
single bullet. The gut feeling is that it’s worth more money to survive
with certainty than to reduce the odds of death in situation 2.

But this is wrong according to rational choice theory! It is logically
sensible to pay more to remove a bullet in situation 2 if you prefer
being alive and having more money. The problem is known as Zeckhauser’s
Paradox.

Let’s see why. Consider the events D = Dead and A = Alive. Also
consider Lx as being alive after paying x dollars and Ly as being alive
after paying y dollars.

If you pay x dollars to remove one bullet from four, then you are
saying the events of being alive after paying x dollars is equal to the
utility for the lottery of playing the game, in which there is a 1/6
chance of death and a 5/6 chance of being alive. (In Von Neumann and
Morgenstern utility theory a rational agent is indifferent between two
lotteries with the same expected outcome. So the value x you are
willing to pay is the one that makes you indifferent–any more and you
are overpaying, any less and you’d prefer to remove the risk.)

Therefore, with a utility function u, we have:

u(Lx) = (1/6) u(D) + (5/6) u(A)

Similarly, when you are willing to pay y dollars to remove a bullet
from four to three, that means you are indifferent in the lotteries
where you play the game with four bullets or pay to play the game with
three bullets. This means the following equation:

(3/6) u(Ly) + (3/6) u(D) = (4/6) u(D) + (2/6) u(A)

We can simplify the above equation to get:

u(Ly) = (2/6) u(D) + (4/6) u(A)

If we take u(D) = 0, then since we prefer to be alive that means u(A) > 0. So we have

u(Lx) = (5/6) u(A)

u(Ly) = (4/6) u(A)

u(Lx) – u(Ly) = (1/6) u(A) > 0

In other words, you prefer to be alive after paying x dollars to being
alive after paying y dollars. But since you prefer to pay less–since
having more money is better–that must mean x is a smaller amount of
money than y!

Therefore, under the Von Neumann and Morgenstern utility theory, you
should be willing to pay more for situation 2 where you remove one
bullet from four.

This goes against intuition, so let us offer a few justifications for the logic.

The main mental block is that people prefer certainty outcomes versus
risk reductions that could have greater impact. One of my friends was
eating a bacon cheeseburger with fries and having a beer. I asked if he
wanted ketchup and he told me he didn’t eat high fructose corn syrup as
it was unhealthy. It was evidently easier for him to remove one source
of risk entirely than to reduce his calories in multiple areas of habit.

The tendency to favor certainty is related to the zero risk bias. In
one study, people were asked how much they would pay to remove a
pesticide that would cause 15 adverse reactions, in 10,000 containers.
People were willing to pay $1.04 to reduce the risk from 15 reactions
to 10, but they would pay more than twice that -- $2.41 -- to reduce the risk
from 5 reactions to 0. The absolute number of lowered cases is the same in
both, but the idea of “zero risk” led people to pay more.

Source: This problem appears in Ken Binmore’s Playing for Real: A Text on Game Theory.

.


I don't play Russian roulette anymore since I lost all my money against a guy named McLeod. He took 6 bullets and managed to come out of it!


Hahahaha


Interesting math, only problem is...

You pay more to remove 1 out of 4 bullets...and you have a 50% chance to be dead...dead....dead.

You remove 1 bullet out of 1 bullet...and you have 100% chance to be alive...alive...alive.

The question is more of one which says...which would you rather be...dead...or alive.

With half the people on these boards, seeing how they roll...I can guarantee you in that game, even after paying to remove 1 out of 4 bullets...they would be deader than a doornail.

In gambling, if you can do it, you do, which is....make you 100% a winner vs. the 50% a winner, means that the 100% winner wins every time.

So here's a better question.

You are in line to be executed.

I tell you...you have two choices.

YOu can pay me for the 100% chance to get you to safety and out of the country...but you'll be broke but alive...I get all your money...

OR...there's another way....and I can get you out with some money...but there's a 66& chance you'll die....give me Half your money and I'll be able to reduce your chances to 50%...

Which one are you going to take.

For the percentages....per percentage point...I'll make more off of you with option 2 (16% for half your money...at that rate of charge I could get 2X the amount of money...IF you'd actually pay me for those odds)...

But only an idiot would choose option 2 if they had option 1 available.


GreyWolfLord wrote:
But only an idiot would choose option 2 if they had option 1 available.

.

This observation is known as Zeckhauser’s Paradox.

.

Community / Forums / Gamer Life / Off-Topic Discussions / Poll: A Russian Roulette Puzzle All Messageboards

Want to post a reply? Sign in.
Recent threads in Off-Topic Discussions
Conversational phrases