Subscribe via feed.

Swap or not?

Posted by Chris on October 28, 2011 – 6:02 am

1. I give you an envelope containing some money (you may open it if you want). I then offer you another envelope and tell you that it is equally likely to contain either half as much or twice as much money as in the first envelope and ask if you’d like to swap them. Should you swap (and why)?

2. I place two sealed and indistinguishable envelopes on a table. I tell you that one contains twice as much money as the other. You may pick one of the envelopes (you may open it if you want). You may then swap envelopes (but only once). Should you swap, and why?


This post is under “Logic” and has 20 respond so far.
If you enjoy this article, make sure you subscribe to my RSS Feed.

20 Responds so far- Add one»

  1. 1. mishu Said:

    isn’t that the monty hall paradox?

  2. 2. DP Said:

    I’ve seen a problem like this before, only it looks like you artisticly changed the application. I think I remember (at least for problem #2) that you will want to swap them.
    I first want to walk through it myself to find out the logical difference between you handing me the envelope and me picking one up randomly, then I’ll try and post my results if no one has answered to my (and your) satisfaction. I may still post if it will help clear it up for us simple-minded folk.

  3. 3. cazayoux Said:

    a bird in the hand …. I’ll not swap.

    Between the two bad options
    …1 – swap and get less
    …2 – don’t swap and get less

    I’d feel much worse swapping and getting less.

  4. 4. Chris Said:

    Hi DP. I’d like to have said, “only fools rush in”, but I can’t insert this comment before cazayoux’s, so I won’t (LOL).

    Hi cazayoux. I’m not completely sure what you’re saying. Also, you haven’t explained why you have given those answers.

  5. 5. JB Said:

    1. You are getting 2 to 1 odds if you swap. If the first envelope contains $100, the second envelope contains either $50 or $200. You give up $50 to get and extra $100.

    2. On the surface at least, it looks like you are getting the same odds as in the first question. So I would again swap.

  6. 6. JB Said:

    Oh yes, I did leave out the significant fact that in both situations, the odds are 50/50 of the second envelope holding the greater amount. So, getting 2 to 1 odds on an even odds event seems like a good deal to me.

  7. 7. DP Said:

    I now see the difference between the two. by the way, mishu, this isn’t quite the Monty Hall problem (at least the classic version) as that has 3 doors, and you’re odds change from 1/3 to 2/3 if one (bad) door is opened.

    ———–

    1)There are 3 possible outcomes.
    a: You gave me $100. I keep it and have $100.
    b: You gave me $100. I switch and get $50.
    c: You gave me $100. I switch and get $200.
    We can now assume I’ve got $100 no matter what, and if I end up with anything less, I’ve lost money.
    So for a, I have a 100% chance of gaining $0.
    Since b&c are equaly likely, I have a 50% chance of losing $50, and 50% chance of gaining $100. So If I switch, on average I will earn (0.5)(-50)+(0.5)(100) = $25 more than the $100 you gave me.
    So I switch.

    ———-

    2) There are 4 possible outcomes.
    a: I take the envelope with $100 in it, keep it and have $100. (stay even)
    b: I take the envelope with $200 in it, keep it and have $200. (stay even)
    c: I take the envelope with $100 in it, switch and have $200. (gain $100)
    d: I take the envelope with $200 in it, switch and have $100. (lose $100)

    a&b are obviously equally likely.
    If I play several games of this and never switch, I will end up with (on average) $150.
    c&b are also equally likely.
    If I play several games of this and always switch, I will end up with (on averagve) $150.
    There is no benefit to switching in this game.

  8. 8. JB Said:

    I do not see the difference between being handed an envelope and picking one myself. The odds should be the same that you initially hold the lesser (or greater) amount before the swap either way.

  9. 9. Chris Said:

    Hi DP. I won’t hold you in suspense. You didn’t do good, you did very good indeed. Would you believe that this is a notoriously “difficult” problem to have solved – particularly case 2.

    That was very well analysed. For 2, you quite correctly avoided suggesting that it was equally likely that the second envelope contained twice or a half of what the first envelope contained. By that, I mean that if the first envelope contained $100, then it is not equally likely that the other contained $50 or $200. Of course the second envelope must contain $50 or $200. Without more information (or by making an assumption), you cannot assign a probability to the contents of the second envelope.

    A big difference between the two cases is (other than you didn’t choose the envelope in case 1), is that in case 1, e.g. $50, $100 and $200 were possible amounts of money; whereas in case 2 only either $50 and $100 or $100 and $200 were possibilities.

    I’ll try that another way. Case 1 – if you opened the envelope and found $100 in it then the second envelope will contain $50 half the time, and $200 the other half of the time – the question clearly states that that is the case. Case 2 – you open the first envelope and find $100 in it. The other envelope either contains $50 or $200. It will not sometimes be $50 and sometimes be $200 (unless I sometimes put different mounts of money in the envelopes, but then there is nothing to force me to do that with equal likelihood); it’s reasonable to say that there isn’t really a probability for the second envelope containing one amount or the other.

    Obviously, a nod to JB for case 1. But I’m still not quite sure what Cazayoux’s reply is, I think he may have got one of them right!

  10. 10. dannt Said:

    i dont care which envelope i get they both have money

  11. 11. Chris Said:

    Hi JB, you posted while I was writing my last comment. You’re right, it is equally likely that you have started with the higher (or lower) amount. DP hasn’t suggested otherwise. That’s why this is a good “paradox”.

  12. 12. Chris Said:

    I could have been mean when writing the problem, and said that, in both cases, you actually open the first envelope and find $100 in it.

    In case 1, I would on average be putting up $100 +(50+200)/2 = $225 at each trial, and you’d be taking $125 on average if you swap, or $100 if you don’t swap. Note that in case 1, I can guarantee that the first envelope contains $100 (because I chose it, not you).

    In case 2, I would either to have be putting up $100+50 = $150 (case 2a) or $100 +200 = $300 (case 2b) at each trial. But, I decide what I actually put up, the laws of probability don’t force me to put both possible amounts into the envelopes (at all, yet alone with equal likelihood). If I had actually put $100 and $200 in the envelopes every time, then you will never find $50 or $400 in any envelope. For that reason, you cannot assign a probability to the contents of the other envelope (without further information) – doing so would be tantamount to saying that you control what I put into the envelopes, and so reversing the role of cause and effect.

    The only way that I can guarantee $100 in the first envelope, is by putting $100 into both of them.

    (Still in case 2,) I could randomly (but not necessarily with equal likelihood) be putting up $50+$100 or $100+$200, but that only obfuscates the problem – your decision to swap, or not, is still irrelevant.

  13. 13. Blake Said:

    X being the value of money in envelope one, and y being the value in envelope 2, you have either (for both problems):

    y=x2 or y2=x

    Set the amount of money in the first envelope at any number.

    y=100(2) or y2=100
    so
    y=200 or y=50

    You will be gaining or losing the same -percentage-, but the actual amounts are lose .5x or gain 1x. While it may be true that over an extended period of time the odds would average out, we are talking about -one- time, -one- envelope. Therefore you should always swap.

  14. 14. Chris Said:

    Hi Blake. For problem 2, if I (always) offered you two envelopes, one containing $50 and one containing $100, and you happened to pick the one containing $100, the other would (always) contain $50 and it would never contain $200.

    Your maths cannot dictate what I put into the envelopes.

    ——
    Here are the calculations for the case where the quantities in the envelopes are independent random variables:

    Problem 1: Let the first envelope contain x. Let the second envelope contain an amount z, where z = x/2 with a probability 1/2 and z = 2x with a probability of 1/2. If you don’t swap, the expected value of your takings is E(x). If you do swap, the expected value of your takings is E(z) = E(x/2)/2 + E(2x))/2 = 5 E(x)/4.

    Problem 2: Let the envelopes contain y and 2y, where y is a random variable, independent of the selection of the first envelope. NB the selection process defines the first envelope. There is a probability of 1/2 that you will first select the envelope that contains y and a probability of 1/2 that you will first select the envelope that contains 2y. Let E(x) be the expected value of the first envelope. Then E(x) = E(y)/2 + E(2y)/2 = 3E(y)/2. If you swap, then let E(z) be the expected value of the second envelope. Then E(z) = E(y+2y) -E(x) = 3E(y) -E(x) = 3(2/3)E(x) -E(x) = E(x). So swapping doesn’t change your expected take away value.

  15. 15. Chris Said:

    I forgot to mention that the posed problem is a Harvard grade problem.

    The saddest thing about problems like these is that once you’ve understood them, they’re boring :( But at least they’re fun while working them out. So I will continue to post them.

    See http://en.wikipedia.org/wiki/Two_envelopes_problem for more info/insanity about the posted problem (mainly for the second case).

  16. 16. Blake Said:

    Chris you’re wrong on two points. The second problem in fact is identical to the first, the only difference being an artificial comfort that you get to choose the first envelope. The actual amount of the money does not matter at all, since its still an identical loss or gain, in direct proportion.

    Also true is the fact that the probability is a false comfort. If the problem was “You are given 2, or 5, or 10, etc. envelopes, then you’d be absolutely right to take into account probability. As it is though, that’s -not- the question as written. You’re only getting one envelope, one choice.

  17. 17. Chris Said:

    Hi again Blake. Point 1: In problem 1, you could walk off with one of three amounts: x/2, x or 2x. In problem 2 you could only walk off with one of two amounts: y or 2y. Therefore the problems cannot be identical.

    Point 2: Taking your paragraph as a whole, you are saying that probability is completely irrelevant for one-offs, and therefore you should … ??? You don’t clearly say what to do, but I guess you meant always swap. Having discarded probability, you haven’t shown what “alternative math” you used to arrive at your conclusion.

    Except perhaps for a very small minority, your view is certainly not that held by the mathematical or scientific community. It is perfectly proper to consider (imagine) what would happen if you were to try many times. Even if you really do many trials (yet alone only 2), you would usually only get near to the average.

    I completely reject the notion that you can’t use probability to describe one-off trials. On the contrary, it is the only objective tool available.


    For either problem, the decision might be based on other considerations. For example, it may be that you have an urgent need that is satisfied by whatever you get in the first enevelope, but not if you ended up with the lower one. The consequence of getting less may actually reduce your whole world value by too much. Then you’d stick.

  18. 18. Blake Said:

    There are several variables to this that are not addressed in the actual question as asked. Do I know you? Do I know if you’re more likely to give me the larger amount or the smaller amount first? Do you have some specific strategy to which you will offer first? These are things that -could- be addressed in the question, but are not, as it is written.

  19. 19. Chris Said:

    It is a fallacy that probability is only applicable when many trials are performed.

    There are several ways that probability can be defined. The one that I find quite satisfactory (at least as a starting point) is essentially the frequentist version. Crudely stated: If, on average, an event occurs once in N trials, then the probability of the event occurring is 1/N. The immediate interpretation is that the event has a (relative) likelihood of 1/N of occurring (and 1-1/N of not occurring) in any single trial. As long as you use a little common sense, that definition is probably good enough for engineering purposes. It’s easily good enough for the posted problem.

    I guess that the fallacy arises because to define probability (in the frequentist way) it is necessary to have introduced the idea of many trials. But the conclusion of the consideration is about being able to say what the likelihood (aka relative frequency) of something happening in one trial is. So the, “you only get one go, so probabliity is irrelevant” assertion is precisely the opposite of the correct reasoning.

  20. 20. Chris Said:

    Blake raised a very good point for problem 2. Let’s see what happens if you tried to guess at what I’d give you. There is no really sensible way to precisely decide the values that you could expect in the game. Rather than getting into laborious calculations, make a minimum/maximum guess, take the average, call it T (for threshold), then when you inspect the container and find an amount X in it, if X ≤ T then swap, else keep. Regardless of how you actually determine T, in the case of 100 and 200 available, if 100 ≤ T ≤ 199, then you’d max out.

      Swap       First pick   Average
    threshold    100    200    take
    
       1- 99    keep   keep    150
     100-199    swap   keep    200
     200-  ∞    swap   swap    150
    

    Making a bad guess won’t penalise you (i.e. you won’t do any worse than by always swapping or always keeping). But if you guessed right, then lucky you. However, if I varied the quantities, the advantage of conditional swapping would be reduced.

    If you wanted, you could choose to try to do badly. If you made a good guess for the threshold, and then swapped when you should have kept and vice-versa, then you could bring your average down to 100.

    PS It may be that if you found the first container contained far more or far less than your guess, that you might decide to re-think the value of T.

Post a reply




PHP Warning: PHP Startup: Unable to load dynamic library 'C:\Program Files (x86)\Parallels\Plesk\Additional\PleskPHP5\ext\php_mssql.dll' - The specified module could not be found. in Unknown on line 0 PHP Warning: PHP Startup: Unable to load dynamic library 'C:\Program Files (x86)\Parallels\Plesk\Additional\PleskPHP5\ext\php_pdo_mssql.dll' - The specified module could not be found. in Unknown on line 0