Math 104

Ace your homework & exams now with Quizwiz!

Prob 2.6: Same question as Problem 5, but with TALLAHASSEE?

11!/2!*2!*2!*3!=831600

The average number of patients admitted to Mary Greeley on a weekday is: Mon 82, Tues 61, Wed 56, Thur 58, Fri 70 what are the the odds for the number of patients being 60 or higher?

60 and up is 3/5, 60 and below is 2/5. (3/5)/(2/5)=3/2 Write this as: 3:2

In problem 2, ("A bucket contains ten balls numbered 1,. . . , 10. If you reach in and pull outa ball without looking, what is the probability that its number is a square?") what are the odds against the number being a square?

7:3

Under the circumstances of the previous problem, find the probability that you will wind up with any kind of flush (all cards of the same suit). a.1/47, b.2/47 c.3/47 d.9/47

A flush would require to get any Club. There are 13−4 = 9 Clubs left in the deck, and 47 cards remaining, so the answer is 9/47.

What key fact did OJ Simpson's defense lawyer omit in his reasoning 'Only 1 in 1000 abusive husbands will eventually murder their wife'? A: The ratio of abusive husbands is actually much higher. B: The rate of wives murdered is actually much lower. C: In OJ Simpson's case, NicoleBrown and her boyfriend were murdered. D: He did not properly use conditional probability. E: He was biased in favor of OJ Simpson.

A, B are up for debate and not relevant. D His statement is really about a conditional probability, and it was even backed up by some evidence, as far as I know. But this has little to do with the question.E: True, but that is a defense attorney's job! You can't blame him for that- the blame does not fall on a defense attorney saying anything, however wrong and outrageous (except for knowingly concealing a crime, I believe) -it's judge and jury who fell for it and the prosecutor who did not counter it.So that was neither here nor there.C is right! This fact was undisputed, and it changes the odds - the article we discussed showed how it produces rather strong odds in favor of guilt.

Mike has read that vandalism in some areas of a major Iowa City is so rampant that there is a 0.2% chance for having your car keyed on any given night. Mike uses the Multiplication Rule, and keys his own car, concluding that the chance of having his car keyed twice is P(Car keyed twice) =P(car keyed)P(car keyed) = (0.002)^2= 0.000004. Ha! I sure showed 'em! says Mike, thinking that he has now made it almost impossible that other people would key his car tonight (of course he just made a tiny scratch himself, in a spot where it's hard to see - he is not dumb!). What did Mike get wrong? A: He needs to use the Addition Rule. B: The square of 0.002 is 0.04. C: Of course it's near impossible that he would do it again - but someone else could! D: He needs to consider the conditional probability that it will happen twice, given that he has already done it once.

A, B, C are all wrong but in different ways. C sounds most appealing (except the premise is silly - if he's done this dumb thing once, who knows why he would not do it again). D is right. The probability that your car gets scratched twice really would be computed with the multiplication rule if the two events are independent. (This problem has a distant echo of a mistake committed by the expert witness for the prosecution in Sally Clark's trial). But what Mike is really thinking about is the probability of getting a second scratch given that he already has one scratch on his car. With that information, the conditional probability for getting a second scratch is exactly the same as the probability of getting one scratch (it does not matter that it was Mike himself who did the scratch).

You are playing Poker. Suppose you are dealt 5 cards from a standard deck of 52 cards. Calculate the number of ways of getting two pairs (e.g. two 7'sand two 10's) and an Ace (the two pairs cannot be Aces).

First, there are C(12,2) = 66 ways to pick two ranks which are not Aces. Next, for each of these, we can pick a pair in C(4,2) = 6 ways. Finally, we have 4 ways to pick one lonely Ace. So the total number of ways is C(12,2)·C(4,2)·C(4,2)·4 = 9504.

You are playing Farkle. On the initial roll of the six dice, what is the probability of getting at least one 5? Round your answer to the nearest hundredth. (A).33 (B).27 (C).45 (D).67 (E)none of the above.

Getting at least one 5 is the complement of getting no 5, so P(k≥1|p= 5/6, n= 6) = 1−P(k= 0|p= 5/6, n= 6) = 1−(5/6)^6≈0.67.

Prob 2.7: If a band of monkeys randomly shake up the letters in DREAMED, what is the probability that they will form the word EDDAMER, a misspelled brand of Dutch cheese?

It is best to use the sample space of all possible rearrangements of 7 letters, which has 7! elements. To come out with EDDAMER, there is only one choice for A, M, R, and the 2 E's and 2 D's can be switched, which gives 4 different rearrangements that spell this 'word'. The probability is 4/7!=11260.

Prob 2.3: You see three boxes. One contains two gold coins (GG), one contains one gold, one silver coin (GS), and one contains two silver coins (SS), but you do not know which is which. You pick one box at random and pull out a coin. It turns out to be a gold coin. Given that, what is the probability that the other coin from that box will also be a gold coin?

Let B be the event that the first coin is a gold coin andAbe the event that the other coin will also be a gold coin. The boxes are each equally likely to be picked, the coins from them as well, so every coin is equally likely to be picked, andP(B) = 1/2 since there are 3 gold coins out of 6.A∩Bis the event that both coins are gold, which happens exactly when we picked the GG box. This has probabilityP(A∩B) = 1/3. So P(A|B) =P(A∩B)/P(B)=(1/3)/(1/2)=2/3

Problem6. In "pick 10" Keno, the probability that you "catch 2" is a.C(20,2)/C(60,2) b.C(20,2)/C(80,2) c.C(20,2)/C(80,10) d.C(20,2)C(60,8)/C(80,10)

Now we pick 2 from the 20 winning numbers (which can be done in C(20,2)ways. This leaves 8 to pick from the 60 losing numbers, which can be done in C(60,8) ways. So the probability is the one from answer d.

You pick 3 balls out of the bucket in problem 2 with replacement. What is the expected value of the sum of the numbers?

One ball has expected value of : 1/10(1+2+3+4+5+6+7+8+9+10)=5.5 The second and third balls have the same probability. 3*5.5=16.5

Prob 2.9. You roll 10 dice. What is the probability to get exactly 7 Sixes?

P(k= 7) =C(10,7)p^7(1−p)^3=625/2519424≈0.000248 = 0.0248% (which you could also have looked up in the table for p= 1/6 (last row,n= 10and column fork= 7).

You are playing Farkle. If you roll 1 1 1 4 5 6, decide to bank 1 1 1 5, and reroll 4 and 6, what is the expected value of your total points after that roll?(you end your turn after that roll, no matter what you get - do not consider the points you might still earn with 'Hot Dice').

Roll 11 15 1x 55 5x xx Prob.1/36 2/36 8/36 1/36 8/36 16/36 Points.1250 1200 1150 1150 11000 The expected value is then E=136(1250 + 2·1200 + 8·1150 + 1·1150 + 8·1100)≈633.33 Note that this is much less than the 1050 points you could have gotten by not re-rolling - so re-rolling is a bad choice (even though it's going to be a bit better in real life, because you would use the Hot Dice if you get them).

You are playing Poker. You are dealtA♣, K♣, Q♣, J♣,4♦. You discard and replace the 4♦. What is the probability that you will wind up with a Royal Flush? a.1/47, b.2/47 c.3/47 d.9/47

Royal Flush would require getting a 10♣- this is the only way. There is only one possibility, and 52−5 =47 cards remaining, so the answer is 1/47.

Prob 2.1: You roll a fair die. Given that you roll an even number, what is the probability that your roll a 3 or less? Are these two events independent? Check using conditional probability.

So we roll 2,4, or 6. Only one in 3 of these says "3 or less" (that is, if we roll 2). That's the probability answering the first question: 1/3. For the second question, let A be the event that we roll 3 or less, and let B be the event that we roll an even number. So A={1,2,3}, B={2,4,6}, and A∩B is just the event {2}. We just saw P(A|B) = (P(A∩B)/P(B))=(P(2)/P(2,4,6))=1/3. But P(A)= 3/6 = 1/2, different from P(A|B), so the events A and B are not independent.

In a class of 40, 15 students are in Band, 8 students are in Soccer. 4 students are both in Band and Soccer. What is the probability that a student chosen at random is in Band or in Soccer (NOT either-or here!)

Solution.The number of students in Band or in Soccer is 15 + 8−4 = 19 (when we add Band and Soccer, we are double-counting students who are in both). So the probability is 19/40.

A bucket contains ten balls numbered 1,. . . , 10. If you reach in and pull outa ball without looking, what is the probability that its number is a square?

The "without looking" is a clue that this means every ball is equally likely to be drawn. There are 3 squares in the numbers 1, 2, . . . , 10 (1,4,9). So the probability for drawing a square is 3/10

You conduct a test on 1500 subjects about the efficacy of a new drug. The scientists tell you that they used the null hypothesis of the drug being effective in 90% of patients, and obtained a p-value of 0.06. What is your best reaction? A: This drug is useless. B: This drug is useful for 6% of patients. C: This drug is very likely useful. D: Scientists, go back and use a null hypothesis with a slightly lower success rate so that we can then reject it at the 5%significance level. E: The test results are inconclusive.

The best reaction is D. A, B are nonsense. C is true. The given information says that IF the drug is not effective in 10% of patients, there is still a 6% chance of seeing such test results or better. But that's still very good. However, answer D is even better. You can re-do the calculations for a lower success rate, which will give a lower p-value (even if you just use trial and error, you can quickly get this p-value just under 5%). E would only be true if you insist on the question whether the null hypothesis could be rejected at the 5% significance level, since 6% missed that (just barely). But E is not a good answer because it completely tosses out the evidence.

You roll a die 30 times and only roll a Six once. a) What is the p-value if your null hypothesis is that this is a fair die? b)Can you reject the null hypothesis at the significance level of 5%? at the level of 1%?

The claim would be that the die is biased against rolling a Six (the clue inthe problem is 'only', but more importantly, with fair dice, you would expecta Six 5 times out of 30. So ifpis the probability for a 6 in one roll, the claimisp <1/6 and the null hypothesis isp≥1/6. With the benefit of the doubt, we have to work with p= 1/6. The letter p is not the same as the "p-value"! The p-value is the probability that the observed result or something even more extreme in favor of the claim. happened. So if k is the number of Sixes in 30 rolls, the p-value here is P(k≤1|p= 1/6) =P(k= 0|p= 1/6) +P(k= 1|p= 1/6).We cannot look this up in our binomial probability tables because they do not have n=30. So we go by hand, the p-value is P(k≤1|p= 1/6)=C(6,0)p^30+C(6,1)p(1−p)^29=(5/6)^30+ 6·(1/6)·(5/6)^29=5^30+ 5^/29630≈0.029. b) Yes, because thep-value is less than 0.05. No, at the 1%-level because it'sbigger than 0.01.

In the class of problem 9, what is the probability that a student chosen at random is EITHER in Band OR in Soccer?

The number of students either in Band or in Soccer is 19−4 = 15 (take students who are in one or the other, and subtract the number who are in both). The probability for a student to be either in Band or in Soccer is then 15/40 = 3/8.

You roll two dice. What is the probability that the first one shows an even number and the second one a 4?

The outcomes in this event are (2,4),(4,4),(6,4), so the probability for this is 3/36 = 1/12

Refer to the Rules for the Powerball Lottery in the textbook. Find the probability of matching exactly 3 numbers AND NOT matching the "powerball". Pick the closest answer (if your answer is M/N, then take N/M and round to the nearest integer - that will give you the denominator you are looking for. This technique is always useful for expressing tiny probabilities as common fractions with a numerator 1, 'One in a . . . '). (A)1/515,363 (B)1/19,088 (C)1/12,245 (D)1/5,003 (E)1/360

The sample space has C(59,5)·35 possible combinations. Out of these, we have C(5,3) = 10 to pick exactly 3 winning numbers. Then we have 34 choices to not match the powerball, so the answer is 10·34/C(59,5)=34/17522351. Then we take the reciprocal, and round it to the nearest integer. 17522351/34≈515,363 which means answer (A) is right.

You are playing Poker. Suppose you are dealt 5 cards from a standard deck of 52 cards. Calculate the probability of being dealt a full house with 3 Sevens.

The sample space of all Poker hands has C(52,5) different hands. How many of these give a Full House with 3 Sevens? First, there are C(4,3) = 4 ways to pick the 3 Sevens. Next, there are 12 ranks besides the Seven, and C(4,2) = 6 ways to pick a pair of this other rank. So the probability is 4·12·6/C(52,5)=288/C(52,5). We leave the fraction in this form, because usually, we just want to compare this probability to others with the same denominator.

In "pick 10" Keno, the probability that you "catch 0" is a.C(20,10)/C(80,10) b.C(60,10)/C(80,10) c.C(20,10)/C(60,10) d.C(20,0)/C(80,10)

The total number of combinations isC(80,10) The Casino has picked 20winning numbers. You pick all your 10 numbers from the 60 losing numbers, so the number of ways to do that isC(60,10) and we get the probability in b.

Make a tree diagram for the situation in problem 7. Are the events of 'first one even' and 'second one is 4' independent? Verify in two ways.

The tree diagram could include all 6 possibilities for each die, and then you highlight only the ones listed in problem 7. Or you simplify matters and just give two choices for the first die (even/odd) and two for the second, 4 and not -4. For the first die, both choices happen with probability 1/2. The 4. in the second die comes up with probability 1/6, and the multiplication rule gives the answer 1/12. As to independence, the simplest way is to observe that the two events are determined by rolling different dice, which are independent of each other, so the two events have to be independent as well. An alternative way is to follow the simple tree diagram: the event "even / 4"has probability 1/12, and this equals the product of the probabilities 1/2 and 1/6 of the events "first die even" and "second die 4", respectively. Therefore, the two events "even" and "4" are independent.

The average number of patients admitted to Mary Greeley on a weekday is: Mon 82, Tues 61, Wed 56, Thur 58, Fri 70 If you pick a weekday at random, what is the probability that the average number of patients admitted is 60 or higher?

There are 3 out of 5 weekdays (M, T, F) when the average number of patients is 60 or higher, so that is a probability of 3/5

Prob 2.11: What is the probability to get 3 Aces in Poker?

There are C(52,5) different hands in Poker. The number of ways to get 3 Aces is best calculated as the C(4,3)=4 ways to pick the 3 Aces out of 4 in the deck, times the C(48,2)=1128 ways to pick the two remaining cards out of the 48 remaining cards which are not Aces. So the probability is P(3 Aces) =4·1128/C(52,5)=4512/C(52,5) and it's best left in this form (on an exam, I would give this as a multiple-choice option).

Prob 2.5: How many different "words" can you form out of HELLO? Use all the letters, but the "words" don't have to be actual words in any human language.

This is similar to Problem 4, but in any rearrangement of the letters, we can switch the two L's and get the same 'word'. This means the number of different words equals the number of rearrangements divided by 2, so it is 5!/2!-- 120/2 = 60

Prob 2.10: You toss a coin 10 times. What is the probability to get Tails at least 8 times?

This is the BPF with n= 10, p= 0.5, and k= 8,9,10. We could fire up the calculator, but much easier to look this up in the table for p= 0.5:P(k≥8)=P(k= 8) +P(k= 9) +P(k= 10)≈4.3945% + 0.9766% + 0.0977% = 5.4688%

Prob 2.2: Weather forecast for Ames: Sunday: PC, Monday: PC, Tuesday: Th, Wednesday: C, Thursday:S, Friday: PC, Saturday: PC, Sunday: PC, Monday: C Here, PC = partly cloudy, Th = Thunderstorms, C = cloudy, S = sunny. If the forecast for one day is partly cloudy, what is the most likely weather the next day according to this forecast?

We can apply the definition of conditional probability: ifBis the event that the weather is partly cloudy on one day, then the weather the next day is(in order)P C, T h, P C, P C, C. There are five days in the eventB, and eg. on three of them the next day will also be partly cloudy. So the conditional probabilities areP(P C|B) =35,P(T h|B) =15,P(C|B) =15. and the weather is most likely partly cloudy again.

You are playing Blackjack. Suppose that the dealer has a 5 showing, you have a 9 and a 7, and the only other player at the table has an Ace and a 5.Calculate the probability that you go bust on the next card if you decide to get hit. Round your answer to the nearest hundredth. (A).64 (B).57 (C).36 (D).43 (E)none of the above.

We have 16 points, so we go bust if we get 6 points or more. The number of cards that make us go bust, listed by rank, is Rank: 6, 7, 8, 9, 10, X, Total how many:4, 3, 4, 3, 4, 12, 30

Prob 2.8: A franchise of MangoWasp's offers a deal on Dinner for Two where you get to pick 3 appetizers, 2 main courses, and 2 desserts. Their menu has 6appetizers, 10 main courses, and 8 desserts. How many different choices does this deal give you? (You can use a calculator for this - do not try to make a list).

We have C(6,3) = 20 choices for appetizer, C(10,2) =45 choices of main course, and C(8,2) = 28 choices for dessert. Multiply them and you get the total number of different choices for the deal, 20·45·28 = 25200.

A murder was committed by someone who wore a green feeder cap and hadsix toes on each foot. a) Police are looking for a suspect who wore a green feeder cap in the samearea on that day. b) Police have found another suspect who has six toes on each foot and wasin the area on that day. Use Bayes' formula, P(G|E)/P(G^c|E)=P(E|G)/P(E|G^c)×P(G)P(Gc)= Likelihood ratio×Prior odds. to explain which of a) or b) is stronger evidence of guilt (to be clear - bothare far from strong enough to convict on that basis alone).

Write F for the event that someone wears a green feeder cap, S for the event that someone has six toes on each foot. Use these in place of the evidence E in Bayes' formula. The prior odds (in favor of guilt) are the same, obviously. As in our examples with Sally Clark and OJ Simpson, P(E|G) = 100%. (the actual murderer had the green feeder cap and six toes). But P(E|Gc) is the probability that these traits are true for randomly chosen innocent people. Obviously, P(F|Gc) is much higher than P(S|Gc) (green feeder caps are more common than six-toed people, even though this condition exists - I even know someone who has six toes on each foot). Since P(E|Gc), the rate of false positives, is in the denominator of Bayes' formula, it makes the whole fraction smaller when we substitute the bigger probability with E=F. So the evidence of a suspect with a green feeder cap is weaker than that of a suspect with six toes. By Bayes' formula, this means that the odds in favor of guilt are lower (of course, both pieces of evidence should not be enough for conviction without corroborating evidence).

Prob 2.4: How many ways to rearrange the cards in a poker hand?

You need to know that there are 5 cards in a poker hand. The number of ways to rearrange 5 objects is 5! =120.

You are playing Blackjack. Suppose that the dealer is showing an Ace, you have a 4 and 10 in your hand, and the only other player at the table has a 7 and a King. Calculate the expected value of a $1 insurance bet under these circumstances. Round your answer to the nearest hundredth (Recall that the insurance bet is taken out to ensure against the possibility that the dealer gets a Blackjack; that is, a 21 on her first two cards. It can only be taken out when the dealer is showing an Ace initially, and the payoff odds are 2:1). (A)$.17 (B)−$0.11 (C)$0.14 (D)−$0.08 (E)none of the above.

You win the insurance bet if the dealer gets a 10, J, Q, or K. The number of cards still in the deck is 52−5 = 47. The number of cards that let the Dealer get a Blackjack is rank: 10 J Q K Total how many?: 3 4 4 3 14 So the probability to win is 14/47, to lose is 1−14/47 = 33/47. The probabilities and payouts are Win Probability: 14/47 Win Payout: 2 Lose Probability: 33/47 Lose Payout: -1 E= 2·14/47−1·33/47≈−0.11 so you lose 11 cents per dollar on average (B).

Albino squirrels have two copies of a recessive gene (genotype aa). Squirrels which have one or two alleles of the dominant allele (A) all look reddish-brown, you cannot tell the genotype (AA or Aa) from the way they look. But suppose we know that Squirrel Theo looks brown, and has had 6 brown-looking baby squirrels with brown Squirrel Alice. And suppose Alice has had an albino child with another male. Consider this as a test whether Theo has genotype AA or A a. What does the evidence tell us? To elaborate, a) If the claim is that Theo has genotype AA, what is the null hypothesis? b) Given the null hypothesis, what is the probability p that a child of Theo and Alice will be an albino? c) What is the p-value of this test? d) Can we say that this test is strong evidence for Theo being of genotype AA, and reject the null hypothesis at the 5%-significance level? (BTW if Theo EVER has an albino child, then there is no further testing needed, he has to be Aa. The same is true for Alice - she must be Aa.)

a) Null hypothesis: Theo has not genotype AA, so genotypeAa. b)p= 1/4 (albinos are the ones with genotype aa). c)P(k= 0|p= 1/4, n= 6)=(3/4)^6≈0.18. d) No, because 18%>5%.

A microchip contains 85 tiny transistors. If even a single transistor fails, the microchip fails. Call p the probability that one single transistor fails. a) Write the probability that the microchip fails in terms of p b) Use your calculator to check which of the following values of p are small enough so the microchip fails with probability of less than 2%.A 0.02, B 0.01, C 0.005, D 0.0002, E 0.0001.

a) P(chip fails) = 1−(1−p)^85. b) Only D and E are small enough (check by using part a). The lesson here is that the individual components need to be much more reliable than the whole to make the failure probability small enough. (In real-world problems,there would be even more components and even tinier failure rates).

You are playing Blackjack. You have 9, 10 and the dealer shows 6, K. a) What is your probability to win? (assume you stand and the dealer hitsherself one more time). b) Compare the answer in part a) to the answer you would get ignoring all actual cards, only using that you have 19 points. You just need to look up numbers in the table from Olofsson's book, reproduced in our textbook on page 126 and in the Blackjack script.

a) You win if the dealer gets 17 or 18 points, or goes bust with more than21. Let us instead count the cards that result in you losing or a push (tie). What happens if the dealer gets 19, 20, or 21 points, so with a 3, 4, or5. There are 12 such cards remaining out of 52−4 = 48 total, so the probability is 12/48 = 1/4 of you losing. The probability of winning is1−1/14 = 3/4 = 0.75. b) The probability that the dealer ends up with 17 or 18 points, or goes bust, with no other information, just from Olofsson's table on p. 126 of the book, is P(17) +P(18) +P(bust) = 0.15 + 0.15 + 0.28 = 0.58 which is very different! Of course, in real life, you should not ignore information if you have it.

You are playing Farkle, and the initial roll of six dice is 1 2 4 5 6 6. Of the following possible decisions that you can make, the one that makes no logical sense is a. bank the 1 and roll five dice b. bank the 5 and roll five dice c. bank the 1 and 5 and roll four dice d. they all make sense - it depends on how much risk you want to take

b) does not make sense, because whatever happens with the remaining five dice could happen as well, with the same probability, if you keep the 1 instead. It's not much, but it's just 50 extra points for 100 instead of 50. a) and c)are the choices for more or less risk, but b. is just always worse than a.


Related study sets

Unit 6 Physics: work, energy, and power

View Set

Exceptions to Warrant Requirement

View Set

Microbiology Epidemiology (Chapter 14)

View Set

P 1.2 Gram-Positives vs. Gram-Negatives

View Set

Chapter 4 Active Directory Server Pro: Install and Configure

View Set