I had this math problem on a test today: The average GPA at a school is 3.4, with a standard deviation of 0.2. What is the probability that your GPA is lower than 3.7, given that your GPA is higher than 3.3(supposing your gpa can be between plus infinite and minus infinite.) I found the answer like this: (Probability(gpa>3.3) and (gpa<3.7)) / Probability(gpa>3.3) By this, you get about .9033 My friend did this problem a different way, which takes a little explaining: The probability that you score higher than 3.3 is 1, because it is given that you score above that. So logically, the probability that you score between 3.3 and 3.7 is 1- the probability of a gpa>3.7. But when you calculate this, you get .933, which is the wrong answer. My friend and I can't figure out why his way doesn't work. Any ideas?