Last semester I wondered whether this was a rational decision by the students: not to learn much. Many professors are not clear about what will appear on the test. Consequently, if a student learns anything, the likelihood it will appear on the test is small, so why put forth the effort?

To test this, last semester I gave students a copy of the test I would give. Now, they knew if they studied the question there was a 100% chance of a positive return. Some of the numbers in the math questions would change, and the ordering of the multiple choice questions would change, but that is all. The result was pleasing, as shown by the following breakdown of grades.

51% made an A

20% made a B

14% made a C

16% flunked (I don't give D's)

This means that when the students took the exam, they really learned the material. They did not just memorize answers, the test was not like that. They knew exactly what the question was

*like*, not exactly what the question

*would be*. So they studied

*how to answer the question*, and for the first time, half of my students made an A.

Yet, when I repeated this on the next exam, only 16% made A's and 35% flunked. Another feature of my class is that I allow any student to retake the exam to improve one letter grade. After talking with the students, it turned out that they did so well on the first test, and were so busy that week, many decided not to study for the test and flunk, but to study the following week for the retest to make a C.

*There is no denouemont to this story. It is interesting though, I think. I still pursue this strategy: telling the students exactly what will be on the test. In my current class, I gave them the class and let them spent two whole class periods taking the exam and helping them.*

**I don't like that I do this, but it is the only way I can get a large majority of the class to learn most of the material.**