Congratulations to Alex Davied, the 2008 AGEC 4213 Nerd of the Year!
Each year in my regression class we hold a forecasting contest. The second day of class we go outside and hit softballs, measuring the distance each person hit. We then develop regression to explain why some people hit further than others. See this blog entry discussing the activity in more depth.
To test whether they can really put together a good regression model, we hold a forecasting contest. Using data from their class, and previous classes if they wish, students are asked to construct a regression model. Then, we return to the field, hitting balls again, measuring distance again. But this time, before the person hits, students must take their regression equation and predict how far the batter will hit. Note this is an out-of-sample forecast. The student whose regression model has the lowest sum-of-square errors across all other students wins. This year's winner was Alex.
She wins $30 in cash and an obnoxiously large and tacky trophy, that yes, states 2008 AGEC 4213 Nerd of the Year.
One lesson that always emerges from the contest is the importance of parsimony. The winning student always has a simple regression model. Alex's model was distance = a0 + a1(male) + a2(experience) + a3(experience squared) where male is a male dummy variable and experience is the number of years the student has played on a softball or baseball team. Inevitably some students will construct a horribly complex regression with all types of quadratic and interaction terms, quite similar to our "locally flexible functional forms", and those regressions inevitably perform poorly. Usually, the winning model is: distance = a0 + a1(male) + a2(experience).
The contest is fun and educational. Moreover, it gets students outside, which is always popular with them and always good for evaluations.