Tuesday, February 25, 2014

Standardized Tests as a Predictor of Success in College

The SAT and ACT are conventionally thought to be the most important components of a high school student's application to college. This is probably because of the fact that standardized tests are used by colleges as somewhat of a "great equalizer" among masses of high school students with very similar grades. The test provides a means to differentiate among some of these students and also gives admissions officers some idea of what a 4.0 GPA at a random school in Idaho means compared to what it means at an elite college preparatory school in Massachusetts. In other words, when admissions officers have such limited information about students and their schools, the SAT and ACT serve as extra sources of information from which to base their evaluation of applicants. And, as the former admissions officers at inGenius Prep have confirmed, the mantra "the more information, the better" is true with respect to standardized tests.

Although statistics about the strength of the SAT and ACT as predictors of college performance has been out there for awhile, the din of criticism of standardized tests has grown louder over the past few years. These studies citing the tests' predictive ability are all fairly methodologically flawed (a subject for another blog post) because of the inability to control for myriad variables; however, most people accept (not necessarily based on any empirical evidence) that higher standardized test scores reflect higher ability and will result in improved performance relative to lower scores on standardized tests.

A recent NPR article about "test-optional" schools calls this conventional wisdom into question and cites statistical evidence that students who apply to schools with "test-optional" admissions policies who choose to submit test scores only outrank those who choose not to by 0.05 GPA points. The NPR article cites this as evidence that the tests have a poor predictive ability when it comes to GPA. Yet, the tiny sample size, the bias from self-selection of people, and a variety of other factors make this data almost meaningless aside from its rhetorical appeal. The article goes one to discuss how high school GPA is a much better predictor of a student's academic performance in college (measured by GPA), which is certainly an intuitively appealing argument (and one that I happen to agree with). Nonetheless, the purported statistical evidence for the claim is too flimsy to hold up such a weighty proposition.