Mikkel Gandil and Edwin Leuven
How can colleges find successful applicants? Criteria such as GPA, interviews, essays, and tests provide information about candidates, but which work and why? We shed light on these questions using unique data on the universe of objective and subjective rankings of all college applicants in Denmark, their applications, admissions and college outcomes. We implement a regression discontinuity design across multiple admission quotas to estimate how admission affects program and college completion, and investigate how this depends on the evaluative criteria used. We find that admission based on alternative criteria outperforms standard admission based on GPA. Alternative criteria are more effective in identifying good program matches, which ultimately leads to higher college completion rates because alternative evaluation is more likely to admit students that tend to struggle elsewhere. Most of the impact of alternative evaluation is found to be due to their impact on the applicant pool (sorting). This suggest that application costs play an important role when selecting likely-successful applicants. Our analysis of the evaluation technology shows that the use of individual grades leads to the admission of applicants that are less likely to succeed. The use of tests and CVs does however have robust positive effects which are explained by their impact on sort- ing and not because they allow programs to select more successful students from a given pool of applicants. Essays is the only criterion that is intrinsically better at screening out applicants that will do well in the program or in college more broadly. The use of tests, interviews and CVs do not outperform GPA in screening once we keep the application pool fixed. There is no evidence that interviews are an effective admission tool.