Comparing candidates for technical roles is no easy matter. Many companies do not have standard metrics by which to quantify candidates’ skills, leading recruiters to rely too heavily on GPA and technical buzzwords on resumes. This post will examine GPA specifically as many recruiters and hiring managers still find it to be a useful, objective measure when comparing candidates. In our recently published Future of Data Talent 2019 Annual Report, we measured the relevance of GPA in the data talent landscape, with respect to data talent acquisition.
GPA is argued to be an important signal for talent, and for good reason. Conventional wisdom has it that to earn a high GPA, students must be hard-working and results-oriented. GPA might not fully capture how well a student understands a specific concept, but it measures how a student can apply conceptual understanding to in-class performance and turn knowledge into concrete results across various subjects.
On the other hand, identifying ideal candidates for technical roles requires a more extensive understanding of their proficiencies around specific skills. While GPA can serve as a decent signal for general talent and work ethic, it can be misleading when comparing candidates for roles that often require more specialized technical skills.
The main problem is GPA is not actually a standardized measure across students, departments, and universities. In our analysis, we observed a significant variance in average GPA when comparing candidates across different schools.
The chart below shows the average student GPA at schools where we tested at least 30 students:
Based on our sample, the average school-level GPA was a 3.66, with a standard deviation of 0.15. Two of the hardest grading schools were Harvey Mudd and Oxford, whose students had an average GPA of 3.36 and 3.43 respectively.
This variance does not mean that a student with an average GPA from one of these two schools is less promising than students at other schools with higher average GPAs. While some recruiters are aware of this difference, and take each candidate’s background into account holistically, it nonetheless prevents GPA from being a standardized measure and at the very least, creates another layer of nuance.
Explaining the GPA Variance
The difference in average GPAs also gets exacerbated due to grade inflation at some schools, so much so that “on average, grade-point averages are rising at a rate of about 0.15 points every decade” argues Stuart Rojstaczer, a professor from Duke University. The upward trend is not surprising, since advocates of grade inflation claim that deflating grades hurts students’ chances in getting ahead in the job market, especially for entry-level jobs.
Admissions officers also tend to favor students coming from grade-inflated schools, creating an attribution error in graduate school admissions. Professors, in the meantime, are further pressured by student evaluations and the risk of putting their students at a disadvantage. Some professors try to circumvent the trend with their own methods, such as the Harvard professor Harvey Mansfield who gives students two grades, one for their transcript and one he believes the student actually deserves: “I didn’t want my students to be punished by being the only ones to suffer for getting an accurate grade,” said Mansfield.
Since GPA still influences recruiters and hiring managers, students are also punished for challenging themselves. A student interested in data science, for instance, might choose to take on difficult elective courses or be more involved in research projects, which could harm their GPA and job prospects, although their goal is to improve themselves. The emphasis employers put on GPA thus risks creating an environment in which students are discouraged from challenging themselves if it comes at the expense of lower grades.
Another important distinction at this point is that some schools measure GPA on a different scale. A 3.92 from MIT, where GPA is measured on a 5 point scale, could be viewed differently from a 3.33 from Oxford. Some recruiters may not be aware of this difference, furthering the need for standardization. Without a consistent approach to standardize GPA across schools, the result is a zero-sum game where grade-deflating or grade-neutral schools are constrained vis-a-vis grade-inflating schools.
What’s the Solution?
As standardized tests such as GMAT serve to eliminate bias and put students on a level-playing field, recruiters for technical roles have also turned to third-party technical assessment providers to evaluate candidates’ data science and computing skills. Technical assessments can significantly help recruiters compare candidates for technical roles, regardless of their formal educational backgrounds.
Correlation One’s Assessments are tailored to specifically map to job functions, meaning recruiters get an accurate read on a candidate’s ability to perform on-the-job. C1 Assessments also provide helpful benchmarks that, unlike GPA, are normalized across all candidates.
Technical assessments can also effectively capture and quantify deep skills acquired through research, extracurricular activities, and internship experience – which GPA fails to account for. Standardized performance-based assessments level the playing field, democratize opportunity, and provide much more accurate signals than GPA as they provide a platform where students can truly showcase their most in-demand skills.
If you’re interested in learning more about our research, or the future of the data talent market, download our 2019 Annual Report, here.