Many of us have read Malcolm Gladwell's Blink, in which he postulates that chance and "gut feeling" may play a more significant role in our lives than we imagine. Nassim Nicholas Taleb's Fooled by Randomness, an older and more rigorously researched book, takes a similar position. These books make me question whether interviews are as valid as we think or even needed anymore.
I remember a recruiter I worked with who was convinced that our time spent interviewing and screening candidates was a waste. He insisted that any candidate who met a minimum set of essential criteria could potentially perform a job equally well. The only remaining need was determining how well the candidate fit with the hiring manager and, to a lesser degree, with the organization. His idea was to ask a few basic questions, conduct a short interview or two, and make a decision. He believed it would be more cost-effective to make rapid hires and allow on-the-job performance to determine who should stay.
At the time, I found this notion unsettling. I believed Recruiters were responsible for ensuring quality and presenting only the best candidates to hiring managers. I also thought we could reliably choose the top performers with assessments, structured interviews, and behavioral questioning. However, data suggests otherwise—many hires leave within the first year, and performance predictions are often famously wrong.
The Historical Context of Hiring Practices
In the early 20th century, before World War II, jobs were often filled quickly without extensive screening. Clerks ensured paperwork was completed, but managers typically selected employees based on minimal criteria. Some jobs required no interviews.
Henry Ford, for example, famously selected assembly-line workers from those waiting at the factory gates. Higher-level positions were filled after brief interviews that focused on a candidate's skills but relied heavily on soft factors such as potential, eagerness, appearance, and family background. Poor performers were dismissed.
This approach was deeply flawed, particularly in its blatant discrimination. The rise of Scientific Management, pioneered by Frederick Taylor, along with union movements and legislation, eventually led to modern HR and recruiting practices.
Nevertheless, the old system had one virtue: Simplicity, which emphasized potential, attitude, and performance over rigid credentials. Many professionals, including engineers, doctors, and lawyers, were trained through apprenticeship models rather than formal education.
The Shift Toward Credentialism
After World War II, hiring practices changed significantly. The 1950s and 1960s saw increased educational requirements, years of experience thresholds, and standardized test scores as employment filters. College enrollments surged, and hiring managers used degrees as proxies for skill and intelligence. While this was seen as fairer and more defensible than previous discriminatory practices, it did not necessarily correlate with job performance.
Over the past 40 years, we have learned that hiring based solely on education and experience is ineffective. According to research from Schmidt & Hunter (1998), traditional hiring criteria like years of experience and unstructured interviews are weak predictors of job success. Recently, AI and automated assessments have revealed additional biases in conventional hiring practices.
New Work, New Skills
Today, job roles and required skills are evolving at an unprecedented rate. With the rapid emergence of new technologies, many traditional hiring methods struggle to keep pace. Recruiters and hiring managers find it increasingly difficult to determine the necessary competencies for a position.
As a result, traditional interviews may now be obsolete. AI-driven assessments can evaluate and rank candidates faster and with less bias. Interviews then serve more as a way to engage candidates rather than as a means to assess ability or fit.
Additionally, AI-based hiring tools, such as Harver (Pymetrics) and HireVue, use machine learning to evaluate candidates' cognitive abilities, soft skills, and potential. These tools can analyze video interviews for verbal and non-verbal cues, assess problem-solving skills through gamified testing, and compare candidates against top performers in the company.
Three Emerging Best Practices for Hiring and Selection
#1: Use AI and Automated Assessments to Evaluate Candidates Efficiently
AI-driven tools can match skills to job requirements and assess a candidate’s capability by analyzing their past work and digital footprint. Platforms such as LinkedIn Talent Insights and Eightfold AI use deep learning to predict potential based on historical data. Automated assessments help identify candidates' competencies beyond resumes, reducing reliance on educational credentials and subjective evaluations.
Keep reading with a 7-day free trial
Subscribe to Future of Talent Weekly Newsletter to keep reading this post and get 7 days of free access to the full post archives.