Technical interviews are expensive. Candidates invest hours preparing and participating. Companies invest engineering time in conducting them. Yet most interview processes are poor predictors of job performance.
We’ve hired over 50 engineers in the past three years, iterating on our process continuously. Here’s what we’ve learned about what actually works.
Problems with Common Approaches
Whiteboard Algorithm Puzzles
The classic approach: candidates solve algorithm problems on a whiteboard.
Problems:
- Tests recall, not engineering ability
- Artificial environment unlike actual work
- Advantages those with recent CS education or interview prep
- Doesn’t reveal collaboration or communication skills
- Stressful in ways that don’t reflect job stress
Algorithmic thinking matters, but whiteboard puzzles are a poor way to assess it.
Brain Teasers
“How many golf balls fit in a school bus?”
Problems:
- No correlation with job performance
- Tests puzzle familiarity, not problem-solving
- Creates anxiety without useful signal
- Wastes limited interview time
Google famously abandoned these after analyzing their predictive value (none).
Trivia Questions
“What’s the time complexity of HashMap.get()?”
Problems:
- Tests memorization, not understanding
- Knowledge easily looked up
- Doesn’t reveal how candidates approach problems
- Creates false negatives for capable engineers
Take-Home Projects
Candidates complete projects at home, then discuss.
Problems:
- Time burden excludes candidates with responsibilities
- Unclear time expectations
- Quality varies with available time, not ability
- Easy to get external help
Take-homes can work but need careful design.
What Actually Works
Work Sample Tests
The strongest predictor of job performance is a sample of the actual work.
Approach:
- Design problems similar to real work
- Use realistic environment (laptop, IDE, internet)
- Focus on problems you’d actually encounter
- Evaluate the process, not just the result
Example: For a backend role, a realistic problem might be:
- Given a partially implemented service, add a feature
- Debug an issue in existing code
- Review a pull request and provide feedback
Structured Interviews
Consistent questions across candidates, with scoring rubrics.
Benefits:
- Reduces bias
- Enables comparison across candidates
- Ensures coverage of important areas
- Provides data for process improvement
Structure:
Question: "Tell me about a time you had to debug a production issue under pressure."
Scoring rubric:
1 - Unable to provide example or poor approach
2 - Basic example, reasonable approach
3 - Good example, methodical approach, learned from it
4 - Exceptional example, systematic debugging, improved systems after
Pair Programming
Collaborate with the candidate on a real problem.
Benefits:
- See how they work, not just the result
- Assess communication and collaboration
- Create realistic interaction
- Evaluate teaching and learning
Tips:
- Choose problems from your actual codebase
- Let candidates use their preferred tools
- Act as a collaborative partner, not examiner
- Focus on problems solvable in the time available
System Design Discussions
For senior roles, discuss how they’d design systems.
Effective approach:
- Start with open-ended problem
- Let candidate drive the discussion
- Dig into tradeoffs
- Follow their expertise
Good discussion: “Design a URL shortening service”
- Starts simple, allows depth
- Reveals how they think about scale
- Surfaces tradeoffs (consistency, availability, latency)
- Can probe specific areas based on role
Technical Experience Deep-Dive
Discuss their previous work in depth.
Questions:
- “Walk me through the architecture of the system you built”
- “What would you do differently now?”
- “What was the hardest technical challenge?”
- “How did you make the key technical decisions?”
What you learn:
- Actual role vs. team credit
- Depth of understanding
- Learning and reflection
- Technical decision-making
Process Design
Define What You’re Looking For
Before interviewing, define requirements:
Technical skills:
- Languages and frameworks
- System design ability
- Debugging and problem-solving
- Specific domain knowledge
Working style:
- Collaboration and communication
- Handling ambiguity
- Learning new things
- Giving and receiving feedback
Values fit:
- Alignment with team values
- Quality orientation
- Ownership mentality
Design for Coverage
Ensure interviews cover needed areas:
Phone Screen (45 min):
- Basic technical screening
- Communication ability
- Mutual interest check
On-site Interview 1 (60 min):
- Pair programming session
- Real codebase problem
On-site Interview 2 (60 min):
- System design discussion
- Architecture and tradeoffs
On-site Interview 3 (45 min):
- Technical deep-dive
- Past experience discussion
On-site Interview 4 (45 min):
- Values and working style
- Team collaboration assessment
Train Interviewers
Interviews are a skill:
Training covers:
- Structured interviewing techniques
- Bias awareness and mitigation
- Scoring rubric usage
- Legal compliance
- Candidate experience
Shadow and reverse shadow:
- New interviewers observe experienced ones
- Experienced interviewers observe new ones
- Calibrate scoring together
Calibrate Decisions
After interviews, calibrate as a group:
Debrief process:
- Each interviewer shares assessment independently (prevent anchoring)
- Discuss areas of agreement and disagreement
- Identify missing information
- Make collective decision
Red flags for recalibration:
- Strong disagreement between interviewers
- Assessment based on unstated criteria
- Bias patterns emerging in data
Candidate Experience
Respect Their Time
- Clear communication about process and timeline
- Efficient scheduling
- Prompt decisions and feedback
- Reasonable time investment expectations
Set Them Up for Success
- Share what to expect before interviews
- Provide context about the role and team
- Allow questions throughout
- Create comfortable environment
Give Feedback
Many companies avoid feedback for legal reasons. But:
- Candidates appreciate it
- Builds reputation
- Helps rejected candidates improve
At minimum, provide meaningful closure.
Measuring Interview Effectiveness
Track Hiring Outcomes
Correlate interview scores with job performance:
- Do high-scoring candidates perform well?
- Do low-scoring candidates struggle?
- Which interview components predict success?
Track Process Metrics
- Time to fill positions
- Offer acceptance rate
- Candidate satisfaction
- Interviewer time investment
Iterate Based on Data
Regular review:
- Which questions differentiate candidates?
- Which interviewers provide best signal?
- Where do candidates drop out?
- What feedback do candidates give?
Common Pitfalls
Hiring People Like Us
Homogeneous teams:
- Miss diverse perspectives
- Create groupthink
- Limit talent pool
Combat with:
- Diverse interview panels
- Structured interviews (reduce bias)
- Evaluate for complementary skills
Pattern Matching on Credentials
Credentials (school, companies) are weak predictors:
- Talented people come from everywhere
- Credentials correlate with privilege
- Past environment != future performance
Evaluate ability, not pedigree.
Optimizing for False Negatives
“Better to miss a good candidate than hire a bad one” taken too far:
- Creates impossible bar
- Rejects qualified candidates
- Slows hiring indefinitely
Find balance between quality bar and hiring needs.
One Bad Interview = Rejection
Single data points are noisy:
- Candidates have off moments
- Interviewers have off moments
- Single interviews miss complete picture
Weight aggregate signal, not single interviews.
Key Takeaways
- Work samples predict performance better than algorithm puzzles
- Structured interviews reduce bias and enable comparison
- Pair programming reveals collaboration and real working style
- System design discussions assess architecture thinking
- Define what you’re looking for before interviewing
- Train interviewers; interviewing is a skill
- Calibrate as a group to reduce individual bias
- Respect candidates’ time and experience
- Measure effectiveness and iterate based on data
- Avoid hiring for similarity; seek complementary skills
Interviewing well is hard. But it’s learnable, and the investment pays dividends in team quality.