The best interview feedback rates each candidate against specific role requirements, cites evidence for every score, and ends with a clear hire or no-hire recommendation. Below you'll find 15+ copy-ready templates covering strong hires, rejections, on-hold decisions, panel debriefs, skills assessments, and cultural fit evaluations.
Why templates? Because most teams don't have a consistent feedback process. According to PwC's Future of Recruiting research, 78% of candidates want specific feedback when they're rejected. Yet Greenhouse's 2024 State of Job Hunting Report found 61% of job seekers are ghosted after interviews. That gap hurts your employer brand and your pipeline. Candidates who don't hear back share negative experiences with their networks, and nearly half turn down future offers from companies that handled the process poorly.
This guide walks through the research on why feedback format matters, then gives you every template you need to cover any interview outcome.
TL;DR: Interview feedback templates reduce bias, speed up hiring decisions, and protect your employer brand. Structured interviews are 34% more predictive of job performance than unstructured ones (Schmidt & Hunter). This guide includes 15+ copy-ready templates for every outcome - strong hires, rejections, on-hold decisions, panel debriefs, and skills scorecards.
Why Does Interview Feedback Quality Determine Hiring Outcomes?
Candidates asked for feedback after an interview are 126% more likely to refer other candidates to your company, according to ERE's Talent Board CandE 2024 Benchmark Research, based on 230,000+ candidate responses. That single number explains why interview feedback isn't administrative busywork - it's a pipeline multiplier.
The data gets worse when you look at what happens without feedback. Candidate resentment hit an all-time high in 2024, with 15% of North American candidates reporting strong negative feelings toward employers, according to the same Talent Board research. In tech, that number jumped to 28% - the highest in 13 years of tracking.
What does resentment look like in practice? It hits your bottom line. According to Starred's 2024 analysis, 72% of candidates share negative hiring experiences with their professional network. Between 41% and 50% of dissatisfied candidates refuse to do business with the company afterward. And 25% actively discourage others from buying the company's products.
Companies that invest in feedback see the opposite effect. CandE Award-winning companies gave 13% more feedback to finalist candidates than average, per Talent Board data. Their willingness-to-refer NPS scored 23 versus 13 for other companies - a 56% advantage. That matters because employers hire 20-40% of their workforce from referrals.
Here's the question every recruiter should ask: if your interview process generates candidates who feel resentful, how many qualified applicants are you losing before they ever apply? The fix doesn't require overhauling your entire hiring process. It starts with having a consistent, structured way to document and communicate interview outcomes. When you pair structured feedback with AI candidate screening tools that evaluate candidates consistently, the result is a hiring pipeline where every interaction builds your employer brand instead of eroding it.
What Separates Good Interview Feedback from Bad?
The U.S. Department of Labor estimates a bad hire costs up to 30% of the employee's first-year earnings. For an $80,000 role, that's a $24,000 loss. CareerBuilder's employer survey found that nearly 75% of employers admit to having hired the wrong person. Most of those bad hires trace back to vague feedback that doesn't give hiring managers enough information to make sound decisions.
Here's what separates useful feedback from the kind that leads to costly mistakes:
| Element | Bad Feedback | Good Feedback |
|---|---|---|
| Specificity | "Seemed smart" | "Solved the system design problem using a distributed caching approach and explained tradeoffs clearly" |
| Evidence | "Good culture fit" | "Referenced three examples of cross-team collaboration, aligned with our transparency value" |
| Rating | "Thumbs up" | "Technical: 4/5, Communication: 3/5, Problem-solving: 5/5" |
| Concerns | "Not sure about this one" | "Limited Kubernetes experience, but strong Docker skills and expressed eagerness to learn" |
| Recommendation | "I liked them" | "Strong hire for senior backend. Weakest area is front-end, but the role is 90% backend" |
Bad feedback is vague, unstructured, and impossible to compare across candidates. It leaves hiring managers guessing and opens the door to unconscious bias. Good feedback is specific, evidence-based, and tied to the role's actual requirements. It gives the hiring committee concrete data to work with.
When every interviewer uses the same framework, you can compare candidates fairly across multiple conversations. This matters for legal defensibility too - structured interview feedback creates a paper trail showing each candidate was evaluated on the same criteria. Organizations using a skills-based hiring approach consistently report better outcomes because their feedback focuses on demonstrated abilities rather than gut feelings.
How Do You Write Structured Interview Feedback?
Structured interviews have a predictive validity of .51 for job performance, compared to .38 for unstructured interviews, according to Schmidt and Hunter's landmark meta-analysis of 85 years of personnel research. That means structured interviews are roughly 34% more effective at predicting whether someone will actually succeed in the role. The difference comes down to consistency.
In a structured interview, every candidate answers the same questions and gets scored on the same criteria. The feedback follows the same format. This eliminates the "I just had a feeling" problem that derails unstructured processes.
The legal stakes are real. Williamson, Campion, and colleagues analyzed 99 employment litigation outcomes in the International Journal of Selection and Assessment. Nearly 60% of discrimination lawsuits involving interviews were based on unstructured formats. Structured interviews accounted for just 6% of cases.
Here's a five-step framework for writing feedback that's specific, defensible, and useful:
- Start with the role's requirements, not your impressions. Pull up the job requirements and score the candidate against each one. Don't start with "I liked this person." Start with "Here's how they performed against what the role demands."
- Use a consistent rating scale. A 1-5 scale works for most teams. Define what each number means: 1 = does not meet requirements, 3 = meets expectations, 5 = significantly exceeds.
- Cite specific examples. Every rating needs evidence. "Communication: 4/5" means nothing alone. "Communication: 4/5 - explained database migration clearly, asked three clarifying questions about scale" tells the hiring manager something useful.
- Note concerns with context. Don't just flag weaknesses. "No React experience" is less helpful than "No React experience, but has 4 years of Vue.js and transitioned between frameworks before. Risk is moderate."
- Make a clear recommendation. End every feedback form with one of four options: Strong Hire, Hire, No Hire, or Strong No Hire. If you can't decide, default to No Hire and explain why.
This framework works whether you're using AI recruiting tools to manage your pipeline or running a fully manual process. The templates below follow this exact structure.
Templates for Positive Hiring Decisions
Only 25% of organizations feel highly confident in their ability to measure quality of hire, according to LinkedIn's 2025 Future of Recruiting report. Structured feedback for positive decisions is where that confidence starts. These templates cover the two most common positive outcomes: a strong hire recommendation and a cultural fit assessment. Copy them, customize the bracketed fields, and submit within 24 hours.
Template 1: Strong Hire Recommendation
Candidate: [Name] | Position: [Role Title] | Interview Date: [Date]
Interviewer: [Your Name] | Interview Round: [Phone Screen / Technical / Final]
Overall Recommendation: Strong Hire
Technical/Role-Specific Skills (Rating: _/5): [Candidate] demonstrated [specific competency] during [specific exercise or question]. For example, [concrete example of what they did, said, or produced]. This maps directly to the role's requirement for [specific job function].
Problem-Solving and Critical Thinking (Rating: _/5): When presented with [challenge or scenario], [Candidate] [specific approach they took]. They [positive behavior - asked clarifying questions, broke the problem into steps, identified edge cases, proposed multiple solutions].
Communication and Collaboration (Rating: _/5): [Candidate] [specific communication strength]. During [portion of interview], they [example - explained technical concepts to a non-technical interviewer, asked insightful questions about team dynamics, articulated their thought process clearly].
Concerns and Risks: [List specific concerns with context. Example: "Limited experience with [technology/skill], but demonstrated rapid learning ability based on [evidence]. Estimate 2-3 months to reach full proficiency."]
Summary: I recommend advancing [Candidate] to [next stage / offer]. Their strongest qualifications are [top 2 strengths]. Primary risk is [concern], which is mitigable because [reason].
Template 2: Cultural Fit Assessment
Candidate: [Name] | Position: [Role Title] | Interview Date: [Date]
Interviewer: [Your Name] | Focus Area: Values and Team Alignment
Company Values Alignment (Rating: _/5):
- [Value 1]: [Candidate] demonstrated this through [specific example from interview - a story they told, a decision they described, how they responded to a scenario question].
- [Value 2]: [Specific evidence or lack thereof. Be concrete - "Described choosing transparency over convenience when reporting a missed deadline to a client."]
Work Style and Preferences:
- Collaboration approach: [What did the candidate say about how they work with others? Cite specific answers.]
- Conflict resolution: [How did they describe handling disagreements? What example did they give?]
- Autonomy vs. structure: [Did they express a preference for independence or clear direction? How does this match the team's operating style?]
Team Dynamics Fit: Based on the team's current composition and working style, [Candidate] would [enhance/complement/potentially conflict with] the team because [specific reason]. [Note any diversity of thought or perspective they'd bring.]
Red Flags: [List any concerns about values misalignment. Be specific - not "bad attitude" but "expressed frustration about collaborative code reviews, which are central to our engineering process."]
Summary: Cultural alignment is [strong/moderate/weak]. [One-sentence recommendation with evidence.]
What Should Rejection Feedback Look Like?
Rejections are where most interview processes fall apart. PwC's research shows that 39% of rejected candidates specifically want to hear from someone they interviewed with - not a form email from an ATS. These templates help you document rejections with enough detail for internal records and provide candidates with useful feedback when you follow up.
Template 3: Soft Reject (Promising but Not the Right Fit Now)
Candidate: [Name] | Position: [Role Title] | Interview Date: [Date]
Interviewer: [Your Name]
Overall Recommendation: No Hire - Consider for Future Roles
Strengths Observed: [Candidate] showed strong [specific skill or quality]. Their experience with [relevant background] is genuinely impressive, particularly [specific example from interview].
Gap Analysis: The primary gap is [specific skill or experience missing]. This role requires [specific requirement], and [Candidate]'s experience in this area is [description of current level]. This isn't a reflection of their talent - it's a timing issue. With [timeframe] of [specific development], they'd be a strong contender.
Candidate-Facing Feedback (if sharing): "Thank you for your time interviewing for [Role]. We were impressed by your [specific strength]. We've decided to move forward with a candidate whose [specific qualification] more closely matches our current needs. We'd like to stay in touch for future opportunities in [area] - would you be open to that?"
Internal Notes: Add to talent pipeline for [role type/department]. Flag for re-outreach in [timeframe]. [Any other context for future recruiters.]
Template 4: Clear No-Hire
Candidate: [Name] | Position: [Role Title] | Interview Date: [Date]
Interviewer: [Your Name]
Overall Recommendation: No Hire
Assessment Against Requirements:
- [Requirement 1] (Rating: _/5): [Specific observation. What did the candidate demonstrate or fail to demonstrate?]
- [Requirement 2] (Rating: _/5): [Specific observation.]
- [Requirement 3] (Rating: _/5): [Specific observation.]
Key Gaps: [Candidate] did not meet the minimum bar for [specific requirements]. During [specific moment], [describe what happened - struggled with a foundational concept, couldn't articulate relevant experience, gave answers that contradicted their resume].
Candidate-Facing Feedback (if sharing): "Thank you for interviewing for [Role]. After careful evaluation, we've decided not to move forward. We encourage you to [specific, constructive suggestion - e.g., 'build more experience with distributed systems' or 'practice explaining your technical decisions to non-technical stakeholders']."
Internal Notes: [Any context about the rejection that would help future interviewers - was the candidate misrepresented by their resume? Was there a cultural concern?]
Pin's AI scans 850M+ profiles to find candidates who match your role requirements before they ever reach the interview stage - try it free.
Templates for On-Hold, Panel Debrief, and Skills Scorecard
According to PwC's research, 67% of candidates gave up pursuing a role because the process took too long. On-hold decisions, panel debriefs, and skills assessments are the scenarios where delays pile up fastest. These templates keep those gray-area decisions moving.
Template 5: On-Hold Decision
Candidate: [Name] | Position: [Role Title] | Interview Date: [Date]
Interviewer: [Your Name]
Overall Recommendation: Hold - Pending [Reason]
Current Assessment: [Candidate] meets [X of Y] core requirements. Their [specific strengths] are competitive with other candidates in the pipeline. However, [specific reason for hold - waiting on another finalist, budget approval pending, team restructure, need to validate a specific skill].
Hold Conditions:
- What needs to happen before a decision: [Specific trigger - "Complete final-round interviews with 2 remaining candidates" or "Confirm headcount approval from VP Engineering"]
- Timeline: Decision by [date]. If no decision by [date + 1 week], default to [action].
- Candidate communication: [Who will update the candidate, and when? What will they say?]
Risk of Waiting: [Candidate] is actively interviewing at [companies if known]. Likelihood of losing them: [low/medium/high]. If high, consider [accelerated timeline or interim offer].
Template 6: Panel Debrief Summary
Candidate: [Name] | Position: [Role Title] | Panel Date: [Date]
Panel Members: [Names and roles of each interviewer]
Debrief Facilitator: [Name]
Individual Ratings (collected before group discussion):
| Interviewer | Technical | Communication | Problem-Solving | Culture | Overall |
|---|---|---|---|---|---|
| [Name 1] | _/5 | _/5 | _/5 | _/5 | [Hire/No Hire] |
| [Name 2] | _/5 | _/5 | _/5 | _/5 | [Hire/No Hire] |
| [Name 3] | _/5 | _/5 | _/5 | _/5 | [Hire/No Hire] |
Areas of Agreement: [Where did all panelists align? What strengths or concerns did everyone identify?]
Areas of Disagreement: [Where did panelists differ? Who rated what differently and why? How was the disagreement resolved?]
Consensus Decision: [Strong Hire / Hire / No Hire / Strong No Hire]
Dissenting Opinions: [If any panelist disagrees with the consensus, document their reasoning here.]
Next Steps: [Specific actions - extend offer by [date], schedule additional interview, send rejection]
Template 7: Skills Assessment Scorecard
Candidate: [Name] | Position: [Role Title] | Assessment Type: [Take-home / Live coding / Case study / Presentation]
Evaluator: [Your Name] | Date: [Date]
| Skill Area | Weight | Rating (1-5) | Weighted Score | Evidence |
|---|---|---|---|---|
| [Core Skill 1] | 30% | _/5 | _ | [Specific example from assessment] |
| [Core Skill 2] | 25% | _/5 | _ | [Specific example] |
| [Core Skill 3] | 20% | _/5 | _ | [Specific example] |
| [Soft Skill 1] | 15% | _/5 | _ | [Specific example] |
| [Soft Skill 2] | 10% | _/5 | _ | [Specific example] |
Total Weighted Score: _/5.0
Minimum Threshold for Hire: 3.5/5.0
Recommendation: [Meets threshold / Below threshold]
Notable Observations: [Anything that the scorecard doesn't capture - how the candidate handled time pressure, whether they asked good questions, how they responded to hints or feedback during the assessment.]
How Should You Deliver Interview Feedback to Candidates?
According to Greenhouse's 2024 State of Job Hunting Report, 42% of candidates said stronger recruiter communication was their top priority. Delivering feedback well is as important as writing it well. Here's how to do it without burning bridges.
Timing matters more than you think. Send internal feedback within 24 hours of the interview while details are fresh. Communicate the decision to candidates within 3-5 business days. The longer you wait, the more resentment builds - and the more likely you are to lose a strong candidate to a faster-moving competitor.
Be specific but not granular. Candidates want to know why they weren't selected. They don't need a line-by-line scorecard breakdown. Focus on the 1-2 primary reasons and frame them constructively: "We were looking for deeper experience in X" rather than "You failed the X portion."
Never compare candidates directly. Don't say "We found someone better." Say "We moved forward with a candidate whose background more closely matched our specific needs for this role." The difference is subtle but significant - one feels personal, the other feels procedural.
Offer to stay connected. For soft rejects, always ask if the candidate is open to being contacted for future roles. This keeps your talent pipeline warm. If you're using tools like Pin's AI sourcing to build candidate pipelines of 850M+ profiles, you already know how valuable it is to maintain relationships with qualified people who weren't the right fit this time. Pin's automated outreach hits a 48% response rate - but that number only matters if the candidates reaching your interview stage are the right ones.
As Miles Randle, Head of People and Talent at Flip CX, put it: "As a small people and talent team, we don't have a ton of time to spend hours sourcing and messaging. Pin has made it possible for us to focus on the people side of things!" That "people side" is exactly what good feedback delivery is about - treating candidates as humans, not tickets.
If you're spending more time sourcing and scheduling than actually talking to candidates, AI interview scheduling tools can take the administrative work off your plate so you can focus on the conversations that matter.
Frequently Asked Questions
What should interview feedback include?
Effective interview feedback includes a rating on each core competency the role requires, specific examples from the interview supporting each rating, a clear hire/no-hire recommendation, and documented concerns with context. According to Schmidt and Hunter's research, structured feedback tied to job requirements is 34% more predictive of actual performance than unstructured impressions.
How quickly should recruiters send interview feedback?
Internal feedback should be submitted within 24 hours of the interview. Candidate-facing decisions should be communicated within 3-5 business days. Greenhouse's 2024 data shows 61% of candidates are ghosted after interviews, so timely communication alone puts you ahead of most employers.
Do candidates have a legal right to interview feedback?
In the US, employers aren't legally required to provide feedback to rejected candidates. However, structured feedback protects employers - unstructured interviews account for 60% of interview-related discrimination lawsuits, compared to just 6% for structured formats (Williamson et al., 1997). Documenting consistent, criteria-based feedback is your strongest defense.
How does structured interview feedback reduce hiring bias?
Structured feedback forces interviewers to evaluate every candidate against the same criteria with specific evidence for each rating. This reduces halo effects, affinity bias, and gut-feel decisions. The result is more consistent, defensible hiring. Companies using AI recruiting tools alongside structured scorecards see additional bias reduction because the initial candidate screening happens without exposure to names, gender, or demographic data.
What's the difference between interview feedback and interview notes?
Interview notes are raw observations recorded during or immediately after the conversation - what the candidate said, how they responded. Interview feedback is a structured evaluation that scores the candidate against role requirements and makes a recommendation. Notes feed into feedback, but feedback is the document that drives hiring decisions. Only 25% of organizations feel highly confident measuring quality of hire (LinkedIn, 2025) - structured feedback closes that gap.
Better Interviews Start Before the Interview
Structured interview feedback isn't just about documentation. It's about building a hiring process where every decision is specific, evidence-based, and defensible. The templates in this guide give you a starting point for strong hires, soft rejects, on-hold decisions, panel debriefs, skills assessments, and cultural fit evaluations.
But the best feedback in the world can't fix a bad pipeline. If the candidates reaching your interview stage aren't qualified, your interviewers are wasting time writing detailed feedback about people who were never going to work out. The real efficiency gain comes from pairing structured interviews with AI-powered sourcing that surfaces candidates who actually match your requirements from the start.