Technical Leadership

The Conversation-Based Interview: How I Evaluate 7-8 Year Experienced Engineers

After 1,000+ technical interviews, I've learned traditional coding tests don't reveal what matters. 6 out of 10 candidates tell me my interview felt like a professional discussion, not an interrogation. Here's the scenario-based framework: career journey evaluation, technical depth through projects, collaborative problem-solving, SOLID principles integration, leadership assessment, and feedback delivery that turns interviews into conversations.

Ruchit Suthar
Ruchit Suthar
November 22, 202520 min read
The Conversation-Based Interview: How I Evaluate 7-8 Year Experienced Engineers

TL;DR

LeetCode and trivia don't reveal how experienced engineers actually work. Use scenario-based conversations to evaluate systems thinking, communication clarity, handling of ambiguity, and technical trade-off decisions. The best engineers explain complex problems simply and admit what they don't know.

The Conversation-Based Interview: How I Evaluate 7-8 Year Experienced Engineers

After conducting over 1,000 technical interviews, I've learned something critical: traditional coding tests and trivia questions don't reveal what you actually need to know about experienced engineers. Six or seven out of every ten candidates tell me afterward that my interview felt less like an interrogation and more like a professional discussion. That's deliberate.

Here's why and how I've structured my interview approach for mid-senior engineers with 7-8 years of experience—and more importantly, what I'm actually evaluating beneath the surface of those conversations.

The Problem with Traditional Technical Interviews

Most technical interviews for experienced engineers are broken. They either:

  • Reduce architects to algorithm performers – LeetCode problems that test nothing about real system design
  • Turn discussions into trivia contests – "What's the difference between abstract class and interface?" as if they don't have Google
  • Miss the critical signals – How they think, communicate, handle ambiguity, and make trade-offs under real constraints

When you're hiring someone with 7-8 years of experience, you're not hiring a coder. You're hiring someone who should be:

  • Leading feature development end-to-end
  • Mentoring junior developers
  • Making architectural decisions that affect velocity
  • Communicating technical concepts to non-technical stakeholders
  • Owning outcomes, not just delivering tasks

Standard interviews don't evaluate any of this. Scenario-based conversations do.

My Interview Framework: The Architecture of a Conversation

The interview runs 60-90 minutes and follows this structure:

Phase 1: Setting the Foundation (10 minutes)

What I Do:

  • Introduce myself: background, current role, what I'm passionate about
  • Share details about the role we're hiring for
  • Explain what we expect from this position (technical scope, leadership responsibilities, team dynamics)
  • Make it clear: "This won't be a typical Q&A. I want to understand how you think and work."

What I'm Evaluating:

  • Their listening skills – Are they absorbing context or just waiting to talk?
  • Their questions – Do they ask clarifying questions about the role?
  • Body language/tone – Are they relaxed enough to have a real conversation?

Why It Matters: Setting the right tone transforms the power dynamic. I'm not the gatekeeper; I'm a potential future colleague trying to understand if we'd work well together.


Phase 2: Understanding Their Journey (10-15 minutes)

What I Ask:

  • "Walk me through your career journey so far."
  • Not just roles and dates—I want to understand the why behind their moves
  • What attracted them to each role?
  • What did they learn?
  • Why did they leave?

What I'm Evaluating:

Career Intentionality:

  • Are they growing deliberately or just jumping for money/titles?
  • Do they show progression in responsibility and scope?
  • Do their moves make strategic sense?

Self-Awareness:

  • Can they articulate what they learned from each experience?
  • Do they own mistakes or blame external factors?
  • Are they reflective about their growth areas?

Ambition & Fit:

  • What trajectory are they on? (IC depth vs leadership track)
  • Does their next logical step align with what we're offering?

Red Flags:

  • Job-hopping without learning
  • Blaming previous teams/managers for everything
  • No clear pattern of increasing responsibility

Green Flags:

  • "I joined X because I wanted to learn Y, and after two years I achieved that, so I moved to Z to work on..."
  • Honest about mistakes: "I took that role thinking I'd build a team, but I realized I wasn't ready yet."
  • Clear growth trajectory toward senior IC or tech lead roles

Phase 3: Technical Depth Through Projects (15-20 minutes)

What I Ask:

  • "Tell me about the most complex project you've worked on in the last year or two."
  • Then I dig deeper with follow-ups:
    • What was your role specifically?
    • What were the key technical challenges?
    • What solutions did you consider?
    • Why did you choose the approach you did?
    • What would you do differently now?

What I'm Actually Evaluating:

Technical Ownership:

  • Can they explain complex systems clearly?
  • Do they say "I" or always "we"? (Both are important, but I want to know their contribution)
  • Do they understand the full stack or just their narrow piece?

Decision-Making Under Constraints:

  • Were there trade-offs? How did they evaluate them?
  • Did they consider non-technical factors (time, team skill, cost, maintainability)?
  • Can they defend their decisions with reasoning, not just "because that's what we decided"?

Problem-Solving Approach:

  • Do they jump to solutions or first clarify the problem?
  • Did they research multiple approaches?
  • How do they handle unknowns?

Learning & Growth:

  • What did they learn from this project?
  • What would they do differently? (Self-awareness)
  • Did they share learnings with the team?

How I Cross-Question: As they explain, I ask pointed follow-ups to verify depth:

  • "You mentioned you used microservices—how did you handle data consistency across services?"
  • "That caching strategy is interesting—what was your cache invalidation approach?"
  • "You said performance was an issue—what metrics were you tracking? What was the bottleneck?"

I'm not trying to catch them in mistakes. I'm trying to understand: Do they truly understand the decisions they made, or were they just implementing someone else's design?


Phase 4: Scenario-Based Technical Discussion (20-25 minutes)

This is the heart of the interview. I present a real-world problem (often from our actual system) and have a collaborative discussion about how to solve it.

Example Scenario:

"We're building a content management system where authors write articles, editors review them, and readers consume them. We expect 100,000 daily active users initially, growing 10x over two years. Articles can have images, videos, and need to support multiple languages.

Walk me through how you'd architect this system."

What Happens Next:

I don't expect a perfect answer. I expect a conversation.

The Process:

  1. They ask clarifying questions (functional requirements, scale, constraints)
  2. They think out loud about different approaches
  3. I introduce complications: "What if editors need real-time collaboration?" or "What if we need full-text search?"
  4. We discuss trade-offs together
  5. I challenge their assumptions: "You mentioned using Redis for caching—what's your eviction strategy?"

What I'm Evaluating:

Requirements Gathering:

  • Do they clarify ambiguity before jumping to solutions?
  • Do they ask about non-functional requirements (scale, latency, budget)?
  • Do they understand the business context?

System Thinking:

  • Do they think in components and boundaries?
  • Do they consider data flow, not just static architecture?
  • Are they thinking about the full lifecycle (development, deployment, monitoring)?

Technical Depth:

  • Do they understand the technologies they propose?
  • Can they articulate when to use SQL vs NoSQL?
  • Do they think about caching, queuing, consistency, scalability?

Communication:

  • Can they explain complex ideas simply?
  • Do they use diagrams/examples?
  • Can they defend their choices under scrutiny?

Pragmatism vs Over-Engineering:

  • Are they building for 100K users or prematurely optimizing for 10M?
  • Do they suggest simple solutions first, then discuss how to scale?
  • Do they consider team skill and velocity, not just technical elegance?

SOLID and OOP Principles (Natural Integration):

As we discuss component design, I naturally steer toward object-oriented thinking:

  • "How would you structure the Article entity? What if we need different types of articles—blog posts, news, tutorials?"

  • This reveals: Single Responsibility, Open/Closed, Liskov Substitution

  • "If we add a new notification channel (SMS, push, email), how would your design accommodate that?"

  • This reveals: Interface Segregation, Dependency Inversion

I'm not asking them to recite principles. I'm seeing if they naturally apply them in design discussions.


Phase 5: Extension & Evolution (10 minutes)

What I Ask:

  • "Let's say six months later, the product team wants to add a recommendation engine. How does your architecture accommodate this?"
  • "What if we need to support 10M daily users? What breaks first? How do you scale?"

What I'm Evaluating:

Architectural Foresight:

  • Did they design with extension points?
  • Can they identify bottlenecks before they happen?
  • Do they understand the difference between scaling up and scaling out?

Refactoring Mindset:

  • Are they willing to say "my initial design wouldn't handle this—I'd need to refactor"?
  • Or do they pretend their first design handles everything? (Red flag)

Pragmatic Evolution:

  • Do they understand that architecture evolves with product needs?
  • Can they balance "build for today" with "design for tomorrow"?

Phase 6: Leadership & Collaboration (10-15 minutes)

What I Ask:

  • "Have you mentored junior developers? Tell me about that experience."
  • "Walk me through a recent code review you did. What did you focus on?"
  • "Tell me about a time you had a technical disagreement with a teammate. How did you handle it?"
  • "Have you led a feature end-to-end? What non-coding responsibilities did that involve?"

What I'm Evaluating:

Mentorship Capability:

  • Can they explain technical concepts to less experienced developers?
  • Do they have patience and empathy in teaching?
  • Have they created documentation, runbooks, or learning resources?

Code Review Philosophy:

  • Do they focus on nitpicks (formatting) or architecture (design flaws)?
  • Do they treat reviews as teaching moments?
  • Can they give constructive feedback without ego?

Conflict Resolution:

  • How do they handle technical disagreements?
  • Do they debate ideas or defend ego?
  • Can they find common ground and make decisions?

Cross-Functional Collaboration:

  • Have they worked with PMs, designers, QA?
  • Do they understand user impact, not just technical correctness?
  • Can they communicate technical constraints to non-technical stakeholders?

Leadership Potential: For someone with 7-8 years of experience, I'm evaluating: Are they ready to step into a tech lead role?

  • Have they led small initiatives or features?
  • Do they think beyond their own tasks to team outcomes?
  • Can they unblock others, not just do their own work?

Cross-Questioning:

I ask follow-ups to verify authenticity:

  • "You mentioned mentoring—what's a specific example of something you taught a junior developer recently?"
  • "In that code review, what specific feedback did you give? Why did you prioritize that?"
  • "When you disagreed with your teammate, what was the technical reason behind your stance? What was theirs?"

Vague answers like "I always try to help" or "I review code for best practices" don't tell me much. I want specific stories.


Phase 7: Closing & Feedback (5-10 minutes)

What I Do:

  • "Do you have any questions for me?"
  • I answer honestly—about the team, challenges, tech stack, culture
  • I give immediate feedback:
    • What I think they did well
    • What areas I'd encourage them to grow
    • Whether I think they're a fit for this specific role and why (or why not)

Why I Do This: Candidates appreciate honesty. Even if they don't get the job, they leave with something valuable: clarity on where they stand and how to grow.

This is rare in interviews, and it's why candidates often say my interviews feel different.


What I'm NOT Doing (And Why)

I Don't Ask Trivia Questions

No:

  • "What are the SOLID principles?"
  • "Explain the difference between == and === in JavaScript."
  • "How does garbage collection work in Java?"

Why Not: These test memorization, not ability. Any senior engineer can Google this in 10 seconds. I care if they apply these concepts naturally in their work, not if they can recite definitions.

I Don't Give Algorithm Challenges

No:

  • "Reverse a linked list on the whiteboard."
  • "Find the longest palindromic substring."

Why Not: For 7-8 year experienced engineers, these tests are insulting and irrelevant. They're not joining to write LeetCode-style code. They're joining to architect features, mentor teams, and make real-world engineering decisions.

Exception: If the role involves heavy algorithmic work (search engines, compilers, data processing pipelines), I'll ask relevant algorithmic questions—but framed as real problems they'd face, not abstract puzzles.

I Don't Put Candidates on the Spot

No:

  • "Code a solution to X right now while I watch."

Why Not: Performance anxiety doesn't reveal true skill. Some of the best engineers I know think slowly and deliberately. Watching them code under pressure tells me nothing about how they work in a real team environment.


The Evaluation Framework: What I'm Scoring

After every interview, I score candidates across these dimensions:

1. Technical Depth (0-5)

  • 0-2: Surface-level knowledge, can't explain decisions deeply
  • 3: Solid fundamentals, understands trade-offs, but limited breadth
  • 4: Deep expertise in their domain, can debate technical choices confidently
  • 5: Exceptional depth, teaches me something in the interview

2. System Thinking (0-5)

  • 0-2: Thinks in isolated components, no end-to-end view
  • 3: Understands systems, but doesn't naturally think about scale/reliability
  • 4: Strong architectural thinking, considers non-functional requirements
  • 5: Thinks in systems naturally, identifies bottlenecks before they happen

3. Communication (0-5)

  • 0-2: Struggles to articulate ideas, unclear explanations
  • 3: Can explain technical concepts but needs prompting
  • 4: Clear communicator, good at simplifying complexity
  • 5: Exceptional communicator, adjusts explanation to audience

4. Problem-Solving Approach (0-5)

  • 0-2: Jumps to solutions without understanding problem
  • 3: Methodical approach but needs guidance on trade-offs
  • 4: Clarifies requirements, evaluates multiple solutions, makes informed decisions
  • 5: Structured problem-solver, thinks through edge cases and future scenarios naturally

5. Learning & Growth Mindset (0-5)

  • 0-2: Defensive, doesn't admit gaps, blames external factors
  • 3: Aware of growth areas but not actively learning
  • 4: Self-aware, actively learning, open to feedback
  • 5: Continuous learner, shares knowledge, seeks growth opportunities

6. Leadership Potential (0-5)

  • 0-2: Pure IC mindset, no interest in mentoring/leading
  • 3: Has done some mentoring, open to leadership
  • 4: Natural mentor, has led small initiatives, thinks about team outcomes
  • 5: Clear leadership skills, unblocks teams, scales impact through others

7. Cultural Fit (0-5)

  • 0-2: Misaligned values (ego-driven, blames others, not collaborative)
  • 3: Decent fit but missing some cultural elements
  • 4: Strong fit, shares our values, would thrive in our environment
  • 5: Perfect fit, would elevate our culture

Scoring Rubric

  • Total Score < 20: Not a fit
  • Total Score 20-25: Possible fit with growth areas
  • Total Score 26-30: Strong candidate, likely hire
  • Total Score > 30: Exceptional, move fast to hire

Important: A candidate needs at least a 3 in Technical Depth and System Thinking to be considered. Everything else can be coached, but foundational technical skills are harder to build.


Red Flags I Watch For

These are signals that make me cautious, even if the candidate seems technically competent:

Technical Red Flags

  • Can't explain their own project decisions – Suggests they weren't driving, just following
  • Over-relies on frameworks without understanding fundamentals – "I used React because everyone uses React"
  • No awareness of trade-offs – Every decision was "the best" with no downsides
  • Can't identify what they'd do differently – Lack of self-reflection

Behavioral Red Flags

  • Blames previous teams/managers consistently – Never their fault
  • Defensive when questioned – Treats follow-ups as attacks, not curiosity
  • Talks down to interviewer – Condescending or dismissive tone
  • No questions about the role – Either not interested or lacks curiosity

Collaboration Red Flags

  • Uses "I" exclusively, never "we" – Might struggle with teamwork
  • Dismissive of non-technical stakeholders – "PMs don't understand tech"
  • Has never mentored anyone – Despite 7-8 years of experience
  • No examples of code reviews or collaboration – Works in a silo

Green Flags I Look For

These signals tell me someone will thrive:

Technical Green Flags

  • Asks clarifying questions before solving – Understands requirements matter
  • Discusses trade-offs unprompted – "We chose X over Y because..."
  • Identifies what they'd do differently – Self-aware and learning-oriented
  • Teaches me something in the interview – Deep expertise in their domain

Behavioral Green Flags

  • Owns mistakes from past projects – "In hindsight, I should have..."
  • Excited about learning – Asks about our tech stack, processes, challenges
  • Gives credit to team – Balances "I" and "we" naturally
  • Thoughtful about their next move – Not just chasing titles/money

Collaboration Green Flags

  • Has mentored juniors – Thinks about growing others
  • Strong code review philosophy – Treats it as teaching, not gatekeeping
  • Handled conflicts maturely – Can disagree without being disagreeable
  • Thinks about user/business impact – Not just technical correctness

Why This Approach Works

For the Candidate

  1. Less Anxiety: Conversations feel natural, not adversarial
  2. True Expression: They show their real thinking, not test-taking ability
  3. Valuable Feedback: They leave with growth insights, even if rejected
  4. Respectful Experience: They feel valued as professionals, not commodities

For Me (the Interviewer)

  1. Better Signal: I see how they actually work, not how they perform under artificial pressure
  2. Differentiation: 7-8 year experienced candidates vary wildly; this reveals the differences
  3. Cultural Fit: I can assess if we'd enjoy working together
  4. Scalability: The framework adapts to different seniorities and specializations

For the Company

  1. Higher Quality Hires: We select for real-world competence, not test performance
  2. Better Retention: Candidates know what they're getting into; fewer surprises
  3. Positive Employer Brand: Rejected candidates still recommend us because the process was respectful
  4. Diverse Talent: We don't exclude great engineers who struggle with traditional formats

Common Scenarios & How I Handle Them

Scenario 1: Candidate Struggles to Articulate

What Happens: They seem nervous, can't explain their thoughts clearly.

How I Respond:

  • Slow down, rephrase questions
  • Use analogies or examples to help them organize thoughts
  • Ask simpler, more directed questions
  • Give them time to think

What I'm Evaluating: Is this nervousness or a fundamental communication gap? In a relaxed conversation, can they express ideas?

Decision: If they improve as they relax, that's fine. If they can't explain technical concepts even when calm, that's a concern for a role requiring collaboration.


Scenario 2: Candidate Knows More Than Me in a Specific Area

What Happens: They're an expert in something I'm less familiar with (e.g., specific ML framework, niche domain).

How I Respond:

  • I tell them: "I'm not an expert in [X], so explain it to me like I'm a smart generalist."
  • I evaluate their teaching ability instead of the technical depth I can't verify
  • I ask about trade-offs and decisions, which are universal

What I'm Evaluating: Can they simplify complexity? Do they explain or condescend? This reveals collaboration style.


Scenario 3: Candidate Is Overqualified

What Happens: They're significantly more experienced or senior than the role requires.

How I Respond:

  • I'm honest: "You seem more senior than this role. Why are you interested?"
  • I explore their motivations—are they okay with the scope?

What I'm Evaluating: Will they be fulfilled? Will they leave quickly? Are they running from something?

Decision: Sometimes overqualified candidates are great hires (seeking work-life balance, pivoting domains). But I need to understand why.


Scenario 4: Candidate Has Gaps in Expected Skills

What Happens: They lack experience in something I expected (e.g., no cloud experience, no microservices).

How I Respond:

  • I ask about their learning approach: "How would you get up to speed on [X]?"
  • I assess learning agility over current knowledge

What I'm Evaluating: Are they a fast learner? Do they have transferable skills?

Decision: For 7-8 year experienced engineers, I expect strong fundamentals. Specific technologies can be learned. But if they lack fundamental system thinking, that's harder to teach.


How to Give Feedback (The Part Most Interviewers Skip)

At the end of every interview, I spend 5 minutes giving feedback:

If They're a Strong Fit

What I Say: "I really enjoyed our conversation. Here's what stood out to me:

  • [Specific strength 1]
  • [Specific strength 2]
  • [Specific strength 3]

One area I'd encourage you to keep growing is [specific area]. Overall, I think you'd be a strong fit for this role. Next steps are [X]."

If They're Not a Fit

What I Say: "Thank you for your time. I want to be honest—I don't think this role is the right fit right now. Here's why:

  • [Specific gap 1]
  • [Specific gap 2]

Here's what I'd suggest for your growth:

  • [Actionable advice 1]
  • [Actionable advice 2]

I appreciate you taking the time to talk with me."

Why I Do This

  • Respect: They invested time; they deserve honesty
  • Growth: Even rejected candidates leave with something valuable
  • Reputation: People remember how you treat them; this builds goodwill
  • Karma: I've been on the other side; I know how frustrating silent rejections are

Adapting This Framework for Different Roles

This framework works for 7-8 year experienced engineers aiming for senior IC or early tech lead roles. Here's how I adapt it:

For More Junior Candidates (3-5 years)

  • Less emphasis on system design
  • More focus on coding fundamentals and problem-solving
  • Shorter scenarios with more guidance
  • Evaluate learning potential over current expertise

For Staff+ Engineers (10+ years)

  • Deeper architectural discussions
  • More focus on organizational impact, not just technical depth
  • Discuss how they've influenced teams, not just built systems
  • Evaluate strategic thinking and technical leadership

For Manager Roles

  • Less technical depth, more people management
  • Discuss performance management, conflict resolution, team building
  • Evaluate empathy, decision-making under ambiguity, influence

Final Thoughts: The Philosophy Behind the Process

Traditional interviews optimize for false negatives—rejecting people who might succeed—because hiring the wrong person feels worse than missing a good candidate.

My approach optimizes for true positives—finding people who will genuinely thrive—because I believe:

  1. Talent is underestimated by traditional interviews. Many great engineers freeze under pressure, struggle with whiteboard coding, or don't fit the "textbook" mold.

  2. Real work is collaborative. I want to see how someone thinks with me, not against me.

  3. Communication matters as much as code. A brilliant engineer who can't explain their decisions creates bottlenecks.

  4. Culture fit is bidirectional. They're evaluating us as much as we're evaluating them. Transparency builds trust.

After 1,000+ interviews, I've learned: The best engineers don't always have the best interview skills. But they all have curiosity, clarity of thought, and a bias toward learning.

That's what I'm looking for in every conversation.


How to Use This Guide

If you're an interviewer:

  • Use this as a template, but adapt to your style
  • The key is creating space for authentic conversation
  • Remember: you're evaluating a potential colleague, not administering an exam

If you're a candidate:

  • Understand that good interviewers want you to succeed
  • Ask clarifying questions—it shows senior thinking
  • Be honest about what you know and don't know
  • Treat it as a professional discussion, not a test

If you're a hiring manager:

  • Train your interviewers on this approach
  • Standardize evaluation criteria, not questions
  • Debrief after every interview to calibrate scoring

The goal isn't to find perfect candidates. It's to find people who will grow with your team, contribute meaningfully, and enjoy the work.

That's what conversation-based interviews reveal—and what traditional interviews miss.

Topics

technical-interviewshiringengineering-managementconversation-based-interviewssenior-engineersinterview-frameworktechnical-leadershipscenario-based-interviews
Ruchit Suthar

About Ruchit Suthar

Technical Leader with 15+ years of experience scaling teams and systems