How to "Peel the Onion" in Behavioral Interviews
There are layers to every question – and every answer! I’m not sure why an onion, of all things, was chosen for this analogy, but "peeling the onion" is a must-know technique as an interviewer. Active listening and digging deep are the critical components of this interviewing approach and you’ll need to use both to be effective.
I’ll assume you got the interviewing basics figured out and that you’re looking to level up your skill some more. By now, you’ve noticed that some candidates are well-prepared for their interviews and that some answers will be rehearsed. (Maybe even "too" rehearsed.) If you just took them at face value, you’d miss out on a lot of other interesting bits, such as:
- What was really at stake?
- What was the thought process?
- What other options were explored? What was missed?
- How did the candidate deliver their solution?
Similarly, some candidates (especially a junior levels) won’t have answers that are well-refined and they might not be great at storytelling. In these situations, your job will be to collect all the details and help them share more of the good bits.
Don’t Blindly Follow a Script
The gist of this technique is that you will practice active listening and ask follow-up questions. But there is no script to follow here – just your fine-tuned intuition.
This might be counter-intuitive for some folks. Interviewers across companies, both big and small, are given various scripts and question banks that they should follow. These questions always come with a list of follow-up questions and most interviewers just deliver these in a robotic way.
Similarly, candidates are thought to follow a formula when they give their answers. This is usually some variation of the STAR (situation, task, action, and result) method. When they prepare an answer in this format, it will usually satisfy an unskilled interviewer because they will have heard all they need to tick-off a bunch of boxes.
When taken to its extreme, an interview becomes robotic: an interviewer asks a question from their script, and a candidate answers from theirs. Repeat for 60 minutes. This is bad for both parties. (But more so for the company.)
Let’s see how to fix that!
"Peeling the Onion" Technique Example
Based on a couple of real interviews where I saw this theme.
Me: Tell me about a time you learnt something new and brought it to your work.
Candidate: I worked at Company XYZ in my previous role and we delivered software that operates powerful drones for agriculture data collection. Stability of our software was really important. To make sure we do a proper job here, I level-up in my testing skills and studied a lot about property based testing. I used library ABC because we use C++ and I added these tests to our codebase. I got the team in on it and shared my knowledge we them, so now we have these in our codebase, ensuring the software is more stable.
So far so good – the candidate is clearly well-prepared! But, as I’m listening to them, some natural questions are popping up in my mind:
- Stability sounds important to this company, but is it even a problem? They did not mention the current state of the system was unstable.
- And if there is a lack of stability, is this really the best way to improve the stability of their software?
- What does "more stable" even mean?
- Property-based testing is a very interesting methodology – did they really learn how and when to use it?
If I don’t follow up with anything, I will never learn the answer to these questions. But, I also need to really be aware of what’s behind my intuition here and what I’m really trying to uncover. To me, it is to:
- See if they really spent time deeply learning about this methodology. After all that was my question – I wanted an example of them learning something new.
- Check if they really understand the business problem and the benefits they delivered. Are they business-oriented or just chasing cool tech?
But you could go in many other directions from here as well. For example, we could also ask about how they introduced this to the team and whether there were any concerns. But, I had already gotten some good information on their teamwork from other questions so this was not valuable enough.
The crucial part of this technique is not to identify and ask every single follow-up. Rather, it is to identify what kind of information you and pick the most likely question to fill that gap.
So here’s how the follow-up looked:
Me: That’s a great example! Can you tell me more about what kind of modules or services benefited from this? What’s the code coverage you achived?
Candidate: The code coverage with property-based tests is not that important – not everything needs to be tested. What I did spend a lot of time testing were our sensor modules and connectivity drivers. These need to adher to a lot of different APIs and specs, supporting a huge variery of use-cases. Property-based testing really helped us expose the surface area for bugs and uncover a couple of edge-cases. I was plesently surprised that running over a million simulations in total took less than 2s, not adding any particular time to our release time.
Me: Nice – can you share more about this bug you discovered?
Candidate: Yes, for example there was a function that accepts several input strings that come from the camera and enum values coming from the moisture and sunlight meter. For some of these combinations the function would make calls to other functions for which those were impossible to handle. It wasn’t really a real-world scenario, but in case the sensors do mess up, it does ensure we can handle that gracefully.
Me: So did this help make the system more stable? Did it fix some major bugs?
Candidate: No, we don’t have any major bugs, but this is another safety measure we now have.
Me: Do you get major bugs often? Will this help you sleep easier during on-call shifts?
Candidate: I don’t think so, things are generally stable.
Notice how I kept all my follow-ups guided toward the two major questions I wanted to learn about. It paid off and we were able to confirm that:
- The candidate definitely dove deep into this testing methodology and acquired a new skill. They showed a good bit of both theory and practice. This is a good signal that they have the competencies I’m looking for and an ability to learn and adopt new skills.
- The candidate lacks a bit of business-oriented thinking. It is a sign that they need to learn more before they can transition to a senior role. (Since this was not a senior role, it was not a major issue.)
In this instance, my intuition ended up being correct about both things. It might sound like I wasted time confirming it, but that’s wrong! Collecting data like this ensures we really understand the candidate’s strengths and weaknesses and evaluate them fairly.
Finding Red Flags
The purpose of this is not to find "red flags" – it is to find a full answer (and data) to our original question. But sometimes you will come across situations where your alarm bells start ringing. While you need to follow through on that intuition, you must also ensure it is not coming from a place of bias. Here’s another example from a past interview. (Greatly modified as well.)
Me: Can you tell me about a time you went above and beyond for a customer?
Candidate: At my current employer ABC TECH, ERP and Finance software provider, my team owns an important tax compliance module customers use to prepare accurate tax statements each month. We had an important customer who used our software primarily because it handled all regulatory and tax requirements. Due to a major bug, they were at risk of failing to meet their obligations on time. I took it upon myself to address these problems when our support team reached out. I openned a line of communication with them via our support tool and I started working on it. I worked through it all night and barely succeeded to find a fix at 2am, just in time so they pull their tax in the morning, verify it, and submit it on time. The problem was due to some transactions having incorrect dates, like June 31st, which would lead to crashing the whole processing pipeline. I recieved praise from them which I shared with my manager.
So what do we dig into now? Give it some thought – there are multiple threads to go on, but we should stick with our original question and the purpose. In my case, the question was about how the candidate handles interactions with customers and prioritizing their requests.
As I listened to their response, the priority sounded very clear. They nicely framed the whole story: the customer’s main reason to use this product was the tax module and if it failed for them, it would have big consequences.
You wouldn’t be wrong for trying to clarify the impact a bit more or for deciding to dive into how the communication went. But here’s where my intuition was as I listened to this: If this is so important, why did nobody else help?
Here’s the follow-up:
Me: That sounds like a long night! Did anyone help?
Candidate: No, it was just me. I know this module and customer the best so I took it on myself to solve it. Involving my team would have only slowed things down or made my manager panic. I knew I could fix this issue on time and that there is no need to tell him.
That’s not a great thing to hear – but, remember, our job is to "peel the onion" further and collect more information:
Me: What kind of panic do you mean?
Candidate: He starts to panic about the negative impact the problem could have and how we can lose a customer over it. He’ll then put more people on the problem, slowing us down because nobody else has time to become an expert on other modules.
Me: What would you do differently?
Candidate: We need a few more people for the amount of modules we own, that way more people can learn to be experts on other modules.
Me: Did you discuss this with your manager?
Candidate: He agrees with me, but we do not have the budget. He appriciates the way I handle customer problems and keep them happy with the product, but it is becoming a bit too much… actually that’s one of the reasons I am looking for another job.
This is definitely a good place to stop digging further. Let’s recap all the extra things we learned through these follow-ups:
- We got good signals that the candidate can manage customers. We’ve also learned that they’ve been successfully shouldering a lot of responsibility.
- We now know there are some areas of concern around teamwork, but we also understand the unusual circumstances. We can now ask a different question to probe deeper into this.
- We now clearly understand the candidate’s motivation.
Comments ()