There is no less productive form of interview than the "resume validation interview". Resumes are practically useless in the best case. Here, you seem to propose paying a premium for candidates who can properly estimate their facility with VCS systems and then cogently discuss that estimation in an interview. That can't possibly be a skill relevant to building software on your team!
>Here, you seem to propose paying a premium for candidates who can properly estimate their facility with VCS systems and then cogently discuss that estimation in an interview. That can't possibly be a skill relevant to building software on your team!
I think that's an unfair interpretation of what StevePerkins is saying.
I don't get the impression from him that VCS knowledge in particular will make-or-break a candidate. He's saying that IF the candidate put it on his resume, then the candidate himself is the one who opened that door for discussion.
Out of the infinite list of programming topics to discuss, what are some options to narrow it down? Well, the candidate (through his own volition) put <Topic X> on his resume... so... let's talk about Topic X! It doesn't matter what Topic X happens to be (whether it's "svn", "parallel algorithms", "Ruby", "TDD", "cloud scaling", whatever). What matters is that the candidate is the one who thought it important enough to highlight it. From there, it's reasonable to think it's something the candidate is already comfortable with discussing in depth. If not, he shouldn't put it on the resume.
Your response makes it seem like StevePerkins is playing Alex Trebek with Jeopardy random topics and asking "gotcha" questions. It's not random -- the source is the candidate's resume. It certainly seems fair and reasonable to discuss any topics the candidate put on his resume. Imo, it's also fair to augment with questions that are not represented on his resume (but that's a different discussion from StevePerkin's example.)
Personally I think it's an excellent question. He's probing into a candidate's understanding of why you use version control systems and why different approaches might be used.
If you have a significant amount of experience then hopefully you have seen enough different situations to be aware of the advantages and disadvantages of each approach.
It's really a question to figure out whether or not you can reason about high level concepts.
personally, I think the SVN question is a pretty good example of "can you talk intelligently about any aspect of any complex system you've ever worked with?". If a candidate can explain one, I'll recommend to hire them. The resume provides a good list of possible subjects to discuss, otherwise I just cast about randomly, try to chase threads in the conversation until I find something with some depth.
I completely agree with this - asking about VCS is a cross-cutting question that nearly all candidates have experience with unlike, for example, graphics, desktop, or cloud experiences. So, requesting that the candidate explain their VCS workflow is a way to generate a practical and relevant data point to use when evaluating multiple candidates.
For example, the 1-in-a-million candidate that has used git with a CI/CD configuration (note: I'm also outside SV) versus the candidate who uses TFS ("git? You mean Github? Yes).
I've carefully read every comment on this subthread, and here's what I think:
This question is a trifecta of ineffective candidate screening tactics:
(a) It's a technical screening question, one a strong candidate could get wrong, based on a technical aptitude that is trivial to teach on the job and thus rarely worth paying a premium for.
(b) It's a subjective technical question, for which reasonable engineers can have differing opinions, which means it's an outlet for interviewer subconscious bias. Did the interviewer just eat lunch? Candidates will do better on this question if they have.
(c) It's a tea-leaf-reading question about engineering/team management: it's superficially and overtly about technology, but subtextually about a bunch of other things. Let's hope the candidate realizes that.
A typical interview lasts about 60 minutes. Let's say it takes 10 minutes to pursue this particular line of inquiry. That's 16% of your interview you're spending with a question that greatly rewards people who are good at talking about technology. Worse, if you ask that question early to a quiet but excellent candidate, you can psych them out, which means you pay for that version control question in every other question you ask.
It is totally reasonable to assess soft skills and team compatibility. But you have to design an interview that does it. You can't improv it based on candidate resumes.
Avoid questions like this.
This thread started with someone saying they're pretty good at interviews. It turns out that they try to assess soft skills with technology questions based on the luck- of- the- draw of candidate resumes. Candidates with effective resumes will have an easy time passing these interviews. From the comments on this thread, that obviously sounds reasonable to some people.
I submit: those people are not competing for talent. They may think they are, but the real contenders in this market won't be OK with letting good candidates slip past because their job-hunting skills aren't finely tuned. In fact: they'll do the opposite: those candidates are steals in this market.
The original commenter acknowledged that when he said that his experience was that the market was full of poor candidates. If that's the case, you especially can't afford to filter out effective devs because they fail to impress you when they explain how they use version control, or when their resume overstates their facility with version control.
Thanks for being open about your perspective on the interviewing process, and for taking the time to follow up on these threads.
Do you believe that technical phone pre-screens are ineffective in general? From what I've read Matasano doesn't pre-screen candidates, but provides complementary study materials instead. Is that because the subject matter is specialized? Would you approach hiring web dev roles differently?
From your remarks, it sounds like you would reject the practice of asking open-ended interview questions (e.g. describe your workflow, describe a typical day, describe a recent project) due to interference from the interviewer's bias. What, if any, value do you place on open-ended questions?
In general, my feeling is that the Matasano process (which I currently manage) works outstandingly well where there isn't a flood of qualified candidates. If you have a glut of folks who are ready to start working, you can get away with a terrible process.
We do free-form technical interviews, but only to try to detect candidates who really aren't ready for the work-sample challenges. Our in-person interviews are standardized and try to evaluate consulting/architecture skills that are hard (impossible?) to measure without people. These involve open-ended intermediary questions, but the final answers are structured.
The bottom line is this: you cannot compare candidates using free-form interviews. You must compare candidates who are going to be doing similar work. Thus, free-form interviews have no evaluative value.
It's highly relevant to getting a job on a software team, but only because most teams are assembled via interviews, and verbally relating stories from your past in a face-to-face meeting is the most effective interviewing technique.
And it's relevant to a job like sales. (That's unsurprising because a job interview is actually a sales meeting.) And many management jobs do have a sales aspect -- you have to justify budgets, sell work inside the company, et cetera.
But if explaining our work to outsiders were a particularly important or routine skill for programmers, we wouldn't be so bad at it. And, on average, we are bad at it. Because our actual on-the-job communication, which we practice all the time, is largely written and asynchronous, taking place on media like Slack or Github. It relies on plenty of job-specific shared knowledge, domain experience, and jargon, and it all happens in the shadow of a job-specific shared codebase that is supposed to speak for itself -- the whole point of software is to build something that works by itself -- but is also perpetually unfinished.
There are social skills that are important to have on a software team, but it's difficult to judge them in an interview. Interviews are staged events.
Challenging a candidate to defend a resume in an interview is like asking them to do improv comedy, and selects for many of the same factors: Verbal gracefulness, comfort in the spotlight, the ability to seamlessly change the subject, and the amount of time spent in rehearsal. Good candidates rehearse their resumes. We get to write them ourselves, after all, and with practice we learn to design them with hooks that lead into our best material.
I would say the whole point of software is to build something that people can use. If someone builds a module, but can't tell me how to use it, it's not of much use to me.
Oh, but surely they'll write documentation? Programmers, by and large, seem to suck at that too, which is why we have tech writers. But somebody still needs to explain to the tech writer what's happening! And only a few companies seem to carry tech writers for internal only products.
To put it simply, if I ask somebody how their code works and they say "Go away for an hour while I write documents" I think I'd rather not work with them. Or worse, they ask me for help debugging but can't tell me what they're trying to do. No thank you.
Your questions are good and relevant ones, I agree. Let me rephrase them a little:
"Hello, coworker! Did you enjoy the cake we both got to eat the other day in the company cafeteria?"
"By the way, I'd like to ask you a question, and don't worry: This isn't an interview or anything, so if you can't answer me right away, or if your answer lacks grace, it's not as if you'll lose your job."
"Anyway, coworker: I found this code, which you wrote while working for my company, under the direction of my company's management, and which solves a problem that my company actually has, and which builds upon my team's platforms, languages, and coding standards, and which might even link directly to my code, and which both of us have had a moment to read and think about and which is right in front of us on this monitor. How does this code work?"
"Also, can you help me debug this code I have here? It builds atop the code I showed you last week, and is written in the same language that we all use, and attempts to solve a problem you've seen before – which is not a coincidence, because you were the person who asked me to solve the problem."
These questions are incredibly relevant to our work, but interviews can't cover them. Candidates are not our coworkers and they share none of our context. Instead, interviews are, at best, an exercise in prediction. In practice, they are often an exercise in magical thinking.
During the workday, people aren't being constantly judged. They don't implement functions on whiteboards without unit tests, solve brainteasers out loud during stand-up meetings, or implement quicksort from memory. They do have to explain code to coworkers, but not to people who don't understand the problem space, the language, the background, or the constraints. These rarely-exercised feats of skill are valuable -- sales is valuable -- and our gut feeling is that such feats are somehow related to relevant job skills. But gut feelings are often wrong. And not every job is in sales.