Hiring Engineers with Ammon Bartram

by Y Combinator5/17/2017


Google Play


Craig Cannon [00:00] – Hey, this is Craig Cannon and you’re listening to Y Combinator’s podcast. In case you don’t know about YC, here’s what we do. Twice a year, we invest 120K in a large number of startups. The companies move to Silicon Valley for three months to work with us and build their company. At the end of the three months, they demo to a room full of investors. Many of the companies have gone on to be very successful. Some of the ones you might know are Dropbox, Reddit and Airbnb. I’m excited to get our podcast rolling again, and want to thank Aaron Harris and Kat Manalac for starting the original YC podcast, Startup School Radio. Our first episode is with Ammon Bartram. Ammon has co-founded two YC companies, SocialCam and Triplebyte. Triplebyte connects software engineers with companies that are hiring. And some of the most frequently asked questions at YC are around hiring, so I thought it’d be good to have Ammon in as he thinks about it all day. Our discussion mostly focuses on hiring engineers, though much of the advice he shares could be applied to hiring for other roles. Okay, here we go. Hey, guys, today with have Ammon Bartram, co-founder of SocialCam, Triplebyte, and he’s here to talk to us about hiring. So you can just give us a quick intro about what you’ve worked on.

Ammon Bartram [01:06] – Cool, so I joined Justin.tv fresh out of school in 2009. It was just 25 folks, and kind of went through the roller coaster of the early days of Justin.tv. And there I worked mostly on the video system. And I think that was where I had my fist sort of taste of hiring. At the end of that we were hiring pretty aggressively. And that’s when I first realized how much noise there is in the hiring process.

Craig Cannon [01:35] – Okay.

Ammon Bartram [01:37] – And then I was part of the spin-off of SocialCam. And that was a video sharing app. I did that for about three and a half years. And we were Autodesk in 2012. And I worked there until 2014. And then took a bit of time off and started Triplebyte.

Craig Cannon [01:56] – Cool, and Triplebyte just for context, for people, can you explain?

Ammon Bartram [02:01] – Sure, yeah, so we’re a recruiting startup. So we help startups hire engineers. And so engineers apply to us and then we do sort of a full interview with them, pass the engineers who are good and match them with companies where they have a high probability of doing well.

Craig Cannon [02:18] – Cool. So people ask us a million questions about hiring, recruiting, all of it. In general, let’s assume that you’re a early stage startup. What should companies be looking for in an engineer?

Ammon Bartram [02:34] – There’s not a crisp answer to that. I think that the pithiest answer I can give to that is you have to decide what it is you want to look for and then you have to effectively look for that. So sort of the status, actually, no, I think some sort of an important truth here. Most companies think that they are trying to hire good engineers. That’s what they say to themselves. What they don’t realize is that Company A’s definition of a great engineer is significantly different from Company B’s. And so what you have is a situation where everyone has this definition in mind and they’re all different. And this is the big source of noise. So, for example, if one company thinks that engineers need to be very fast and productive, and be able to show that in a interview. And a different company thinks that engineers need to be very intellectual and be able to deeply understand computer science topics and talk crisply about that. What happens is sort of all of the awesomely productive engineers, are very practical and not necessarily strong in academics, so apply to second company, fail, and all of the very academic, could totally solve all your hard problems, but maybe aren’t quite as flashy in web environment engineering, so he apply to the first company, also fail. For companies that have a bit of a larger stage, I think the obvious answer is you wanna hire both those people. And so it’s about building a process that can identify more broadly different types of scale. For smaller companies, I think you’re more in a situation where you may well actually only wow one of those folks. The thing that’s holding your company back may well be productivity. And we need someone who’s gonna come in and be productive and sort of bang out code. And if that’s the case, you need to realize that it’s not important that everyone you hire be strong in computer science. And if you are a company where you’re facing security issues, really clean code, really precise code and solving hard problems is important to you, at an early stage, and it probably makes sense to have a process that skews more in that direction.

Craig Cannon [04:46] – And so do you have general advice for people that come to you guys at Triplebyte, or just you as a friend, advisor, for diagnosing what kind of engineer is a good engineer for your company. What do you tell people?

Ammon Bartram [05:01] – We don’t. So it’s funny. We’re rarely in that situation, actually. I think most people have strong preconceptions, so we’re more often in the situation of sort of broadening people’s vision of what a skilled engineer can be. But I think the obvious… I’m gonna circle back to the question. I think what happens a lot, this is a mistake that’s easy to fall into is when people are interviewing an engineer, they tend to ask about the things that they are the best at. There’s this overlap between the things that you’re the best at and the things that you think are the most important. Like every engineer thinks the things that they know are kind of the core of the discipline. And so you ask everyone who you interview those questions, you bias yourself towards hiring engineers who have those skills, they join your team. They ask people they interview the same type of questions. And so the whole organization can grow in a direction that might not make sense.

Craig Cannon [06:04] – Mm, okay.

Ammon Bartram [06:07] – It’s very complicated, ’cause there are plenty of examples of companies with defined engineering cultures that have worked out very well. So, for example, Google has, intentionally, or unintentionally grown very much in a computer science-y direction. And that’s obviously worked out very well for them.

Craig Cannon [06:20] – Yeah.

Ammon Bartram [06:24] – There are other companies in YC that… I just don’t wanna say names. But there are companies that are complete opposite, take a very human productivity friendly approach. There also completely successful excellent companies. And so answering that question is not, basically there are success cases on both sides. But I think when you’re hiring your first employees, I think you need to just basically try to decide what’s holding us back.

Craig Cannon [06:52] – Yeah, and so say I’m trying to vet this pool of engineers and they all fit the certain rubric that I’ve created. But maybe one of them did a bootcamp and have some projects and then one of them went to a great school, has a CS degree. So how should I think about credentials and experience?

Ammon Bartram [07:15] – Bootcamps versus CS degree I don’t think are all that different. Now that’s obviously a forceful statement. So I think experience matters more than where you got your education. So someone come fresh out of a CS program who doesn’t have internships, is still essentially a junior engineer. And, perhaps, they may have a more academic slant to what they’ve studied, than someone out of a bootcamp. But both those people lack real world experience. I think categorizing that differently than someone who has worked in the industry for five years and can own projects. The skill that you can most easily measure in an interview is ability to think through relatively small problems quickly. That’s really what interviews can evaluate.

Craig Cannon [08:10] – Okay.

Ammon Bartram [08:11] – The skill that you need in an employee is the ability, is quite different, the ability to solve large sprawling problems well, but over a long period of time. And there’s obviously a correlation there. We use interviews as a proxy of evaluating actual skill, because there is a correlation there. But the correlation is not perfect. An interesting observation is that people who are fresh out of university and boot camps, actually in many cases, because they’ve been practicing, are better at the kind of problem that gets asked in interview than your very senior, eight, 10 years of experience, at large company engineer. What the senior engineer has typically is experience making up correct decisions, owning projects, gathering requirements, carrying that whole process through.

Craig Cannon [08:58] – And so, okay, how do we evaluate for that?

Ammon Bartram [09:01] – It’s super hard. Basically, what ends up happening, and this is honestly unfair, is that experience is used as a proxy for that. This is something we’re focusing a lot at Triplebyte. This is very strange. Actually, if you have five years of experience, it’s just flat out easier to pass an interview. You will get a job offer after a worse perform, people think maybe that senior engineers, you expect they’re senior, they should perform better in an interview. And that’s actually not generally true. The bargaining in job offer goes down as you have sort of a impressive looking resume. And it’s not irrational in the part of the companies, it’s just the reality.

Craig Cannon [09:38] – Sure, okay, and so then when you guys are pre-screening these people for Triplebyte, for example, what are you looking at? What are you having them do?

Ammon Bartram [09:46] – The approach that we take is to evaluate as much as we can in isolation and be aware what we’re evaluating. So we explicitly evaluate programming, just programming productivity. Given a relatively, a spec’d out problem, so, for example, it’s describing an algorithm to solve a problem. It’s not super math-y, it just choose the set of steps they have to do. Can the candidate take that and render it into well working, well structured code? And, interestingly, junior folks actually often do better than senior folks at that sort of problem. We then separately do a evaluation of sort of academic computer science skills. Is the engineer knowledgeable about computer science and about sort of that approach to problem solving? We then separately, one thing that we took from Stripe, actually, is we do a debugging section. And so we give the candidate a large code base and that has some bugs. And we ask them to sort of take some time, dig into the code base, and then try to find and fix these bugs. And I think that does such a great job of solving some of those problems, basically, because this is a skill that comes from experience. That is often missed by more and more traditional interviews. And then, finally, we do a system design section. So, here’s some requirements, design a production web system to satisfy these requirements. Okay, now I’m gonna change the requirements. How do you adapt your design? How do you talk about trade-offs?

Craig Cannon [11:22] – And all of that’s done remotely because the person’s at home, right?

Ammon Bartram [11:25] – Yes, we do this all over Google Hangouts.

Craig Cannon [11:27] – Okay, and so what’s a reasonable amount of time for someone to just like go through one of these exercises? Are they all widely different or–

Ammon Bartram [11:35] – To go through our exercises?

Craig Cannon [11:36] – Mm-hm.

Ammon Bartram [11:37] – Our reviews are about two hours in length. And so we spend about 30 minutes on each section. _ Okay, cool. And you find that’s a very strong dataset in terms of correlating how successful they are?

Craig Cannon [11:42] – Okay, cool. And you find that’s a very strong dataset in terms of correlating how successful they are?

Ammon Bartram [11:48] – Yeah, well, we’ve done about 2,000 of these interviews over the last year and a half. And so we’ve been able to sort of drill in on the parts that are most predictive, and cut time off and shorten it. I think if you’re starting from scratch, you would probably need about twice the amount of time to get through all that stuff.

Craig Cannon [12:01] – So having gone through all these interviews at this point, was there anything that you thought was really important in the beginning? Or something that’s very common in the valley that many people think is important that isn’t really important?

Ammon Bartram [12:13] – Too much reliance on a single question. So the, sort of, the classic interview format in tech companies is a number of one-hour, 45 minute to one-hour sessions. And often engineers pick the questions themselves. And they’re usually like little nuggets of… They’re sometimes pejoratively called brain teasers. Almost no one actually asks brain teasers. They’re more like legitimate programming problems. But they are little nuggets of difficulty, like how do you, given a string of parentheses, how do you determine if they’ll all match? Okay, given multiple types, how do you determine if they all match? That’s a classic interview question. And it ends up there’s just a huge amount of noise. If you just take a bunch of candidates and in a controlled setting, have them all answer three or four of these questions, you’ll see there’s some correlation, but there’s way less correlation than you would think. And companies, and I believe that’s in my previous, like you have this type of question, you ask someone the question, and you see this variation, and you assume, oh, the people who answer this question well must be smart and better programmers. And people who get totally tripped up must not be. And then when you actually inspect that you see that there’s this huge amount of noise. And we have this pretty incredible lense on this, because we evaluate engineers pretty rigorously, and then send out to multiple companies. And see what happens, get detailed feedback.

Craig Cannon [13:44] – And so, yeah, is that feedback from the actual interview process, or then once they’re placed you actually know as well how they’re doing.

Ammon Bartram [13:50] – Both.

Craig Cannon [13:51] – Okay.

Ammon Bartram [13:52] – But I’m kind of talking about the interview process. So we screen engineers, we send them to companies, and they do the interview there and we get feedback. And it’s just pretty incredible how much disagreement there is. So a candidate who does really well at one company, and they get told, “Oh, this one of the best people we’ve interviewed in months, this is our rock star.” Goes onto fail and interview somewhere else. And a pretty interesting stat we had up is I compared the rate of agreement between interviewers at companies with a dataset of user’s reviewing movies online. Right? And the numbers were basically under the integrator agreement was equivalent. So, basically, knowing that engineer did well at one company gives you about as much information about whether that engineer is skilled as knowing whether The New York Times film critic rated 12 Years a Slave as excellent or terrible.

Craig Cannon [14:51] – So, okay, maybe you don’t have an answer to this, but say I’m really good at brain teasers, where should I interview?

Ammon Bartram [15:00] – Larger companies. Probably larger companies. And this makes a certain amount of, okay, this is all very complicated. It makes a certain amount of sense. So bigger companies, so brain teasers always introduce noise. But we found that bigger companies rely more on that for an interview. And they do that partly for some rational reasons. So bigger companies care more about measuring your innate ability, and less about measuring whether you can jump into their particular code base and be productive on day one. But it’s way more unlikely that a smaller company’s gonna say, “We’re using Ruby on Rails, we need a very productive Rails developer. Come take this interview and we’re gonna observe maybe how well you can work on our actual code base.” Whereas the big companies, Facebook, Dropbox, Apple, Google are more likely to say, “We care about smart people.” Within the confines of the noise of the interview process they’re trying to identify intelligence rather than specific experience.

Craig Cannon [16:07] – I mean, they also have capacity, time, to train people.

Ammon Bartram [16:08] – Yeah, precisely.

Craig Cannon [16:10] – Whereas a small company, no way. Okay, and then one of the questions, what about skills that people don’t think is correlated that are strongly correlated to a successful engineer?

Ammon Bartram [16:20] – Relatively easy problem solving. So we have found that asking pretty easy interview questions is often more generally predictive than asking harder interview questions. So to break this down, there are two sources of signal from asking a question. You can get signal on whether the candidate comes up with the right answer, and you can get signal on whether they struggle, how easy or hard is to them to solve the problem. And so we scrub out these things for a bunch questions. And we’ve done this for, again, thousands of candidates. And what we found is that if you go and look at how much the individual score on one question we’re asking correlates with how the candidate does on the job. And what we found is that, as you’d expect, getting a question right is correlated with–

Craig Cannon [17:12] – Is good.

Ammon Bartram [17:14] – And as you would expect, being able to answer a question easily is correlated with being a good engineer. But there’s also, of course, false negatives. So there are great engineers who fail interview questions and they’re great engineers who struggle with questions. If you like at the actual predictive ability, rather than just the correlation of getting on the right side. The sweet spot is actually far lower on the scale than most people intuitively think.

Craig Cannon [17:43] – And so, yeah, can you give a couple examples of what those easier questions might be?

Ammon Bartram [17:47] – Yeah, sure. Just saying like, “We want you to create a checkers game.” No logic or anything complicated, just a class that has a board, has a grid, it’s got pieces, pieces move around. This is really a pretty mundane straightforward task. That actually ends up being, how well candidates do that ends up being a more stable predictor of engineering skill than sort of, “I’m gonna give you a sentence that consists of a list of words, all glammed together, and you would define the optimal way, given a dictionary, “to break this apart into words.” The second problem ends up being a graph search problem that can be optimized with memoization or dynamic programming. Getting the second problem right carries more information than getting the first problem right, but with a really high false negative rate. And so the first problem ends up actually being a better general predictor of engineering skill.

Craig Cannon [18:44] – And so is there a way, if I’m getting ready for a new… I’m gonna go to another company, I’m gonna get ready to interview. Do you recommend people train in any particular way? Or it’s because you’re going for that sweet spot of easy questions, you just have to be smart enough to do it. What do you tell people?

Ammon Bartram [19:04] – Sort of in general? Across the board.

Craig Cannon [19:05] – If I’m gonna prep to do some interviews, what would I do?

Ammon Bartram [19:05] – So, I mean, I guess there’s two questions there, one’s where I think, for companies that I think are doing a good job of interviewing, and then maybe for I think some of the status quo is. In general, actually, it depends where coming from. So very different advice for new grads and for experienced folks.

Craig Cannon [19:25] – Okay, so let’s break ’em apart, yeah, new grads.

Ammon Bartram [19:27] – Okay, for new grads, I would say make sure you’re a solid whistler of the classic sort of stuff that so breadth first search, hashtable, heaps, just classic core computer science. A surprising percentage of a new question ends up being slightly obscured applications of sort of those, especially, sort of hashtables and breadth first search. Those two things by themselves represent probably 40% of the questions that are asked at most companies. And so you need to know those. But, actually, many new grads are already pretty solid on that, because they’ve been being drilled for that throughout school. The second thing is practicing writing code under stress. So working out a big problem over time is very different that, “You have 10 minutes or 30 minutes. Here’s a marker and a whiteboard,” or even , “Here’s a laptop program on it.” And so just, the things correlated, the skills correlate, but you can improve your performance by practicing, so totally put in 30 minutes a day, finding some interview questions online, and giving yourself a time limit and sort of trying to solve them under stressful situations.

Craig Cannon [20:38] – And are there good resources that people can look for? Like anything in particular?

Ammon Bartram [20:42] – Yeah, I mean, the classic ones. So Cracking the Coding Interview has a pretty good list of questions. The advice in that book I don’t think really applies to startups very much, but the questions are good. And there are a bunch of sites online that have lists, Interview Cake is one that I’ve seen that I think is high quality. An interesting aside to this, though, is that most companies actually want you to do these things. Companies would prefer that all their candidates, we prefer, we totally prefer, we try to design our interviews in such a way that there’s no impact. We don’t really necessarily want to be measuring if you’ve been cramming on algorithms. But what interview companies wanna measure is max skill, max potential. They would actually much rather see you in a state where you’re well prepared with the material, rather than a state when you’ve have the potential to understand it, but forgot about it.

Craig Cannon [21:30] – Yeah.

Ammon Bartram [21:31] – And an interesting trend that’s happening in the industry is companies being more up front about what they’re asking. So Facebook, for example, has started providing sort of an interview prep class to everyone who applies so that they’re sort of going over the material. Part of me finds it encouraging, ’cause it is moving you in a better direction, but then it’s also discouraging, ’cause it’s like really sucky that you have to take a class to figure out that.

Craig Cannon [21:57] – I just wonder if it’s just filtering for those types of people, who are just like, going “I don’t know what to do. I don’t know what to do.” It’s like, “Qell, if you apply here, you can take the class.” And then like, “Let me hold your hand the whole way through your life.”

Ammon Bartram [22:08] – Yeah.

Craig Cannon [22:11] – Okay, so say I am going to interview at a bigger company, is there a way to prep to do well with the brain teaser stuff?

Ammon Bartram [22:23] – Yeah, Practice. So, again, there are some words that are thrown around describing interview questions. Brain teasers are pretty rare. Some companies probably have asked things like the golf balls in a 747. But that’s really very rare. Much more common is application of computer science idea to a practical problem.

Craig Cannon [22:47] – Okay.

Ammon Bartram [22:48] – And there still is this leap of insight required. In many cases I think those are bad interview questions. Companies should try very hard to ask questions where there it’s not like one thing that has to be grasped, until that’s grasped, it feels impossible. Practical application of a computer science topic represents the significant majority of questions at big companies.

Craig Cannon [23:10] – So when I was in college, I interviewed at one of those big management consulting firms. And they did have all those questions. We spent like two months prepping for it. And I didn’t get the job. And I did okay on the stupid ping pong ball questions.

Ammon Bartram [23:26] – You don’t have to feel bad. One number we have that’s interesting is that the engineers who do the best at companies go onto pass about 80% of their interviews, but not 100. Almost no one passes more than 80% of their interviews at companies.

Craig Cannon [23:42] – Wow.

Ammon Bartram [23:43] – And, yeah, so one big bit of advice to everyone is just don’t feel bad if you fail the interview. It is really not a referendum on your skill.

Craig Cannon [23:53] – I’m very happy to have not gone in that direction. So what about the role of projects? Someone’s portfolio of side projects. Are there certain types of side projects that, across the board, are attractive to companies? Or say I’m applying for a job at Stripe, and I did X payment type project, and that would be more attractive to them. So across the board, are there things that are interesting?

Ammon Bartram [24:19] – Let me answer that question then we’ll talk about what I think the right thing for companies to do it. So companies don’t actually pay very much attention to side projects.

Craig Cannon [24:29] – Okay.

Ammon Bartram [24:30] – Except for at the screening stage. So resume screen, candidate applies to a company, the company decides if they’re gonna interview the person at all. And there’s some adverse selection bias and who applies to companies, and there’s this big stream of candidates. And so at any company, there’s this huge stream coming in and they have to decide somehow. And so they do that based on resume screens. And it comes down to pretty dumb credential stuff, if you’ve worked at a top company, you’ve gone to a top school, or in some cases if you have a project that catches their eye, that’s impressive. And so side projects can help a lot there. But they are very rarely given weight in the actual interview. And I think it’s actually probably the right decision. So people who have side projects sometimes feel bad about this.

Craig Cannon [25:13] – Yeah.

Ammon Bartram [25:16] – The reason it’s the right decision is that most engineers don’t have side projects. Most engineers have been working at a company, and it’s all proprietary code and there’s very little they can show. Eight out of 10 interviews are in that situation. And having a consistent process, consistent fair… Consistency is the first goal of that interview process. The big problem is that the process is not usually consistent. So if you’re being consistent, then you can optimize it. And having this sort of other process when we look at projects, introduces noise. And it’s also just really hard to do. So you can’t tell if someone spent a weekend on a project, or if they’ve been working on it for the last 10 years. We literally see both pretty regularly when talking to people about their projects. Somebody did it over weekend, or this has been my abiding passion for the last 10 years.

Craig Cannon [26:06] – Let alone, who actually contributed.

Ammon Bartram [26:08] – Yeah, and things like coding quality. It’s startlingly hard to look at a big bit of code and decide if you think the person who wrote it is skilled. Just, again, there’s so much context you can’t tell what bugs they spent hours over. Like finding bugs, none of us are good enough to look at a code and immediately find the bugs. And, yeah, and so for that reason, for all those reasons, side projects are useful. So if your problem, as an engineer, this is advice to engineers. If you’re applying for jobs and you’re being screened out a lot at the resume stage, doing these projects probably helps. Doing projects is a great way to obviously increase your skills. And that will be effective and better for once on interviews. But I don’t think projects have a very big role in the actual interview.

Craig Cannon [26:58] – And so what other things should I think about if I am being screened out? Like say I’m getting a call back from one out of 10. What should I do?

Ammon Bartram [27:08] – Apply to Triplebyte. That’s the short answer. Otherwise, yeah, side projects help. I mean, it just sucks.

Craig Cannon [27:17] – Oh, totally.

Ammon Bartram [27:19] – It’s not malice on the part of the companies. They’re overrun by applicants. And so they use these sort of crude filters. And that’s I think the big thing that we’re focused on is trying to figure out how to directly measure the skill. And so we don’t have to rely on filters like where someone’s worked or what school they went to.

Craig Cannon [27:38] – And what about things like, for example, location. So say I live in Salt Lake City. And I’m interested in getting a job possibly at Facebook. Should I put San Francisco on my resume and just fly out for an interview? Do you have general advice in that area?

Ammon Bartram [27:53] – Big companies don’t care at all where you’re based. They fly people in by the hundreds every week. Smaller companies do show a slight preference to local candidates. And so if your goal is to work at a smaller, let’s say, sub-20 person startup, you’re probably, I don’t know, 10 to 20% advantage if you’re based in the Bay Area right there.

Craig Cannon [28:14] – Okay, cool. So from the company side, there’s a million different interview methods that people go for. Say they go through Triplebyte, they get screened, they’re gonna do an interview. Whiteboarding, pair programming, all that stuff, how do you feel about it?

Ammon Bartram [28:36] – All that (those), methods can work, let me sort of give a bit of overview here. So, as I mentioned earlier, the core problem is there’s this tension between the skills that can’t be measured in an interview, solving small problems quickly, and the skill that matters as a programmer, solving big projects over a long period of time. And so the first approach you can take to interviewing is just say, okay, we’re gonna not do it. We’re gonna do like trial employment. Or something like that. And that totally works. If you’ve worked with someone for a week, you have a far better read of their skill than I think anyone can get during a three to four hour interview. The problem is that there’s a pretty strong bias in who’s willing to do trial employment. And it’s an adverse bias. Many of the best programmers have lots of options. And if your company requires that everyone do this trial employment period, most of them are not gonna do it, so they’re gonna say no. And, obviously, anyone who currently has a job–

Craig Cannon [29:36] – They can’t leave for a week.

Ammon Bartram [29:37] – Can’t do it, can’t leave for a week, yeah. And, of course, you’re committing a week of time. And so obviously you need some feel for, before the trail, employment. And so I think in the end we’re left with a thing kind of like the famous, “Democracy is the worst form of government, except for all the others.” I think that interviews are the worst way to evaluate engineers, except for all the other options.

Craig Cannon [29:56] – Yeah.

Ammon Bartram [29:58] – And so you have to do it. It’s fundamentally inaccurate, but you still have to do it. The goal is to make it as accurate as possible. And so once you’re on that page, we see two sources of noise. We see noise that comes from the companies being inconsistent. So I talked about that a bit earlier. It is still too often the process that engineers are responsible for coming up with their own questions. So if you’re asking every candidate different questions, and coming to a gut call, there’s just this far larger, than anyone really realizes, source of noise. And so if you asked… Pick any company that has that process, if you asked them to somehow re-interview their colleagues in a blind fashion, they would likely have a 50 to 60% pass rate. So, for example, their college would be screened out.

Craig Cannon [30:48] – Yeah, yeah.

Ammon Bartram [30:50] – And so the solution there is just to be really consistent, to make sure that you’re asking everyone the same question, and make sure that you’re evaluating them in the same way. And I think that’s more important than what you’re actually asking. The first step is be consistent, second step is tweak that over time based on what you’ve done, and the results you see.

Craig Cannon [31:09] – Okay.

Ammon Bartram [31:11] – Once you’re doing that, I think the other source of noise we see is companies looking for different things. So, as an example… Look at a company that’s looking for super academic engineers?

Craig Cannon [31:22] – Yeah.

Ammon Bartram [31:23] – You have to look for very practical engineers, You have companies that think that all skilled engineers know about how operating systems work. You have companies who think that they only wanna talk to people who have experience in compiled languages. You have companies who hate compiled language and think they’re old and stocky. You have companies who want people to use enterprise languages. It’s just a mess. And so I think the important thing is to untangle which of those are conscious decisions you’re making about who you wanna hire. So you’re a banking company, you wanna be focused on QA process an safe code. It probably makes sense to reject someone for being too much of a cowboy. You’re a social media company, your goal is to move really fast. Maybe you decide to have a culture where you wanna move fast and break things, and you wanna hire cowboys. Those are all logical decisions. But very often companies are making similar kinds of decisions almost by accident. And so sort of introspection deciding, “Okay, we wanna hire those people.” And then designing the process to look for it. And so in your examples, whiteboard coding tends to skew towards the academic. It tends to give preference to people who are really good at breaking their thoughts down in a sort of structured academic way, and writing with a small amount of code. So you often have people who are actually really productive, excellent programmers, who look really stupid and bad on a whiteboard interview. And so if you’re not looking for academic skills, it probably makes more sense to put people on an actual computer and see how they actually work in their environemnt.

Craig Cannon [32:49] – Okay, and so what does the… The underlying question for me is could you engineer a perfect interview? But I wonder what does the interview for a job at Triplebyte look like? I mean, I imagine you made it, right?

Ammon Bartram [33:02] – Yep, so first of all, all our candidates go through our regular process. So we hire if there’s people out of our regular stream. And then we compete head-to-head with the companies, so they just enter them to us as well as other companies, which is kinda fun. So they first go through a regular process. And so we already have a pretty strong sense of how they are in those areas. And my advice generally is to decide what the skills that you preference. And so I think we preference a couple of things. Data and data analysis is pretty key to our business. And so we preference people being comfortable and familiar talking and thinking about data. That skews a bit more academic, I think, than what many companies hire for. And then we, because we’re in the business of evaluating knowledge really broadly, we then preference breadth of knowledge, I think to a greater degree than most companies need to.

Craig Cannon [33:58] – And so what does that mean in practice? What questions would I be looking at?

Ammon Bartram [34:03] – Yeah, so, again, everyone goes through first our standard process. And so that gives us a pretty good read on just practical programming output, general knowledge of computer science, general knowledge of system design. And then we then we do additional followup onsite with the candidates and that goes much more into depth into data. Or if we’re hiring them for a different role, we sometimes hire folks who are not working in your data, so hiring to be sort of, say, a front-end developer. That would be sort of into depth, into front-end development. And so here’s a spec for a front-end project. You have two hours, built that. Or if they are going to be backend specialist, here’s a backend spec.

Craig Cannon [34:44] – Gotcha, okay, and so as an engineer, should I be paying attention to every new thing that’s coming out? Is that gonna be of importance when I’m doing an interview? Or should I be paying attention to, whatever, a medium amount of it.

Ammon Bartram [34:59] – Well, there’s an interesting longer term answer. A class of people we see, interestingly, we see people who thought about that same thing 10 years ago.

Craig Cannon [35:10] – Right.

Ammon Bartram [35:11] – Made the decision to not keep up. Now the industry has changed and now these folks are maybe still using, let’s say, CGI, and don’t understand modern web stack and are indeed in a weak situation in interviews. So I think the answer to question is day one is not so important. Very few companies, especially generally only smaller ones, are directly evaluating flashy new tech. However, if you make the decision too forced with today and you don’t keep up to date and you end up being totally behind 10 years from now, then you probably are gonna pay a price.

Craig Cannon [35:44] – Yeah, I mean, especially if you’re actually interested in starting your own thing at some point. Being on the edge really, really matters. Okay, I mean, maybe this is kinda difficult to answer, but I wonder about employee retention, engineer retention, are there any qualities that you found? Like you can vet someone and say, like, okay, I think the average is like 18 months or something for someone to stay around. Are there qualities that correlate to longer term employment?

Ammon Bartram [36:10] – I haven’t looked in on this recently, so this is gonna be a little bit sort of off the cuff. I mean, just the obvious things, if candidates are excited about the mission and the actual company have a higher probability of staying than the candidates who are chasing the highest paycheck. Of course there are counter examples, sometimes they’re awesome engineers who are looking for a place to really commit, but also wanna make a fair wage. Like these things are all very complicated. But, yeah, I think the number one thing I would just say is looking for engineers who are excited about the company and the job.

Craig Cannon [36:42] – Okay, cool. So kinda just wrapping up, are there any books or things that, if I’m kind of like an engineering manager, I’m gonna be running a bunch of interviews, that I should really dig into and I can get a lot out of?

Ammon Bartram [36:57] – I have not actually found any books that I think are very useful. I think, this is gonna sound arrogant maybe, but I think engineering is this field where it’s so easy to say things that sound profound that are not true.

Craig Cannon [37:10] – Okay.

Ammon Bartram [37:10] – I truly believe that like 80% of what’s written out there about interviewing doesn’t actually hold up. So, for example, a really idea a lot of engineers love is the statement that interviews don’t make any sense at all and you should just look at work someone has done in the past. And we tested this a bunch. We tried scoring engineers and have them talk about past projects, and scoring them. And even like a full hour, like going into depth in the project, giving technical details, scoring it. Just talking skill and ability to spin a tale ended up dominating the actual engineering rigor. And this was far less predictive of job performance than giving them a relatively simple programming assignment. And that kinda sucks. I don’t really like that’s the case. And you can find so many articles out there about sort of this. It’s stupid we’re asking engineers to do these interviews. Why don’t we just have them talk about their past experience? And if you test it, it doesn’t hold up.

Craig Cannon [38:08] – And just for the sake of like keeping everything standard, what do you tell people to do in that way, when they’re conducting an interview?

Ammon Bartram [38:15] – Well, yeah, I mean, standardize. You’re be very careful about helping people, interestingly. Certain candidates are a lot better at eliciting help, without necessarily realizing that you’re helping them. It’s something we’ve had to battle with a bunch, actually. And so it helps, ’cause, again, we’re doing thousands of interviews and so it’s easier for us to do this but sort of being… We had to come up with a decision tree with all the different ways we can go and what help we’re allowed to give and what help we’re not allowed to give. It’s just it’s a big source of noise. Outside of doing a thousand interviews and standardizing it, I’m not sure I have a really good fix for it. But be aware that some really compassionate candidates will…

Craig Cannon [38:48] – Okay, so what’s a common area where I might ask for help without you even realizing that I’m getting help?

Ammon Bartram [38:58] – One just being brave enough to ask, so saying like, being really friendly, and then saying something with confidence that’s sort of right, and there’s this natural instinct to add onto then correct the error. And as the interviewer, it’s really easy to do that and not realize that you’re steering the person through the problem.

Craig Cannon [39:15] – So if you’re going out for interviews, you should do exactly that.

Ammon Bartram [39:20] – I have a blog post on how to prepare for interviews and I do write like I recommend trying to do that, actually.

Craig Cannon [39:23] – Really? Oh, that’s awesome.

Ammon Bartram [39:27] – Yeah, I wanna add sort of a side to that, though. Which is actually the negative side to that, which is that interviewing could turn into hazing. Interviewing is not just evaluation, it’s also like this rite, the shared rite of entry into a company. And some companies develop this culture around the interviewers being hard and unpleasant. And as the interviewer, it’s really easy to forget how much harder it is if you’re the one answering the question. It’s so much easier to feel smart when you’re asking the question. And sometimes candidates get really flustered and can’t answer a question. And it could be really frustrating as the interviewer if they’re like this thing is obvious in front of them, and they’re just missing it and they’re just wasting your time, and you can get a little bit angry inside. And it’s just really important to stay away from the hazing, the taking out anger on them by.. I’m generally against cutting interviews short, I think. Except when the case where the candidate is in pain, I think it’s not worth doing. I think you save some time, but you damage your reputation, they dislike it, it’s embarrassing.

Craig Cannon [40:35] – Okay.

Ammon Bartram [40:36] – But definitely staying away from the hazing, staying away from the the–

Craig Cannon [40:40] – So does that just mean like crazy brain teasers? Does that mean like cutting them off in conversation? What does that mean?

Ammon Bartram [40:46] – It means all those things. So it means crazy brain teasers, being mean, sort of getting slightly angry and aggressive in how you answer their questions, because you’re frustrated by how poorly they’re doing.

Craig Cannon [41:00] – Okay.

Ammon Bartram [41:01] – And a trick that we use that I think helps in that case, is to just sort of, in the case where a candidate is totally failing the interview, flipping a switch in your brain and going from evaluation mode to teaching mode. Your full goal now is just to explain in as a friendly as possible, like generally, you already know the person failed, so this happens when the person has already essentially failed, at least the problem, if not the interview. And so you already have it in their brain, okay, this person is not passing, and so I’m gonna spend the remaining 15 minutes being friendly and explaining the answer to this question, rather than continue and to try and elicit responses from them.

Craig Cannon [41:33] – And what about just the dynamic. Like do you advise one-on-one interviews? Or how many people per interviewee?

Ammon Bartram [41:40] – Yeah, like a interview panel definitely increases the stress, so we max out at two to one. Training is important. So if you’re trying to keep it consistent, you need to have continual cross. People need to watch each other’s interviews. But two, sort of one interviewer, one shadower is enough to do that. Going beyond that increases the stress and I don’t think it really helps.

Craig Cannon [42:03] – Cool. So if people wanna follow up with you and ask you questions, how can they reach out to you?

Ammon Bartram [42:08] – Sure, my email is ammon at triplebyte.com, that’s A-M-M-O-N. Cool, thanks, man.

Craig Cannon [42:14] – Thank you, Craig. All right, thanks for listening. Please remember to subscribe to the show and leave a review on iTunes. After doing that, you can skip this section forever. And if you’d like to learn more about YC, or read the show notes, you can check out blog.ycombinator.com. See you next week.


  • Y Combinator

    Y Combinator created a new model for funding early stage startups. Twice a year we invest a small amount of money ($150k) in a large number of startups (recently 200). The startups move to Silicon