Episode 34

full
Published on:

20th Aug 2025

Beyond Cracking the Coding Interview with Mike Mroczka

Ever wondered how many “perfect” candidates simply learned the test—or how many great engineers get filtered out by bad interview design? Mike Mroczka, interview coach and ex-Googler, shares what really goes on behind technical hiring and how to navigate it to your advantage.

What you’ll learn:

  • How leaked question banks and standardized puzzles can distort hiring signals - and where they still help
  • Practical ways companies can make interviews fairer and harder to game, both on-site and remote
  • A balanced take on data structures and algorithms: when they’re useful and when they’re noise
  • Tactics to spot and reduce cheating without turning interviews into surveillance
  • How to structure interviews for different seniority levels so you measure the right skills
  • Salary negotiation playbook: timing, leverage, and common pitfalls that cost candidates real money
  • Getting past the application black hole: skipping recruiters, networking that works, and coordinating offers

Who this helps:

  • Engineers tired of grinding puzzles who want a smarter prep plan
  • Hiring managers looking to improve signal and reduce false negatives
  • Anyone preparing to negotiate an offer with confidence

Guest: Mike Mroczka, Primary author of Beyond Cracking the Coding Interview, Ex-Google

Mike Mroczka, a former senior SWE (Google, Salesforce, GE), is now a tech consultant with a decade of experience helping engineers land their dream jobs. He’s a top-rated mentor (interviewing.io, Karat, Pathrise, Skilledinc) and the author of viral technical content on system design and technical interview strategies featured on HackerNews, Business Insider, and Wired.

Mike Mroczka, website

Beyond Cracking the Coding Interview

Links to interesting things from this episode:

Transcript
Intro:

You're listening to the Platform Engineering Podcast, your expert guide to the fascinating world of platform engineering.

Each episode brings you in-depth interviews with industry experts and professionals who break down the intricacies of platform architecture, cloud operations, and DevOps practices.

From tool reviews to valuable lessons from real world projects, to insights about the best approaches and strategies, you can count on this show to provide you with expert knowledge that will truly elevate your own journey in the world of platform engineering.

Cory:

Welcome back to the Platform Engineering Podcast. I'm your host, Cory O'Daniel.

And today's guest, Mike Mroczka, interview coach, ex-Googler, author of "Behind Cracking the Coding interview" He spent years helping engineers navigate the often bizarre maze of tech hiring that we go through.

He's also done some real world pen testing of interview systems. Like, not of the code, but like of the systems themselves. Think of it as trying to rob the hiring bank to find the weak spots. But with permission. With permission, right?

Mike:

With permission.

Cory:

I'm excited to have you on the show. I have thought interviewing has been broken in this industry for a long time.

I'm one of those people that like, if I get put into an interview situation and it's like hardcore algorithmic, like I just tap out. I'm like, If this is the way that you are judging the people coming into this team... like, not how we work as engineers... I'm not interested in being a part of this team. So there's just been whole sectors of companies I'm just like, I'll never apply there because I know I'm not interested in that process.

So I'm stoked to have you here today. Mike, can you tell us a little bit about your background? Tell us a little bit more and like how you kind of got into this space specifically?

Mike:

Absolutely. For one, just thanks for having me on, Cory. I appreciate the chance to talk about it. And I think there's maybe two things to know.

The first thing is that I actually went three and a half, almost four years into a degree in physical therapy, of all things. Before I saw the Iron Man movie and that movie, like that totally changed my life.

I was like, there's Jarvis and you got like the Avengers going together. And I'm like, why is this not a thing? Why can't we have Jarvis in like the real world? I feel like computers, like can kind of talk to you. Like, why can't they be more intelligent? And it was at that point I switched majors entirely, went into computer science, and I was like, "Okay, I'm going to go for it. I want to learn to do this. I want to get into AI and I want to really understand the field."

And in doing that, naively at the time, I was like, "Well, then, okay, my North Star is going to be Google, because that apparently is the big company to go to if you kind of care about learning about technology." And so I realized very early on, immediately when I switched, like, "Okay, how do you get into Google?"

And it turns out it's these algorithmic challenges, as you said. And so I started very early on studying for them, learning about them and preparing for them.

I made YouTube videos that you can still find on YouTube from decades ago of me, like, explaining concepts that my professor was talking about. And then from there people are like, you're actually pretty good at teaching this. And that transitioned into a couple of roles, and now I'm here.

I worked at Google, I understand AI, and it's a fun kind of rabbit hole to go down into for sure but also... well, as much as you can understand AI, I'd say... that's a little bit about my background.

Cory:

Are you still at Google or you're ex-Google?

Mike:

I'm actually ex-Google now. So I actually left because you can't write a company on the interview process and how to pass it and then have them be okay with you still there. So it's sort of like you had to leave in order to write it turns out.

Cory:

Very cool, very cool. And so this is your first book?

Mike:

tten by Gail McDowell back in:

Cory:

Very cool, very cool. Okay, so before we jump into the details of everything, I've just got to know, I want to start off with a silly question.

Like, interviewing is so weird. It is such a weird thing, in general, like even outside of software, it's always uncomfortable.

But then you get like, the quirks of people, like, trying to figure out how do we integrate other people into this business and like this business culture. And I feel like a lot of times you have people that are running the interviews that have like zero experience putting together or running interviews, right? Like, "Hey, let's just try to figure out what's going to make people happy coming in here and make us happy with them."

So I have got to start with what is the worst... like, what is like just the most fall apart interview process that you have gone in and tried to understand and like hack that was just like, "Wow, what have you..." You don't have to mention the company's name. I would love to know the most like "Phew" one you've ever had was.

Mike:

So this is fun. I just want to actually set the stage by saying interviews suck. And like the elephant in the room is interviews suck. And I agree with that. And we can maybe circle back to why they suck, but one of the reasons they can suck is because the company itself and the interview process itself just is set up in such a way to where it's not really testing anything productive.

So there is a company, and this is a current company that still has this process that is very well known. It's actually a very big tech company. And the way it works is they have a question bank that has about 15 questions in it. And unfortunately this question bank is leaked. Like it exists online if you go to the right places and know where to find it.

So if you do well in the interview, it's very hard for them to know did you do well because you really understood it or because, you know, you had a little bit of help. But they don't necessarily realize their question bank is leaked. And at the time, at least they didn't - when I was doing some initial work for them.

And the big problem is that they assume that we just have people that are really, really good at answering this stuff. And those are the people clearly we want to hire. And then the people that struggle or take a little bit of time thinking about the problem, they must clearly just not be as bright.

So it sort of totally derails the whole interview process because now you have this thing where what you think is a good signal is actually just people cheating. And then you have the bad signals, which is actually just people actually struggling on a question that they hadn't seen and prepared ahead of time.

So that's probably one of the most egregious examples. And the crazy thing is this is like a Fortune 100 company. It's a huge company and they now know it's a problem and they are actively working to rectify it. But it's like a massive thing because they just didn't invest any time in their question database. Which is just crazy to me.

Cory:

That is wild. And that's one of the things I've always thought was a bit goofy.

I've got a good friend, he is an amazing interviewer, but he's one of these people, he just spends his spare time just doing HackerOne stuff. Like the amount of job offers he gets when he's on the market, like he'll go and interview at like 10 places and he'll get 10 offers.

That is great. But it's just like, damn, the amount of effort. And then he's just like, "You know, I don't really... I'm reacting. All the stuff I just showed them that I know how to do, like I'm never going to do it in my job. I'm going to make a pretty interface that is engaging and fast and I'm not doing any of this like deep algorithmic work. It's not like I'm building like how React renders stuff, I'm just building HTML and CSS."

And it's just like, "Man!" and at the same time like I'm not putting that effort in. I'm going to show up, I'm going to put the effort into like, "Hey, I want to work here." I'm going to put that effort in. Like research the team, research the company a bit. But I'm like, I'm not going to try to go and pass your test.

I'm going to show up with what I know, and I want to see if I am a good fit and if you are a good fit for me. Like, that's what I'm here for. And like that's why I said like, "If this is the gate - you have to have read the big red algorithm book - it's like I read it 20 years ago and like I didn't see a lot of value in it in my work day-to-day, working typically in like web frameworks and whatnot." It just didn't hit - if I was building a database, hell yeah, Like, makes sense, right? And I think that scoping is missing a lot of times.

I feel like many people that you see running these interviews... I worked on a team where somebody did this... they sent me some stuff that they had Googled for the interview and I'm like, "Dude, if you have to Google it, what is the point of hiring any of these people?"

And they're googling stuff all day, in and out. Like, so what's wrong with Googling something? I would love to see people Google. I think that's one of the things... like when I'm interviewing people, you know, at Massdriver, we let people Google stuff - I want to see how you find information. Like, that's more important to me than, do you know the answer to the problem? Do you know how to find the answer to the problem? And I feel like that's not really reflected in a lot of, like, technical job interviews.

I would love to know, how do these hire a bank robber style gigs work? How did you learn how to pass them and kind of prod them for their weak points?

Mike:

Sure, yeah. Let me kind of just step back and set the stage for it for a second.

So I think as a coach, if you go and you teach people how to pass interviews, the natural thing that always comes up is like, "If I don't know the documentation, can I Google the answer?" I know when anyone asks that question, what they're really trying to kind of probe at is like, "What are the things I can ask? Or what can I Google? Or can I use an LLM?" And this naturally sort of devolves into this conversation on cheating.

Like, "Can I cheat? What is considered cheating? If I was going to use the LLM in my day job, do you count it as cheating?" And just to be clear, I have a very, very clear line as to what's cheating - if you're using an LLM without your interviewer knowing you are, that's cheating. Like, if you can tell them and you don't think it's a problem, and they don't think it's a problem, that's not cheating. Very straightforward.

But because of this conversation, it naturally sort of dovetails into me realizing that it's really easy to cheat in an interview. And that's not something maybe you want to advertise on a podcast, but it's like, it's actually not as hard as people think it would be.

I did a study for a big mock interview company called Interviewing.io. A study is maybe too strong of a word for it, but basically we did experiments where we paired people up and we asked them to cheat. And we had professional interviewers from Google, Facebook, Netflix basically do these interviews.

And we asked, "Hey, did you notice anything wrong with this person? Did you see anything wrong?" And some people would be like, "Oh, they were a little slow at coding. You know, maybe it took them a while to get the solution.", but not one person realized that they actually were cheating in the process.

And this was a huge kind of thing in the news. It made Hacker News, it made Wired. And when the release of the post kind of came out, it's like, "Wow, it's actually much easier to cheat than we think it is." And from there, kind of companies started messaging me saying, "How do we prevent it? Like, what are the strategies?" And stuff like that.

So it started off as just a couple of consulting jobs. And then one person pitched me the idea, like, "Hey, I don't know if this is a problem, but I'd love for you to just prove it to me that it is." Let's take somebody that we recently rejected from the interview process that we said, "No, we don't want them." We'll set them up with a different group of interviewers that don't know this is going on. And I want to see if you can help them cheat through the interview. And we did, and it was a success. And we got them through the process. And this kind of showcases how easy it was to cheat.

It's like this whole thing is not set up to be able to support cheating. And that's kind of what I've been doing ever since. It's a fun thing.

So companies will basically hire me to see can I get through the process myself or can I get somebody else through the process. And most people think, okay, this is just online interviews. But we actually there's cool ways to cheat in person with in-person kind of interviews that we also potentially talk about as well. So, yeah, that's a little bit of the story.

Cory:

Can we about how people cheat a little bit?

Mike:

Sure, yeah, absolutely. Let's dive into it. What do you want to know?

Cory:

So I'll tell you, I'm going to admit something here.

Mike:

Okay.

Cory:

I've cheated at a couple of things in my life.

So one, when I was in elementary school, I was poor and my elementary school started doing this thing called Book it, where if you read like five books you'd get a free pizza from Pizza Hut. I just grabbed a bunch of the sheets and I just went to the library and just wrote down books. I was eating pizza. My mom was like, "How do you keep getting all these coupons for pizza? And I was like, "Yeah, man, you write five books down on this thing and you give it to your teacher and you get a coupon for a free Domino's pizza or something like that."

Mike:

That's awesome.

Cory:

And then, you know, I think the other... my other big cheating moment was actually on the test in college. And so it was managerial accounting while I was in college and we were allowed to use calculators, and I had a TI 85 or 87 program. So what I did is I just programmed all my notes into my calculator and I just pretended to be doing math.

I'm just like, I got a ton of notes here and I crushed that test. And guess what? I do my own taxes today, and it's fine.

Mike:

Yeah, yeah.

Cory:

So, okay. But cheating in real life, like, those are my only two real cheating in real life stories.

Like, how do people cheat in an interview where somebody's just like, hawking you and standing there, like, while you're in the room with them?

Mike:

Well, before I answer that, I just want to actually point out the fact that, like, you know, for a Pizza Hut kind of pizza, you're like, "Okay, maybe I'll cheat". Now multiply that to this is a $400,000 a year job. It's like, for one, understandable that people cheat.

I think a lot of people want to just be like, "How could you ever do that?" And I think that's maybe not the best way. Like, it's very clear how you could do that. And some people just kind of need a job, so I'm not saying it's right or anything like that, but it is kind of worth just noting, like, yeah, if you do it for Pizza Hut, you're certainly going to do it for a $500,000 job.

As far as, like, how people cheat, the interesting part about in-person interviews is a lot of people don't realize just how small microphones get. And all you really need is one other person and one of these very small microphones and one of these very small kind of earbuds in order to kind of make it happen. And you can just literally have somebody go through... and it's like the old spy tech gear. I've got some of it down here.

But anyways, it's like, you know, you have a little cam that sort of what's on the whiteboard, what am I hearing, or what is the person hearing, and then, like, do I have somebody that can go Google answers for me? Really, you just need one accomplice and, like, a hundred dollars to make this happen.

This is not an endorsement, just to be clear, but, like, this is actually a lot easier than you think it is.

There's a couple of, like, very specific things you have to sort of check for or very specific things you need to not do. So one of the guys I was working with would always just sort of like, do this with his ear. It's like you got to stop doing that. Like there's something in your ear very clearly if you keep like bringing your hand up to your ear, like trying to hear what's going on. So ditch the sort of spy moment where you're trying to do that kind of stuff.

Then you've got to stop moving as a candidate because if I'm trying to see what's on the whiteboard and you're doing this or you're like giving me the nervous energy where you're kind of moving back and forth, I can't see what's on the screen. So I can't help you, you know, if I'm in another room kind of doing stuff.

But yeah, it's actually a lot easier than people maybe would think it is. And that's the in-person.

Cory:

Jeez. I didn't even think about the ear thing.

I mean, you can't even say something to somebody with an ear thing. You don't know if it's a hearing aid or not, right? If they have one of those small like flesh colored doodads, like kind of tucked in.

That's, I mean... honestly, I'd be a little offended if I caught somebody doing that. But also, at the same time, like, this person's pretty damn clever, right?

Mike:

You're smart. It's smart in some ways.

Cory:

Might want to have him on my team.

Mike:

I'm not sure it's necessarily a strong hiring signal, but there's something there for sure.

And I think, keep in mind that so many places aren't doing in-person interviews these days. It's like, no, that's the hard version. That's hard mode. But now imagine that you've just got somebody helping you out on another screen right in front of your screen. It's just so much easier if it's a virtual interview.

So I think that's kind of one of the biggest problems that's actually plaguing interviews at the moment - it's really hard to know over a virtual interview if you're cheating or not.

Cory:

Yeah.

And then with like the Roy Cluely stuff, too, right where they built the thing, just kind of like, looks at your page and tells you what to do. I saw that and I was... I actually thought it was hilarious. I was like, "I love this, dude." Because I was like he's addressing this pain. Everybody got so mad about it. But I'm just like, dude, but also like, that just would kind of be a dope tool. Just in general, as an engineer, just like having a HUD just kind of follow you around. Be like, "Yo, this is what this means." I'm like, "Dude, I want that just for, like, navigating my own code base."

But yeah, it's like... and you know with Google Glass and like some of the wearables that we're getting like Raybans, that's going to be just on somebody's lenses soon. I feel like it's going to be very difficult to detect.

So, like, as an interviewer, how are they spotting these people? Do you put a little EMP in the room to just disrupt?

Mike:

That would probably be the spy way of doing it, maybe

Cory:

An EMP.

Mike:

So it's probably easiest to tackle in sort of what strategies work for virtual interviews and which ones work for on-site. So which ones do you want to kind of dive into first?

Cory:

I'm blown away by this idea of, like, people cheating on on-sites. Like, I mean, I just assume everybody's cheating on the remote ones, and I'm fine with it. But in person, I'm like, these are... these are daring folk.

Mike:

Yeah.

Cory:

Yeah. I would love to dig into that a little bit, and then we can go into the remote stuff from there, because I know that's definitely on people's minds.

Mike:

Sure. So, I mean, you know, when you think about it, you need somebody that's going to be able to help you Google answers or have an LLM.

But most questions that you ask, I think, if you're in-person, you'll find that you might even be more lenient with somebody than if you're over a screen. Because again, in an interview where you're over a computer or something like that, as you said, you're assuming they're cheating. So anything they do, you're immediately discounting. So I think people tend to ask easier questions during on-sites. And that's true even, believe it or not, at like big tech companies - so the Googles, the Facebooks.

The easier questions tend to be once you've passed the online assessment and you get to the actual thing. Sometimes you'll find that the questions are much easier.

So for one, just kind of keep in mind, like, what are you rating here? Are you keeping your questions consistent throughout it? If you're not, that's maybe a indication that you're doing something kind of wrong if you're making it easier for those.

The next thing I'd say is when it comes to cheating, if you have a question there's basically two things that somebody needs in order to get through the process successfully. They need either somebody to feed them information or a way to access a screen.

Now, I've tried a couple of different ways of getting a screen to be something that could be in front of it - so where they don't need an accomplice and it's straightforward. Like it is not an easy task to be able to do. There have been some things where we've tried building it into like a belt buckle. There's a couple of times where I've tried having it be like on a sleeve or something like that and just sort of like over a sleeve transparently. There's not a whole lot of good options to do this yourself, where you have an LLM that's listening in and then trying to do it.

But you could in feed in theory at some point soon have something to do that. Really and truly, if you go into an interview with Ray Ban sunglasses, I think that's a red flag for most interviewers.

Cory:

Yeah.

Mike:

If it's not for you, then like now you know, that's a thing that you have to watch out for.

But more importantly is like - okay, so if you're going to have somebody that actually is on the outside giving you information, you need a way to get that information. And I think the kind of two biggest issues are: Do I need to see something? Or can I just hear it?

And most people would think, "Okay, let's just have audio and video." But that also can really screw things up because now I have to communicate with you and I have to be like, "Move a little to the left. I can't see what's on the whiteboard." I have to then start communicating to this person that I can't see what's going on the whiteboard.

So oftentimes it's actually better to just have the audio. Whenever there's something visual that needs to be described, you'll have somebody like... you know, if there's a whiteboard just off to the screen here... but basically they're looking at the whiteboard and they're saying, "Oh, this is this visual thing." And they start describing what this thing is to me. So I'm like, "Okay, so I have a database on the left, and then I'm going to feed that from the database..."

If you're like in a system design interview, you start describing what's on the whiteboard, that ends up being a huge red flag that maybe they're feeding information to somebody else. So if they're visually starting to describe what you and them can both see on the thing, it's kind of one of those immediate red flags I'd say.

There's a couple of more things to kind of mention. I want to pause there. Does that kind of make sense? It's kind of a weird thing to talk about, but yeah.

Cory:

No, that... that crushes my soul a little bit because I'm one of those people that just talk out loud while I'm doing things a lot. Like because I'm trying to like over explain.

Mike:

Yeah, yeah, yeah.

Cory:

So now I sound like... I'm just like, "Okay, I'm drawing a box. This box is a database." You're like, "This guy's fucking cheating for sure." I'm just like, "No, I'm just... I'm just... my handwriting's not good. Like, it doesn't even look like a square. I'm trying to tell you this amorphous blob is in fact a square. I just don't square very good."

Damn it. I sound like a cheater. I am a cheater. Everybody knows that. I got free pizzas.

Mike:

Well, and it's the opposite of what you're usually told, right? You're usually told in interviews to over explain your thought process. And now all of a sudden, this thing that we've now been telling people to do for decades, it's like, now it could be a problem. It's like, wait, why are you. What's the reason? It's at least something to be suspicious about as an interviewer.

Cory:

Yeah. Wow. Okay, that's wild.

Okay, so now let's talk about the remote one. Because that's... everybody's hiring this way now. It's where most people are seeing their first interviews, at least, right? Even if there's an on-site, you're probably going through a few remote ones before they bring you on.

So, like what are... I mean besides like the Cluelys of the world... it's like, I assume tons of googling and you know, off to the side. Like what are strategies for kind of minimizing that?

Are people jumping to like, "Hey, like we've got a portal that you log into to use a laptop so you can't see it. Like what are people doing to kind of combat cheating there?

Mike:

It's a good question and I'd say there's maybe a couple ways to address it. One of the most important things to talk about is we have sort of client side solutions and server side solutions to the problem and neither are perfect.

You could have somebody install a proctoring software. And this actually exists in other industries. So if you have... let's say somebody who's trying to take the bar, who's a lawyer but has a disability and can't go into a testing center, like there are proctoring services that sort of monitor processes in the background and stuff like that. So you download some sort of client software, you install it and then it sort of monitors what's going on on your computer and it sort of tracks your eye movement on a webcam and it's just kind of making sure that you are kind of answering the questions normally.

Now that sounds really good. That would work for a lot of places, but most places aren't doing something like that. I'd say that it would eliminate, if I had to guess, probably 90% of the cheaters. You're going to get some people that just really want that hundred thousand, five hundred thousand dollars salary and like they're going to be able to cheat through this anyways. But that's sort of one of the better options - the client side softwares.

You then have server side softwares. If you've ever heard of HackerRank, or CodeSignal is another one of them, where if you were to buy the enterprise level version of that, you'll find that they come now with anti-cheating software and unfortunately those don't tend to work very well.

Now I've had a couple people reach out to me about it and be like, "Don't say that, like this is the main line of defense.", but transparently it just... they don't work very well. And what they can do is they can say, "Hey, here's your web port. And we see that the person changed tabs so they're now on a different tab", "We can see that they resized their screen", "We can see that this person's looking off the screen". There's lots of things that it can do. It can say, here are some red flags, but we can't know for sure that this person was cheating.

Whereas it's very easy to tell if somebody on a client side version of this, like launches Google and starts googling things - it's very easy to tell this person's cheating. Whereas at best, this gives red flags on the server side versions of this. So they're not perfect.

And both of these suffer from the fact that if you just have a second laptop, you know, you can kind of circumvent both of these problems. You just have one screen in front of you.

And that's how I do it with most of my clients, where I'm like, "Look, I'm just gonna literally sit right over on this other side and I'm just gonna like mirror the screen. I'm gonna know what's on your screen." And if you want to be really, really careful, I can even like have a camera pointing at the screen so we don't have any way of detecting through Bluetooth or if you have any sort of like third screen enabled that you might be able to kind of tell from client side software. And then it's very, very easy to know that you can kind of feed people information like that. So it's still not foolproof.

Host read ad:

Ops teams, you're probably used to doing all the heavy lifting when it comes to infrastructure as code wrangling root modules, CI/CD scripts and Terraform, just to keep things moving along. What if your developers could just diagram what they want and you still got all the control and visibility you need?

That's exactly what Massdriver does. Ops teams upload your trusted infrastructure as code modules to our registry.

Your developers, they don't have to touch Terraform, build root modules, or even copy a single line of CI/CD scripts. They just diagram their cloud infrastructure. Massdriver pulls the modules and deploys exactly what's on their canvas.

The result, it's still managed as code, but with complete audit trails, rollbacks, preview environments and cost controls, you'll see exactly who's using what, where and what resources they're producing what all without the chaos. Stop doing twice the work. Start making infrastructure as code simpler with Massdriver. Learn more at Massdriver.cloud.

Cory:

There's obviously tons of ways that people will cheat, right?

Mike:

Absolutely.

Cory:

I feel like, again, going back to the root cause of it all is our interview processes. What is it about our interview process?

What are good takes on interviews that you're seeing companies do where maybe cheating isn't the thing that people reach for anymore. Instead it's the, "Let me show you I can do the job."

Is that the ultimate solution? Getting a better interview process? And if so, what do those start to look like?

Mike:

Yeah. Okay, buckle up. This is a hot take. I've got a couple of thoughts on this.I'm going to say a couple things I think are maybe controversial and goes against what you'll see people online talk about.

So most people say the interview process is broken. And I do actually agree with that. But I think why it's broken - I disagree with most people on.

So let's talk about just the data structures and algorithms, LeetCode-style coding interviews for a second. What's good about them is imagine the world before that. You could be asked anything about anything and whether or not it was a good question or a bad question really kind of just depended on who was doing it.

You might have, let's say 10 years or 20 years of web development experience and I maybe have 20 years of web development experience. But the things we did during that web development experience, there's going to be some overlap, but there might be some things that I just, I don't know. And that doesn't mean that if you ask a question about something I don't know, I'm not experienced.

So I think there's sort of that natural problem where it's like if we actually standardize the test with data structures and algorithms and we say this is what is going to be on the test... no foolers, I'm not going to ask you about C++ memory management... it's going to be very much something that you can study for. In some ways that actually just democratizes it and I think it provides transparency when there was none. Remember, Google, not 15 years ago was asking brain-teaser kind of questions.

So I think that'd be kind of the first thing I'd say. I don't necessarily think these coding puzzles suck as much as some people think it does because the way you would normally pass interviews before that is you actually just knew somebody. So you knew somebody at the company who would tell you what types of questions to kind of prepare for. Like, "Oh, this person really likes, you know, the DOM and like know a lot about what a shadow DOM is" or you know, something of that nature. And then it's like really, really hard, like unless you knew somebody to actually kind of get a job.

So I'm gonna maybe pause there for a second. I want to see is there any pushback on that or how do you feel about that as sort of an initial thought?

Cory:

I mean, I kind of agree. I mean, I think it makes sense to include those questions when it pertains to the work, right? Again, like, it's like hard algorithmic work, right? Like, I mean, it's again, like if you're making databases or you're literally designing Kubernetes, it's like, okay - distributed systems is going to be really important to the job, right?

But then where like most engineers sit, where it's like - we work for a SaaS, we're taking some stuff from a form, we're sticking it in the database and then doing some authorization rules to show it to somebody else. It's like we're mutating hashes. Like, that's what we're doing.

To me, it becomes more about like, "Okay, well, you know, we have a monolith. Like, what are the pains of, like, working on a monolith? It happens to be in Ruby on Rails. Like, how deep is your experience of going from like a Rails Two to a Three project?"

I feel like the context really matters. And I feel like many times it's so easy as an interviewer to be like, "Oh, let's find something tricky. Let's see if they can answer something tricky." And it's like, I'm not personally interested in whether or not you can answer something tricky.

I'm interested in, will you fit in on this team and can you add values to the business?

Which I feel like is... for many engineers that aren't experienced in interviewing or haven't held a leadership role... like, that's just not where many engineers heads are. It's not, "Is this valuable to the business?", it's like, "Hey, like, I'm an engineer, you're an engineer. Let's do some cool engineer things."

It's like,"Yo, this person's trying to get a fucking job. They need money, right? Like, let's not do some cool engineer things. Like, let's not waste their time and see if they're going to fit in here."

So, I mean, I think that those, like, harder algorithm questions make sense. Like, if you're building satellites, hell yeah, this should be very hard algorithm questions. You want to work at... not Boeing, they'll let anybody work there apparently.

Mike:

Exactly.

Cory:

Sorry Boeing.

You're working at Lockheed, hit him with hard interview questions.

But I feel like it's one of those things where it's like, what do we do here? And it's just like, let's ask questions that pertain to what we do and what our day-in-day... what is troublesome for us. Let's ask about that. Let's ask about the pains that we have or ask about like the thing that we don't know.

Oh, we're doing this new initiative with some data stuff. We have no idea how to interview a data team. Right now my interview questions for like day one, if we don't have that experience, is like, "How do you incorporate with a team that has no idea whether or not your work is good?" Like I need to know that you're going to come in and be a good leader that comes in and incorporates and is able to find people that are going to work well with you. And hopefully you know how to interview them, right? I'm not going to ask you some airflow questions. I have no idea what the answer is, right?

Mike:

Right.

Well, actually you, I think you said it, you stumbled on what I think is the critical idea here, which has to do with like most engineers. It really... most engineers... it depends on what we're talking about here because Google is one of the biggest employers, like they have quite a large number of them, but relative to other ones, it just sort of depends.

So if you're a startup, it doesn't make sense for you to ask algorithmic style questions, for the most part. If you want an easy way to screen people, it doesn't make sense.

But if you're hiring 10,000 people a year, you can't ask the same question. Whatever questions are currently in your interview process for your company, imagine taking that and then asking it to 10,000 people. It doesn't scale well and that's in part because it's going to get leaked. There's problems with that from that standpoint. So you need something that you can quickly come up with new versions of the questions that doesn't require a whole lot of like extra work on an engineer's part.

So from a Google perspective, scale is the big difference. It's not necessarily that I think you have to work on algorithmic questions. It's just an easy way to be able to start filtering out people at scale.

Does that kind of make sense?

Cory:

Especially for like doing 10,000 applications a day. It's like, we want to weed out some people.

But, like, going back to your original point, like not weeding out the person that's struggling a little bit, that might be a really good fit for the job - how do you not miss them in that scenario?

Mike:

Yeah. So I guess maybe the hot take is - I don't think data structure and algorithm questions are totally out of place, I just don't think they should be the only factor in an interview. And I don't even think they should be the deciding factor.

Again, somebody who's like, let's say an architect, Very, very technical. Why should they be coding threesome? Or, like, you know, edit distance or like some graph thing. Like, it doesn't make sense to see if they can do those sorts of things, but it makes sense to ask them system design questions. And certainly maybe a part of the process could be, "Hey, let's do a coding question. Let's make sure you can do something with code." And maybe it's actually an easier version of the data structure and algorithm stuff. And it's done in person to be like, "Okay, let's just make sure you can, like, make a variable, write a for loop, write an if statement." Like, are you just talking totally out of your butt, so to speak? Or like, can you actually do the thing that we're talking about at a high level.

Then, from there, that's just one piece of the overall process. I think where things went wrong is we said, "You can't pass elite code hard, okay, well then obviously you're not good for the job." That's kind of the place where this all falls down.

Cory:

Yeah, yeah, yeah. I agree there big time.

What are you seeing in organizations where you've gone in... well, first off, I gotta know, have you ever just gone in and tried to like cheat your way through a system and just flat out fallen on your face? Like, "Damn, these people are tough to beat." Or like the questions are like... Have you been in that scenario before?

Mike:

Yeah, well, what I'd say is I've only unfortunately been able to do this a handful of times. So it's not something that I've done hundreds of times yet.

There's been one or two times where we've fallen down a bit. Where it's like the camera, for instance, just goes off and I'm like telling the guy, "Like, your camera fell off. I can't get it." And so he starts clearly trying to fix it. I'm like, "Stop doing that. Don't do that. Like, we don't want to alert them that we have it." So then he starts trying to communicate what's on the board to me.

So there have been times where it fails. But usually it's for a round of the interviews. So you get one person with poor feedback, like, "This person didn't do well" or "This person was acting strange." But again, most people aren't thinking, especially during the on-sites, that somebody is actively cheating in that interview. It's just not on people's radar. So it's amazing what you can get away with.

I used to be... fun fact... Before I wanted to do a physical therapy degree, I wanted to be a magician - which is just, like, one of the most nerdy things you can possibly say. But one of the cool parts about magic is a lot of people think it involves the hand being quicker than the eye. And really, it's not at all about that. You can be incredibly slow and steal somebody's watch if you have their attention directed elsewhere. So it's all about misdirection. And I think that applies a lot in these interviews as well.

You can get away with crazy stuff if people just aren't expecting and their attention is not focused on the right things. So I think that's just something to kind of keep in mind. Just making us a little bit more aware that it could happen is a huge deterrent for it actually happening in your interview process.

Cory:

That slow motion thing, it works. I do a lot of magic in my house.

I've got a sweet tooth and so do my children, but I don't want to feed them candy all day.

So I'll be like, "Man, there's a cookie and I want it. I can see it." And it'll be like, "Hey, guys, do you see that bird in the yard? Like, that crazy bird?" And then I'm just munching a cookie. If they turn around, the evidence is right here. I just got to keep their attention over there.

Mike:

No, over there. It's just behind the bushes.

Cory:

That's funny.

Okay, so, like, companies that are going about what you would consider good interview processes, what do these start to look like? How open or, like, looking forward to better interview processes, are most companies? Are companies aware that their interview process may be defunct or dropping good candidates, and they're trying to make it better? Or is it something that kind of has to be, like, brought to their attention by podcasts like this or your book?

Mike:

Yeah, well, so, you know, we can kind of separate this into two things. One is just like, how bad is the average company's interview process? And then how resistant is the average company to, you know, having somebody cheat through their interview process?

Let's talk just about the interview process in general, because I think what most people don't realize is I think if you were to take anyone and have them do an interview... and there's probably lots of people that are listening at the moment that have done a technical interview... Now, if you just sort of stop and ask yourself, like, would you consider yourself a good interviewer?

What's going to happen is you're going to find that 90% of the audience is like, "Yeah, I think I am a good interviewer." But this is like driving. It's like, you can't have 100% of people think they're good drivers. It's like, it's not a thing.There are bad drivers on the road. We know it's a true thing. So the hard thing about this is people don't realize that they're a bad interviewer.

And so you have everybody thinking they're a good interviewer, but many of them are bad. And you might be a bad interviewer because you're doing things wrong as an interviewer, but you could be a bad interviewer just because the process is set up against you. So if you're told to ask a specific question that's leaked online, like, it's not really so much that you're a bad interviewer, it's like the process itself is broken. Like, we have to make sure the process is right.

Cory:

Yeah.

Mike:

So if we fix the process, and we assume that you aren't asking questions that are leaked or questions that you know easily can just have an answer sort of figured out, then I think the next biggest thing to talk about is, as an interviewer, what are you doing?

A lot of people think that when you ask a question, you should change the question. You should be asking different questions all the time.

The problem is it's really hard to know if you ask the same question to 10 different people versus 10 different questions to 10 people... Like what you're going to end up doing is it's just going to be a vibe check on people. If it's the 10 different questions to 10 different people, it's going to wind up being, who did you like the most? Or who was lucky enough to get a question that was easy enough for them to answer.

Whereas if you ask the same question to 10 different people, you can develop a baseline and be like, relative to other people, how did this person do? So it's not really how did this person do? They could have bombed the question, but if they did much better than the average person, that's kind of going to result in, you know, a better score.

So I think from an interviewer perspective, I think 9 out of 10 places that I work at and do sort of this sort of coaching for people are asking different questions for every candidate. And it's crazy to me.

It's like, how can you possibly judge somebody properly if you're always kind of asking, you know, different questions? It's like, hard to judge. So that's kind of one thing I'd say - bad interviewer process, bad interviewer.

And then there's sort of the AI resistance piece, which is like, are you doing basic checks in your process somewhere to see if somebody is using AI to cheat? Because honestly, if they cheated through the coding interview online assessment that you had, that's not a problem, provided you catch them in the on-site where you're actually walking through more important things - like, how would you design a system? If you know for sure that they're not cheating in that scenario, or you're reasonably confident I guess we'll say, it's less of an issue because if they cheated through the leet code question you asked, it's actually just not that big a deal.

So, I think that'd be maybe how I'd phrase that. I want to pause there. Does that make sense?

Cory:

It does, and it's interesting because I feel like you get different things out of that. And I want to pop quiz you and get your feedback on our interview process.

Mike:

Okay, let's do it.

Cory:

A cornerstone of it... we'll get to it in a second... a cornerstone of it is I've never liked the idea of we want an apples to apples comparison between candidates, which is exactly what the 10 questions is. Right? And I'm like, people aren't apples. Like, we're all apples to oranges all day long. And to me accepting that is a part of trying to find the person that I want to be on this team.

But what's interesting about the way that you phrased it is, like it's not necessarily that nine out of the ten people that got asked that question passed it and one person did not. But it's like, what was that experience like? You might have nine people that just cheated and they're like "Damn, I've never seen somebody you know do a threesome so fast. Nine people just crushed it. And then this one person, they didn't really get it. They only got about halfway through it, but it was like I could see how they learned. I could see how they thought about the problem. I could see how they refactored it. I could see how they thought about like the testing that was going into it." And it's like, "Oh, they didn't get the answer right.", but I might be fine with that, I might be like, "You know what? I liked learning about how they got there more. So I want that person to move on to the next round just as much as these four of the nine people that got it right." These five people just - they got it right but like something about them was off, or something about the experience was off, or maybe it was just like too quick where I'm like, "Ah, sure, right." And so like that is interesting.

Like kind of take into account not just the answer but like what that experience was like getting there. So I think that's really cool.

Do you want a pop quiz and let me just talk you through our interview process really quick? I would love... man, I might get my... well, I'm about to get schooled everybody.

Mike:

I'm happy to chat but actually one thing I want to just jump in and say before we start is, you know, I think there's lots of ways to skin a cat. I think... that's a terrible phrase by the way.

Cory:

There is.

Mike:

But yeah, there's ways to do it.

So I'm not trying to say there's the one Tm true right way of doing things, but what I would say just about what you actually just mentioned there, which was sort of wanting to get... like I think there's great questions you can ask and somebody not solving it optimally and still passing them through. Like I totally agree with that. There's good merit to that.

Now the only thing I again would want to just be careful of is, how are we discerning between you just liked the person and passed them because you liked them versus like some other external kind of metric. So that's the only thing I guess I'd maybe guard against in kind of what you said there.

But anyways, let's jump in and let's hear what the thought process is.

Cory:

Oh man, this is going to be rough.

So the first one, we just do that little 10-minute call that everybody does.

Mike:

Recruiter call or a technical call?

Cory:

Yes, the recruiter call. It's just kind of like trying to figure out like what they're interested in, yada, yada. Really, are they interested in the problem that we're solving?

And like that's one of the key things, right? And that's what we try to put into our interview process. Our interview process, but the way that I've interviewed for years, is like I want to know that this person is excited about what we're doing and that they have an interest in the product. Because I feel like when you don't, what you see three years later with that engineer is they're the engineer that's chasing the shiny baubles, right? They're the ones that are like, "Oh, I'm doing this because the engineering work is interesting, not because of the bottom line of the business." Right?

And so one of the key things for us is like, "You know, we're a dev tool. Are you interested in organic dev tools? Are you interested in solving the infrastructure problems?"

It's like trying to get at like the pain that they felt there and the interest there versus like do you want a well paying job that gets to solve hard technical things? Every engineer wants that, right? So that's kind of like our goal.

And then, after that, our interview process is pretty simple. It's two hours and it's not a take home per se.

We ask you to find an open source project that is in the realm of what we're doing, preferably something that you've used before to show that like you know the space, and we want you to find either something that's bugged the shit out of you about that or an open issue that you can work on.

And you don't tell us what it is, we have no idea what it is. We find out what it is at the beginning of your interview process. And so the only rule is you should have the repo set up ready to go. You should be familiar with the problem statement. We prefer that you don't work on the code ahead of time, but we actually don't know that. And solving the issue is not the goal of our test.

And then what we do is we ping-pong... pair program with the person. And so everything we do is about making that person comfortable, right? Because that first thing in an interview is everybody's nervous. And so we're, "Hey, we're going to pair program. But guess what? You get to pick how we do it. Are we doing ping-pong? Driver-navigator? Like, do you want us to code?" Like, I've been on interviews where I write the code, but the person's telling me what to solve, right? And I'm fine with that.

The whole thing is I want to come in and see what it's like for me being the new employee working with you that knows everything about the problem that we're trying to solve. And like, that is our interview process. Did you pick something interesting? Did you figure out, like, what the problem was, and can you eloquently communicate that to us?

Now we're a small company. This probably doesn't work, scaling up to the Google sizes of the world. But like, that's our interview and it is very, I'd say, "vibey", right? Like, only a few people end up interacting with the person, but the goal is to see how do you communicate a problem. Like, did you prepare for this interview?

When people come in and be like, "Oh, I have no idea." And it's just like, "Okay, so you got to pick." And we tell people, like they know what it is - we're going to pair program, you get to pick how we do it, and we're going to play... I mean we're not going to play dumb, we have no idea what we're working on. I'm the new employee, you're the expert, right? And like that's it.

So like, we've felt it works pretty good. Like, we see people, we see how they work, we see what they're interested in. Like we're asking questions, but like we're asking questions off the cuff. It's like, "Oh, what is that? Like, I'm not as familiar with the Golang syntax." Like, "Yeah, so you pick something from Kubernetes or Terraform. This is great. Like, what is that?" And then you'll see them explain interfaces or like talk about it. It's like, "Okay, they understand how the language works. They understand interesting projects and problems to solve in the space. It's like this person gets what we're trying to do."

I know you can Google and LLM shit. Like, I know that you're going to fit in with the team and like you can communicate with other people that don't know what's going on, which is going to be the rest of time, for hiring.

Mike:

Right.

Cory:

So that's our process. I would love... I know you've got to go through it. I'll go through it with you one day, maybe once we reach our Series A.

But like off the cuff, what am I doing wrong?

Mike:

Okay, well, before I critique, let me ask just a series of questions because I think depending on what you say would actually dictate kind of what matters.

So the first one is - Does language matter? Can they use any language? Go, you know, whatever.

Cory:

We generally ask that they use a programming language that either we use or a tool in the space uses. And then what happens is somebody on the team that is familiar with that language will end up doing that session with them.

Mike:

Okay, cool. So we're rotating the people that do it as well.

Cory:

Yeah.

Mike:

Okay. Okay, cool.

And then, just when it comes to the actual process, do you care if they, you know, consult with an LLM during the time or Google while working with you? I assume that's actually totally okay with you, right?

Cory:

Yep. Yeah.

Mike:

Okay, cool. I like that.

And then, just the biggie here for me and the biggest sort of sticking point that I'd have with your process has to do with what position is this for?

Because, you know, how you're going to evaluate a junior engineer, now you're going to evaluate the most senior... like the tech lead on your team? Like I think it should be different. And there could be certain mechanics that are the same, but how we evaluate them I think needs to be different.

Do you universally apply this template for every role, or is this for a very specific role and only that one role?

Cory:

It's usually for what we would consider like a... I don't want to say tenured engineer, but like an established engineer. Right? Not a distinguished engineer, like Google.

But like juniors we have a lot more leniency towards. I actually love juniors. I love apprenticeships. I love bootcamps. I think they're amazing. One of my favorite engineers I've ever hired was from a bootcamp.

This interview process we developed... I co-developed with somebody like ten years ago at a different company... I interviewed this person. They came in and it was the most ballsy interview I think I've ever seen. They sat down and they were like, "I don't want to write any code." And I'm like, "Okay." And they're like, "I use this JavaScript library and it's documentation sucks ass. I want to go through and build a bunch of examples of like how to use it and like work through their docs." And I was like, "You're fucking hired." Like I didn't say that, but I was just like, "Dude, that's a 10x engineer." He's like, "This is hard for me to use, hard for everybody to use. And my whole goal is to make this easier and more well documented." And I'm just like, "Okay, let's do it." And it was fun to work on.

But I think there's real value in bootcamps. We need more software engineers and those people don't have a traditional computer science background. I think they're some of the ones that suffer the most in this algorithmic nightmare of getting a job. Right? And so those we tend to be a lot more like chill with and the interview process is a very similar interview process.

We still want to see like what their melding into the machine looks like with us, but it's not as... I mean it's never really technical for us. It is very... sociotechnical is kind of what we're aiming for. So it's like I don't mind if you're getting the code... now, if you just can't fucking type the language that you said you know, it's like, "Oh, this person might not know typescript."

Yeah, so there's a line, but it's just like I'm not there looking to see like did you nail this on the first go? Did your test pass on the first one? I'm looking for like what does that style look like? And in the scenario where it's a junior, it's like I'm looking for them to actually be open with like what they don't know. And so like seeing what they LLM or Google is helpful to me. And seeing what they ask back, like, "Hey, I don't know, like what patterns do you think would actually work well here?" It's like, "Oh, this is a person who's looking to learn. They're not just like sitting there cornered and saying I'm just going to guess through this by changing this 200 times in a row."

And I think like that's the stuff that makes good teams, right? It's like I know that we can all communicate well together. Doesn't really matter to me what your technical prowess is. Like if you have technical chops, you can grow those and you can learn. It's hard to learn how to be an effective communicator that works well with a team.

And I think that's one of the things where it's like people have criticized us before. They're like, "Oh, it's not... it's very apples to oranges." And I'm like, "I'm fine with that." I'm trying to build people that fit well in this business and that's always going to be about communication, especially in a remote world. And if we struggle with that part, doesn't matter how good of a developer you are, if you can't communicate it, you can't work with people remotely, you can't work with your team.

So yeah, sorry, that was a very long answer for the junior folks out there.

Mike:

No, no, actually. So for one, I totally agree. I think some of the best engineers I've ever worked with are self-taught and sometimes bootcamp grads as well. Because the thing is they're curious, you know. So of course that's going to win. If you're interested and you just want to... like because you find it annoying or something like that... like of course that's a huge thing to go for.

So as far as like back to the interview process as a whole. Again, I think the smaller the company, the more you can tailor the process and the more you can deviate from the data structures and algorithms sort of thing. So despite what many people maybe would assume I'd say - like where's the LeetCode? - that's not what I'm going to say.

I think I do really like the process. It sounds like the two things that maybe would be the biggest pain points of your process would be again partially the for positions understanding. It's really hard to tell if somebody's senior if they're not doing some amount of both navigating and driving in something like that. So I think maybe enforcing for certain levels some amount of split - like you have to do some amount of writing code, you have to do some amount of driving as well. Maybe something like that could help. And just sort of like leveling people and doing slightly different things based on the position. Seems like maybe there could be some use there.

The other big thing that seems... and I don't know if you would agree with this or not, so you tell me... but if you've ever seen a debate online, you see somebody that's giving bad points but has won the audience. Not because the points are good, but because like they said something witty or something like that. This is the problem with debates. It's like you don't win if you are the smartest person. You win if you, you entertain the crowd the most.

And I think the problem with the way you've structured it, the biggest risk you have is somebody that just comes in with a kick ass idea and it's like, okay, so they just went the whole time and just iterated on what's the best, most kick ass idea for getting through. I'm sorry, I'm swearing on the, pardon my French, but what is the best idea? And sort of iterating on the idea.

And then maybe technically they really kind of suck. But if they're like, I've got a good idea and then you want to love the person because the idea was so great.

So I think maybe that's kind of one of the biggest potential problems with that.

Do you think that's actually a weakness on your end or do you think you sort of suss out within two hours how technical they are?

Cory:

I think they get a good idea of it.

And so like even when we're doing the driver nav, so for people who aren't familiar with like pair programming, driver nav is like one person's typing but the other person's the brain, right? And so like I've been in that scenario. I'll type all this stuff. Like we, we do like to do ping pong like and go back and forth.

So we do want to see their hands on keyboard. That's usually what I prefer. Some people don't want to do it at all.

And they're like, I'll just, I'll just do the coding and you can just like shoulder surf. And it's like that's, that's fine too. Like my goal is like I don't want you typing nervous about the, the debate. Like I want the debate.

The debate, right? It's like, it's like I want them in the, the coolest, calmest position possible so I can see who they are, right?

And I know that I'm never going to get 100% but like that's the goal is get it as close to that as possible. I think one of the things that could be hard with this as we scale is only, you know, two or three people that are doing the interviewing today is.

I am not a person that gets attracted to shiny baubles. So just because you show up with an interesting problem, I'm not going to be like, oh, cool, this is an interesting problem. Like, I don't.

I don't care. My interesting problem is how are you going to fit into this team? But I could see this going down the line.

We've got two, three, four, five principals that are now hiring. And, you know, one of them is just like, oh, this is a really cool idea. And it's just like, I can see that falling apart. Right.

And I feel like it's one of those things that's hard, is like, how do you have of. In any interview process? Like, like, benchmarks across interviewers and like, what is the rate at which, you know.

And I don't even know how far you go with this. Like, that their hires work out. Yeah, right.

Like, I feel like that's something we don't see a lot is like, hey, this person's been doing all of our hiring, and like, a hundred percent of their people that they hire are gone within six months. It's like, oh, maybe you got a bad interview or a bad interview process there. And like, it's hard to.

It's hard to get to that point until you've reached scale. Right.

Mike:

Absolutely. If I can just interrupt just to say so. You mentioned, like, you don't get distracted with shiny things.

But I suppose the reason I made that point was actually to illustrate that I thought you did, given just your example. Because your example was this person came up with a great idea to improve the documentation, and they had the job at that.

And I understand you're being a little tongue in cheek with that, but I guess that's again, the point. If I can come up just with an idea that gets me the job that's inherently resistant to, like, any engineering, probably interview. So that's.

That's maybe. I guess what my point was there.

Cory:

Oh, I. I meant more of like, I was. I was amazed at how brave this person was coming into the interview, being like, we're going to write docs today.

And I was like, I mean, we still had to go through and do it. Like, so I got to see them use the library. I got to see them build out the examples, like, explain how it worked.

But it was one of those things in my head where I'm like, I've never had anybody show up and be like, we're not going to do code today. We're going to do. We're going to do documentation.

One person out of like probably like two or 300 people, you know, that I've, I've hired over the years. I was like, wow, that was, that was awesome. And the person ended up being fantastic. Their entire tenure was, was like that working with them.

But yeah, as far as like will the problem be super interesting? It's like I think, I think it matters because you know, people will come in, they'll be like, hey, like job like the front ends in JavaScript.

Like let's do D3 and make a bunch of crazy like diagrams. Since like, or diagrams but like graphs and weird stuff.

And it's like okay, like that was you like wanting to like play with that library versus like okay, well like because we tell, we'll tell people like here's a bunch of open source libraries that we use. If you want to pick one of these, great. Don't feel like you have to if it's within the realm of what we're doing. And I think that's important to us.

Again, we're at a later stage because if they, if they can't grasp that from like the docs on the site and like the conversation and they just show up with something that's just like way out there. It's just like, like maybe you don't.

Mike:

Get, get it if I don't.

If you don't mind me just interrupting again just to mention that one of the great things about the way your process is set up is that it's highly resistant to cheating.

You know, like it's, it has very little to do with, you know, knowing the right answer or the optimal solution to something or needing to reference something.

Like if you want to use an AI, you can and just announced just a little while ago, actually I think this morning Meta said that they were going to change parts of their process to now include AI within the interviewer. So you can use AI within some interviews.

Cory:

I've seen like some Fortune 500s that are allowing people to use it during interviews.

Mike:

Yeah, yeah. So there's, there are some, some big places doing it too. And I think at scale, this again used to be a problem that was very hard to do.

How do you have a pair programmer experience on some sort of controlled environment with maybe some other engineer, but at scale to where like the whole thing just doesn't get leaked real quick and make the whole thing pointless.

But with AI, you can probably interject a lot of things like hey there's like as an AI, go into your Repo introduce a bug and the interviewer comes in knowing what the bug is ahead of time or maybe doesn't even. And just like you have some sort of fork of it and then like you and one other person need to go pair program their way through the bug.

That's another process very similar to yours that I've seen some places use that I just, I love. So I definitely think again, depending on the scale of the company, these more custom things work really well.

And I think if you were to take the same process and maybe have it on your own, your own system, and then gave them maybe just a little bit tighter parameters so you could compare apples to apples, I think you would maybe have an even stronger signal. But transparently, I think your, your process is pretty solid.

Cory:

I passed. Or he's being nice. As soon as we turn the video off, he's like, dude, you guys suck.

Mike:

I want the 20 bucks, please.

Cory:

Okay, let me venmo you real quick.

Mike:

Quick.

Cory:

Awesome. I know we're over on time. This has been so much fun. I feel like I could talk to you for like two or three hours.

But let's get, let's get back to the book. So you, you Beyond Cracking the Code is Coding interview. It's out now. You can grab it now. It is. And it is the official sequel to the og right?

Okay, so it's the official. So people can grab that on Amazon, all the, all the different book sites. Are you self publishing it? How are you, how are you distributing it?

Mike:

Yeah, it's on Amazon. It's about how to crack these traditional coding style interviews. If you suck at them, I can teach you to suck less at them.

And honestly, that's like part of the book, but the other part of the book is like, you know, all about salary negotiation, the soft stuff. Like, how do you get noticed? How do you even get the interview in the first place?

How do you manage the black hole that is LinkedIn, you know, for like trying to get the applications in and all that kind of good stuff as well. So there's tons of other stuff in the book as well. Even if you don't really care about.

Cory:

The data structures and algorithms, salary negotiation is, is key. I've met so many engineers that just accept the first offer.

Mike:

Yes.

Cory:

Like, it's like, man, like, I know it's. And I feel like, should you have actually a couple more minutes?

There's one other question that kind of ties in this I would love to know the answer to. I feel like it is hard. Like it has always Been hard to get a job in this space and I feel like it's getting harder to get a job in this space.

And you know, I've seen people that are like, oh, I've been on 30 interviews, 40 interviews, and I can't imagine the toll of the stress of an interview. And that's one of the things that we probably over index for is like making ours as stress free as possible.

But like these people are coming into these jobs extremely stressed, right? Especially if they've been out of work for six months, right?

And they're like, fuck, I got this hard thing, I've gotta, I've gotta get past, you know, to get in here. And then you get the salary and you're like, fuck, they offered me the job, I just take it. There's flex. I mean even, even in the Googles, right?

I, I remember I applied at Google and they were like, we offer everybody the same shit. And I was like, this doesn't work. And, and I was, it was just way below my baseline and I was like, ah, it doesn't, it doesn't actually work for me.

And then all of a sudden like the offer changed. It was me legitimately. They were like, we don't, we don't negotiate. And I was like, okay, well then I pass. And then I got a negotiation, came back.

Mike:

Funny how.

Cory:

Still not enough. It still wasn't enough. And, and so you know, I didn't go, but like you, you can negotiate and, and I feel like you should be able to. So I love that.

So like, like, how much do you get into on negotiation? Like do you have any metrics on like, you know, like what is a valid amount to push back?

Or is it just based off of what you assume your worth is and what the, and what your requirements are?

Mike:

Sure, there's so many things to unpack there.

So for one, just as far as depth goes, we have like kind of two separate chapters on it and one of my co authors who's the head of interviewing IO, wrote them and she's amazing and she goes into a lot of depth specifically on some of the major like, you know, in JavaScript we have those, you know, foot guns of like how easy it is to screw up in JavaScript.

But this is like the salary negotiation negotiation version of that where it's like, what are all the foot guns that you're going to run into where you're going to shoot yourself in the foot before you even get to the point where you think you're negotiating? So there's so Many things people do right off the bat that I think really set themselves up for failure in the salary.

Just one kind of quick example of that would be something like, you know, saying if you're interviewing at, like, a lot of people are like, oh, I'm interviewing at Facebook. Should I tell Google that I'm interviewing at Facebook? And they, like, they give too much information away.

The problem is, as soon as you tell them, like, maybe that's a good thing, and maybe that is a data point you want to share, but when you share, really matters, because they can say, okay, great, but we can't interview you for four weeks. So, you know, in four weeks, come back to us, let us know if you passed the meta thing and then go through.

So you'll find recruiters start, like, gatekeeping people based on the information they've shared. And so I think that's kind of just one of the examples that tenta really, really mess people up.

Cory:

And also the concern of, like, am I going to have to go into a negotiation battle with Facebook?

Mike:

Exactly.

Cory:

Yeah, yeah, yeah.

I mean, because they know we're all interviewing for two or three jobs at the same time, but, like, bringing that in, like, at the negotiation phase versus, like, I just met you all 100%.

Mike:

And that's even more compounded by the fact that a lot of people take at face value exactly what's said. So in your example, if a company says we can't negotiate, it's like, well, I know. It kind of depends. How well did you do in the interview?

Have you done interviews yet?

Because before interviews, I find most people tell me you can't negotiate after an interview, and they're like, okay, we want to make this person an offer. Like, are you really telling me, like, for, like, you're not going to give me an extra $5,000 just because, like, there's the line.

It's like, that's never the case. So.

Cory:

Yeah, yeah, especially once you get to that offer phase. Right? That offer phase. They've knocked a bunch of people out. They've spent a bunch of time getting there.

Hell, CarMax will negotiate if you walk off the lot. Like.

Mike:

Absolutely. Well, Google did.

There's some data that says that for every employee Google hires, they spend about a million dollars on recruiting costs and, like, costs of engineer time and all this stuff to filter that down. I don't know if that's an actual. Actually an accurate number, but, like, the concept is still true.

They spend so much money per employee, by the time they've got it, like, you have them, you know, like, once. Once they want to give you the offer, you have a lot more leverage. And then if you have multiple offers and you've sort of timed that right.

And there's this whole other process of how do you time your interviews so that all of the offers land in a specific place? We have a whole chapter on that. So there's. There's just a lot more to it than I think a lot of people think.

Cory:

Yeah, yeah, that's a chapter to read if you're. If you're interviewing. Yeah, yeah.

I feel like the other thing that's hard with, like, interviewing, because I remember when I was interviewing for Google, I was interviewing for another job as well at the same time. And I did. Exactly. I mentioned it very early on because I wanted to look smart. Like, holy shit, this guy is in, like, second, third rounds at Google.

It's like, yeah, I'm second, third round at Google. Like, you guys better move quickly. And they're like, you know, some people are like, oh, like, wait that out. Right. Like, right.

You're already so far with Google, we're not going to be able to compete with Google.

Mike:

And it's like sharing too much information too soon.

Cory:

Baseline salary. You definitely can now. Total comp. No baseline salary, you can kick their ass.

Mike:

Right?

Cory:

But, yeah, so, like, you know, tied into that, like, I feel like people can feel forced to accept an offer, especially if they've been in a bad place.

And I feel like there's just so many people that I see on, like, just different Reddit subreddits talking about, like, how painful the interview process has been. Like, some people are seeing, like, applying for 100 jobs, been on 30 different interviews, and they're still looking for work. Like, what can you do?

Like, if you're in one, if you're. And these are good engineers, these people that do the job. It's just there's so many people looking for work right now. Like, how can you stand out?

Like, if you're 15 interviews in and you just. You're not getting anywhere, like, what. Is there any. I don't want to say tricks, or. I definitely don't want to say cheat codes, because we're not.

We're not teaching people how to cheat. But, like, how do.

How can you get onto people's radars to get into some of these interview processes that you want to be in that you can't make it through the first, you know, gatekeep.

Mike:

A few thoughts on that. The first and most important one to mention is that I think level setting expectations really matter.

So again, somebody that interviewed, let's say, six years ago, different job market, wildly different. You barely try. You get 20 places that want to interview you. You send out 25 applications, 22 want to interview you. Of the 22, you get six offers.

the average tech employee in:

Now I do again, sort of private coaching with interviews. And the average person that I work with, we try and get through about 50 to 100 applications in a week per week. And that's.

That is just a wildly different thing. And then at the end of about two months, most people have interviewed at about 10 companies.

Now what I find is, you know, when you start doing the math on that, like, that's, that's a lot of companies to have kind of done applications for. There are certainly ways to improve that chance.

So some things are like, okay, you know, I went to University of Michigan and I have people at Salesforce that also went to University of Michigan. So let's go contact. Not a recruiter, but like an actual person that worked there and was like, hey, you work at the place that I want to work.

Like, let's actually make that connection and go that way. I feel like people want to just click a button to be able to get through this, but really, this is more than ever about who you know.

And I don't mean that in sort of like, you know, somebody that worked at Ivy League, but I just mean finding and leveraging the best places of your network. And people always say that, like, the time you most need your network is when you're like, least able to just sort of like, build it.

And like, a lot of people are in the case where they need to build their network over the course of a couple of months before they start consistently getting leads come through.

Cory:

Yeah, not to be that old cheesy guy, guy that went through an mba, but like, your network is your net worth. Right. And like, I feel like, I feel like that's hard.

That's hard to hear, or it's hard to hear, but it's also, it's hard for a handful, maybe a bigger handful of engineers that aren't necessarily as social as their business counterparts. Right. But like, it is important. And I think I actually, I love that.

I mean, I feel like most of my jobs have been like, hey, I know we went to USF Together, like, do you know somebody they are going to choose? Like, who's hiring? I saw this role. Like, I'm very interested in it. Like, who can I talk to there?

This is one of my tricks is, like, who can I talk to there about the role to make sure it's a good fit before I apply. Right?

And now, like, I've jumped over the recruiter and I'm talking to the hiring manager, maybe their boss, like, about the role, trying to figure out, like, is coming to a good fit for me, should I even bother applying. And now I'm on their radar. But, like, that's. It's. It's hard to do cold emailing.

Mike:

And I don't even think it's a trick. Like, I think that's actually a useful tactic. It's one of the ones that, again, we talk in the book. It's like, you got to skip over the recruiter.

The recruiter gets, you know, a thousand applicants for that job. It's like, the best thing for you to do is to find the.

The hiring manager, the person that's actually looking to do the hiring and get on their radar. And there's a couple ways to do that. Again, cold emails is one. LinkedIn's another.

Like, there are other ways to do that, but, like, you got to get through the barricade. And just trying to, like, stand out by tweaking your resume, I think is one of the worst uses of your time.

A lot of people are like, well, what do you think of resume reviews? I mean, resumes, in my opinion, are dead at this point. It just. It doesn't matter if you have Google, if you have Harvard on your resume. It just.

It doesn't matter. Your resume doesn't matter.

Like, you've got to get past the point where you're trying to just send your resume into a black hole, because that's where it's going at this point.

Cory:

Dead Internet theory is just bots sending resumes to recruiters. Like, you're. You're in. You're. You're in the wash right now, right? Absolutely. Oh, my gosh. All right, Mike, this has been.

I'm sorry I took so much extra time. This has been super fun. This is something that's very. I'm very passionate about. I feel like the interview process.

Process has been so hard for such a long time. Definitely check out the book Beyond Cracking the Coding Interview. We'll put a link in the show notes. Where else can people find you online?

Where can people learn about your coaching.

Mike:

Beyond Cracking the coding interview is itself a website as well, or beyond ctci, which is a shorter version of that. And then my name, Mike Maroczka. That's Polish M R O C Z K A.

And honestly it's probably easiest to find my name on the book and then go from there because there's no way you're going to get that on the first listen.

Cory:

So you all missed the first eight minutes of the call where I just botched his name over and over again trying to get it right. That's the throwback there. Mike, awesome. So great to have you on today.

Thanks for coming on the show and yeah, look forward to seeing more of your work out there.

Show artwork for Platform Engineering Podcast

About the Podcast

Platform Engineering Podcast
The Platform Engineering Podcast is a show about the real work of building and running internal platforms — hosted by Cory O’Daniel, longtime infrastructure and software engineer, and CEO/cofounder of Massdriver.

Each episode features candid conversations with the engineers, leads, and builders shaping platform engineering today. Topics range from org structure and team ownership to infrastructure design, developer experience, and the tradeoffs behind every “it depends.”

Cory brings two decades of experience building platforms — and now spends his time thinking about how teams scale infrastructure without creating bottlenecks or burning out ops. This podcast isn’t about trends. It’s about how platform engineering actually works inside real companies.

Whether you're deep into Terraform/OpenTofu modules, building golden paths, or just trying to keep your platform from becoming a dumpster fire — you’ll probably find something useful here.