Hacker Newsnew | past | comments | ask | show | jobs | submit | cauterized's commentslogin

Yes. I've interviewed "developers" who failed FizzBuzz, and one who took 20 minutes to come up with a solution. I've never even specified a required language for it, and generally accepted pseudo-code.


Not to mention that discrimination in hiring decreases opportunities for minorities. Which in turn increases recidivism.


If you want a job at a bank (perhaps because they work on hard problems or because they pay a shit-ton), maybe not the best idea unless you intend never to roll up your sleeves or wear short sleeves. (Maybe short sleeves aren't an option at a bank anyway?) Happy to work at a startup? Go for it.


The problem is that if just two or three people like the lively, chatty open office, it ruins it for everyone else whose productivity suffers by an order of magnitude in a noisy environment.

It's basically impossible these days to find an office where someone who needs quiet to concentrate can get work done effectively.

Whereas in an office where quiet is the norm, you can always go to the kitchen if you really want to chat.


well you can always buy noise cancellation headphones, though i personally hate open office space too, i would rather work for less money in normal office than for more in open office


There are several problems with this, namely: they don't work as well for loud conversation (they're more for things like the drone of an airplane engine). They're also not necessarily comfortable to wear for long periods of time. But most importantly, why is it so hard for people to just, respectfully, shut the hell up? I work (and worked) with people who practically yell during Google Hangouts and don't go to conference rooms, or who joke around all day. I've worked in theee separate open environments and a handful of people have always had volume issues which ruin it for everyone else. Luckily I can work remote now, but why is it so hard to be considerate of others?


I agree. Unfortunately, loud people want to be loud, and you have a problem if you want quiet.

Also I hate loud music in every cafe/store.

Why are people so scared of a bit of quiet reflection/concentration?


there will be always inconsiderate people, even if it's only one out of 20 it's more than enough in big open space and you can't really do anything about it, it's just human nature and it's foolish to expect there won't be person like that in big team

it was same in every open space i worked in those many years, luckily in my last job we had internal messenger and personal communication (at least with our team) was discouraged so i had no problem with headphones to live in my own environment


I see these challenges as a great way for excellent experienced developers to weed out incompetent companies.

I'm a kick-ass get-things-done full-stack web engineer. I've never had to deal with one of these sorts of problems in my day to day work; and if I did, I'd just find an existing, tested, stable library that already handled them.

A company that needs someone to solve these sorts of problems doesn't want me on their team in the first place, nor would I thrive there. A company that just needs to build damn good web apps is losing out by using these sorts of questions in their interviews.

The best interview challenge I've had (actually, it was a take-home, with discussion in the interview proper) was about designing code for re-use and extension. It was a great indicator of the company's practical and mature approach to engineering, and of what they really wanted this hire to accomplish.


> I'm a kick-ass get-things-done full-stack web engineer.

And modest, too. If an engineer gave me your answer ("I never learned the principle because I never had to") I would know they aren't a fit for my team.


>If an engineer gave me your answer ("I never learned the principle because I never had to") I would know they aren't a fit for my team

So we should learn all the things, ahead of time, just in case we get an interview question at some point in life?


No, it's the attitude. Saying "I don't know depth first search" is fine, saying "I'll never need this and by asking it you've revealed what a terrible company you are" is sour grapes.


Not revealed as a terrible company, perhaps, but as a terrible interviewer.

If any company were to quiz me on algorithmic basics, it had better explain to me beforehand why it is among the x% of all hiring companies that actually need to roll their own new solutions in the face of so many well-established libraries.

That is, before you ask me to demonstrate a depth-first search, you had better explain to me why I'm going to need to be doing that instead of just writing an SQL query and tweaking an index, which is likely what I would be doing at most companies.

Part of development is figuring out not just the answers to the questions, but also figuring out "Of all the questions I could have been asked, why was I asked that question?"

A disappointingly large fraction of the time, the answer to "why did you ask that question?" is "we noticed a correlation, confused it for causation, and built an entire strategy around it".


> need to roll their own new solutions in the face of so many well-established libraries

I'm not defending interview quiz-time, but it isn't (or shouldn't be) about rolling your own solution. It's about understanding concepts. Understanding the basics of time/space complexity are pretty fundamental when designing systems.

Developers frequently encounter hashing and (probably less-often) trees, so I don't see an issue with asking something like "Why/how/when is a hash-based lookup faster <in some situation> than a tree-based lookup?" (as one question during an interview). If a person can give a good answer to this, they'll pretty easily be able to understand the performance tradeoffs of hash vs btree indexes when using mysql (and this same tradeoff in a variety of other situations).


If you're measuring my design ability in the interview, you had better give me an opportunity to use that ability after being hired.

Too many times have I been asked questions that shaped my expectations of the job, only to be disappointed later on. The worst offender in this respect put me through a technical screen that could only reasonably be passed by someone with a bachelor's degree in CompSci or equivalent work experience, only to later tell me that the code I write "should be understandable by a kid fresh out of high school" (actual quote). I applied to 12 different job postings that night.

The interview led me to believe that I was being hired for my expertise, and the job expectation was actually to be a warm, brainless body in a formerly empty seat. That's why I don't like questions that act as proxies for some other metric. The questions you ask me are telling me about you as much as my answers tell you about me. When you cargo-cult interview procedures from another company, you are actually misrepresenting the nature of your own company as being like the company you stole your interviews from.


> the code I write "should be understandable by a kid fresh out of high school"

tl;dr: "you're going to be replaced by a kid fresh out of high school so you might as well make it easier on us to do it"


Not quite. I was probably actually replaced by someone fresh out of college, who failed to learn anything during that 5 years. I am 99% certain the contract requirement mandated bachelor's degrees (and in a relevant field) for all software developers.


> "before you ask me to demonstrate a depth-first search, you had better explain to me why I'm going to need to be doing that instead of just writing an SQL query and tweaking an index"

How I am going to ask you to tweak an index, if you don't know how to browse a tree? Although I would have asked you about B+ and B* trees, the ones used to index a database. There are differences in those trees and you need to know them in order to decide which one is a better option to improve the performance of the queries. Obviously this improvement means nothing for a small startup, but image the impact it has in a company as Google.

I think the point is that a lot of interviewers make those questions because big companies do them. But there is a reason why they do them. Obviously I would prefer the candidate that knows the answer over another one that also does the job but doesn't know the answer.


How big is the chance that you select the candidate who happens to know all solutions (e.g. by learning them by heart recently for the dozens of interviews he's planning on doing), but is not a good technical fit? My estimate is: pretty high; Let me explain why.

If you indeed need someone who knows about B* vs B+ trees, why not ask him about that separately ("explain me the difference between ..."), to see if he has the technical background you need? Even if someone understands that difference, that same person might have problems getting DFS right during the whole live interview, for a number of completely irrelevant reasons (nervousness, a momentary lapse, being a bit "rusty", a.s.f.). For the candidate to know the difference between a B+ and B* index/search and to operate a database correctly doesn't require the ability to implement DFS flawlessly (and most likely will never require the person taking a shot at that, for that matter...)

I think the real problem is that some people don't seem to understand the goal of these questions. If you do algo whiteboard questions (and you should!), you should measure the candidate's behavior, reactions, and analytical skills, while attempting a solution with you - and not to find out if the provided solution is correct or not (indeed, its often more interesting when the solution is not and you work with the candidate on locating the issue!). And all the while, the interviewer can try to figure out if he/she wants to work with that person, given the current interaction, too.


I was not talking about implementing anything.

I disagree with you in a lot of things. I don't like whiteboards because they are stressful and feels strange if you don't use them regularly. I was a teacher which uses whiteboards daily and they are completely different to use a computer. I would rather ask the candidate to talk about one of his projects and start asking relevant questions related to the job position and applied to the project he knows. And have a discussion this way. And the iteration with the candidate is only biasing your decision towards his personality and not his skills. To know if he does the things like I would do them.

But that is my opinion, the person who is hiring is the one who decides who he wants to hire. I feel more confident in finding a good fit/candidate my way that how you described.


"Perhaps you won't, but since we don't know yet exactly what you'll be working on, we might know broadly what PA or even project, but we can't know what problems you will encounter or what direction it will take to debug any problems that arise, we want someone with a broad base of skills who can at the very least recognize performance problems and solve them in a simple case. We expect that if you can solve this relatively simple problem in an environment with no resources, that with the aide of documentation, profiling tools, and teammates to lean on, you'll be able to address much more complex issues that arise. On the other hand, if it takes you documentation and teammates to solve this simple case, who knows what kinds of tools it will take you to solve real world problems that arise."

"Correlation does not imply causation" doesn't imply that correlation never implies causation.


I think you mean to say that correlation and causation correlate.


> If any company were to quiz me on algorithmic basics, it had better explain to me beforehand why it is among the x% of all hiring companies that actually need to roll their own new solutions in the face of so many well-established libraries.

Please suggest a reasonable alternative then. I need to interview people and see if they are going to be capable of digging through complicated code, of coding things reasonably quickly, or writing scalable, robust code, of potentially digging into things enough to optimize their performance, etc. as reasonably well as I can in an hour. And I need enough concrete evidence that everyone else in the debrief believes me. I would love to change the way I do things, but I am held accountable for the interview, so I can't just show up to the debriefs saying crap like, "He says he can use a library for anything that comes up." And I need to evaluate some other soft skills type stuff, like caring about the customer, communicating well, delivering more if you see something extra that you think should be done, etc.

But the obvious answer to your question would be, "because if everything we do could be solved just by using a library, we'd have interns or offshore people do it for a fraction of the pay."


As prirun says, you have a conversation, not a quiz.

Quizzes are a terrible way to assess skill; they're used in academics because they're, sadly, the only quick way to get some sort of consistent feeling over the memory and information retention in a group of dozens of students. No intelligent company should be copying this; companies have the luxury of dedicating significant time to exploring the potential of each candidate.

You say "Hey, here's a real problem that we'd need to solve. How would you go about it?" And let them talk. They'll be able to speak in informed terms about how to approach the problem even if their algorithms memory is rusty or if they're self-taught. If it sounds like they're on the right track, you win, if not, you move on. Repeat until end of interview, evaluate at the end.

I've used a basic code competency test for an interview pre-screen before. This is a take-home project that should take a competent person about 30 minutes, 2 hours if they go all out and add a bunch of bells and whistles.

People say this, that their take-homes are non-intrusive, but don't mean it. I mean it. The quiz is seeing if they can make a single API call to a prominent API and return the results according to a very simple format.

These pre-screens should be well below the competency requirement, which can only be assessed in the interview. It's essentially a fizzbuzz that they can't easily copy and paste by searching for "solutions to fizzbuzz".

Other than an extremely simple pre-screen like that, your decision should be based on their ability to reason, discuss, and solve problems on the fly, not how well they remember the particulars of compsci curriculum. That selects for the recency of their last class, which, ironically, is usually inversely correlated with their much more valuable real-world experience and what people think they're trying to test for.

This doesn't apply if you actually are doing deeply theoretical work on the cutting edge of compsci that may require frequent cooperation with academics. But that's a tiny minority of positions.


I love this; it's a good insight into what a hiring company wants to know about a potential employee. So how do you get there? Here's my suggestion:

There are 2 very distinct (in my mind) kinds of skills in question. For the "soft skills", a manager-type could and maybe should do that part of the interview. A trusted tech person should do the technical part. I'd go so far to say to that every technical person on the staff should be trained / groomed to help with interviews. So maybe you have 1 tech person do the tech interview, and a 2nd is learning how to interview.

For the interview itself, the process I'd use is to ask a lot of questions. For the soft interview, things like:

- "Tell me about a difficult customer you've had." Let them explain a while, and ask lots of probing questions: "Why didn't you do X instead?", etc.

- "Tell me about a difficult problem you had with a former employer." Same drill.

For the tech part, bring some code to the interview, ask the interviewee to bring some code to discuss. Another option would be to just throw them into a source code tree and say something like "Here's a tree of one of our projects. Talk to me about it." Maybe I'm wrong, but I think I could tell a lot about a tech person just from how they approached being thrown into a mess with no prior information. After all, that's a large part of being a good developer IMO. As they start to figure things out, look at code, etc., ask them questions about what they're seeing. Let them ask too. If you run into something interesting, like a specialized algorithm, ask them about it: "Why do you think it was done this way? What would some other options be?"

Then let them do these things for code they have previously written. For example, I wrote a Prime minicomputer emulator. I'd love an interview where they asked me about why I did this project, what problems I ran into, what were some alternative designs for tricky problems, what were the tricky problems, what did I learn from doing this, what tools did I use to do it, etc.


>Another option would be to just throw them into a source code tree and say something like "Here's a tree of one of our projects. Talk to me about it."

"Its in perl, I don't know perl, and especially not your internally modified version of custom-magic perl that Steve wrote 7 years ago."

Not to mention that you are now either showing source code trees to random potential hires, or you have to audit/create/otherwise use some potential set of source code. Maybe you prescreen by asking them their favorite language, and you come in with an open source project, in their language of choice, but now you have to have one of your devs spend time familiarizing themselves with Redis or the Python interpreter or Hibernate Core or Angular or whatever, and what happens when they ask to do the interview in Haskell?

FWIW, I know some companies that do the interviews you're describing, but they're all relatively small (<100 employees), and they all do that kind of interview only after a technical phone screen with your conventional questions, because the time investment required by the company is so great.


A huge amount of coding is reading code.

These tests don't test digging through code.

I've worked on projects where people role their own solutions instead of being able to use stuff that's out there and it's not great either.

The real solution I guess is talking to people and teasing this info out of them, and perhaps showing them some older code you have since fixed and seeing what they find in it.

Edit - one reason I personally try and use a lot of libraries is so that large parts of a project are maintained upstream after I leave my contract; the more I write into the project, the more the next dev will have to maintain, anything I can push back into external libraries is a win.


> Not revealed as a terrible company, perhaps, but as a terrible interviewer.

Some software is HEAVY on data structures and algorithms? But, then hopefully someone not interested in that would not apply...


In fact, it should be explained at the job offer text. With an added bonus that people then can self select in or out of the offer without losing any further time.


Ironically enough I was never asked about DFS during my interview but needed it for the first thing I worked on after getting hired.


Saying "I don't know depth first search" is fine,

No, it most definitely is not "fine". Even momentarily hesitating on question like "Would you use BFS or DFS in this case?" is more than enough to merit a quick transition to the "Do you have any questions for me?" phase, in many a modern, "As hire As, Bs hire Cs, you know" interview session, these days.


As an engineer you should be able to see the global picture and know other things. Because you wont be able to use something to solve your problem if you don't know it in advance. I mean, you don't need to know the details, but you need to know how things works.

For example, you might not need an AVL tree in your daily job, but if one day you need it to use it, you wont be able to notice if you don't know what is an AVL tree and what is its advantages over other trees.

Another example, you don't need to know http1 and http2 differences for your daily work. But if you know them you will change how you do web pages and you will see a leap in terms of performance and scability. And that knowledge also includes some knowledge about TCP, UDP, cache and other stuff.


The amount of things we need to know is not commensurate with salaries for companies that lazily copy someone else's interview. Most of us are not practitioners that get to dictate a lot of the technical decisions. We're just told we need to know what an AVL tree is before we can work on CRUD app #4272095.

If you work really hard, learn all of this stuff, and do a really good job and save your company a bunch of trouble by knowing all of this, you won't get a dime for it, and that's the problem. You get a pat on the back and you feel slightly less like an imposter for a few days.

People are arguing that we should offer quality for free. This is what open source is for.

I would argue security is more important than performance knowledge, but there is virtually zero focus on this that I've heard from the technical interview rabble-rousing that goes on in these threads.


> If you work really hard, learn all of this stuff, and do a really good job and save your company a bunch of trouble by knowing all of this, you won't get a dime for it, and that's the problem

I think that is your problem. I don't stay in companies where I don't feel valued. And anyway you are receiving a salary every month for your work. If you think you should get more, ask for more or move to another place.

For some people security is more important, for it is not. Because I believe that in the current society the companies can fall in few years (Nokia, Canon, ...). My security lies in my knowledge and my skills, and that is something I take with me whereever I am. I recall something I read, it was like this: "A bird is not scared of a branch to break, because it lies his confidence in his skills to fly".


>My security lies in my knowledge and my skills, and that is something I take with me whereever I am.

Sorry, I was confusing, I meant network and information security. Using a good password hash/library, not trusting user input, knowing some basic attacks or stuff from OWASP Top 10 and how to reproduce them. That stuff is still a problem. I'll admit I probably know less than I should because it's at the fore-front of my mind and I don't practice often, but it just seems weird that a company would prefer performance over secure coding.


Oh I missunderstood, sorry. I think everything is important, the more you know the better you can do your job. Probably maintenance comes before security, and security before performance. But it depends on the case.


... but if one day you need it to use it, you wont be able to notice if you don't know what is an AVL tree and what is its advantages over other trees.

No, what would happen is you'd say to yourself "Hmm, looks like I need a self-balancing tree. Haven't thought about those in N years..." and do a few seconds of keyword searching. The idea that people will be utterly helpless on their jobs without instantaneous photographic recall† of AVL tress (and A-star, and all the other crap people are bullied into memorizing these days) just doesn't hold water.

† Which, as we know, is the only level of recall acceptable in the modern interview process. Even momentarily hesitating in your recall of certain definitions will easily get you flushed by some interviewers.


The research you have to do is based on your previous knowledge. The less you know the more you need to learn when you are facing a problem you don't know how to solve it. But people is lazy, if they know a way to solve it they will go that way even if it is the worst way possible.

I am not talking about implementing an AVL tree without any help or resource. That is simple stupid, we have internet we must use it and save time. My point is that you need to know that there are balanced trees, know some of them and they characteristics. That knowledge will save you time when you face problems related to trees. You are not going to know all the trees because researches are working on new ones every day, but you should have some knowledge in the area.

If you interviewer drop you because you hesitate in something, maybe it is the interviewer problem and not yours. We are not perfect, we should know people will make mistakes and hesitate. I was rejected of interviews when I hesitate when they asked something about linked lists. I was happy it happened, because I was in shock when I was explanning how I did something with and A* algorithm and a octree and I couldn't believe the next question was about a linked list. I wouldn't be happy in that company.


> No, what would happen is you'd say to yourself "Hmm, looks like I need a self-balancing tree. Haven't thought about those in N years..."

You're attacking a strawman which has little to do with the original post or what the other commenter is saying. There's a lot of developers out there who lack the knowledge to ask the question you pose in the first place, so wouldn't end up doing that searching. The type of interviewing you're criticizing is intended to identify developers lacking the kind of base knowledge needed as a foundation for further inquiry.


Knowing when to use an AVL tree is orthogonal to being able to write one under pressure on a whiteboard. But as long as someone knows that different trees have different performance characteristics, I care more about how they decided what to optimize and why they recommend a lookup-optimized tree over a cache.


Yes, this is what I meant


No, we should learn the basic algorithms and data structures (and their performance and storage characteristics) because they're the building blocks of everything else we use. You're using some API? OK, cool. Knowing some about its internal workings means that when you've got some wonky performance issue, you've got a basis to start reasoning from, to find and fix the cause of the problem.


Never once has knowing about the performance of linked lists or hashmaps been necessary to fix an API issue. 99% of your API issues will be "this doesn't work for X reason", not "this API response is slow because weren't using X instead of Y data structure".


You sound like you've worked with significantly different code than I have. Your "never once" is my "almost always", and my "X reason" is often because I'm working with an immature, unreleased library developed by another team at my employer.


What's more likely: that everyone needs to know intimate details about basic data structures or that the world is held together with duct tape and Perl and 95% of code is written by people not in Silicon Valley who need it to just work?


I think you are looking at this wrong. I have zero problems with someone presenting themselves this way and would definitely consider hiring the person.

Here's reality: Someone who knows their stuff is able to dive into details of their work in a way that an impostor can't.

My response to the above would be to have the person bring in some of their work and take an hour or two to take a deep dive into it. I want to see code, documentation, examples of trade-offs and a discussion of the reasoning, challenges, what could be made better, what should not be touched and why, project history, etc.

A conversation with someone who knows what they are doing and is very actively involved in their work is very different from a conversation with someone who might be trying to bullshit you or simply doesn't know enough.

I hate puzzles. All I learn from them as an employer is that someone might have devoted a month to memorizing a whole bunch of them for the interview.

I would imagine that at the scale of a company like Google, resorting to puzzles as a first filter might be an inevitable reality. If you have to interview people en masse you almost have no choice. It's like Stanford having to filter through 40,000 applications a year to accept 2,000 students. You have no choice but to go algorithmic on that problem.


The problem with your approach is that it excludes anyone who spends most of their time writing code for a business. I legally can't provide you with the code that I have written over the last several years. You're limiting your applicant pool to people who have been paid to work on open source, freelance web developers, and people with very little life outside of work.

I agree that puzzles aren't all that great of a measure of ability, but at least anyone can do them without writing about thorny legal issues over IP or spending all of their free time on spare projects.


Not entirely sure how you might have reached that conclusion. Anyone dedicated to their craft will naturally --out of sheer interest-- devote time and effort to getting better at it.

For example, as a young EE I was constantly reading data books (yes, physical data books) and application notes. I had hundreds of data books and probably went through all of them twice and some several times. I could, at the time, talk about almost any chip from any of the major manufacturers and knew where relevant application notes existed for most problems. My employers did not mandate that at all. I was truly interested in what I was doing.

Someone interviewing me at the time would have learned a heck of a lot more about me if they asked me something as simple as "Can you tell me about a few interesting chips and how you would use them?" rather than asking me to design a low pass filter with a given frequency response using a specific op-amp.

The deep dive I am talking about does not require anyone disclosing code done for their existing employer. Nobody wants that.

Frankly, I would also want to know about life outside of work. However, given our laws you have to be very careful about how you might probe for such information. I feel very strongly that our legal system has, to one degree or another, sapped all humanity out of our work life. I've worked in other cultures where it is perfectly normal for people to greet each other with a kiss on the cheek and a hug or pat on the back in the morning. In the US almost any physical contact can land an employer in court and a manager in serious legal trouble. But I digress.


So here's the thing: not everyone is you, not everyone is passionate about technology, and it's not reasonable to expect people to do more of their job outside of work for free. You might like, and that's great, but people have families and hobbies and interests that sometimes have nothing to do with the 40 hours they churn through for their bosses. I'm so sick of interviewing for companies who want "passionate" developers, which translates to "doing more work for free".

If you're curious about why we have workplace harassment laws, talk to basically any woman with a job and then rethink your sweet nostalgia.


> not everyone is you, not everyone is passionate about technology, and it's not reasonable to expect people to do more of their job outside of work for free

I am not sure how you would twist personal development and remaining up to date with "do more of their job outside of work for free".

Let's get away from technology for a moment.

You need to go to the doctor.

Would you rather go to a doctor who is passionate about their work and constantly devoting personal time to remain up to date on the latest research, drugs, studies, techniques and technology?

Or would you rather go to a doctor who has not devoted a single day in ten years to stay up to date?

I'd pick option #1 every time. Same with engineers.

Look, if you are not interested in remaining current, that's OK. Just don't expect to have access to the same opportunities. It's as simple as that.

> talk to basically any woman with a job and then rethink your sweet nostalgia

Wow. Not sure how you made the jump to harassment there. Gender has nothing to do with it.


> I am not sure how you would twist personal development and remaining up to date with "do more of their job outside of work for free".

Well, unless you're doing personal development at work during the 40 hours (or so) that you're required to be there, then you're probably not getting paid for it. I mean, I'm willing to tinker on my own, but you can be a good developer without doing it (you can also be a bad developer in spite of doing that).

> Would you rather go to a doctor who is passionate about their work and constantly devoting personal time to remain up to date on the latest research, drugs, studies, techniques and technology?

I want to go to a reasonably friendly, competent doctor. I really don't care how passionate they are about being a doctor, because I don't view passion as a proxy for quality (because it isn't).

> I'd pick option #1 every time. Same with engineers.

So my friend is a brilliant chemical engineer, and yet she doesn't sit around designing plants at home on the weekends. No, she hangs out with friends, sings in a choir, watches TV shows, travels, etc. I don't particularly understand the tech industry's fetish with eating, sleeping, and breathing code, and it's a damn shame that we're pushing away really fantastic nine-to-five developers.

> Look, if you are not interested in remaining current, that's OK. Just don't expect to have access to the same opportunities. It's as simple as that.

So it turns out that most software isn't created in Silicon Valley, and more money is made by crufty old Java and Perl enterprise systems than the next sexy JavaScript framework (which is probably just a poor rehash of a computer science concept Alan Kay developed in the '60s).

> Wow. Not sure how you made the jump to harassment there. Gender has nothing to do with it.

Gender has nothing to do with it if you don't experience the kind of issues that women in the workplace face, sure. If you're a man, you're not particularly affected by issues that affect women.


> You're limiting your applicant pool to people...with very little life outside of work.

Imagine that.

If your company exists solely to write CRUD apps in whatever JavaScript framework came out in March, you don't need to spend hours doing a deep dive into someone's side project todo list app. Unless, of course, you're looking for someone who will work 16 hours a day, 6 days a week, for one tenth of one percent.


> My response to the above would be to have the person bring in some of their work and take an hour or two to take a deep dive into it.

So another interview and another day off work? Another set of interview approaches/questions that immediately eliminate people who don't spend their free time doing the same thing they do at work 8+ hours a day?

This is not solving the problem.


Sure it does. A huge problem.

Here's reality: 100% of the technologies I work with today did not exist when I was in school. The only thing that has remained constant, useful and relevant is basic science, math, physics, etc.

What, then, make an engineer a good engineer in any domain? This applies to all aspects of engineering, from software to manufacturing engineering?

If I had to pick one thing I'd say their ability to remain current, learn and apply new unknown technologies through self study. Their flexibility and willingness to do so. Almost their drive to do so.

What I am interested in is someone who has the right approach and attitude for the job, an ability to solve problems creatively and a significant enough desire to learn. I am not looking for a robot that can memorize a hundred coding solutions in a month.

Part of the discrepancy here might lie in a difference in environment and stated goals. I have never worked in Silicon Valley but I have a feeling it is a dog-eat-dog mercenary environment where people are chewed-up and tossed out as quickly as others are hired.

If you are looking for quick hires and you are not looking at the idea of adding a team member for a long term relationship with the company and the goals of the business, yeah, sure, filter them through some quick "can you code this shit fast" puzzles and move on. Great! You are hired. Here's your desk. Here's your ankle chain. Now code away!

If, on the other hand, your industry and approach is such that you view people as a long term investment you really should not care about those skills. Anyone with decent ability can memorize coding problems. And it is worthless. What you are looking for is to bring someone on-board who will become a true asset for the business. I see it as almost short of looking for a partner. I don't care about memorized performance, I care about the ability to problem solve and the creativity, culture and thinking they can bring into the business.

These are two very different views. One is hiring cattle. The other is hiring people.


> If you are looking for quick hires and you are not looking at the idea of adding a team member

> filter them through some quick "can you code this shit fast" puzzles and move on. Great! You are hired. Here's your desk. Here's your ankle chain.

> One is hiring cattle. The other is hiring people.

You seem to be arguing against several points I never made. I took issue with your insinuation that you can just bring someone in for two hours and talk at length about some of their code that they bring it. That cuts out most of the hiring pool, which might be okay if you're Google, but for most companies isn't a smart move at all.

> What I am interested in is someone who has the right approach and attitude for the job, an ability to solve problems creatively and a significant enough desire to learn.

This has no correlation with programming as a hobby or having a wealth of freely reviewable code. Plenty of people love their job, are great at it, very creative, and fantastic technically and never write one line of code that isn't closed source or behind one or more NDAs. They go to work, do an amazing job, then go home after 8 hours and don't write any code or even touch a computer until the next work day.


If they don't have any code or project whatsoever they are able to discuss I have zero interest in interviewing them.

I have never said this is a universal formula for all to adopt. This is what I do. And it has worked very well for over thirty years across a range of engineering disciplines. Google and others can't take this approach because they need to hire people by the thousands. Not me.

Another interesting thing is to talk about someone's hobbies and passions outside of work. This is where you can learn so much about a person. People are passionate about one or more things and that is a reflection of their personality.

We do a lot of work in aerospace. There are very obvious legal barriers to disclosure there. Yet, I have always found that most engineers who are truly engaged with their craft have enough interesting things to talk about outside of work that you can really get a sense of who they are and how they will approach work.

An interesting example was when I needed someone to work on some Python code. I brought in people who had zero experience with Python. I could not care less about that. I wanted someone with some of the qualities I have already mentioned. I ended up hiring a programmer with lots of C++ experience and no Python chops at all.

The first three months were dedicated to taking a deep dive into Python while getting up to speed on the project. After that the focus shifted to the project itself. This person has been with me for many years and doing an amazing job with various technologies we didn't even know we would touch when I hired him.

I know this person can pick-up any technology we might need to utilize and do an excellent job of it.

The investment is in the person, not the technologies or the ability to memorize coding puzzles.


But I did learn the principle, back in school. I've just never needed to use it.


I was going to agree with you, 98% of my career has been "google for a library, then use or tweak". It's RARE we ever actually do anything "new". However there ARE companies that do, and every once in a while YOU may have to do something new. In those cases it's good to make sure you have a foundation to build on.

I have 4 books in the "The art of computer programming" series on my desk. They've been more or less decoration for several years, until I ran into a problem I couldn't google. Since I read them, I had an idea of where to start, and I used it as inspiration to craft a new solution tweaked from one of these foundational algorithms.

That said, not everyone is strong in algorithm development, and you can be a kick ass programmer without that skill. I'd only ask these questions if I needed someone with that skill on my team.


In that 2% of the cases where you have to come up with something new, I don't really think that knowing pretty much any of this would help. You are storing a lot of information that will probably never be used. On the other hand, knowing how to come up with the solution. Knowing where to ask, what books to read or what people to ask, seems like a more important skill.


You are OK with not knowing how to invent, innovate, push the envelope, etc because you don't need to in order to collect your paycheck? Seems sad! Where is your passion for the craft?


That's not what he said.


You misunderstood me. Of course I love to invent and innovate. That's why I don't like memorizing solution to coding challenges. But just because I don't know them by hearth doesn't mean I don't know where to find the solution. Which is the more important skill.


I think we should strive more to be just more than code monkeys that glue stuff from stack overflow... because that makes you or I replaceable.


If you don't know what is the difference between O(n) and O(n^2) and how to make a data structure that allows you to keep the data sorted in particular way, you can get by using ready-made things up to some point. When that point comes, your "where to ask" would be "take a course in algorithms design and data structures and/or read a fat and quite dense book". Which is a completely OK way to advance your knowledge - except that some employers may prefer to hire people that already did that, and not wait with the task at hand until they do.


Even if something represents 2% of your cases, if no one at a company or on the team has algorithmic problem solving skills it will represent more than 2% of your development time.


I tend to place engineers in one of two buckets-- those who make tools and those who use the tools. Most people fall into the latter bucket and can get by without hardcore CS knowledge.


Googling for a library is all well and good, but if you can't evaluate whether it's well designed and well written then you might as well rephrase it as "Google for someone else's problems to add to my own."

The background knowledge to understand what you're looking for and looking at is really important in these cases. You mention having and having read TAOCP, that already puts you ahead of >99.99% of the people doing programming out there - 95% of whom couldn't tell you who Knuth is, what he's written, or what Drofnats refers to.


I agree that these interviews are often just an annoying rite of passage, and they exclude many very talented programmers, but dismissing companies that use them as 'incompetent' seems like a stretch to me.

Learning basic data structures & algorithms is an immensely useful thing for a programmer, and completely essential in many cases. The best of these puzzles are based on problems which people have had to solve in the real world. For the companies, there are many benefits to conducting algorithms interviews:

- setting a minimum standard to make sure there is a shared language and knowledge that you can expect any engineer in company to know.

- making sure you are able to do more than trivial optimisation and go beyond the abstractions that libraries / frameworks provide

- giving you a simple problem to solve in 30 minutes to see if you can program at all.


It's just laziness incarnate. This pushes all the investment of the first phase of interviewing someone onto an automated process and denies the candidate the opportunity to vet the company which is just as important as the reverse.

Well, actually they do allow the candidate to vet the company: the message they send is we don't care about you at all until you do a bunch of busywork and if you're very lucky we might allow a human to spend some cycles on reviewing your results.

If as a company that is the kind of message you would want your prospective employees to have that's fine with me but it would be good to remember that interviewing a candidate is a two way street.


How are companies supposed to evaluate candidates without giving them some form of busywork? The best way I can think of is to pay them to complete a project but that's not possible when you are interviewing loads of candidates.


Introduce them to the team they will be working with, have them do a code review or take a ticket and find their way through docs and discuss a potential solution.

In general teams are a much better judge of talent and ability than recruiters or your typical interviewer, especially in a normal work setting.


That's really expensive for the team in terms of time investment for possibly no payoff. Someone has to make sure that a candidate that gets to that point has a good chance of being hired. That person is "your typical interviewer."


Your typical interviewer, if he or she does not have relevant knowledge is just as likely to throw out the baby with the bathwater as they are to select the right candidates. There is no reason not to have the team do the pre-selection. I know this is all terrible news for recruiters and HR people alike but really there is nobody better qualified to determine who they want to work with on a particular problem than the existing team. The only situation where you would be better off with other people making that decision is when there is no team yet.

One thing I used to totally loathe during my brief stint as a programmer employed at a large organization is that when new people showed up that were already hired it was then up to us on the floor to make the best of a whole series of bad decisions preceding that moment.

So, let's involve the team in the messaging and pre-selection as well as giving them the final say.

Think about it this way: if you believe that the team you employ is the best possible group to do the work, don't you feel they are also the best possible group to determine how to expand the group?


> Your typical interviewer, if he or she does not have relevant knowledge is just as likely to throw out the baby with the bathwater as they are to select the right candidates. There is no reason not to have the team do the pre-selection.

What are you talking about? The interviewers are almost always from the team.

> Think about it this way: if you believe that the team you employ is the best possible group to do the work, don't you feel they are also the best possible group to determine how to expand the group?

Yes, and that's why most companies have their devs do interviewing.


What do you do when tons of people apply for a position?


Find a better way to reach your target audience. Getting people to self-select is the very best way to reduce the load from interviewing.


How would you do that? Also, saying there will be a coding interview up front does cause people to self select.


It would definitely take up some team resources, but it's not more complex than the work they're already doing. So 'how would you do that' is the wrong question. The question should be 'how would they do that?' and the answer I don't know but I'm sure if you ask a team they'll be more than happy to explain, after all it is their (and not my) future that is at stake and given the fact that they are involved they'll do the best possible job to make sure they have to go through it the minimum number of times with the largest chance of success.

This is still far less effort than a wrong hire would cause. Anyway, I can see that my methods are not acceptable (yet), maybe in another decade or so?

Trust is hard. Even companies that trust their tech people with the corporate crown jewels still have a hard to impossible time trusting them with such everyday decisions such as who they want to work with. It's counterproductive to say the least but that's how we've been doing it for the last 40 years, so I don't expect any major changes in the near future.


Fair enough. I think without coding interviews networking would become even more important than it is now, and it is already very important now. Moving more towards a "networking" world would mean things are less a meritocracy. It would not be about what you know, but who you know. Personally I find that unappealing because it doesn't seem fair, but I know some people aren't interested in "fair" anyways.

Instead of people complaining about coding interviews on HN, you would have a lot of programmers complaining that they are introverts who are just good at their job, and don't think that they should be punished for not being a people person.


I don't think teams should be penalized for not hiring people that aren't team players. There are good spots for introverts in IT but teams usually (though not always, I've seen some interesting exceptions to this rule) are not too welcoming to that sort of person.

If the world were gamified to the point that your skills are all that matters then yes, a solely merit based approach would work. But in the world we live in today people skills matter (a lot, actually). On a personal note, this was a very hard lesson for me to absorb, the first years of my career introvert would have been too friendly a description, anti-social probably would have been a better one. But over time I got a bit better at working with others.


> It's just laziness incarnate

You could be describing software in general.


Not really. Quite a bit of software is like a powertool and I certainly would not want to label the users of powertools as 'lazy'.

But these companies are not using software as a powertool, they are using software in a way that attempts to deny their counterparty a mutual investment in the relationship. And that to me is lazy.


I'm a lowly c++ software engineer. You might not have to deal with that stuff, but for me it was Tuesday.

Kidding aside, someone has to build those tested, stable libraries that handle those problem (or even untested bleeding-edge if you are breaking new ground).


I primarily write python web-based APIs for a web application + 2 mobile apps. Just the other day, I was dealing with an endpoint that had to update hierarchical data (i.e. a collection of trees).

Due to the circumstances, normalization wasn't an efficient option. I ended up throwing together a barebones tree with a 5-line DFS implementation to traverse it. It handled inserts, updates and deletions (for my use-case) in linear time.

The details aren't so important as the fact that adding a dependency would have been overkill for my needs. This isn't to say that efficient graph implementation libraries should not exist or be used, but I was able to produce this code faster by having that basic CS knowledge.


And because your code was implemented in python (rather than use prebuilt libraries that call back to C) it was 100x slower than it should have been. Im all for knowing the fundamentals but there is a strong argument for knowing the right tool for the job.


And because it was attached to a web api, it was likely still io bound, so it didn't matter.

Context and knowing the right tool for the job is important indeed.


re: the debate between Python being slower and C faster, it all depends on context. If the context is "this is going to be called multiple times for every transaction" then yeah, look into recoding it. If the context is "this is going to be called for this particular edge case and may execute 10 times a week and take an extra 3 seconds each time" then there are more productive places to put your energy.

At the level of programming that the grandparent is talking about, I'd accept the judgement of the programmer working on it as to the appropriate solution.


Where does one get a job writing actual algorithms and data structures? I actually enjoy that and am pretty good at it. I'm sick of jobs that are nothing but glueing together poorly documented and tested libraries.


I'm currently working in finance. Before that I was working at a major search engine which doesn't start with G.


Where does one get a job writing actual algorithms and data structures?

Two big areas that come to mind are simulation/mathematical modelling, where you're often crunching data in ways that aren't just textbook examples, and embedded systems, where you often have resource constraints that make efficiency more important.

This doesn't just mean modelling weather systems on supercomputers or writing the control software for cars, though. For example, consider user interfaces. We are increasingly looking for more intuitive input methods using techniques like natural language processing, speech recognition, handwriting recognition, and gesture-based UIs. We are looking for more intuitive output methods, such as integrating additional data with real world imagery like maps or the view through a 3D head set or camera. We are looking for systems that learn patterns in their users' behaviour and adapt to provide more likely options more quickly next time.

You won't see much of this if you're just writing simple form-based web front-ends for CRUD applications. A lot of real world software is like that, and it gets a lot of useful work done, but it's mostly pretty mundane, join-the-dots work as far as the programming goes. However, there are plenty of interesting problems out there and we could directly improve the user's experience in new and helpful ways if we could solve them, and much of that work involve developing data structures and algorithms far beyond anything you'd find in an introductory textbook.


Data Engineering, SRE, Production Engineering are really good for that kind of thing. Especially at a larger company, but the truth is those opportunities aren't going to come up that often, you don't want to be continuously inventing your own technology unless you're living on the bleeding edge like Google.


Lots of places, but I would guess finance and gaming have it at the highest percentage. They have the strictest performance requirements so algorithms and data structures end up custom built for the job more often.


Game development, and research are the two that come to mind for me.


This is very true, but I don't think that anyone interviewing for a position like yours will reference to this list of things to test their candidates.


Honestly, is it unreasonable to require that people brush up on this stuff every couple of years? In my experience the majority of companies just want you to be able to do fizz buzz level whiteboarding and intelligently speak to your experience. I feel like we all know in advance which companies typically require a month long review of algorithms before the interview. IF you want to work for one of them then do what you need to do to get the job there. We all agree it's annoying but I really don't think it's as big a deal as people make it out to be.


You can conceivably move up to management, never have to deal with algorithm hazing again, and make more than the guy that has to refresh every few years.

The rewards just do not add up for this to remain an industry practice.


But the innovative businesses that develop genuinely new technologies also hire "the guys that has to refresh every few years". Leaving aside the potential financial gains if you're in early enough and they have a big exit, those also tend to be interesting places to work.

Moving into management is essentially changing career, and for the kind of person who actually enjoys programming and wants to do something creative and technical, there's no reason to assume they would either enjoy the new role or be any good at it.


That sounds like a career change that I wouldn't enjoy, despite being able to also avoid some of my least-favorite aspects of staying on the dev side of things.


> I see these challenges as a great way for excellent experienced developers to weed out incompetent companies.

This is far too broad. There are plenty of jobs in certain companies where a good understanding of the theory and practice encapsulated by these challenges is the bare-minimum requirement for doing well. Companies that leverage coding challenges like these for these positions aren't incompetent (at least, not for that reason). Just because your work doesn't require this depth of understanding of CS doesn't mean there is no such work.

I'm skeptical, however, that the number of such jobs is very large, even in the "usual suspects" companies (Google, Amazon, etc.). Most jobs, even in these places, one can get by with the most rudimentary ability to understand what 'greater than' and 'lesser than' means and a chart describing time/space complexities of various structures and algorithms in a library.


It's not about actually having to implement these algorithms in your practical, day-to-day work. It's a challenge to test your reasoning and problem-solving ability in abstract, that you can administer in 15 minutes. You can't really test a candidate with real-world workloads, can you.


You absolutely can test a candidate with real-world workloads. It takes longer than 15 minutes, though.

I have no idea why anybody cares about the 15-minute thing. Each person you hire adds thousands of hours to your available labor. So even if I spend 100 hours finding the right candidate, I'm still way ahead. And the better my working environment is, the lower my turnover, in which case I can spend even more.


While I fully agree with you in that interviewing is broken, and it's broken because we throw away a lot of great people, you'll be surprised how much time a company spends finding the right candidate.

Imagine we spend an hour per candidate on screening, and we pass 10% of people, and we spend 6 hours on each on site,with 25% of them passing.Finally 50% of candidates accept the offer: This is very typical math, and it ends with over 100 hours per developer hired. For each hire, we also had a declined offer, 6 candidates rejected on site, and 72 failed screenings!!

Given those round, but not really all that far from reality numbers, any increase in screening time will spiral out of control unless some other multipliers change. The one that is most likely is that said company is giving offers to less than half of the people that would have been successful.

Those numbers also show why it's so important for a company to make competitive offers and woo candidates, and why they'd not like it when people interview at 6 places at once: Run the calculations just changing the acceptance rate down to 30% and up to 80%. If you are trying to recruit from a big name university, chances are that your candidate really is talking to a dozen serious SV companies and gets 4 serious offers.

Therefore, while I agree with your sentiment, you can probably see why it's so hard to convince someone running the traditional SV pipeline to make changes, as you are either raising costs or telling them that their current outcomes are a gigantic dumpster fire.


I probably wouldn't be surprised, having set up hiring processes before. E.g.: http://williampietri.com/writing/2015/slightly-less-awful-hi...

Anybody spending 6 hours per candidate on site can afford to test them on some real work. I'll typically do 2 hours for pair programming, and/or 1 hour for reviewing some existing code. 1-2 hours is also a good amount of time for a joint design session on some real problem.

You're definitely right that we're missing good people. I keep coming across people who didn't get the right job until they invested a bunch of time into learning to beat the Mensa-puzzle interview. So one way to make the numbers better is to find more good people.

I think if we're going to do the math, I also think we need to account for the cost of getting people who are not so good. If we test some thing that we hope correlates with doing the work, we're asking for large downstream costs. The more our interview tests what people actually need for long-term success, the better off we are.


How do you keep interviews consistent for candidate comparison with pair programming? I'm assuming you mean that y'all are pairing on actual work. That can vary. Yesterday was simple CRUD, "candidate nailed it." Today, there is an obscure concurrency bug, and the candidate would need more than 1-2 hours to understand the landscape of the complex code base we are asking them to pair in; "the candidate asked some ok questions I guess."

    I fully agree with "the more our interview tests what people actually need for long-term success, the better off we are."
^^ This. As tokenadult always points out, a work sample test is the way to go.


I do it by pairing on a standard problem, one I'll use over and over. That can be real work in the domain or a toy problem. Both seem to work pretty well to me.

I'm pretty sure having candidates do unpaid work is illegal in California, which is another reason not to have them pair on actual work that you plan to ship.


You are often weeding through a large number of applicants, so spending 100 hours on each candidate is just not feasible, and, generally, not necessary. Plus, high quality, experienced developers are not going to want to spend 100+ hours doing real work for companies just to see if they can get a job.

I'm more than happy to spend an hour here or there for a phone screen or coding test to show that I understand the basics of data structures and algorithms and that I can use that information and my reasoning ability to solve problems I haven't seen before.


Then it's a good thing I didn't suggest spending 100 hours per candidate.


And how many candidates do you have to consider to find the right one? And what proportion of the interviewer's time is spent on the live coding interview vs setting it up and arranging it? I've been in this boat recently. It doesn't really take much to get to double or even triple digit hours if you don't automate something.


Define "right candidate"?

Is right candidate = person who solves algo questions for 5 hours on a whiteboard?


No, it's probably more like 4 1-hour interviews where they do whiteboard coding 2 - 4 times for 30 minutes and the rest of it discussing other previous experience, what they are looking, what the company's doing now and looking to do in the future, etc.


No. The right candidate is someone who can do the job for which they're being recruited effectively.


> You can't really test a candidate with real-world workloads, can you.

But you can test a candidate with imaginary non-workloads?


"reasoning and problem-solving ability in abstract"

Implementing a syntactically correct tic tac toe program on a whiteboard doesn't fit it. Plus many are comfortable in working behind the screen than to work in front of white board (real world)


Why not ask me to bring in some work I have and discuss it?


For one, you're excluding all the people who've only done work that they can't bring in.


Nothing stops you from discussing your own work. Typically I ask people about past work...


Most programmers on the market don't have a lot of experience, due to the programmer population increasing in size quickly over the last decades. (https://en.m.wikipedia.org/wiki/Population_pyramid)

So one big question all companies have is how to interview for programming positions by kids fresh out of school or who've had a job for at most a couple of years.

This is less a question of competence and more a reflection of the age structure of the market for programmers. You will be working with people younger and less experienced than you. You will probably be hiring people younger than you -- how will you interview them?

As someone with a couple decades professional experience, I see these challenges as one of many ways for competent companies to attempt to find competent programmers despite a lack of experience by the interviewee.

The last company I interviewed for gave me coding challenges, but that's not all they asked, I got plenty of questions that allowed my experience to shine. If you only got coding challenges as an experienced developer, then yes, that would be a reason to avoid that company.

On the flip side, my willingness to take the coding challenges in my interview allowed me to highlight my practical experience, because I crushed them with little preparation. Other experienced devs who refused the coding challenges or dragged their feet and complained about them lost the opportunity to receive an offer.


That's like an EE saying, I don't really understand capacitors, but I am building a circuit like this one and it has a capacitor, so I'll just borrow the values and tweak them in simulation.


>> That's like an EE saying, I don't really understand capacitors, but I am building a circuit like this one and it has a capacitor, so I'll just borrow the values and tweak them in simulation.

Dude, in EE interviews, we just ask them some basic about capacitor and how to use it. We don't ask them to derive the mathematical equations of electrolytic capacitors.

In fact, in most EE interviews, you just use them to solve ONE problem and be done. Nobody questions you know EE stuff once you've solved ONE problem.

In programming interviews, you have to solve MANY problems and interviewers just want to keep finding ways of docking points.


I've never applied for a strictly dev position, but my friends have told me they've been asked about how different forms of self-balancing trees work. I haven't implemented my own self-balancing tree since Data Structures, and I'm pretty sure they haven't either. If someone asked me to implement one in an interview I'd honestly think they were crazy. Who would implement a data structure or algorithm they haven't used in nearly a decade without looking an implementation up first?

Bearing in mind that I use my programming ability for analysis rather than developing applications, I'd say that programming is like 90% fundamentals and 10% looking stuff up. If a company really wants to test someone's programming ability in an interview, I feel like the best thing to do would be to make up a programming language, give them a reference sheet, and then ask them to program a couple different versions of fizz-bizz.

Obviously it's not a perfect idea, but I think you'd at least be testing the skills people actually use when programming, rather than whether they can remember every bit of syntax from every language they have listed on their resume. I mean, if I lied and said I knew javascript, I'd almost certainly fail a programming test that used a made-up language based on it.


> If a company really wants to test someone's programming ability in an interview, I feel like the best thing to do would be to make up a programming language, give them a reference sheet, and then ask them to program a couple different versions of fizz-bizz.

Are you joking? Doing fizz-buzz is way too low of a bar. That doesn't even show you can use standard classes and things, like maps and lists.

> but I think you'd at least be testing the skills people actually use when programming

That would tests very few of them, unfortunately.

> rather than whether they can remember every bit of syntax from every language they have listed on their resume

No one I see tests for syntax, but for the more important/broader concepts.

> if I lied and said I knew javascript, I'd almost certainly fail a programming test that used a made-up language based on it.

Probably not. Most mainstream languages have pretty similar syntax. And if you expected people to know others without warning, people would raise hell. Anyone who knows C can probably guess the gist of what a small snippet of non-tricky Javascript does.


Well, other than R, I've never programmed in a functional language before. I programmed Javascript in my High School intro to CS class back in 2004, but I can't imagine I could pass a test based on that alone.

Fizz Bizz is stupidly easy, but I honestly don't think that much of programming is hard in the first place. The hardest thing in my mind is designing a coherent program. You could add any requirements you'd like to your made-up language (no automatic garbage collect or reference counter, etc), but I think you'd be able to tell pretty easily if they were the real deal.


Sadly, that describes more than a few EEs and MEs that I've worked with in the past. Where the act of turning a key in a commercial software package starts to displace practical design considerations that they learned in the classroom.


You don't need to be able to build a capacitor from scratch in order to understand how they work. Furthermore, electrical engineers don't have to build capacitors from scratch during job interviews to prove their competence.


No, but there's a level of the EE tech stack where you do need to understand (and have the ability to build) the level below. I don't need to know: how to construct a transistor, build a logic gate, build a look-ahead adder, construct an ALU, CPU, computer hardware, write assembly, or write a compiler to do my job (although the last couple start getting close enough to my bailiwick that I think they're useful).

I'd expect that something basic in an EE job could be "draw the core part of an oscillator circuit, then we'll talk about the principles of its operation". The discussion would end up going into some properties of capacitors, why they chose that exact form of oscillator, expected use-cases, etc. The behavior of the object lower in the "stack" becomes important, and so does a real understanding of how they work. Of course, actually requiring them to build one would be ridiculous.


>>I'd expect that something basic in an EE job could be "draw the core part of an oscillator circuit, then we'll talk about the principles of its operation".

Sure, but that's the equivalent of drawing a diagram that explains how quick sort works, as opposed to implementing it using real code. Most companies demand the latter during interviews.


The places I've interviewed (a few big names and a few small ones) either wanted something pseudo-code-like (on the whiteboard), wanted an explanation of the algorithm (potentially with some clarifying diagrams or code), or actually provided me a computer with an IDE.


> I've never had to deal with one of these sorts of problems in my day to day work;

You are from different niche and actually you are being paid less than those guys who knows the stuff you don't

From [0]:

"[S]killed cloud and backend developers, as well as those who work in emerging technologies including Internet of Things, machine learning and augmented/virtual reality can make more money than frontend web and mobile developers whose skills have become more commoditized..."

[0] https://www.linux.com/news/developer-nation/2017/3/visionmob...


A completely delusional comment, devoid of any roots in reality whatsoever, and an insult to the decency and dignity of software engineering, as well as an invitation to strip software engineering of the respect and compensation it deserves.

A 13 year old making a website for his dog in PHP can fit your definition of full stack web engineer.

The underlying parts of this "full" stack require significant domain knowledge around algorithms, data structures, computer architectures, operating systems, distributed systems, networking and communications, programming languages, etc... and most importantly, critical thinking and engineering rigor beyond trial and error and cargo cult copy-pasting from stackoverflow into your "get-things-done" duct-taped spaghetti code base.

Those underlying parts created by the people that you now call "incompetent" are required to design, implement and maintain the "kick-ass" babyproofed playground you live in and that allows you to put food on your table. Have some respect.

If you are so kick-ass and get-things-done, checkout the source code for Linux, Chromium, v8, node or libuv, Python, Ruby or whatever technology you use and try to get something done there to a level of quality in which it gets accepted and see what happens. You and your kick-ass denomination will be stomped over and brought back to reality.


The problem with your reasoning is that you expect the interview to mirror job requirements rather than select for job performance. In many cases the best instrument to measure the latter will resemble the former, but there's nothing intrinsic about the relationship. If giving someone a brain teaser or having them recite trivia provides a strong signal for job performance, then it makes sense to use these instruments. You could argue that they don't provide a meaningful signal (which I think Google may have discovered with the brain teasers), but that is a separate discussion.

Something else to consider is that the interview is optimized to select for true positives and reject false positives at different rates. It's been discussed elsewhere that for a company like Google avoiding false positives is much more important than finding good candidates. So it may be the case that some instruments like trivia recitation provide the right signal at the intersection of the optimization curves. The fact that many (even most) qualified candidates score poorly on these instruments doesn't impugn their utility; their primary goal isn't to identify good candidates but to filter out bad ones.


I suffer greatly in coding interview questions. I'm not a kick-ass full-stack engineer but rather a research engineer. I code every day but usually it is proof-of-concept demonstrations so I lack formal education / guidance of many professional coders. I'll knock a take home assignment out of the park but I do envy people who are great coders.


To me this sounds like you just develop dependency-laden bloatware, i.e. you'll throw in some 5 gigabyte Javascript library just to do one thing.


I agree 95%. However, inevitably there will be a developer that will want to implement a trie for reasons. It's hard to be able to reason with her (or technical leadership when appealing the decision) when advocating for an off the shelf alternative if you can't explain why this isn't a brand new problem. On the flip side, one actually does need to create a new data structure on occasion, and obviously there we would want to be able to implement the common alternatives.

Anyway, it's complicated. Especially when hiring for technical leadership.


If I need to convince another developer that a particular path is well-worn and that they are drifting towards NIH syndrome I have the benefit of time to develop an argument and resources with which to do so. I do not have 45 minutes, no resources, and just a whiteboard.

To pre-counter an expected objection, if your company makes significant decisions like that in a single 45-minute (or any length, really) meeting you need to change your design process, not your hiring process.


> I have the benefit of time to develop an argument and resources with which to do so.

One would assume so. That's not always the case in my experience. Improvisational discussion of design tradeoffs and costs happen a lot. YMMV, I guess.

Though I agree that no-google, closed-book, no-IDE whiteboard development is very unnatural.

> ...if your company makes significant decisions like that in a single 45-minute (or any length, really) meeting you need to change your design process, not your hiring process.

I'd agree with that, but design processes on the whole tend to more dysfunctional than organizations realize. There's still a lot stuff running on deprecated OSs, dead languages, and mountains of technical debt.


> One would assume so. That's not always the case in my experience. Improvisational discussion of design tradeoffs and costs happen a lot. YMMV, I guess.

They certainly do, but significant, unchangeable decisions should not be made that way. If I think there is a better alternative to something expressed in one of these discussions, but do not have the details at hand to make the case, I will voice the alternative and compile the details later. Choices like that should not be made without documented rationale anyway.

> I'd agree with that, but design processes on the whole tend to more dysfunctional than organizations realize. There's still a lot stuff running on deprecated OSs, dead languages, and mountains of technical debt.

The solution is to fix the organizational dysfunction. Hiring to the dysfunction is a band-aid at best.


I've had a similar experience. I think the organics companies just clean their equipment/tanks better or something. Regular milk with an expiration date 3 weeks in the future often turns sour within 3 days of opening, despite refrigeration. Whereas organic milk with the same expiration date, from the same shelf in the same store, will last a week and a half. And there's really nothing inherent about organic milk that should make it last longer.


Regular milk with an expiration date 3 weeks in the future often turns sour within 3 days of opening, despite refrigeration.

Wait, what? Are you not refrigerating your milk? This is unusual.


I always refrigerate milk, just wanted to be clear about it to explain that it wasn't going sour because I was doing something dumb like leaving it open on the counter.


I find the reverse. Organic smells sour after 2-4 days, but my standard delivered milk lasts at least a week. Both are local and minimally processed, and come in glass bottles.


Honestly, that sounds awful. I already guard my phone number more closely than my email because a) spam calls are more disruptive than spam email, and b) it's easy to make a throwaway or catch-all email; throwaway phone numbers are harder to obtain. To the point that phone numbers are being used by some systems as an identifying datum. Hell no.


It's mature. The community is mature (in multiple senses of the word). The ecosystem is mature. The package management is solid. There are drivers or SDKs for basically every tool or service I would want to interface with.

The syntax is rich but concise. It doesn't require compilation of binaries. It does OOP pretty well. It does procedural pretty well. It does web pretty well. It does throwaway command line scripts pretty well. It does performance well enough for the things I do with it. It's portable.

Python isn't perfect, but 99.99% of the time it gets out of my way. I must, can, and do work with other languages. I constantly find them frustrating me in small ways. Python generally doesn't do that.


I've never had a hire go wrong for technical reasons. I have hired, participated in hiring, or inherited several developers who had to be let go for reasons related to attitude or soft skills. Some examples:

- a guy with an alcohol problem who would disappear for a week at a time or come in to work sloshed

- a junior developer who had major problems with authority, mixed with bizarre paranoia. He refused to take direction from his team lead and had to be let go after he started accusing anyone and everyone of trying to undermine him.

- a guy so obsessed with doing everything perfectly that it took him a year to produce what other engineers could accomplish in a month. Granted, his work has been running for 3 years now without a single bug, but even taking that into account he still wasn't cost effective to have on the team

- a developer who refused to take ownership of his projects and insisted that everything expected of him be specified down to the pixel (might work at a large corporation, but not a startup - we don't have time to hand-hold like that)

- a guy who was hired as a junior mobile engineer and then began throwing fits when we denied him the authority to change the priorities of the entire web and mobile product team

Takeaways:

It's fairly easy to assess who is and is not capable of developing basic CRUD apps. Getting meaningful information about a person's neuroses, self-management ability, and ability to play well with others is extremely difficult in the space of a handful of hours of interviewing.


The perfect code guy doesn't sound so bad.


Until you ship a product so late that its declared DOA. Perfect coders are more dangerous than shitty ones. They usually slow down progress in the name of perfection so much that nothing gets done. IMO.


I don't know that I agree that perfect coders are more dangerous than shitty ones. When you're behind on a project because someone is taking too long, it's upfront and obvious and usually they can be encouraged to speed up things as long as they document where work needs to be done afterward.

Bad developers are the gift that keeps on giving. Everything looks great, you ship, it mostly works, and then you spend four years struggling to build anything on top of what was written, squashing data-loss bugs that take weeks to track down and can never have their root causes fixed, etc.

Companies that end up in the latter situation are basically zombies. They're already effectively dead, but nobody knows this is the case for years, pouring time, effort, and money into a bottomless sinkhole.

Perfect coders can also serve as really good mentors on teams, and catch serious issues everyone else would have missed.


For a startup, sure. But the machine doing LASIK surgery on a human's cornea dozens of times per day absolutely should have a perfect coder. There are some industries where sloppy code should get you fired, because it will get someone killed.


https://www.joelonsoftware.com/2006/10/25/the-guerrilla-guid...

People who are Smart but don’t Get Things Done often have PhDs and work in big companies where nobody listens to them because they are completely impractical. They would rather mull over something academic about a problem rather than ship on time.


If you're building life support software or manned lunar probes, maybe you want him on your team.

If you're a startup trying to find product-market fit before you run out of funding, better to ship a few bugs every week (in features that have an 80% chance of not existing or being rewritten anyway within a few years) than nothing at all for months.


That's one reason we rely heavily on thorough, structured reference checks.


Does that actually work? I've never had anybody ask me for references for a technical job. Few people are going to say anything bad about a former employee for fear of being sued. And that's if the candidate plays fair. For less than $100, you can bribe three people to say you're the next best thing to Steve Jobs himself.


It works great! I agree that we're unusual in the weight that we put in our references, but I've never had it be a blocker with an applicant. We run every reference through the same script of questions, for a call that tends to last about 30 minutes. Since many engineers often talk in interviews about what their team did, versus their own contributions, the third-party viewpoint is very helpful for getting that perspective. We've found references to be helpful in differentiating between average, good, great, and exceptional individual performance, and extremely helpful in understanding a candidate's teamwork. I personally find it vastly more useful than whiteboard coding and work samples for predicting how a candidate will actually perform.

Aside from their usefulness in making hiring decisions, as a manager, it has been great for jump-starting my relationship with my new hires with context on how they have worked in the past. That has been really helpful.

I agree with you that references have some limitations. We do, of course, expect that applicants will cherry-pick their references to make themselves shine. Yet, I typically get very candid feedback from the reference providers. I think most people simply aren't built to straight-up lie for someone else, even if they are a friend. References are typically someone in some level of authority someplace else, and they put their word on the line. Most people seem to take that seriously.

Another limitation is that people often can't use their current boss as a reference, and for people on an upward career trajectory, that may exclude someone who is capable of talking about the applicant's greatest career achievements. This was the case for me when I applied to my current role.

Lastly, we have to keep in mind that any single reference is colored by the biases and personality of its provider. And some applicants simply have access to better referrers than others for reasons out of their control. All we can do is make judgment calls when it comes to these things.


How do you structure your reference checks? And what do you do for new hires who are very early in their careers and don't have real work references?

Also, I'll note that three of the above employees were sourced via referral from either other employees or friends of execs.


See my other response for more context, but we have a behavioral interview-style reference call script, which is universal for all roles at the company. But it still does tend to reveal good knowledge about the candidate's technical ability, even with generic questions. Occasionally, I'll add an additional question for tech context, but I find the basic script to be pretty comprehensive.

I'll echo the sibling post for inexperienced candidates. References from jobs in other careers also work. But to be honest, if someone can't find anyone to speak in-depth about their achievements and teamwork, they are going to have a really tough time passing the bar to get hired. We don't generally do work samples of any kind, but I'd probably have to make an exception in that case, to get some kind of picture for how they work.

We've certainly had a couple people not work out on our engineering team. But I don't think we've ever hired a total dud or a toxic person.

It's interesting that you mention internal referrals, because I think that can be a major source of bias. People feel obligated to put in a good word for their friend, and if I hold the referrer in high personal regard, my natural skepticism of them as a reference diminishes. I feel referrers should say their (small) piece at the beginning, but from that point onward, the hiring process needs to be independent of their influence.


References from internship colleagues/managers, school mates or teachers can work a bit. But be aware that you will be getting very few data points.

Now, it is easier to influence new professionals than changing habits of seasoned professionals.


It's definitely fair to check this. Also I think this works very well when the hire worked in a well-ordered place beforehands. On the contrary, if someone worked at a toxic work place before - it's probably counterproductive to ask colleagues or even managers. Odds are high the person is exactly leaving because of that.


Other than hiring someone as contractor first, have a trial run for 6 months, what do you think is a good way to minimize issues like this?


Re: trial run. I do not suggest this.

I'm not a world class developer, but I'm a good one, maybe great depending on the day but I had 3 offers to be contract to hire and I refused.

You may save yourself a bad hire, but more than likely you will do so at the cost of missing out on hiring a handful of good ones.

If I'm interviewing with you and you don't trust me enough to make me an employee from day one then why should I trust you as a company?


I will only take a contract position if that's the only way I can get a job doing something I know academically -- I studied the technology and did a side project -- but don't have any on the job experience.

Basically, a resume builder


I agree. Good ones always have good options. I avoid contract to hire, because even a talented developer who chooses to stay as contractor will avoid contract-to-hire positions.

I guess you just have to go with who you think is the best candidate, and if the decision turns out to be incorrect, fire quickly.


I don't get the drive to eliminate bezels. I mean, big bezels are a waste of space. On a laptop they may serve an engineering purpose, but no user purpose. But for handheld devices at least, it's nice to have enough bezel to be able to hold your device without worrying about activating touch response.


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: