Not revealed as a terrible company, perhaps, but as a terrible interviewer.
If any company were to quiz me on algorithmic basics, it had better explain to me beforehand why it is among the x% of all hiring companies that actually need to roll their own new solutions in the face of so many well-established libraries.
That is, before you ask me to demonstrate a depth-first search, you had better explain to me why I'm going to need to be doing that instead of just writing an SQL query and tweaking an index, which is likely what I would be doing at most companies.
Part of development is figuring out not just the answers to the questions, but also figuring out "Of all the questions I could have been asked, why was I asked that question?"
A disappointingly large fraction of the time, the answer to "why did you ask that question?" is "we noticed a correlation, confused it for causation, and built an entire strategy around it".
> need to roll their own new solutions in the face of so many well-established libraries
I'm not defending interview quiz-time, but it isn't (or shouldn't be) about rolling your own solution. It's about understanding concepts. Understanding the basics of time/space complexity are pretty fundamental when designing systems.
Developers frequently encounter hashing and (probably less-often) trees, so I don't see an issue with asking something like "Why/how/when is a hash-based lookup faster <in some situation> than a tree-based lookup?" (as one question during an interview). If a person can give a good answer to this, they'll pretty easily be able to understand the performance tradeoffs of hash vs btree indexes when using mysql (and this same tradeoff in a variety of other situations).
If you're measuring my design ability in the interview, you had better give me an opportunity to use that ability after being hired.
Too many times have I been asked questions that shaped my expectations of the job, only to be disappointed later on. The worst offender in this respect put me through a technical screen that could only reasonably be passed by someone with a bachelor's degree in CompSci or equivalent work experience, only to later tell me that the code I write "should be understandable by a kid fresh out of high school" (actual quote). I applied to 12 different job postings that night.
The interview led me to believe that I was being hired for my expertise, and the job expectation was actually to be a warm, brainless body in a formerly empty seat. That's why I don't like questions that act as proxies for some other metric. The questions you ask me are telling me about you as much as my answers tell you about me. When you cargo-cult interview procedures from another company, you are actually misrepresenting the nature of your own company as being like the company you stole your interviews from.
Not quite. I was probably actually replaced by someone fresh out of college, who failed to learn anything during that 5 years. I am 99% certain the contract requirement mandated bachelor's degrees (and in a relevant field) for all software developers.
> "before you ask me to demonstrate a depth-first search, you had better explain to me why I'm going to need to be doing that instead of just writing an SQL query and tweaking an index"
How I am going to ask you to tweak an index, if you don't know how to browse a tree? Although I would have asked you about B+ and B* trees, the ones used to index a database. There are differences in those trees and you need to know them in order to decide which one is a better option to improve the performance of the queries. Obviously this improvement means nothing for a small startup, but image the impact it has in a company as Google.
I think the point is that a lot of interviewers make those questions because big companies do them. But there is a reason why they do them. Obviously I would prefer the candidate that knows the answer over another one that also does the job but doesn't know the answer.
How big is the chance that you select the candidate who happens to know all solutions (e.g. by learning them by heart recently for the dozens of interviews he's planning on doing), but is not a good technical fit? My estimate is: pretty high; Let me explain why.
If you indeed need someone who knows about B* vs B+ trees, why not ask him about that separately ("explain me the difference between ..."), to see if he has the technical background you need? Even if someone understands that difference, that same person might have problems getting DFS right during the whole live interview, for a number of completely irrelevant reasons (nervousness, a momentary lapse, being a bit "rusty", a.s.f.). For the candidate to know the difference between a B+ and B* index/search and to operate a database correctly doesn't require the ability to implement DFS flawlessly (and most likely will never require the person taking a shot at that, for that matter...)
I think the real problem is that some people don't seem to understand the goal of these questions. If you do algo whiteboard questions (and you should!), you should measure the candidate's behavior, reactions, and analytical skills, while attempting a solution with you - and not to find out if the provided solution is correct or not (indeed, its often more interesting when the solution is not and you work with the candidate on locating the issue!). And all the while, the interviewer can try to figure out if he/she wants to work with that person, given the current interaction, too.
I disagree with you in a lot of things. I don't like whiteboards because they are stressful and feels strange if you don't use them regularly. I was a teacher which uses whiteboards daily and they are completely different to use a computer. I would rather ask the candidate to talk about one of his projects and start asking relevant questions related to the job position and applied to the project he knows. And have a discussion this way. And the iteration with the candidate is only biasing your decision towards his personality and not his skills. To know if he does the things like I would do them.
But that is my opinion, the person who is hiring is the one who decides who he wants to hire. I feel more confident in finding a good fit/candidate my way that how you described.
"Perhaps you won't, but since we don't know yet exactly what you'll be working on, we might know broadly what PA or even project, but we can't know what problems you will encounter or what direction it will take to debug any problems that arise, we want someone with a broad base of skills who can at the very least recognize performance problems and solve them in a simple case. We expect that if you can solve this relatively simple problem in an environment with no resources, that with the aide of documentation, profiling tools, and teammates to lean on, you'll be able to address much more complex issues that arise. On the other hand, if it takes you documentation and teammates to solve this simple case, who knows what kinds of tools it will take you to solve real world problems that arise."
"Correlation does not imply causation" doesn't imply that correlation never implies causation.
> If any company were to quiz me on algorithmic basics, it had better explain to me beforehand why it is among the x% of all hiring companies that actually need to roll their own new solutions in the face of so many well-established libraries.
Please suggest a reasonable alternative then. I need to interview people and see if they are going to be capable of digging through complicated code, of coding things reasonably quickly, or writing scalable, robust code, of potentially digging into things enough to optimize their performance, etc. as reasonably well as I can in an hour. And I need enough concrete evidence that everyone else in the debrief believes me. I would love to change the way I do things, but I am held accountable for the interview, so I can't just show up to the debriefs saying crap like, "He says he can use a library for anything that comes up." And I need to evaluate some other soft skills type stuff, like caring about the customer, communicating well, delivering more if you see something extra that you think should be done, etc.
But the obvious answer to your question would be, "because if everything we do could be solved just by using a library, we'd have interns or offshore people do it for a fraction of the pay."
As prirun says, you have a conversation, not a quiz.
Quizzes are a terrible way to assess skill; they're used in academics because they're, sadly, the only quick way to get some sort of consistent feeling over the memory and information retention in a group of dozens of students. No intelligent company should be copying this; companies have the luxury of dedicating significant time to exploring the potential of each candidate.
You say "Hey, here's a real problem that we'd need to solve. How would you go about it?" And let them talk. They'll be able to speak in informed terms about how to approach the problem even if their algorithms memory is rusty or if they're self-taught. If it sounds like they're on the right track, you win, if not, you move on. Repeat until end of interview, evaluate at the end.
I've used a basic code competency test for an interview pre-screen before. This is a take-home project that should take a competent person about 30 minutes, 2 hours if they go all out and add a bunch of bells and whistles.
People say this, that their take-homes are non-intrusive, but don't mean it. I mean it. The quiz is seeing if they can make a single API call to a prominent API and return the results according to a very simple format.
These pre-screens should be well below the competency requirement, which can only be assessed in the interview. It's essentially a fizzbuzz that they can't easily copy and paste by searching for "solutions to fizzbuzz".
Other than an extremely simple pre-screen like that, your decision should be based on their ability to reason, discuss, and solve problems on the fly, not how well they remember the particulars of compsci curriculum. That selects for the recency of their last class, which, ironically, is usually inversely correlated with their much more valuable real-world experience and what people think they're trying to test for.
This doesn't apply if you actually are doing deeply theoretical work on the cutting edge of compsci that may require frequent cooperation with academics. But that's a tiny minority of positions.
I love this; it's a good insight into what a hiring company wants to know about a potential employee. So how do you get there? Here's my suggestion:
There are 2 very distinct (in my mind) kinds of skills in question. For the "soft skills", a manager-type could and maybe should do that part of the interview. A trusted tech person should do the technical part. I'd go so far to say to that every technical person on the staff should be trained / groomed to help with interviews. So maybe you have 1 tech person do the tech interview, and a 2nd is learning how to interview.
For the interview itself, the process I'd use is to ask a lot of questions. For the soft interview, things like:
- "Tell me about a difficult customer you've had." Let them explain a while, and ask lots of probing questions: "Why didn't you do X instead?", etc.
- "Tell me about a difficult problem you had with a former employer." Same drill.
For the tech part, bring some code to the interview, ask the interviewee to bring some code to discuss. Another option would be to just throw them into a source code tree and say something like "Here's a tree of one of our projects. Talk to me about it." Maybe I'm wrong, but I think I could tell a lot about a tech person just from how they approached being thrown into a mess with no prior information. After all, that's a large part of being a good developer IMO. As they start to figure things out, look at code, etc., ask them questions about what they're seeing. Let them ask too. If you run into something interesting, like a specialized algorithm, ask them about it: "Why do you think it was done this way? What would some other options be?"
Then let them do these things for code they have previously written. For example, I wrote a Prime minicomputer emulator. I'd love an interview where they asked me about why I did this project, what problems I ran into, what were some alternative designs for tricky problems, what were the tricky problems, what did I learn from doing this, what tools did I use to do it, etc.
>Another option would be to just throw them into a source code tree and say something like "Here's a tree of one of our projects. Talk to me about it."
"Its in perl, I don't know perl, and especially not your internally modified version of custom-magic perl that Steve wrote 7 years ago."
Not to mention that you are now either showing source code trees to random potential hires, or you have to audit/create/otherwise use some potential set of source code. Maybe you prescreen by asking them their favorite language, and you come in with an open source project, in their language of choice, but now you have to have one of your devs spend time familiarizing themselves with Redis or the Python interpreter or Hibernate Core or Angular or whatever, and what happens when they ask to do the interview in Haskell?
FWIW, I know some companies that do the interviews you're describing, but they're all relatively small (<100 employees), and they all do that kind of interview only after a technical phone screen with your conventional questions, because the time investment required by the company is so great.
I've worked on projects where people role their own solutions instead of being able to use stuff that's out there and it's not great either.
The real solution I guess is talking to people and teasing this info out of them, and perhaps showing them some older code you have since fixed and seeing what they find in it.
Edit - one reason I personally try and use a lot of libraries is so that large parts of a project are maintained upstream after I leave my contract; the more I write into the project, the more the next dev will have to maintain, anything I can push back into external libraries is a win.
In fact, it should be explained at the job offer text. With an added bonus that people then can self select in or out of the offer without losing any further time.
If any company were to quiz me on algorithmic basics, it had better explain to me beforehand why it is among the x% of all hiring companies that actually need to roll their own new solutions in the face of so many well-established libraries.
That is, before you ask me to demonstrate a depth-first search, you had better explain to me why I'm going to need to be doing that instead of just writing an SQL query and tweaking an index, which is likely what I would be doing at most companies.
Part of development is figuring out not just the answers to the questions, but also figuring out "Of all the questions I could have been asked, why was I asked that question?"
A disappointingly large fraction of the time, the answer to "why did you ask that question?" is "we noticed a correlation, confused it for causation, and built an entire strategy around it".