Why are we interviewing Developers by asking Architect questions? - Part 2

I was very fortunate with my latest post about programmer interview questions.  As a result of a large number of readers, the comments section started a discussion which was very enlightening.

Let's recap


My previous post pointed out a flaw in the current interview process for programmers.  When interviewing for a Software Developer position, we are asking questions that are designed for Software Architects.

The Comments were spectacular


One commenter noted an interview he enjoyed and how they avoided the problem of asking the wrong questions by simply having him write code.

Another discussed how these bad interview techniques negatively impacted his career.

And a third (We'll call him Nate)...well the things he said inspired this post.  So let's go over them. shall we?

The professional programmer


Being a professional means that you are conversant in the terminology of your industry.

I am a firm believer in Einstein's statement "If you can't explain it simply, you don't understand it well enough."  Since Nate's using some big words here, and I'd like to talk about this in a way non-native speakers can understand, let's restate this without the big words.

"If you don't know the words that people in your line of work use, you're not being professional."

The word professional is hard to translate here, so we have to understand that professionalism is partly about showing respect for your co-workers and partly about getting work done.  These go hand in hand in that you show respect for co-workers by getting your work done by the deadline.  But it also means not flirting with every opposite-gender employee in the company.

So let's start by saying that most programmers are not professionals.  For Many Reasons.  Note that the last link is an interview with a software architect.

With regard to professionalism, because of the respect piece of this task, and of course the general way that organizations are structured, as employees, we are beholden to the requests of a manager.  This extends to mean that we should also communicate at the same level of the manager--turning off all of that technical terminology so that he can understand.  In fact, I would go so far as to say that the ONLY time that technical jargon is relevant is in discussions with our peers, other developers.

But what about that terminology.  This isn't like biology or physics where you memorize a base set of terminology and build on that.  The world of programming is constantly in flux.  Just this year, for example, the term IoT began to pop up on all my blogs and I had to learn its meaning.  Know what it means?  "Internet of Things"  That's right, it's not even a term for something new but it's yet another arbitrary term for something we work with every day.  I truly believe that we shouldn't be forced to memorize every meme on the internet.

So no, I don't think that a programmer is bad at his job if he doesn't "know the language".  Let's look at more of his comments.

I'm not even speaking the language


Nothing that you have mentioned concretely to this point is relevant to architecture. Indeed, to even say "architected code" is a complete misnomer and implies that you don't understand the term as it is actually used in the industry. Code can be "designed," but it is not architected.

Of course I did say "Architected Code" and precisely for ass the reasons I've mentioned previously (Einstein and all that).  But I was talking about architecture.  Just as a lean-to shack may be the appropriate solution to keeping your firewood out of the rain and a skyscraper for your corporate headquarters, it's important to understand that software also comes in many shapes and sizes.

Let's take a real world example of an architect in action.  At NASA, I was put on a team of Coldfusion developers in order to help ease their transition to .NET.  We were tasked with building a small administrative application from scratch.  I suggested we build it as an MVC 3 application.  Why?  An academic architect might say that was crazy.  I was taking a team of people and thrusting them into a new platform head first (MVC 3 was brand new at the time).  It was my real world experience that won here, because they already thew how to code.  They didn't need months to memorize every nuance of the new language.  They would learn by doing and be better for it.

A guy like Nate might find fault with my decision.  Most academic architects would find fault with my choice simply because the technology was new.  NASA loved it.

What am I really talking about?


Maybe your choice of examples have just been too limited, but they don't remotely resemble the sorts of things I would ask when interviewing an architect.

Well this post should provide Nate with plenty of detail.  Just in case anyone is still confused, this is what I've been talking about this whole time. NSFW (a few words)


I do not share her views, but she is absolutely correct in stating that those questions do not tell anything about the quality of a developer.

With regard to the questions we're asking architects--what Nate is really saying is that the knowledge of these low level architecture principles is assumed by the time an architect makes it to an interview.  This may also be a mistake, but that will have to wait for another blog post another time.

Nate continues to repeat that it's important to know this and that arbitrary fact.  There's a bit of sadness that washes over me as I read and realize what he is really advocating.

Programming is not like academia


Nate's examples of things that developers should know for interviews aren't related to current technologies.  In fact, to a casual observer it might look like Nate is way behind in technology.  That's not how I saw it.  He was mentioning old, low-level technologies like "state machines" because he truly believes that Computer Science should be like the other sciences, filled with academic discussions and debates.  But in the real world productivity matters more than quality--in some cases to the extreme.

A bug fixing maintenance developer is on a time limit.  He rarely benefits from discussing his problem with his co-workers (unless it's a very big problem) and every day he spends working on it, there's usually a manager complaining that it's taking too long.  Nate's line of thinking is that this guy spent ten years in college memorizing a bunch of programming theory.  This neglects the constant growth of technology and the confusing shifts in the language, not to mention that theory is just that.

In reality, after spending 12 hours at the office, our poor maintenance developer then has to go home and read up on the latest technologies rather than spend time with his family--if Nate's way is the right way.  And sadly many companies agree and it has resulted in an unfortunate situation.  Programmers are being treated like scientists and factory workers at the same time.  The inexplicable amount of stress this creates is literally driving them crazy.

Why Nate's post is so great


Nate isn't just saying the wrong things.  He's saying EXACTLY the wrong things.  If you want to see what you're up against--exactly what's wrong with the current line of thinking in our industry-- then read his comments.  They're intelligent, well articulated, and phenomenally wrong.  At the end of his comments, he goes so far as to say that he won't hire developers who don't know the answers to these arbitrary questions.

I have a secret for those of you reading this post: you don't want that job.  Nate is looking for programmers based on skills they don't use on a day-to-day basis.  That means that his team is going to be comprised of politicians, not programmers.  These are people who are masters of being unproductive, and that sort of corporate environment is an ugly "every man for himself" "rat race".

The truth is that Work-Life balance demands that something give.  In Nate's case. he has chosen to give up productivity.  Since the demand for our skills is only increasing, it's a pipe dream to think that the factory worker mentality should go away.  Instead, I choose to hone the skills down to those that actually benefit them on the job.  They can choose to work toward a higher rank (read architect) on their own time if and when they choose.

A final word about Nate


Nate said he wouldn't hire me, but he should know that I wouldn't hire him either.  His blog is empty.  I encourage all of my programmers to write a blog, even if it's just occasionally.  It shows that you have the passion to have an opinion on something.

I absolutely tend to choose the best for my teams, but I base that choice on their passion for the work.  When I pick my teams, I don't waste time with qualifiers, I ask them what code means to them.  That tends to get the right answers.

In fact, my complete list of questions can be found in a previous post.

Comments

  1. First, it's Nathan, not Nate. (Since we're talking about being professional and all.) I do have a blog, it just happens not to be on Blogger. Nevertheless, I updated my profile, so my blog should be linked now.

    I still don't think that you've demonstrated that programmers are being asked architecture questions. I asked for specific examples of architecture questions, and you haven't provided any. I actually think your interview advice is pretty decent, and I will probably use some of it in the future (really). That said, you are missing a key question: what is a major mistake you have made in your work in the past two years? I would probably want both a technical and non-technical answer, as we make interpersonal mistakes as well as programming mistakes, and I prefer to work with people who are self-aware enough to recognize their own shortcomings. Second, I wouldn't characterize any personal or solo project as "large-scale" because to me that implies hundreds of thousands to millions of lines of code. The more general concept implied by your question is whether one hones one's skills outside of day-to-day work responsibilities, and that's definitely worth exploring with a candidate.

    Whether MVC3 was an appropriate architectural decision is hard to say, given the limited detail you've included. It really sounds closer to a business decision than an architectural decision, and so would be subject to a variety of non-architectural considerations. MVC3 also likely carries with it a number of implied architectural decisions, such as transaction granularity, persistence strategy, "layering," etc., saving you from having to make architectural decisions at all. If, e.g., the application needs to connect to or provide external services, interoperate with other applications, meet some SLA definition, or adhere to a comprehensive security standard, more architectural work would be required.

    You contradict yourself when you say that the factory worker mentality cannot go away and yet programming is nothing like factory work. Which is it? Designing a factory requires engineers, so maybe programmers are more like engineers after all. Also, I assume factory work today is at least relatively similar to what it was 100 years ago, but people who code like it's 1999 (or 1979) might as well be factory workers. In short, I don't see how your proposal is a solution, or whether it's even accurately pointed at a real problem. The links you posted on why software developer's aren't professionals also contradict your main point, because they actually advocate increasing the level of professionalism, which I agree with completely.

    There are two major problems that seem to be in play here. One is programmer competence, which is highly variable and difficult to assess. (Some professional standards would make this much, much easier.) The other is software quality, in part a byproduct of programmer competence. Software quality can be objectively measured according to a number of criteria (there's that Computer Science thing again), though it's not a solved problem by any means. I don't think addressing the former has a huge impact on the latter, but perhaps I take an overly empirical, scientific approach. If the everyday output of programmers is going to improve, it won't be because we hire lots of clever maintenance programmers. After all, if they were really that clever, they'd be writing good code in the first place and maintenance wouldn't be such drudgery. In other words, I think your position is self-refuting. If you've got specific interview questions that you think are appropriate for architects but not for programmers, I'd be happy to know what they are to see whether I really disagree with you on the specifics, but as far as big-picture thinking goes, we seem to be at odds.

    ReplyDelete
  2. To continue: I agree with all the links you posted: programmers ought to be more professional. Obviously we have a very long way to go before that happens. Professionalism is not about "getting work done;" it would be more accurate to characterize professionalism, for the present discussion, as getting work in a manner appropriate to the circumstance. Most software does not live up to that standard, and I do not have the time to list all the harmful consequences that follow from this state of affairs. I should note that most of what I say assumes that we are talking about experienced programmers, not the novice or relatively inexperienced.

    The point, re terminology, is that there are basic, well-understood terms that I expect a programmer to become very familiar with as his or her career progresses. It's one way of measuring quality, and in my experience this measure results in fewer false positives than something like years of experience. (I'd love to find a way to assess false negatives.) Sadly, we don't even have to get close to the professionalism of doctors or engineers here: the issues I'm bringing up are what I would consider to be on the level of an electrician or HVAC technician.

    Terminology is constantly expanding, but in the case of what was originally discussed (OO design patterns), the major terms have been around for at least 2 decades. In my view, someone who presents him/herself as an OO programmer but can't participate in a discussion of patterns because the terminology is too foreign is more in the category of junior programmer, regardless of years of experience. Marketing jabber like "The Cloud" and "IoT" isn't what I have in mind here. Whether it's relational databases, OO, or functional programming, I expect an experienced programmer to be familiar with the standard terminology of his or her chosen specialty.

    Working with a moderate-size Java application, I would expect to run into at least a half-dozen of the major design patterns (probably many more). To maintain such code "professionally," implies conformance with the established patterns, something that is often difficult to do when one is not familiar with the patterns, what they do, and why they were chosen.

    As far as programming being like Computer Science, you've mis-characterized my views considerably. Engineers benefit from advances in scientific fields such as physics and chemistry, and consequently we expect engineers to learn and use this knowledge. This doesn't imply that theoretical, novel, or very recent findings are important to engineering, but it does mean that an engineer who stopped learning anything new 20 or 30 years ago is likely either to be unemployed, stuck in a dead-end job, or not very well respected by his or her peers. Likewise, programmers benefit from advances in computer science, and so similarly should be aware of some of the scientific underpinnings of their work. (Concepts like big-O notation, idempotency, referential transparency, covariance, etc.) This does not imply being an expert, or even using the exact term correctly 100% of the time: it does mean that one has awareness of the scientific concepts, their value, etc.

    ReplyDelete
  3. The bulk of your comments about science and whether self-taught programmers are more skilled than those with degrees is entirely non-empirical and I lend it no credibility. It's absurd (to me) to think that someone who goes through the considerable difficulty of getting an MS or PhD somehow lacks "drive." There are lots of exciting developments in cryptography, distributed computation, machine learning, etc. being done by the academics you deride, but I'll avoid getting into details as that would imply architecture... The point is, a great many lasting contributions to the practice of programming come from academics, and programmers do themselves and their field a great disservice by remaining ignorant of these contributions.

    Contra your tangent on work-life balance, what I'm advocating actually involves less effort (in the long run). That is, the closer you get to actually being a professional programmer, the less time you spend fixing your own mistakes, the less time you spend creating bad designs that have to be thrown away, the less time you spend building replacements for things that weren't done right the first time. The approach of heads-down problem solving and "maintenance programming" does not and cannot lead to any solution for this problem. Incremental solutions to exponential growth cannot work.

    So maybe my motivation for advocating professionalism is selfish. I don't want to be up at 2am again fixing an entirely avoidable production bug because a junior programmer got some awful design through a lax code review process (and got a promotion for "getting things done" to boot.) I don't want to spend 2/3 of my day figuring out the convoluted logic of a working system that simply cannot be extended without creating new bugs in order to re-write it in a sensible manner. I don't want to have anything to do with a codebase that, according to SonarQube, requires 9 man-years of work to clean up (even if it did take 100 man-years to build). I'm tired of running profilers to figure out where that clever, problem-solving, git-r-done programmer hid all those n^2 algorithms that made parts of the application slow to a crawl over a 3 year time-span. I'm tired of saying "yes" to bad ideas that exist only by mandate from leadership. (Especially the former programmers who weren't very good but thought they were and played politics well enough to get promoted as "architects" (or worse) somewhere along the way.)

    I'm also prideful, I suppose. I don't want someone cursing me years from now as they try to maintain the awful code that I wrote because I was ignorant. I don't want to write bad code because I have to "follow orders" either. I carry enough of those sins around, and prefer not to add to my burden. Working harder, being more clever, learning new techniques: none of these can save me, they can only slow my production of negative karma.

    I should also clarify that I don't necessarily blame programmers for failing to know what they really ought to know. If programming were truly a profession, then they would be blameworthy. As it stands, the average programmer is overworked and mostly clueless as to the major historical and scientific events of their field. This situation fits the interests of the corporate elite, and so is unlikely to change. Nevertheless, the people I've worked with whose work I admire most all share in common a commitment to professional improvement, something that is not reducible to the acquisition of new skills or improved techniques.

    ReplyDelete
  4. I'd like to mention that I'm glad to read that we perceive the same problems in the industry (politics and a lack of effective screening). It's noteworthy that we see the problem so differently that we've come to nearly opposite conclusions as to how to solve it. I wonder if you live in FL. Perhaps we could hang out sometime and discuss it further. I don't believe that this discussion is anywhere near it's end and we could certainly expedite it by discussing in real time. Also, you're wrong :),

    ReplyDelete
  5. I don't envision anyone spending 10 years in post-secondary education "memorizing theory." Most humans don't do well under those conditions, which is why most universities don't actually operate in that fashion. If you don't think theory matters in programming, you could start with Amdahl's Law. Sure, it's a theoretical concept from some PhD guy back in 1967, but in this case it turns out that the real world is usually worse than the theoretical optimum. Maybe you can go the rest of your career without having to write parallel code, but I don't have that luxury.

    I don't think there should be a single path to becoming a professional programmer, but I do think the concept needs to become a reality, given the real-world harms that result from accidental or intentional misbehavior from computer systems. Our field is really bad at doing apprenticeship and mentoring, in my experience. We pretend to have a meritocracy, but often the best way to get promoted is not to do great work, but rather to "job hop." The alternative is to slave away for someone until they are gracious enough to promote you, then you have the privilege of treating other entry-level programmers as slaves and continuing the cycle. There is a dearth of objective standards or qualifications for mid-level, senior, or architect positions. We let HR departments pigeonhole candidates as "SQL" or "Java" or "C#" or "C++" or "Big Data" or "Web" developers, rather than hire for motivation, creativity, or determination. Some sort of objective way to classify a programmer as an apprentice, journeyman, or master, (or some other, superior classification system) might help to clean up some of the mess we are currently in.

    If you want to talk about "low-level constructs," why not talk about files? These crappy things were deprecated by Englebart in the mid-1960s and we're still using, abusing, corrupting, and losing them today. Anyway, strings are low-level constructs and we still use them all the time. The point about state machines is that they're everywhere and most programmers don't even see them, hence they fail to use them effectively, and so programs are harder to maintain than they otherwise would be. Like dozens of other examples I could list, once you learn the pattern you start to notice it. In this case programming language designers seem to bear some of the blame since most languages fail to make state transitions first-class entities. (Likewise most languages lack a good way to discard partial state changes (rollback) but STM is another story.) At any rate, someone who groks state machines is more likely to write compact, testable, comprehensible code, and be less affected by the (both theoretical and very, very real) problem of state space explosion.

    Put another way, the value of theory, in a real science at least (which Computer Science mostly is), is that it is always true. Whatever new innovations may appear tomorrow, they cannot violate sound theories. That shiny new NoSQL database may be "cool," but if it fails any of the ACID properties, I'm not entrusting it with my data unless my data is of such low value that I'd never try to recover it if it were lost or corrupted. Optimizing a quadratic algorithm by lowering its memory requirements will always fail to solve the underlying problem. Reusing an initialization vector with the same encryption key is always a violation of semantic security. So, knowing theory is one way of avoiding avoidable mistakes. (So is familiarity with logical fallacies: how many times has your project manager or boss affirmed a consequent in a meeting?) Theory isn't everything, but we would all be better off if, for instance, the PHP inventors had learned more of it before setting out to invent PHP.

    ReplyDelete
  6. For a brief, but valuable vignette on the value of patterns, I recommend this (patterns as a language). I hope someday to have time to write with such brevity.

    ReplyDelete

Post a Comment

Popular Posts