Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Preparing CS students for programming interviews from day one (etilevich.wordpress.com)
72 points by RiderOfGiraffes on Feb 21, 2011 | hide | past | favorite | 45 comments


I'm convinced this is actually a good idea ...

  > I stopped using PowerPoint slides.
There's a great start.

  > immediately pose a programming puzzle using this concept to the class
Instant reinforcement of the ideas

  > ask students to work on the problem for 5-10 minutes
Enough time to get into the issues concerned, not enough time to get stuck and frustrated

  > volunteers to present their solution by writing it on the whiteboard.
Present your work to others for everyone to learn.

  > discuss and correct the presented solutions from
  > the perspective of a potential interviewer
Immediate feedback for everyone, further analysis of the problem and the ideas involved.

It won't suit everyone, but ...

  > Will I teach all the remaining technical material
  > in the course using this method? Of course not.
I wonder which method will prove more effective.


I have had some classes that employed a similar method, it works in some cases (small class size, professor and students are very familiar/good rapport) but in general this is how it would go in my experience:

- Students group up for 5-10min to work together, everyone sits there and waits for the one person who knows the solution to solve it

- Prof asks for a group to come up, no groups volunteer so he starts picking at random

- Students that don't know the answers will stop showing up for lecture or will cling to the people that do, so they don't learn the material

Like I said, there are cases when it can work -- and when it does it makes for a really good lecture. But I find it is increasing rare that students are willing to go up to the professor and in front of their classmates and work through their mistakes; you are very exposed and I think the fear of public failure can be overwhelming.


One way to approach this is for the teacher to model getting to an eventual solution including mistakes and false starts. This gets the students comfortable with the idea that they don't have to know everything when starting or having to present a perfect solution and that the real learning comes with some struggle to get to the solution.


PBL can be very powerful in the right setting, but as you point out, it really boils down to student/professor rapport. In the best case, PBL is a fantastic learning tool, but in the worst case, as you say, it can be incredibly divisive.


What is PBL? I'm guessing "problem-based learning"?


yes, sorry - i spent the better part of a year studying that in depth and got rather used to using the acronym :)


I just graduated from the department Eli teaches in, and I know him personally. Out of curiosity, how did you stumble on this?


I have no idea. I do a lot of searching, some computer assisted, some directed by Bayesian filters, some directed by keyword search, and some just clicking around. I spend about 2 hours a day on searching about math, teaching, hacking, programming, physics, enrichment, and similar. I also employ programmers, and am interested in the problems of interviewing. I guess in that 2 hours I view about 700 or 800 pages, most for just a couple of seconds. I then save around 20 that my systems have marked as potentially interesting.

I found it in the bundle.


I like Eli too, he's pretty good at dry humor. I know him from the georgia tech days.


This is a good way of teaching in general - presenting a problem, having students work at it (alone or in groups), then everyone discussing solutions. This is class time well spent, much more effective than lectures. Students being becoming interviewees is just one side effect.


If there were surgeons applying for jobs who didn't know what a scalpel was, I sure hope they'd be asking that at an interview.

Anyone can write on their CV that they've got five years programming experience, unless you can demonstrate the skills you've learnt over those five years, then you might as well have spent those five years watching paint dry.

If you've spent the last five years teaching undergrads data structures, then I certainly expect you to be able to answer questions on data structures.

Medicine is a regulated profession so you don't need to ask these questions as someones already asked them and certified the individual.

What's often missed is the fact that software development interviews are an order of magnitude better than those in other non-certified professions. In many industries interviews often come down to the gut instinct of the interviewer, a practice which is known to be highly unreliable.

Programmers are lucky that we're in a profession where it is possible to separate the wheat from the chaff using objective questions.


I disagree with the assertion that solving a coding question on a whiteboard is equivalent to asking a surgeon what a scalpel is. The equivalent for an engineer would be asking what an if statement is. Both are simple tools that are used for your job, and interviews should assume you know what they are.

The real equivalent for a surgeon would be to give him a theoretical case of a patient and then ask him to explain how he would solve it. I'm not a surgeon so I have no clue what their interviews are really like, but asking a question like this would seem like a very reasonable thing to do.

As for the guy in the article who whines about employers stubbornly wanting someone who has written code sometime recently, I'm not quite sure why he'd think otherwise. When you hire someone at a PhD's salary you generally don't expect to have to hold their hand for a few months as they get back into learning how to write code. Not to mention anyone I know who is good at writing code writes code for fun in their free time. Would you hire an artist who has a Fine Art degree but hasn't actually drawn anything for years? I think not.


Goal of CS should be to make students who build next Google. What if Google changes its interview game and asks people to actually show what they actually built?.

CS goal could be greater than a factory producing employees for big corporations.

The new Gospel is lot better. "Build something useful" Start it on day one.


What if Google changes its interview game and asks people to actually show what they actually built?

This is not likely to ever happen in a big way. Way too easy to show stuff that you built that you either didn't really understand or didn't really contribute to.

I've interviewed people with significant open source contributions who seemed almost incompetent in the interview. I almost suspect that they hired someone to do the checkins as I'd ask why they did X vs Y and it was like they didn't really know what X was, much less Y.


Knowing something and doing something with what you know are two different skill sets. Small companies realize this soon.

Google is driven by numbers, If they see enough people who are very informed and smart but actually they gamed the system just to get in Google but really do not do much, then companies will be forced to make major changes.

Every idea/process has an expiry time before everyone realizes and try to game the system.


Promote literate programming!


This professor has done exactly that. Students in his senior level software engineering course made this webapp suitable for smart phones: http://www.bustracker.org/

The course he is teaching right now is a sophomore level data structures course. He's still teaching them the basics of computer science and programming. Walk before run and all that.


"If we are to use a medical analogy, imagine interviewing an experienced surgeon...'What is a scalpel?...show how you'd cut in that case'...such a situation is unimaginable."

I don't think this analogy tells us anything (other than that's not how the medical field works). Why is this situation unimaginable? What exactly would be wrong with this approach?

I do a couple tech interviews a week and I regularly see people with very impressive resumes that can't code fizzbuzz. The job largely consists of writing software; I see no problem with requiring candidates to demonstrate that they can, in fact, write software. I don't want to work with people that think they're too good to demonstrate their skills.


I just interviewed someone and literally asked fizzbuzz, and their answer took the better part of an hour. Any tips on cutting an interview short without insulting the interviewee?


Give them X minutes (you don't have to tell them X) and then at X minutes simply say, "OK... that's fine. We don't need to solve it here, but I just wanted to get an idea of your coding process. Before we wrap this up, do you have any questions for me?"


Whenever a candidate can't get it on their own, I give progressively better hints until they come close. This helps you gauge how many hints they need in comparison to other candidates and it moves the process forward.

Then move onto a different topic or skill. No one is good at everything or experienced in all fields. Plenty of candidates will do poorly on one question and then nail the next.


This stupid insistence on ignoring the background and experience of interviewees and instead rank them based on how fast they can write code on a whiteboard is the reason why 1) google can't polish a product to save their lives and 2) every time I look at the code they produce, especially small projects, I cringe with the lack of mantainability, header standardization or design coherence. Keep thinking the best people to hire are the ones that can spout basic cs theory at a moments notice on a whiteboard and nothing will change.


Can you imagine asking the surgeon: “What is a scalpel? What kind of scalpels are there? Now, cut this cadaver as if you were to perform an appendectomy. Good. Now imagine that there were some complications–show how you’d cut in that case.” I hope you are having a good laugh, because such a situation is unimaginable.

Not unimaginable at all. In a world where surgery was run like software "engineering" [1], we would interview surgeons this way.

Why don't we interview surgeons this way?

(a) Because surgical training is via apprenticeship. For any given surgeon, you can ask "which group of other surgeons taught you everything you know? Which ones signed sworn statements that you are minimally competent?" Then you can find those senior surgeons and ask them candid questions about the candidate. This is always possible. If a surgeon turns up without such references they don't get as far as an interview. ("I taught myself surgery from blog posts, and I practiced on my cats?" Eeeek. NO HIRE.)

(b) Thanks to the long history of medicine and its very serious life-and-death implications, surgery is a very thoroughly attested event with a very definite chain of responsibility. Look at a hospital's records for a given surgery, and you will get a complete list of everyone in the room. If the surgeon was the only surgeon in the room, you can be fairly confident that they handled the operation themselves. And though I'm not sure there's an actual law that forbids a surgeon from handing the scalpel to a nurse, or an intern, and having them do everything, that would (a) not be something they could do in secret; everyone in the room would know and (b) the surgeon-of-record would still be the responsible party: If something went wrong in surgery guess who would be sued?

What I'm saying is: It is hard to bluff being a surgeon. Which is not to say it doesn't happen: For various all-too-human reasons, doctors and nurses do cover for each other, and scandals occur. But it's not like software, where the level of bluff is really, really large. Software "engineers" are often self-taught by necessity, especially in the tasks they actually do all day; their work has no legally defined standards or practices; there are so many ways to solve a given problem that two engineers with the same job title and official responsibility may have completely different skillsets; they work largely inside their own heads where nobody can see what is going on; they can very easily copy others' work, or sit in the corner while the rest of the team does the heavy lifting and then claim full credit.

---

[1] For this post, I'm going to use these scare quotes to distinguish "software engineers" like myself from actual Professional Engineers, who are like surgeons and are not, in fact, asked to build little bridges out of Lego bricks during their job interviews.


Here (in Quebec) the title of "software engineer" or "computer engineer" is regulated like any other professional engineer, ie. you need a bachelor's degree in computer or software engineering, which is a year longer than CS and includes stuff like engineering economics or analytical and numerical solving of differential equations, and two to three years of experience working under a professional engineer's supervision. Microsoft actually had some trouble with us: http://www.microsoft.com/canada/learning/quebecmcse/default....

I think it's a good thing. We still have regular programmers, and the word "engineer" actually means something.

Still, the level of competence is far from that of surgeons. It's possible to do a bachelor's with fairly little programming: at least where I studied, most courses include a large portion of teamwork, and I'm sure quite a few teammates passed their course thanks to me (or other people) doing all the work. When they graduate or enter internships, they just do the less technical work, like evaluating software packages, testing, or reading bids. I don't think you could find surgeons who can't operate like you can find engineers who have trouble with fizzbuzz.


Computer Engineering & software engineering degrees are a lot different than CS than just another year. CE degrees in Canada are pretty much a dressed up versions of Electrical Engineering degrees with a few programming courses thrown in they go pretty deep on the hardware side of things but the programming stuff is covered in somewhat handwavy way. Software engineering degrees on the other hand are something that I just don't get. They require you to take too many useless courses like Chemistry and business-esque courses. I think CS degree is the sweet spot for programmers, I had had enough of computer hardware after take taking computer organization courses, building circuits is not of any interest to me, something that CE majors spend a lot of their time on.


Here in Melbourne, SE is a 4-year version of CS which requires you to take 2 years of math, 1 year of physics, some EE subjects, and two projects with industry clients. Most of the time little distinction was made between the degrees; they were referred to collectively as "the CSSE department". Computer Engineering, on the other hand, was very close to EE.


So when do they actually do CS?


Bad analogy (on the author's part re: surgeons). That example is precisely like the oral portion of board exams for specialties that have them. When you hire a board-certified surgeon, they have already demonstrated the ability to jump through this (important) hoop.


I didn't want to just guess this - there's enough misinformation driving this particular metaphor - but in fact I was pretty darned sure that, in addition to all the stuff I mentioned, every surgeon travels with the equivalent of a formal certification that he or she has passed the surgical equivalent of fizzbuzz.

Thanks for the confirmation.


Besides all this, a CS degree, from what I hear (I don't have one) often teaches you a lot of concepts but not a lot of practical skills. Which isn't to say it isn't valuable, but it may not have prepared you fully for work. You may not know anything about version control, for example, or agile development.

Plus, it's always possible that you got your CS degree and aren't actually all that smart. I'm not a genius, but when I'd been programming for maybe a year, my ex-Googler friend asked me a Google interview question and said I did better than a PhD in CS that he'd recently interviewed.

So education and credentials don't always prove that you can solve problems or do everyday work in this field. Which is why I'm GLAD when an interviewer wants me to solve real problems in the interview. If I don't have to, chances are my coworkers didn't have to, either...


> Besides all this, a CS degree, from what I hear (I don't have one) often teaches you a lot of concepts but not a lot of practical skills.

If my own background is anything to go by, CS is almost a textbook university subject: it presents a lot of theory and academic analysis, much of which certainly does underpin practical software development, but the emphasis is not on teaching specific applications, nor on teaching the supporting practical skills required to implement those applications. These things are experienced to some extent via examples that accompany the theory, and to a greater extent via reading around the subject and experimentation where the course has provided enough context to direct the student's personal study.

IMHO, one of the major differences between good and bad courses is that those that are taught well will also encourage the student to explore useful applications for what they are learning in this way, and thus to experience not only the practical implications of what they are learning but also the surrounding tools, processes and techniques that go with them in the real world.

Someone with a broad foundation in the theory and experience of varied practical applications should have no difficult adapting to other applications the same theory, nor to newer theories that have evolved through industrial R&D and not yet made it back into academia. Whether a recruitment process is any good at recognising this important ability to adapt and learn independently is an entirely different question, but any remotely useful process should be able to weed out the people who managed to spend 3-4 years studying theory without experimenting practically and reading around their subject, and avoiding hiring those is one of the most important basic goals of any software developer recruitment exercise IME.


The idea behind an interview is to understand how proficient you are in articulating your thoughts, and how well you can handle difficult problems. No good interview asks only simple questions. The hard ones are there for the interviewer to understand how experienced one is in handling newer situations, and how he/she performs in a "virtual" stressful situation. Its not a litmus test, but probably the best an interviewer can do in 60 mins. These skills only come by practice, and if school can help accelerate it -- that's great!

My only concern is that for introductory courses, programming must be a "fun-only" affair -- let students explore, crash and burn the computer rather than worry about small details.


Enjoyed the article a lot, I'm currently 3rd year CS. Although I understand that its not always practical to follow lessons with immediate or else set practicals, I find these are the lectures in which I excel.

We also had a lecture this year involving role playing of a RISC-1 processor pipeline, felt stupid at the time, but don't think I'll ever forget the purpose of an out 1 internal register.

I call for a new structure to sit down and listen lectures.


I'm all for it but please replace 'programming interview' with 'a job as a programmer'.

Edit for clarity: I'm not sure interviews are the ideal test-case for programming performance and thus I'm not sure it makes sense to prepare students for that.

Along the same lines I wouldn't agree with professors preparing their students for the GRE because that has little real-world relevance.


The method isn't about preparing students for a programming job, but about preparing them for real-world interviews.


This is exactly what we'd do in tutorial classes for most of the CS subjects I took during my degree. I can see it being harder to do in lectures, especially in early years, where large class sizes make it more difficult to keep everyone engaged.


Current software engineers are roughly equivalent, in terms of field maturity, with the 'doctors' who put leeches on people in order draw out bad humors. This isn't an insult, rather it's just amazing to have been around during this phase. Also, interviewing reflects this maturity.


Current software engineers are roughly equivalent, in terms of field maturity, with the 'doctors' who put leeches on people in order draw out bad humors

Balls. The medical profession killed more people than it cured until some time between the 1930s and 1950s with the discovery of antibiotics. Walmart's supply chain management software alone has bdone more good than the entirety of the medical profession up to 19##.


Walmart's supply chain management software

This is what we call cherry-picking an example. Someone who wanted to argue with you might select Windows autorun and its role in the world of illicit botnets and keyboard logging.


Autorun bad and all, but Windows was pretty obviously a huge win as a productivity multiplier. I'd be kind of surprised if Windows hasn't done more for GDP[1] than Walmart.

Practically any commercial software package ever sold probably had a greater positive effect on the world than pre 19## medicine. This will hold more strongly for shrinkwrap software that stays around long enough for the company producing it to build a rep.

[1]Or total world utility/happiness, if we could measure that


Considering that only 16% of software projects are completed on time and under budget [1], I'd say we kill more people than we cure, too, metaphorically speaking.

[1]: http://www.projectsmart.co.uk/docs/chaos-report.pdf


I wasn't talking about public good, I was talking about relative maturity within a field.


I imagine doctors would gladly trade this maturity to the luxury software engineers have: having build everything themselves.

Alas, a lot in modern medicine is not really much better than putting leeches, hence the articles "X is good for your health" and then two weeks later "or maybe not".

You can compare medics to a bunch of programmers which have to maintain billions copies of self-replicating and self-modifying program with billions year old legacy code. No source code is available, only obsfucated fragments without clear understanding which bits are significant and which just junk. Disassembly is hard, and no debug is possible, so the only way to figure out how the system works is try to modify a bit and look how the output changes.

Some defects manifest quite often and there are few bugfixes ready, but how they work and what side-effects they have is not always clear.

Systems are very fragile and must be treated very carefully in order to avoid fatal crash. Hence the "maturity" of the field — one has to be very careful when dealing with the staff not fully understood.


This is a very common meme, but there is some evidence to show that is isn't true.

Most fields (including software engineering) are pretty good at solving problems that have previously been solved. Generally speaking civil engineers are pretty good at building bridges that stay up for example (with occasional exceptions of course!). Similarly, software engineers are pretty decent at building accounting systems etc.

When any field attempts to do something new, software engineering is pretty much as good as any other field. Even fields as mature as mechanical engineering make mistakes.

For example, the space shuttle Challenger disaster was caused by defective O-ring seals, and delusional engineering management who systematically ignored warnings.

Richard Feynman (who knew something about building systems at the edge of what was possible) said the following, when comparing the software engineering process with other processes inside NASA:

The software is checked very carefully in a bottom-up fashion. First, each new line of code is checked, then sections of code or modules with special functions are verified. The scope is increased step by step until the new changes are incorporated into a complete system and checked. This complete output is considered the final product, newly released. But completely independently there is an independent verification group, that takes an adversary attitude to the software development group, and tests and verifies the software as if it were a customer of the delivered product. There is additional verification in using the new programs in simulators, etc. A discovery of an error during verification testing is considered very serious, and its origin studied very carefully to avoid such mistakes in the future. Such unexpected errors have been found only about six times in all the programming and program changing (for new or altered payloads) that has been done. The principle that is followed is that all the verification is not an aspect of program safety, it is merely a test of that safety, in a non-catastrophic verification. Flight safety is to be judged solely on how well the programs do in the verification tests. A failure here generates considerable concern.

To summarize then, the computer software checking system and attitude is of the highest quality. There appears to be no process of gradually fooling oneself while degrading standards so characteristic of the Solid Rocket Booster or Space Shuttle Main Engine safety systems. To be sure, there have been recent suggestions by management to curtail such elaborate and expensive tests as being unnecessary at this late date in Shuttle history. This must be resisted for it does not appreciate the mutual subtle influences, and sources of error generated by even small changes of one part of a program on another. There are perpetual requests for changes as new payloads and new demands and modifications are suggested by the users. Changes are expensive because they require extensive testing. The proper way to save money is to curtail the number of requested changes, not the quality of testing for each.

http://history.nasa.gov/rogersrep/v2appf.htm

Edit, TL;DR: To summarize then, the computer software checking system and attitude is of the highest quality. There appears to be no process of gradually fooling oneself while degrading standards so characteristic of the Solid Rocket Booster or Space Shuttle Main Engine safety systems.


Another insightful text about the software process used to create the Space shuttle software systems: http://www.sinz.org/Michael.Sinz/Software.html




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: