Preferences

How often do doctors have to prove they can still recall every obscure medical fact when changing hospitals? When a lawyer moves to a new firm, do they have to retake the bar or answer weird hypothetical cases all over again? Aren't their credentials, past cases, and reputation usually enough? Why do we in tech keep forcing experienced professionals to jump through these puzzle-like hoops instead of trusting their track record?

Doctors and lawyers are licensed, software engineers are not. If a doctor or lawyer screws up they will often face sanctions and lawsuits, whereas a software engineer can just jump ship to a different company and let their resume do the talking. In general it is way easier to get 3 years of experience as a software engineer than it is to go to law school and pass the bar.

I am sympathetic to the argument that software engineers should be licensed, which would reduce the need for dumb technical interviews. But I imagine HN wouldn't like that very much.

> I am sympathetic to the argument that software engineers should be licensed, which would reduce the need for dumb technical interviews. But I imagine HN wouldn't like that very much.

Oh yea, HN hates this idea, but I'm sure it would cut down on all the screening, FizzBuzz, hazing that happens during interviews. If you had a - basic, barebones - certification or exam, that all software engineers must pass, that at least says "This candidate can function at a very minimal level" it would at the very least filter out the 50-75% of candidates who literally cannot code or even speak coherently about the basics of programming.

I think people here are really in love with the romantic ideal that someone with no college, no credentials, no formal-this or certified-that can (in theory) jump right into a senior FAANG job. Yea, that's great I guess, but in practice does it really work? It seems to me that in practice, it just means every company gets thousands of totally unqualified candidates every time they post a job offer, and this helps nobody.

Isn't there a business opportunity somewhere, providing exams and certificates for software developers? I find it surprising that many rich companies have this problem, and there is no standard solution yet.

How much would it cost e.g. Google to start an independent certification company, with branches in 20 different cities across USA. The top candidates could be also offered a job at Google, but every candidate would get a certificate confirming their skill level that many other companies would probably be happy to accept.

At first, the testing could be done for a symbolic cost, because Google already spends some money to interview its candidates anyway, and this would also serve the purpose. But later, when many companies start accepting the certificate, you could let the candidates pay the full cost.

The first rounds of testing could be done at a computer (provided by the testing company), you would just make sure the candidates are not cheating. Only those who achieve a sufficiently high score would be later interviewed by humans. This would be way more efficient than interviewing everyone individually.

What's curious is that in the US there are many IT trade certifications (e.g. A+) and many engineering certifications (Professional Engineer) but software engineering falls into a weird gap. I think IEEE offers some certificates but they don't seem especially valuable.
Oracle has Java certs. That's something MS could also do with CSharp, AdaCore with Ada,...

How useful that stuff is/would be in practise, I don't know.

They kind of offer this service for free already. Who's going to pay them to do it? :)
> "How often do doctors have to prove they can still recall every obscure medical fact when changing hospitals?"

Doctors go through 5-7 years of residency with long hours, low pay, and a full fledged physician watching them who can fail them. And that's after an undergraduate degree and medical school.

If you want to endure a 5 year miserable grind just to avoid interviews, go for it.

Doctors, lawyers, teachers and other licensed professionals do continuing education every two or three years. Software engineers are not licensed professionals, so there is no legal standard of quality that all software engineers are guaranteed to have met (and continue to meet). Hence, the interview is an assessment along with all other parts of the application.
“Continuing education” for doctors could mean someone lectured in the university lecture hall for 45 minutes while you nodded off, then you got your free slice of pizza while signing your name on the continuing education form circulating around the room. It’s a farcry from school.
Because it's much easier to get-by faking it as a software engineer compared to a lawyer or doctor. We've all have had these colleagues - and I've seen them jumping to other companies easily.
I don't know if that's true. I'm not saying software engineers don't fake it, but I think many doctors and lawyers are faking it, too, if we're defining faking it as looking up information, or even assisting with your core job responsibilities. I don't consider that faking it, but I think that's what we're talking about, isn't it?

One time I asked my optometrist what he was doing when he left me in the examination room for 15 minutes. He said he was looking up the details of the medicine he was going to prescribe me. I don't consider that incompetence, and it sounds a lot like what I do when I need to know about some technical issue. Everybody in my industry knows that Googling stuff is a huge part of the job, but you're not allowed to acknowledge that in an interview.

I think it is actually an interesting and relevant question: why don't past performance or job references play more of a role in hiring, compared to technical screens many people hate and think are pointless? Checking references and former calling employers—if it's done at all—is often a pro forma final step, rather than the first step, which might plausibly be a more effective way to screen candidates.

I don't think that's faking it at all. Being able to research is part of any knowledge based profession.

Faking it is shirking your duties. Most bad medicine doesn't result in obvious harm/cost to a patient or a board complaint. A poorly managed medical practice that runs off support staff and makes patients wait three hours isn't going to threaten a license, and that's faking practice management. A primary care doctor who punts anything beyond the very, very basics to third party vendors or specialists is faking it.

In my major metropolitan area, general practice veterinarians are often absolute crap at doing their (required) medical notes. If you ask them for copies, they'll send you invoices. If you push for real notes, they'll use their 48 hour response window to make something up. They almost always get away with it because during future visits, the vet (whether it's the same person or not) will ask a bunch of questions to (necessarily) fake knowing what was done previously. It's obviously going to affect patient care and cost, and if you look at board complaints that aren't thrown out, the "punishment" is usually because the board saw that the vet has shit for notes. And don't even get me started on vets using AI, which results in hallucinations in the notes -- which I think are notably worse for patient care than just empty notes.

I think that's a good example of even good doctors faking it.

Being able to do the work with reference materials is fine. Maybe if you spend all day on stack overflow asking questions and waiting for answers, not that great. But it depends on the level of work; for me, a lot of my real work problems either have no search results or lead to only reports of questions with no answers. Formal documentation is typically absent, although sometimes there's insufficient or misleading documentation instead. Someone who can't figure things out in these conditions wouldn't be able to do my job.

In my mind, faking it is learning what to say in interviews to get the job(s) without having the skills and ability to do the work.

That said, when I interview, I'm really looking for can they describe the output of a program before the write the program, and then can they write the program that works as they said it would. Also, the program should have a loop inside a loop, because nested loops happen all the time and there are too many candidates that can't deal with them.

Candidates that can't manage a nested loop or can't write a program that does what they said it would immediately prior to starting to write the program but somehow managed to get into my interview are faking it in my book. Some candidates clearly are just having a bad day, and some are probably having a bad day but not as clearly.

References are hard because a) some people fake those b) fear of litigation means many companies prefer to only provide start and end dates and maybe eligibility for rehire. Also, the modern world means nobody answers their phone so this process becomes very asynchronous.

Doctors and lawyers build up immense amounts of "open-source" work that proves their skill. Almost everything a lawyer writes becomes public at some point. Doctors have patient results and billing records that can be checked on a reference call. Software engineers comparatively work in the dark and rely on softly-defined metrics.
I don't think patient results can legallay be shared (in the US) on a reference call with any specifics, and vague statements are no better or worse than a manager describing a dev's performance on projects.

And I'd be kind of surprised if a practice shared billing performance info. That'd be like sharing a tech support person's ticket close rate, but even more sensitive. Maybe at a very high level if HR isn't concerned with defamation claims?

Patients cannot, but other doctors see these outcomes and can comment on them in vague terms. Administrators can also comment in vague terms.

It's all anonymizable.

New doctors, conventional engineers, and lawyers only come with analytical status, though. That hiring process focuses on what you’re like to work with as a person. I recommend that formula because those domains know how to layout a firm which can avoid malpractice. Those firms compete on problem solving, where 10x skill or availability might eventually mean 10x salary.

Firms which associate disruption with profit are altogether different. No such thing as malpractice; and if there was, they’d prefer to hire the “risk taker”.

There’s a lot of doctors in my family. They all carry malpractice insurance. I’m told that malpractice lawsuits have a lot more to do with bedside manner than actual performance.

This item has no comments currently.

Keyboard Shortcuts

Story Lists

j
Next story
k
Previous story
Shift+j
Last story
Shift+k
First story
o Enter
Go to story URL
c
Go to comments
u
Go to author

Navigation

Shift+t
Go to top stories
Shift+n
Go to new stories
Shift+b
Go to best stories
Shift+a
Go to Ask HN
Shift+s
Go to Show HN

Miscellaneous

?
Show this modal