Hey, Joe! How did that interview go? Any promising candidate in the pipeline?
Joe: Not that great. They didn’t even know the fundamentals.
Joe: Yeah, I asked them to explain ACID and all I got was a blank stare.
Ah yes, the fundamentals or endless list of things that everybody working in the field ought to know. Atomicity, something, something, Durability? To avoid any future embarrassments I made sure to dust off my Wild Hog book 1 and re-read the section on ACID. Here’s an excerpt directly from the book:
The safety guarantees provided by transactions are often described by the well known acronym ACID, which stands for Atomicity, Consistency, Isolation and Durability.
…Today when a system claims to be “ACID compliant”, it’s unclear what guarantees you can actually expect. ACID has unfortunately become mostly a marketing term.
After that passage the book goes on explaining all the potential situations and problems that you may fall into when dealing with databases. It’s an interesting read for sure (and necessary for anyone that is designing distributed systems), but even if I knew the exact definition I would be none the wiser.
A couple of years ago, I had to design an archiving system for storing lots of data coming from a bunch of hardware that powers a modern telescope. It was a small ecosystem that integrated various off the shelf products, since we couldn’t find the one size fit all solution that would cover our requirements. It took me a long time of arguing with the customer and it cost me my sanity, but other than that, it wasn’t a particularly groundbreaking piece of software.
Yet, despite coming up with a distributed archiving system that works in production, I wouldn’t be able to explain from the top of my head what the ACID term means. Is it really such a fundamental thing then? It turns out my day to day work doesn’t require deep knowledge about database internals and I can mostly treat them as a black box with an API 2.
But, that’s only because you are a noob and you don’t deal with the things that I have deal with.
Someone on the internet
Exactly. The field is vast and that makes it easy to ask questions about things you have never touched before, yet one may consider them fundamental. I am mostly programming complicated control systems where I have to turn vague requirements into a working software. The performance requirements are often known in advance and they are much smaller in comparison to the “always on” social networks used by millions of concurrent users.
As a consequence of that, figuring out “what” we have to build is usually harder than figuring out “how” we are going to build it. A lot of the programming jobs out there are not really about coming up with groundbreaking solutions and once you understand the “what”, “how” will often be quite obvious and you can always fill the missing gaps of knowledge as you go along. After all, we are supposed to be engineers and not a walking encyclopedias.
How many similarly vague “fundamentals” we have floating around us? Plenty: RESTful, MVC, SOLID are the first things that come to my mind. If you start a discussion about the meaning behind those terms, you will find yourself in the middle of a food fight where everyone will have a slightly different opinion and no consensus will ever be reached.
For the majority of programmers out there, something being RESTful means that you can feed it chunks of JSON over HTTP. The minority, who have spent some time reading about it, will complain that it’s a design principle defined in the Roy Fielding’s thesis and it has nothing to do with JSON or HTTP 3. How many of you have actually read his thesis? All of it? No skipping?
I am quite sure that most programmers didn’t read it and have no intentions of ever doing that, therefore the misinterpretation of chunks of JSON over HTTP is here to stay. If the terms are so vague and so many interpretations are possible, maybe we should replace them with the ones that couldn’t be understood in 20 different ways. As such, I don’t think we can really treat them as fundamentals, as fundamentals are not supposed to be open for interpretation.
Let’s leave the acronyms and programming theories aside for a moment and ask ourselves another question. Is knowing how your computer works essential to your work as a programmer?
As with everything in life there are multiple answers to this question: yes, no and maybe. The programmers whose work mainly consists of squeezing performance out of their hardware will naturally gravitate towards: “Yes, every programmer should know how their hardware works as one cannot write an efficient software without that knowledge 4, and I wouldn’t be able to do my job if I didn’t know anything about computer internals, yak, yak, yak…”
On the other hand, not every programmer is dealing with performance issues as part of their job. Some has also specialized in writing GUIs and are more concerned with the user experience and making the user interfaces pretty. They may not even have the computer science background, yet they have managed to learn enough about programming to be hired as programmers doing their part of the job. I am glad they did, as I don’t enjoy pushing pixels on the screen all day long.
The first group likes to look down on the second group, because the ones who doesn’t know the nitty gritty of computer science and start a day with a profiler in hand are not a real programmers. The second group likes to look down on the first group, because somehow they don’t seem to understand that a Hot Dog Stand colors are not a sane choice for the user interface 5.
Each member of the aforementioned groups would benefit from learning a little bit from the other side, since when you are trying to ship a widely used product all that knowledge will come handy at some point. But, humans are prone to confirmation bias and like to overestimate the importance of their own work, hence here we are. I guess we would be much happier if we dared to admit that one man’s fundamentals are another man’s trivia 6.
Martin Kleppmann: Designing Data-Intensive Applications ↩︎
I am not saying that knowing the definition of a term is a signal of deep knowledge, but it’s easy to extrapolate and think that a rather obscure field in which you are working is something everybody needs to know. ↩︎
There are plenty of computer users who are using them as a tool without knowing anything about their internals. Scientists would be a great example: during their research they have gathered some data which they will have to analyze somehow. Sometimes they can analyze it manually, sometimes they can analyze it via little Excel messes and sometimes they have to write a program in whatever programming language they know.
Knowing how their computer works would let them write more efficient scripts that would perform the analysis faster, but unless such analysis takes days, optimizing for the execution speed of a rarely executed script is a waste of time. That time would be better spent elsewhere (e.g., on learning more about the domain of the problem). ↩︎
There is another fundamental that is ignored by a lot of computer users. I think everyone who spends a lot of time behind a computer should learn how to touch type, since typing is still the main way of interacting with your computer. Yet, seeing a great touch typist in action is a rare sight even among professional programmers. ↩︎