Hacker News new | past | comments | ask | show | jobs | submit login

After programming for 20 years, I've noticed that programmers are aware of the potential benefits of any abstraction that they have internalized, and blithely unaware of the costs. Because, having climbed that learning curve, it is now free to them.

The result is that programmers introduce complexity lightly, and when they walk into a new language/organization/etc are inclined to add random abstractions that they are used to from past experience.

And yes, I am not immune.




I wouldn't look at it that negatively. Programming is like speaking a language, I don't mean what programming language you "speak" (ie, c++, or java, or whatever), but how you're used to speaking in it.

Like real language, you probably prefer certain idioms, you have your own style and you know certain words better than others. This is similar to how we program, we know certain algorithms, yet are oblivious to others, we have our favorite abstractions and have our etc.

There are downsides to these habits for both spoken language and computer programming languages, no doubt, you can get stuck in a rabbit hole without seeing you're in one. In the end, a similar approach works for both, read and write widely if you want to be great at these, meet different groups, work on uncomfortable projects, you know the drill.

Just like how you expand your language and therefore your world; socially.


> "After programming for 20 years, I've noticed that programmers are aware of the potential benefits of any abstraction that they have internalized, and blithely unaware of the costs."

'Lisp programmers know the value of everything and the cost of nothing.'


Underrated comment!

Here’s the footnote:

“Perlisisms: EPIGRAMS IN PROGRAMMING by Alan J. Perlis”

http://www.cs.yale.edu/homes/perlis-alan/quotes.html


Your criticism of abstractions is very much in the abstract.

What abstractions?

What costs of these? In human comprehension, in runtime, in reliability, in mem/cpu?

Can you give examples which I can usefully learn so as to avoid?

TIA


Any and all abstractions. The cost is usually in all of the above.

Examples that I commonly encounter include OO, closures, dependency injection frameworks, complex configuration systems, various code generation systems, and on and on and on.

In general the tradeoff is this. For those who have internalized the abstraction, they can think about more complex things. Those who have not internalized the abstraction find it hard to figure out how the system works at all until they internalize it. So when you work on code that has a lot of abstractions under the hood it becomes either a black box (that occasionally you dig into) or (very often) a requirement that you understand X before you can even start to work on the code.

A rule of thumb that I use is how long the stack backtrace is. If every bug creates a stack backtrace that is dozens of frames, there are a lot of abstraction layers in place. And when all of the layers are actively being worked on, it adds up - quickly.

Deciding whether given abstractions are worthwhile for a given problem involves a judgment call. Unfortunately the people who are in the best position to make those calls tend to be the most senior, and tend to be the least aware of the costs of the abstractions that they add.


A nice reply, upvoted.

I would note that if you start your list with OO then you're including abstractions that almost all professional programmers would consider not abstractions but basic tools. It's impossible to work without such abstractions, except by working in purely procedural code, and even then...

I would add that if "a requirement that you understand X before you can even start to work on the code" is the case then your abstraction has - arguably, I may be wrong! - failed. My DOM iterator abstraction would take some effort of understanding to maintain, which was my concern, but it was a very simple black box to use.


A simple example of an abstraction for which this is not true is the MVC design. If you don't understand how it works, you don't know where to begin looking to make a change to the system.

And yes, some abstractions truly are basic tools. And the more experience you have, the more basic they seem, and the more such tools you have.

I am not arguing against abstraction per se. What I am arguing is that abstractions bring a cost, and that cost adds up. A project needs to find an appropriate balance, and there is a tendency for experienced programmers to draw that balance at a point that may or may not be appropriate for the people who have to maintain the system after them.


Which is why what really matters in programming is "communities of shared abstractions."

This is part of what you get with a language, but it depends on the language. (We've seen Javascript users split into several such communities, I'd say).

Or it can be what you get with a framework -- one of the main values of Rails is that people who have learn it's abstractions can look at each other's code and understand it. This applies to third-party extensions to Rails that get especially popular too, like say Devise. (The downside is when those abstractions aren't good, and you want to use something else instead... now everybody finds your code difficult to understand).

When we talk about a language having a good stdlib, a lot of what we're talking about is providing a good set of abstractions that everyone will learn, making their code understandable as well as interoperable with other developers'. JS's lack of much of a stdlib may be not unrelated to JS schism into several communities of shared abstractions...

I don't think it's really about "minimizing abstractions", it's not even possible to do so -- it's about which abstractions how. The important point you make is that abstractions that are understood by a community of programmers, from which those who work on your code are likely to be from -- actually practically have less cost than abstractions that will be unfamiliar with them.

I rememeber when OO was super confusing to me...


Replying to myself as I thought of an example from my own past which is relevant.

After discovering the eye-opening expressivity of functional programming years ago with Dylan and Haskell, I had a substantial project in javascript. I found JS supported first class closures and lambdas, which allowed me to make iterators.

The novelty was they were iterators not over lists but over trees (DOM trees in this case but of course any tree could be made iterable).

I understood there would be a learning cost for anyone who took over from me so I left plenty of docs and pointers. It would not be a small cost either, my successors would have to learn to think differently, at a higher level and possibly rather alien (to them) way.

But was it worth the cost, bloody hell yes! Not having that iterable and rather declarative abstraction over DOM trees would have greatly bloated the code and consequently brought in bugs by the bagful. I would do in a couple of lines what would have taken half a page to do, in many locations.

So it had a human cost but if you could overcome that, a seriously huge human benefit.


haha did your successors think it was worth the cost?


What happened was interesting and unexpected. I was building on top of an application (call it H) which had the DOM trees to manipulate, and JS to manipulate it with. I was reporting new bugs in H continuously, 1 or 2 every day.

Because of the abstractions I'd done (it wasn't only tree iterators) I could largely work around the bugs invisibly - I pushed the bug-special-case-handling code down to make it invisible when using my abstractions.

That made it all too viable to continue using an crappy, flaky product far longer than would have been possible - or sensible - without those abstractions. I'd turned the abstractions' value into a liability!

I finally told them it wasn't worth continuing with H, they junked it AFAIK and I walked away.


More importantly, I suppose, did the company?


The absence of abstraction being what exactly ?

Also beware of that kind of pseudo-wise thinking. Is there a point in climbing up a learning curve anymore ? Maybe you have a constant and massive stream of fresh bodies to throw at your "simple" solutions before laying them aside once those abstractions have clogged up their heads ... And maybe if you can unskilled workers at such a scale, it's because you have massive fundings as well...

More a question of financial optimization than software engineering I think


Before dismissing it out of hand, take a look at the Go language. It was designed to make specific kinds of common abstractions hard exactly because, when working at scale, programmers routinely create disasters by layering abstractions in a way that nobody can understand the consequences of.


This is exactly the kind of pseudo-wisdom that I read the GP as referring to, though.

In the case of Go, the core team saw the pain of indirection-masquerading-as-abstraction in complex Java/C++ codebases and considered the whole thing to be a boondoggle. As a result of this we’ve been saddled with a popular language in which two massive projects (gvisor and kubernetes) have had to hack their own expressivity into the language just to build complex software (i.e. codegen’d generics)

I worry about the cyclic nature of progress in our industry, where wonderful advancements can be made and then walked back or under-utilized because we aren’t patient enough to learn them thoroughly.


Go is Java 1.0 all over again.

If it's enterprise adoption ever goes beyond Kubernetes and Docker, expect GoEE and Go Design Patterns to make their appearance.

Worse, since its plugin support is really cramped down, expect any enterprise grade CMS to be built on hundreds of processes.

This happens all the time with simple languages, tons of library boilerplate code.


That means you have probably not been programming C++.

In C++, the cost of abstraction is a fundamental consideration, and while you can still ignore it - most resources about using C++, and gradually even the language itself, tend to nudge you towards avoiding abstraction costs in various ways (sometimes ugly, sometimes elegant). Plus, core language designers and compiler architects bend over backwards to reduce the cost of various abstractions to nothing or very little.

Plus, it happens that sometimes, you can use stronger abstractions to _reduce_ cost rather than increase it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: