Hacker News new | past | comments | ask | show | jobs | submit login
On Design Patterns in C++ (fluentcpp.com)
79 points by mnouquet on Dec 18, 2020 | hide | past | favorite | 67 comments



I wish we had a new Design Patterns book in 2020.

GoF book is well obsolete, since then OO has faded, JS & Python are the the most common tools, even Java is unrecognizable with Lambdas and functional features everywhere. Meanwhile vast amounts of memory and CPU power mean flyweights and object pools aren't so useful and immutability is now a valid design goal. The GoF needs an update.


I'd rather just see the idea of Proper Noun Design Patterns die, in my experience they've done more harm than good by not emphasizing their tradeoffs. My struggles with over-eager developers turning everything into a smorgasbord of over-abstract design patterns outweighs any benefits I ever got from them (there's a similar tendency with some functional programming enthusiasts). I think we're better off tucking these into language documentation and calling them "idioms", to keep the zealots and dogmatists from turning them into thought-terminating cliches.

> Meanwhile vast amounts of memory and CPU power mean flyweights and object pools aren't so useful and immutability is now a valid design goal.

I don't agree, they are still widely used in most languages, but are often hidden behind abstractions so that developers don't need to know about them.


This is a problem with bad developers - the general idea of patterns makes a ton of sense because pattern recognition is important and it makes it easier to reason about the properties of various parts of code.


Blaming bad developers and hoping for them to try harder isn't going to make anything better.

I totally agree that patterns are important and useful, I just think that the way they're presented and over-emphasized is net-harmful, and that by changing that we can make the situation better.


>>Blaming bad developers and hoping for them to try harder isn't going to make anything better.

Well, but it does, doesn't it?

I mean, the first step to solve a problem is to identify it. Clearly, the problem with design patterns is that clueless developers have no idea what they are talking about, thus they mount incomprehensible complains that make no sense if you'd understood the basics.

Developing software is a complex endeavor that is much more than declare variables and nest control statements. There is way more to developing software than what is covered by a "intro to language X" or "learn framework Y" blog post.

If a programming language gives you brick and mortar, you start to talk about retaining walls and windows and stairs. If a programming language gives you gears and screws and bearings and springs, you start to talk about pumps and differentials and dampers.

So what's hard to understand in needing design patterns to talk about components and solutions to common problems?


I've heard design patterns described as "missing language features" and when you look at the GoF book, many don't make sense in languages other than C++ because they're already built-in or unnecessary.

If you're looking for a newer book on design patterns, I've heard Head First Design Patterns (2004) suggested as much more approachable. The cover image is a bit goofy and screams mid-2000s. I've only seen excerpts, but GoF if I remember correctly is very dry and doesn't explain some things very well. Head First Design Patterns looks to be written for Java, but I don't remember having trouble understanding and implementing them in other languages.

Head First Design Patterns: Building Extensible and Maintainable Object-Oriented Software 2nd Edition will be released on December 29. I guess this is the newer edition? It shares 2 of the authors.


> I've heard design patterns described as "missing language features" and when you look at the GoF book, many don't make sense in languages other than C++ because they're already built-in or unnecessary.

I've heard that statement before and it really never made any sense to begin with.

Design patterns describe higher-level concepts. Just because a higher level concept made its way into a standard library that doesn't male it go away, nor does it eliminate the need to reason and have higher level discussions involving it.

Take for example C#. Does it make sense to talk about event loops in C#? Or observers? Or callbacks? Because those are all design patterns, aren't they? Great, you don't need to add a lot of boilerplate code to get to use one, but you're still using the concept and reasoning about the system in terms of these patterns and their features and properties, right?

Just because I can buy a ready-made roof instead of having to cut down trees or buy the wood somewhere, that doesn't mean that a roof is no longer a roof.


I agree it isn't about the language features, but designs have changed a lot. 2nd ed could be great - thanks, I'll try to get a copy when the price settles down a bit.


A lot of that stuff isn’t necessary anymore in Modern C++ either.


> Meanwhile vast amounts of memory and CPU power mean flyweights and object pools aren't so useful

I'd argue that they still are relevant. The only thing that's seen more impressive growth than memory volume is the relative cost of a cache miss. So the great irony is, one of the most efficient ways to sabotage your CPU power is to make use of all that RAM.


> The only thing that's seen more impressive growth than memory volume is the relative cost of a cache miss.

Has the cost of cache misses truly increased? (Also: relative to what?)

Today's CPUs have far larger caches than in the past (and not to mention a lot faster too), so the number of cache misses and the time cost of each cache miss for the same program on a modern CPU should be much lower—right?


It depends a lot on how you frame it. The absolute time spent waiting on memory has improved, although not at nearly the rate of other things[1]. When I was young and rosy cheeked, the performance gap was only order of magnitude, but it now spans more than three. So the cost of a cache miss, in terms of number of CPU cycles lost (which is what I meant by the term "relative") has grown by quite a bit. Worse, while we keep adding CPU cores, there's still only the one memory bus. Even if one core is waiting on a memory access, the others can keep working - but only for as long as they don't need to access memory, either. Which may not be for very long if, for example, you're allocating lots of short-lived objects, or using persistent data structures. In memory intensive applications, things can quickly back up so that all the cores are blocked up, waiting in line behind each other for data. If you look at the cost of this in terms of per-core CPU time rather than wall clock time, things start to look pretty icky. And this can be a devious thing, because it's an effect that isn't typically shown in the output of a performance profiler.

On the cache side, it's true that caches are bigger. But then we run into that old saw about software people's greatest achievement being to negate to efforts of hardware people. Caches get twice as big, and programmers decide the best thing to do with all that extra space is using 64-bit numbers as a matter of habit, or perhaps even switching to programming languages that don't have 32-bit numbers. Or by switching to dynamic languages that cram the cache full of pointers and object headers. Things like that.

Which isn't to say that all of these practices are objectively bad - spending computing resources on human productivity is typically a very good trade-off. Just that there's no such thing as a free lunch. Memory usage still has a performance cost, and, in stark contrast to how things worked a quarter century ago, it now kicks in long before the system starts experiencing actual memory pressure.

1: See, for example: https://assets.bitbashing.io/images/mem_gap.png


Nowadays there are increasingly many memory buses. I think the top-end Epyc uses 8 of them, that may each simultaneously have an operation in flight, each with its own queue. Lesser Zen2s have four memory buses.


Perhaps I over-simplified, but, from what I've seen, performance benefits from multi-channel memory architectures are pretty variable and not something a programmer should typically assume.

Partially this is due to configuration variability. A lot of laptops are still configured in single channel mode. Even high-end laptops tend to be dual channel setups, even if the CPU itself supports more. I'm inclined to say that high-end CPUs configured for 4 (or more) channels are rare enough that you probably shouldn't expect to be able to enjoy one unless you're developing in-house software where you can know exactly what kind of hardware you'll be running on.

And partially it's due to contention. All the memory channels in the world won't help you when memory access starts to pile up on the same bank. Which is, in practice, going to be something that will happen all the time if you're not taking steps to keep it form happening, because that's the kind of unhelpful jerk that stochastic processes are.


Agreed. I did not mean to detract from your point, and awareness of bus architecture is so limited that vendors usually get away with cheaping out on memory channels so we often have even fewer than our CPU module could exercise.

In practice if you have more memory buses they are more likely to be useful to help more processes or threads make simultaneous progress than to speed up one process. For a single thread to make good use of more memory buses usually requires using memory pre-fetch intrinsics and memory sequestration, and even then it is hard to keep more than two or three usefully engaged for that thread.

So, your point stands.


I think the parent talks about the relation between miss/hit, not the actual cost of a miss. In other words, if a piece of code took 100ms with 100% cache hit and 200ms with 0% hit but now it takes 20ms with 100% hit and 100ms with 0% hit, your code is 5 times slower with cache misses, when before it was only 2 times slower. Yeah, it's faster than before, but the relative cost in terms of your available performance has increased.

And I'd agree with that statement. I have been doing some heavy profiling lately and it's fun to see how good are modern processors at executing even non-optimal code if it's cache friendly. However, the moment it needs to hit L3 cache or further, the pipeline stalls and performance goes down the drain.


I feel like the focus now should be on architecture rather than patterns.

We know how to write good widgets now, but designing your program such that it scales well and doesn't rot is still difficult (e.g. lots of mutable state is fast but is much harder to scale when you introduce concurrency)


> I feel like the focus now should be on architecture rather than patterns.

It's really the same issue, isn't it? specially if you follow a component-based design. The difference is only in the level where the discussion is set.

Having said that, personally I've seen a tight correlation between those who criticize about design patterns and those who criticizd software architecture.

In fact, the higher the abstraction, the higher the criticism. It seems the furthest the subject is from the critic's comprehension and experience, the more their criticize.


I didn't like this article because it picked two examples out of two 400+ page books just to gatekeep C++. I don't know the author, but I don't see the purpose of this article. There is so much more to C++, and so much more to the computer book publishing industry. The former is hard to capture in one text because it is so radically diverse, the latter floods the market with crap. I really have a problem with articles like this one: paper thin and disposable.


FWIW from what I have read of GoF, I think this criticism about runtime polymorphism is spot on.


Any Design Pattern identifies a failing in the core language or its standard library.

Were the language up to the job, the pattern would have been captured in a named library component, and no longer be a Pattern at all. Use of a well-designed and -tested library component is always better than re-coding the pattern again at each new use.

A classical example is the hash table. In C, and similarly limited languages, you cannot write a useful hash table library. Instead, custom hash tables are re-coded over and over again in thousands of individual programs and libraries wherever they are needed.

In other, similarly limited languages, the pattern is pulled into the core language, as a "dictionary" or "map" type, eliminating most need to re-code it, at the expense of flexibility in its design.

In more expressive languages like C++ and Rust, hash table libraries arise, and the most commonly useful forms appear in the standard library.

We have continually added to C++ and its library as we have identified patterns. New patterns arise in composing existing features and library components, so adding to them creates new opportunties for patterns to arise. As each new pattern is identified, it may be captured. Sometimes this demands new core language support for abstraction, as is seen in new Concepts and Coroutines core features. So, we naturally expect to see new library components that use the new features in commonly seen ways.

The expected addition of pattern matching to core C++, similarly to what appears in Rust and certain other languages, represents a lapse: a failure to discover and specify core language features to add that would enable implementing usefully general pattern matching as a family of library components. Conflict over, e.g. simplicity vs generality, in design of the core feature is unavoidable, where, were it implementable as a library, different choices could be captured as different library components.

Anytime we lapse and fail to discover core language primitives needed to implement a high-level construct, we should remember and see each use of it as a rebuke, and motivation to invent core features that would have made it redundant.


Design patterns aren't data structure though, I agree with the rest of your statement.

OO Design patterns are usually about implementing functional concepts into rigid OOP languages such as C++ or Java (though both languages have gotten better).

With first class functions, there is no need for factories or strategies, or commands or a lot of stuff that imply single method classes.

With pattern matching, there is no need for chains or responsibility and so on and so forth...


A design pattern is anything you see hand-coded in multiple programs.

We don't often see data-structure patterns nowadays just because such patterns were recognized early as in need of core language features to capture them into libraries. Hash tables were a tough nut, so the core features that are needed to capture them in libraries, or even direct core-language implementations, didn't make it into some of the languages that are still in common use.

The set of patterns that can be captured into a library is a moving target for a language that is still evolving rapidly, and for the landscape of new and upcoming languages; but a fixed target for a mature language like C. In C, X can, Y can't, and that will probably always be true of C.

As time goes on, the set of patterns that can be captured into libraries of new and evolving languages grows, and with it our expectations of new languages. A language, new or old, that fails to capture what other languages are seen to do (for reasons) is rightly seen as less powerful.

There is a place for less powerful languages, for use in limited domains, that are therefore quicker to learn and maybe safer to use, for less experienced or more distracted users. But we should always be clear about why the language is being made or kept less powerful, and what is lost by it.

When starting a new project with open-ended goals, we don't know what challenges the project will face and whether a deliberately limited language will be enough for the job. By the time we learn, it is generally far too late to change horses. Thus, aging OS kernels and databases we use are often still coded in C largely because that was all that was practically available when they started out.

Go is a recent deliberately-limited language that, because it was chosen to implement projects with open-ended goals, is getting painful extensions, and getting harder to learn. Rust was deliberately limited to reduce the burden of implementing its toolchain and to make it easier to pick up, but is rapidly expanding (according to plan) to satisfy the growing needs of its increasingly sophisticated user base.


> A design pattern is anything you see hand-coded in multiple programs.

It's like claiming an algorithm is a design pattern. No. In the context of GoF as talked in the linked article, a design pattern isn't anything you see hand-coded in multiple programs.


We have been trying to make our languages able to put algorithms in libraries for a long, long time. So, usually they are.


> We have been trying to make our languages able to put algorithms in libraries for a long, long time. So, usually they are.

No they are not, not in the context of the GoF design patterns which is what the thread is about, nothing else.


Restricting pattern awareness to the sort favored in GoF has become a pointless exercise. Most of the originals are no longer relevant at all.

An expanded awareness of unavoidably repeated constructs remains useful.


> An expanded awareness of unavoidably repeated constructs remains useful.

yet it's irrelevant to the topic at hand. You can't just make up your own definition to suit your argument when the related article talks about something completely different, it's dishonest.


Claiming that you couldn't pass first-class functions in C/C++ is a weird claim that I see an awful lot, that is completely wrong.

You can pass function pointers around all you want, it's just fucking difficult to get right without practice and the syntax is abhorrent.


I remember when those books came out and puzzling over them, especially the Modern C++ Design. It's hard to describe what it felt like back then, the promises that these techniques were making, how exciting it was and how differently software development practices have evolved. That's what I love about writing software, even after 30 years in the trenches. There's always something new to discover and then draw conclusions about how it's different from what you thought 5 or 10 years before.

I don't know if I would recommend these books today but they are undeniable classics.


Design Patterns are a massive code smell. I always prefer simple code over extensible over-engineered frameworks.


For the nth time, design patterns are descriptive, not prescriptive. They're a guide for when you're facing a type of problem which has been encountered thousands of time before, not a blunt tool to be jammed mindlessly into every possible scenario.


The problem is that you can preach this over and over but some people will still use them mindlessly in every possible scenario, because it's easy and feels good.

I think we've done a poor job as an industry of beating new developers over the head with "know the tradeoffs!" Design patterns are taught without teaching their tradeoffs, and so new developers think that it's best practice to apply every possible design pattern as eagerly as possible.

I agree that naming patterns is useful, but IME they've been a net negative on the code bases I've worked on, since the way they're taught encourages using them mindlessly without considering their costs.


...and the elaborate ones aren't needed very often, in my experience. So if you have many of them, it's probably overuse.


I think the languages with multi-page exception dumps are a great example of design patterns applied to the point of absurdity.


Funny, I prefer Design Patters over extensible over-engineered frameworks.

Design patterns can reduce a lot of unnecessary cognitive overhead. For example, a builder pattern that takes a config a returns an evaluation pipeline is really simple and eliminates a lot of runtime if..else conditions.


> Design Patterns are a massive code smell.

Are you really trying to argue that a collection of terminology aimed at describing high level concepts, or standard solutions to commonly repeating problems and requirements, is a code smell?

What other good engineering practices are also code smells?


It's very close-minded to believe that having a common phrasebook to solve common problems can be described as a "code smell".

Design Patterns are tools in a box, aiming to create scalable solutions, but the way a solution is implemented is up to the implementer. Design patterns just try to establish the language.


Yes, but this approach doesn't scale.


Torvalds and a few other people would disagree with you.


The reason it's just a few other people is an indicator that it doesn't scale.


"Any sufficiently complicated C or Fortran program contains an ad hoc, informally-specified, bug-ridden, slow implementation of half of Common Lisp." - Greenspun's Tenth Rule

In the same way, any sufficiently complicated C++ program contains an ad hoc, informally specified, bug ridden implementation of half of the Gang of Four design patterns. They would be better off just using the patterns, instead of trying to roll their own half-baked solutions to the same problems.

If you've got one of those problems, use the standard, tested solution. If you don't have one of those problems, don't shoehorn in a solution that doesn't fit.

It's not that hard to grasp (except for junior devs who have recently read a pattern book...)


"Oh, it's just a singleton" - me after half an hour of reading yet another implementation calling it something else.


OK, let me extend my statement. If you've got one of those problems, use the standard solution, and call it by the standard name so that everybody understands what you're doing and how you're doing it.


patterns are not frameworks. Patterns are not libraries, too.


But they could be ...


a library maybe but not a framework. The framework controls the program flow.


So you would rather have a “sortListByName(list)” where the sort algorithm is hard-coded to what field we should compare on rather than a factored approach using the Strategy pattern? “sortList(list, [](const item& i) {return i.name();})”? I would say the factored solution using “Design Patterns” is the simple one here.


I'm not sure I agree with the author of this book that that counts as an example of "The Strategy Pattern" (vs. simply "passing a function as a parameter").

Is having parameters at all "The Parameter Pattern"? Should we call passing a boolean flag "The Flag Pattern"?

To me, "Design Pattern" implies at least some higher order structure than that (and which is not natively abstractable over in the language in question).


This is the crux of the matter. Design patterns are just common structures that have been found useful at some point in some context. Some programming languages have adopted these patterns, adding "native" support for them. If your programming language didn't have parameters, then yes, I guess someone would add the "Parameter Pattern". It's all rather arbitrary what ends up in a book on "design patterns".

Regarding the strategy pattern, when does it start becoming the pattern and when is it just "passing a function as a parameter"?

   sortList(list, [](const item& i) {return i.name();});

   struct MySorter
   {
      bool operator ()(const item&) {...}
   };

   sortList(list, MySorter{});

   class MySorter : public ISorter
   {
   public:
      virtual bool execute(const item&) override {...}
   };

   sortList(list, MySorter{});


Design patterns aren't common _structures_. I suppose this is why everyone gets so caught up in the language specifics and think they're trivial. Design patterns are common _design decisions_. The strategy pattern isn't saying "you can have callbacks", it's saying "sometimes you should allow a policy to be supplied by the user". Every day, in every language, even the ones with first class functions and people posting on the internet that design patterns aren't needed, people still use libraries with massive 'options' arguments to functions, instead of internalising the actual lesson of the strategy pattern which is to allow executable specifications of behaviour.

So yes, there are trivial things which you can argue the toss on, but in the last month I've used HTTP libraries that don't allow you to supply a strategy for retry logic and machine learning libraries that don't allow a strategy for early stopping.


>Is having parameters at all "The Parameter Pattern"?

Yes

> Should we call passing a boolean flag "The Flag Pattern"?

The fact that we call it a flag, in itself a metaphor, shows that this is exactly what is happening already. We just omit the "pattern" as it is not necessary for understanding.

Pattern just means "something that occurs repeatingly". Some things do so obviously, others a bit less obviously.

My experience teaching comp. sci. is that when faced with problems that call for design patterns, ~40% of 20/22 y.o. students are able to come up with the "classical" design pattern implementations on their own in a couple hours without prior exposure to them. For the rest, well, we have the books.


> students are able to come up with the "classical" design pattern implementations on their own in a couple hours without prior exposure to them.

I think this is even mentioned in the GoF book. The point of naming these patterns is not that they're amazingly innovative. The point of "Patterns" is that when we give them a name we can (a) discuss them easier with a common language, and (b), we can recognise when we use them again so to remember the pitfalls of last time.


Yeah, pretty much. All languages have design patterns, but somehow everything in Java becomes a Pattern. So whether something is called a pattern or not is definitely very culture-dependent.


I think this about monads in Haskell. They're a design pattern (well, several different patterns). They just aren't called that.


Design patterns are considered to be things that you can’t express directly in the language (that’s the Design Patterns book definition). Monads can definitely be expressed directly in Haskell. The only thing you can’t do is prove the Monad laws in Haskell (I think?).

Other languages, like e.g. Rust (because of lack of HKT), cannot express monads like Haskell can.

But yeah, you can definitely call it a design pattern if that’s what the culture around the language is like. But it would be like Java folks calling interfaces with default implementations for something goofy like “Interface with stateless implementations pattern”, and even they don’t go that far.


Late reply; I didn't see this earlier.

But in Haskell, a lot of the time you don't use a monad because you want a monad. You use a monad to do something else - to log in a pure way, for example. That use of a monad to solve a particular problem is a pattern.

I'd even argue that it's using the pattern to compensate for a weakness of the language. (Or, if you prefer, to help with things that are difficult within the language design and philosophy.)


That's sometimes called language idiosyncrasies.


>>Should we call passing a boolean flag "The Flag Pattern"?

You mean like a bit field?

Did you realized that with only two words you can convey amultitude of information regarding it's use and implementation and properties?

Two words.

Perhaps there is more to design patterns than mindlessly complaining about stuff you don't fully understand.


The Flag Antipattern


I have a colleague which calls a higher-order function Reader pattern which is obviously absurd.


Yes, things like thread-local storage and singletons are so over-complicated.

/s

https://en.wikipedia.org/wiki/Software_design_pattern


There's a balance to it, and finding that is what (I think) you should strive for in software development.


unless you're writing a framework


Either you understood these or you didn't, what I like about them is that I don't reinvent the wheel every day. Small differences here and there, but the structure is the same, and years later, I see it and I remember how it all works, real quick.


> The C++ in the GoF book is not representative of C++

Looking back at when the book was published, we had PowerPlant, CSet++, SOM, COM, CORBA, Turbo Vision, OWL, MFC, VCL, Motif++.

Definitely representative.

I like C++, and yes there is more to C++ than those OOP frameworks, yet lets not pretend that what gave birth to Java like enterprise coding came from C++ development practices during the decade that preceded it.


Useful book, thank you for sharing!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: