Hacker News new | past | comments | ask | show | jobs | submit login
Complexity and Strategy (hackernoon.com)
219 points by mgdo on March 25, 2017 | hide | past | favorite | 30 comments



> What I found is that advocates for these new technologies tended to confuse the productivity benefits of working on a small code base (small N essential complexity due to fewer feature interactions and small N cost for features that scale with size of codebase) with the benefits of the new technology itself — efforts using a new technology inherently start small so the benefits get conflated.

I see this a lot. It's exciting to work on shiny new $x app, and productive, but it's most likely productive because it's new, not because it's $x. The old code base has accumulated complexity that the new one doesn't have.


I think that one key source of needless complexity is lack of "spring cleaning". Design artifacts and quick hacks, over time, can create high levels of complexity that are a major hiderance when adding new features. This might be due some people taking the "if it ain't broke, don't fix it" approach to the extreme and not taking future considerations into account.


I think what you describe is what people usually call accidental complexity. The complexity described in the article is about the incidental complexity, meaning the actual complexity of the features, rather than the complexity of their implementation.

Let's take an example, if hacker news added a new feature that allowed people to delete their account, then you have to decide what to do:

- do you delete the user's comments?

- if yes, what do you do about people replying to that comment?

- do you delete the user's submissions?

- if yes, what do you do anout people commenting on the thread?

- do you downvote the submission / comments that the user upvoted?

- do you unflag the submission / comments that the user flagged?

- do you keep a copy of their yc applications?

- do you keep their email address along with their karma, so that if they open a new account with the same address, their karma will be restored?

As you can see, adding a single feature means that you have to figure out how it is going to play with all the other features. That's what the author means when he says: that makes the cost of implementing the N+1 feature closer to N than 1.


Yes. A subtle but important distinction. Indeed, these two kinds of complexity feed off each other. As the author states: If essential complexity growth is inevitable, you want to do everything you can to reduce ongoing accidental or unnecessary complexity


"Accidental complexity" and "incidental complexity" are usually taken to be synonymous, to be contrasted with "inherent complexity".


I've more often heard "inherent complexity" called "essential complexity".


This is the term used in Out of the Tar Pit IIRC.


Well said.

Another example is how adding a new programming language feature can make every other feature more complicated as you consider how the interactions should work.

This is especially true as you think about how all the tools (formatters, debuggers, etc) should work.


Well worth the time to read, one statement in particular stood out to me:

So “free code” tends to be “free as in puppy” rather than “free as in beer”.


Yes, that was a great line, and a thought provoking one.

This whole topic makes me a little sad, as I thought the improving tools and techniques would vastly increase developer productivity over the many years I've been involved in software development. Instead it appears complexity is winning so far.


Many of the points are about what happens in regards to specific projects and changes to those projects.

I didn't see as much about what is happening on industry wide scales over decades


I think improved tools and techniques actually are vastly increasing developer productivity, it's just that our expectations for what developers can do keep going up as well. We always just push until we hit a wall of complexity and have to slow down.


Make no mistake: developer productivity has VASTLY improved.

We still hit some pretty fundamental limits when making a feature-rich product though.


I never thought to equate my PR when I see a great library/framework/tool, with my kid's PR (puppy request) as we pass a pet store.

My eyes have been opened. I am enlightened.


I wish I could upvote this article 1000 times - definitely the best thing I've read all year. My favorite quotes:

"Features interact — intentionally — and that makes the cost of implementing the N+1 feature closer to N than 1."

(When explaining to folks why it takes longer to build something in the CRM than as a standalone app - you realize the standalone just won't work with 10 critical features, right?)

"What I found is that advocates for these new technologies tended to confuse the productivity benefits of working on a small code base (small N essential complexity due to fewer feature interactions and small N cost for features that scale with size of codebase) with the benefits of the new technology itself — efforts using a new technology inherently start small so the benefits get conflated."

This describes like 20% of the articles that make it on hacker news. Or maybe 20% of the ones I read. Might be a personal problem.

"So 'free code' tends to be 'free as in puppy' rather than 'free as in beer'."

Anyone debugging js build chains or trying to fix ng1 performance after you get out of the toy app stage can probably relate.

"Bill wanted (still wants) a breakthrough in how we build rich productivity apps. He expected that the shape of that breakthrough would be to build a highly functional component that would serve as the core engine for all the applications. For a time he believed that Trident (the core engine for Internet Explorer) could become that component. That model leads you to invest in building a more and more functional component that represents more and more of the overall application (and therefore results in more of the cost of building each application being shared across all applications).

This view that I have described here of increasing feature interaction causing increasing essential complexity leads to the conclusion that such a component would end up suffering from the union of all the complexity and constraints of the masters it needs to serve. Ultimately it collapses of its own weight. The alternate strategy is to emphasize isolating complexity, creating simpler functional components, and extracting and refactoring sharable components on an ongoing basis."

It is super common to see people want to do Big Framework Up Front design for applications, and then find out that the framework makes things slower for the primary use case than the old, "crappy" way, aside from the cost/opportunity cost of spending all that time on a framework instead of business value. I guess it makes me feel a little better than Bill Gates also suffers from that delusion.


> debugging js build chains

I just spend several days trying to build a deb package for a opensource native C project (fontforge).

It was no walk in the park either.


This is a long but excellent article by the guy who leaded Microsoft Office development. That will make you think twice before you add that super-easy-to-do button in your app!


The flowchart of how Slack's notifications are working comes to mind: https://twitter.com/mathowie/status/837735473745289218


The additional length is a boon here. I don't think that the provided insights can't be properly disseminated in a few short paragraphs. The article does a fantastic job of distilling some of the fundamental dynamics involved in building software.

Anyone know of good places to find other similar high quality content about software development? I really appreciate when experts are able to effectively communicate the most important pieces of their painstakingly acquired mental models of complex topics.


> Bill wanted (still wants) a breakthrough in how we build rich productivity apps. He expected that the shape of that breakthrough would be to build a highly functional component that would serve as the core engine for all the applications. For a time he believed that Trident (the core engine for Internet Explorer) could become that component. That model leads you to invest in building a more and more functional component that represents more and more of the overall application (and therefore results in more of the cost of building each application being shared across all applications).

Is this not essentially what electron is?


Was mixing msOffice and iExplorer code, so the one couldn't be removed without breaking the other, one of the strategies discussed?

"The battle we are fighting is over who controls the next generation applications and system architecture, APIs and services" Shirish Nadkarni 1991

http://edge-op.org/iowa/www.iowaconsumercase.org/011607/0000...


From the article

"Bill wanted (still wants) a breakthrough in how we build rich productivity apps. He expected that the shape of that breakthrough would be to build a highly functional component that would serve as the core engine for all the applications. For a time he believed that Trident (the core engine for Internet Explorer) could become that component."


I've just added Mythical man month to my basket. It was long overdue :) Probably they will agree that there is no silver bullet in tech.


There is literally a chapter titled "No Silver Bullet."


Sure the more features that interact with each other the more the graph looks convex. But I strongly disagree with this: "Project after project has demonstrated there is nothing about language or underlying technical infrastructure that changes that fundamental curve."

So we should all code in machine code? Perhaps what this shows is that all languages are very similar. Haskell, clojure, java, C#, C++, Rust, C etc they all are statement or expression oriented and data structure oriented. So, until I see languages that explore the rules/constraints and relations(similar to "out of the pit tar") I will remain unconvinced of this "all languages are the same" because assembly certainly isn't the same as java.


this is also my experience. that complexity curve varies. i use the plugin design pattern where any feature/plugin are not allowed to interact or depend on another.


> ... file formats continued to serve as a critical competitive moat with immensely strong network effects.

I couldn't agree more.


Now who's betting this quote won't be cited in a court of law one day? Maybe shortly after they straighten up the law on "patent misuse," for example.

(Which doesn't mean I didn't like or don't agree with the article.)


> Even applications like OpenOffice that were specifically designed to be clones have struggled with compatibility for decades. By embracing that complexity, and the costs, we would deliver something that we knew was fundamentally hard to match

> Google Apps have been announcing some variant of offline editing for almost 8 years now and it is still semi-functional. The other “real soon now” promise is “better compatibility with Office”. This has the flavor of the laundry detergent claims of “now with blue crystals”! The claim of “better” is hard to argue with but history would seem to indicate they are not getting any closer to actually being compatible, especially as Office continues to evolve

It's a bit difficult when they're on purpose embracing complexity and cost of the current format, and keeping it closed source. The fact that Google is able to achieve its own network effects with a different format should actually encourage Google to not support Office, ever.

For us, users, we should ensure Google docs stays compatible with open format like ".odt"


I think you've put your finger on it. Funny thing that there are laws imposing standards on so many other technologies... but not software, even re basic file formats. Hmmm. Nobody thinks software is without economic consequence, yet this remains so. Hmmm.

For example, once upon a time screws were made however anyone wished to make them (and railways were whatever width the company wished.)

"In 1918 Congress passed a law establishing an organization called the National Screw Thread Commission, with the goal of ascertaining consistent standards for screws. The goal of this effort, which you might guess given the timing of the law's passage, is military-related: As you might guess, the military uses a lot of screws, and inconsistencies were apparently bad enough after World War I that Congress had to do something about it.

John Q. Tilson, a Connecticut congressman, argued that the measure was necessary due to the problems a lack of consistent screw thread were creating. He also made the case for businesses—who he argues also will benefit from screw compatibility."

http://tedium.co/2016/09/15/screw-history-standards/




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: