Hacker News new | past | comments | ask | show | jobs | submit login
Just let me code (drdobbs.com)
186 points by TheCraiggers on July 23, 2014 | hide | past | favorite | 133 comments



Oh the irony of an analogy... The point of these complications is that you are able to travel farther and faster on a jumbo jet than on a bicycle and these journeys will take you to many more interesting places (once you are set up, that is).

Version control, build systems, compilated makefiles, testing frameworks, profiler sessions, deployment scripts - all that takes time to set up, but pays handsomely on all projects of non-trivial size and length.

The right thing to do is not to get rid of this "overhead" but make this setup as easy as possible.


> Oh the irony of an analogy... The point of these complications is that you are able to travel farther and faster on a jumbo jet than on a bicycle and these journeys will take you to many more interesting places (once you are set up, that is).

Yes, but his analogy was about how he got into programming because it was enjoyable like riding a bike, and recently the bike-riding process was replaced by airport-processing-and-jet-flying-process, which is not fun at all. He cares more about the travel experience than the destination.

As an aside, this is also often a difference between "business guys" and "tech guys" - on the extreme ends of spectrum, the former care about getting money doing pretty much whatever (be it selling potatoes or building jet engines, whatever sells) while the latter care about doing what they like regardless of "business value".


That's the difference between a job and and a hobby. He likes biking but to be efficiently deliver pizzas from shop to homes, he needs to drive a car. He could still ride a bike in his spare time, but don't expect to get paid the same way as driving a delivery car. Note that even if he's a super fit guy, and is as efficient as another guy with a car, there are always other rules that make it complicated: insurance, customer expectation, etc. Now we can always dream about a world where we can just work on our hobbies, but the reality is we are not there yet.


That's very well put.

Having been in the industry for what feels like a long time, I definitely struggle with that. Having a deep understanding of systems was useful and valuable back in the day. It still can be useful today, but an awful lot of software gets made through relatively unsophisticated people gluing together a bunch of stuff they don't understand very well.

I suspect a much greater proportion of cyclists really understands their bikes than car drivers understands their cars. Which would be problematic if those people were, say, race car drivers. But that's not the case.


> Version control, build systems, compilated makefiles, testing frameworks, profiler sessions, deployment scripts - all that takes time to set up, but pays handsomely on all projects of non-trivial size and length.

Part of what the OP was trying to say is that these tools sure are valuable and necessary in all non-trivial projects, but you can't get away without them even if the task at hand is trivial. Like the simple mobile app that the OP was talking about.

Once a project grows too big to be one or two source files, you're going to have to invest time to set up version control, build systems, testing, deployment, etc. It's like you'd have to take the jumbo jet to go to the grocery store and not being allowed to ride a bike.

I recently went through bootstrapping a project like this, I had been working in a single C file in Vim (:setlocal makeprg=gcc\ -o\ %:p:r\ %:p $CFLAGS) which was rather nice and I was very productive initially (and enjoying myself). Now I've spent about 5 working days setting up builds and automated testing, test coverage, profiling, continuous integration, etc and I'm still not done. That's 5 days worth of time not going to solving a problem.


It's your personal project; no one's forcing you to use version control, a non-standard build system, etc. Many of us spent many years/decades coding without, and personally I never never never want to go back to working that way. It's the standard tradeoff of automation: some extra time up front learning and setting up some workflow tools (download them, write your own scripts, whatever) ends up saving you from an absurd amount of repetitive, error-prone manual activity.


> It's your personal project; no one's forcing you to use version control, a non-standard build system, etc

This is simply not the case any more. If we go back to the example in the OP - a trivial mobile application - even if the code would fit in one simple file and would take no longer than a few hours to write, doing all the mundane setup work would still be there. You can't build an Android or an iOS app without using the toolchains the platforms provide. You're going to have to write XML manifests and build scripts and whatnot (and/or learn the respective preferred IDEs for both of them).

The same applies for any practical sized application, no matter how simple.

> personally I never never never want to go back to working that way.

Me neither, I just wish the overhead of setting all this up would be less.

Now the best thing we have are language-specific tools and conventions, no universal established best practices exist. This is also why "Java shops" hire "Java Programmers" and not C# programmers. A proficient programmer in either should be up-to-speed on the other in no time if it weren't for all the new tools that have to be learned.


> You can't build an Android or an iOS app without using the toolchains the platforms provide.

That's because of the way Android and iOS were designed. The amount of crap around coding is not a constant, it's very easy to inflate it with things like XML manifests and IDEs.


> Now the best thing we have are language-specific tools and conventions

Sometimes we do, sometimes we don't. I've been learning Ruby toolchains to code Erlang in a company, seen C++ code for robots running AVR being compiled and deployed by a Scala build tool, and webdev is a total mess; every other library uses weird tools written in unrelated languages.


That's a big reason why I developed the centipede framework

https://github.com/paulhoule/centipede

It handles all the BS that it takes to do Java projects so you can start a new project with all the boring stuff taken care of for you in 1 minute.


But by the author's logic now I have to learn your new tool! And it's on Github...the horror.


>The point of these complications is that you are able to travel farther and faster on a jumbo jet than on a bicycle and these journeys will take you to many more interesting places

That's a rather insightful thought that I didn't think of while reading the article.

Keeping the analogy going, I'll point out that there is a large difference between bike riding and jetting around the world. As far as traveling goes, one could think of bikes being analogous with enjoying the journey, and flying being analogous with enjoying the destination. Despite the fables and memes, enjoying one over over the other isn't inherently wrong in my opinion, but different people will prefer one over the other.

Some people will prefer the simple act of creating something with your own hands using (relatively) simple tools. Like a craftsman building a nice table. Others will prefer huge projects, the skyscrapers and the like. When I first read the article, I was only thinking of the one side, but your post has helped me see that creativity takes many forms.


I have another apt application of this already-stretched analogy, but I think it works well w.r.t. the pain of doing large projects without the right tools:

A bicycle ride to the next town is enjoyable. Paddle boating across the Atlantic gets old fast.


> The point of these complications is that you are able to travel farther and faster on a jumbo jet than on a bicycle and these journeys will take you to many more interesting places (once you are set up, that is).

That's the theory of those complications. Often it never pays off. You can have too many moving parts.

What development teams need to do is assess the TCO of the various tools they use to see if they can reduce it and get a sense of whether they are getting a real advantage after they have.


I know plenty of developers who are all too ready to reassess the TCO of source control or bug tracking, or who are already as free as a newborn puppy of the complication of a "deployment process"---which can be a problem if you are the one responsible for making sure their app gets deployed and stays deployed.


But don't overdo it.

You need a version control, not an extensive strategy on how to do it from day one.

You don't need to automate deployment of your Hello World web app from day 1 so you can press a button and this will set a load-balancing setup in AWS. You may eventually need something, which will be simpler.

Also don't go for overbloated tools

OH but you NEED a CSS minification tool! You HAVE TO HAVE IT!!!!1111

No

You don't have to have it, in fact I've caught some major websites not using it.


    OH but you NEED a CSS minification tool! You HAVE TO HAVE IT!!!!1111
One way to keep things simple while still getting the benefits of minification is to code your whole simply, and then later install an optimization package like mod_pagespeed on the server. This keeps your code and flow simple, and then later on when it's worth your time you can install/enable the optimizer.

(Disclaimer: I'm the TL for mod_pagespeed.)


Precisely my thoughts. I would summarize with being mindful of all of these things, and building with flexibility in mind so that should the need arise, these additional "complications" can be added.

To put it bluntly, it just comes down to being smart about what you're doing. Don't mimic the other guy for the sake of appearances.


I really like the go(1) approach of testing and profiling, and together with Docker and git it really makes for a easy-to-setup environment.

Library-first (as opposed to framework-first), write your foo.go and foo_test.go, run go test -bench ., iterate, deploy to devenv, run integration tests, push to live. Works great, no hassle.


Yes, but ultimately the old farts won't admit that code is overrated when it comes to writing real software. The old "programmer" job died years ago. Not to mention how most the tools he rails against are certainly code in any real sense.


Yes, but ultimately the old farts won't admit that code is overrated when it comes to writing real share. Not to mention how most the tools he rails against are certainly code in any real sense.


It's funny how all of these complexities are put in place to make software as convenient for the user as possible, but in doing so it just ramps up the difficulty for the developers. There are some sites out now that do try and make the setup as easy as possible while keeping the power that the "overhead" yields. Nitrous.io orchardup.github.io/fig and bowery.io do a great job in simplifying the setup. This laracast explains it all pretty well: https://laracasts.com/lessons/bowery-is-pretty-darn-insanely...


can you pass the salt? http://xkcd.com/974/


>The point of these complications is that you are able to travel farther and faster on a jumbo jet than on a bicycle and these journeys will take you to many more interesting places

But perhaps really good tools would offer all of that without losing the ease of use of a bicycle.

The problem is that current tools are working at the toolchain equivalent of assembler - or maybe C, for something like git.

Tools that work at the toolchain equivalent of a good functional programming language - never mind a toolchain with useful AI assistance - haven't been built yet.


This is exactly the reason why I avoid any "Java/Android/iPhone" job out there. I code software in Python on Ubuntu machines. Simple bike rides. The build tools are either non existent (Python) or are old friends (make/automake/cMake), the IDE is a bash and a vim, like a handle bar and pedals. Git, much like vim, is a mastery in itself but when the disciple reaches a certain level most of the annoying things make sense and work as desired because how they are. The Zen of Unix and the Tao of Testing are my companions. Nothing disturbs my chi.

What I am trying to say so eloquently (or not) is that maybe the problem is not the modern development world but the developer. The idea that you need just another tool to solve a problem, that is what is flawed. Skill and balance and a set of powerful, basic tools, that is what solves the problem.


I like this metaphor because, if you get any two bike enthusiasts together, they will likely be able to argue endlessly about the details of their respective machines. And if either of them was to try to ride the other's bike, they would have lots of small annoyances and pain points.

Try watching someone who is an expert at Cocoa/XCode and you will see that they also have extremely tight, natural interaction with their toolset. They know where the rough corners are and avoid them. Their environment is already configured so that builds just work. etc.

Those of us who work in the browser have similar tools and workflows so that, once we have them working well and meld with the tools, we work very productively. Yet any time I have to show someone new how I do my work, it involves many layers of learning over the course of months before they can get to that point.

Similarly, though your workflow seems quite natural to you, I doubt it would make any sense to me. Yes, I know git and vim, and I can probably read the python code just fine. But knowing how to build the project, how to test incremental changes, how to debug and do performance profiling, and how to have a basic feeling for when I'm done? It would take me quite a while pairing with you before I felt comfortable.

What's the lesson? While there certainly are better and worse development environments, if you judge the one you know well against the one you know poorly, you will always think that what you know is best. I appreciate the sentiment that says learn one thing and learn it well, rather than chasing every fad. But I also think that, if you're going to be well-rounded or wise, you need to have the perspective and take the time to go deep in other environments before judging them.


Well, I tell you to write utf-8 text files, following a well known coding standard for your language (which is PEP-8 for Python) and show me the git repository with your code. I don't care which text editor _you_ use, which git client, or which Python interpreter. If you want to use XCode you can go ahead. You like web guis instead of bash, no problem. No need to learn my tool chain.


That's because you are using the tools that you have learned over time and are comfortable with. Someone who has never used make before would find build a makefile non-trivial, with all its implicit rules. Someone who has used Xcode extensively would have no trouble whipping out a simple iPhone app using the tool chain.

Even with Python and the Unix tool chain, you still have to learn new libraries and look up API doc when they come out, unless you plan to keep doing the same old things.

Software career is a process of learning new stuffs from time to time, expanding the comfort zone.


Most people write code into text files (instead of into a DB for instance), therefore I expect you to edit text files. Most people use git, therefore I expect you to use git. These are skills you need to learn to call yourself a software developer in my eyes. If you can code but are not able to edit text files, that is bad (e.g., I had a friend who was so much into Eclipse he wasn't able to edit a very simple README.md without making an Eclipse project first).

I completely agree that there is no common denominator when it comes to build scripts. There are so many variations out there. But I think if I know make, automake, CMake and Ant I'm fine with many projects and some more when I ask the authors nicely to provide me either of these.


Honestly, I've never seen anyone use anything other than XCode for iOS. Android, it's true there is Eclipse and Android Studio, but Eclipse support for modern build system features like binary packaged Android libraries is lacking, so people are being forced off it. So there isn't much choice for either any more. Android does have a lot of devices and versions to test on, but iOS tends to have almost everyone on the latest version and just few screen sizes, so it isn't an issue. So I can see avoiding Android, but not iOS for this reason.


An extremely good argument. There are tasks for which all people involved developed a very specific development chain which has nothing to do with anything that existed before. I have not found a way to deal with that yet and decided to simply avoid it (e.g. having the choice between a $25 Python coding hour and a $50 iPhone coding hour I'd choose the Python task!)

But it's not really a problem, actually. Often these people who work in these environments don't understand what freedom of mind is anyway (like try to tell an Apple guy what a FOSS OS would do good for them). The need to implement the whole Unix philosophy and all it's tools didn't stop the Java world from avoiding Linux and BSD which already has the Unix philosophy and tools available. They wanted to develop everything themselves again (and sometimes did a brilliant job!) because that's how they are. I don't like that. I am lazy and don't want to develop what is actually already there and I am not so smart. All my tools would suck compared what already exists. So I am actually quite happy not to work in the Java world even though that's what I studied in university. When git came out I learned how to use it (so much that I would be able to write my own git now), although in school we always used SVN. Now I can happily merge, rebase and cherry-pick and can focus on coding because everybody uses git. Learning git even made my coding better because I learned some great things about software development there.

(This now became more of a novel than a response to your text. Sorry!)


>Android does have a lot of devices and versions to test on

Ironic how the choice of Java and sandboxing was meant to make this not necessary


You are comparing apples and oranges.

If you are running the same application on equivalent hardware at the same API level then you should be fine.

Android has issues with hardware as well as the various API versions. The closest parallel in Java would be comparing a Pentium 4 running Java 1.3 vs some 12 core server running Java 1.7 or something.


Equivalent hardware in the Android universe is the Nexus 5 (plain Android UI), Samsung S5 (using Samsung's UI customization, TouchWiz), HTC One M8 (Sense UI), and Moto X (almost plain Android UI, but not quite).

So a closer parallel would be that 12 core server running Java 1.7 can have its UI skinned by different device manufacturers, and that the UI actually makes a great deal for acceptance of the app in the marketplace.


> This is exactly the reason why I avoid any "Java/Android/iPhone" job out there. I code software in Python on Ubuntu machines.

This is great, but if everybody was doing like you we wouldn't have mobile apps, or in fact anything that has to run on something non standard.


I believe we would. Mobile platforms would evolve or die. What we ended up with is the simplest(for the platform) thing that developers would use. Sure, it is a horrible experience but as long as there isn't too much developer revolt then why bother improving it? That would add cost that doesn't contribute value to the bottom line. No one buys a phone because the dev tools are nice.


Well, Android runs on Linux. Python runs on Linux. Git runs on Linux. Vim runs on Linux. See where I'm going here?


I code in Python and JS, but I believe there is more money at mobile these days.


Yep, it's a pain in some regards. But my job is already like a long holiday. And I get paid for it. No iPhone developer can say this (I suppose). ;)


Couldn't agree more. Most of the tools I see and try are usually just ways to make coding harder and less joyful. Special mention to IDEs which make everything super slow, provide a huge lt of unnecessary plugins and can't even do editing as well as proven editors such as vim or emacs.


"The idea that you need just another tool to solve a problem, that is what is flawed. Skill and balance and a set of powerful, basic tools, that is what solves the problem." If we were using your 'idea' on programming, we wouldn't have moved on from the power, basic tools such as C and Assembler.

"The Zen of Unix and the Tao of Testing are my companions. Nothing disturbs my chi."

Good, then keep doing your chee. The rest of us are going to use the right tool for the job, instead of some mythical pain-modifier that you think is the "One True Way".


I occasionally have to work on (fairly complex) Android apps and I only use emacs and make for edition, compilation and deployment - the makefile calls the SDK CLI toolchain to compile and deploy (ant & ndk-build for the native side). There's really no reason to use Eclipse IMHO - it's just an horrible mess of an IDE that hinders more than it helps.

For iOS, however, it seems you're indeed kinda screwed - you have to go through Xcode and the atrocious storyboards stuff.


You can write iOS apps without using storyboards or xib files. That's how Apple does it.

The main benefit is that you end-up with no merge conflicts in semi-opaque files.


Where did you hear that Apple writes iOS apps without xibs or storyboards? I'm in the same camp myself (I think xibs are a ticking timebomb), and having some supporting evidence showing Apple itself does the same would make my argument much stronger.


Ex Apple employees on the 'Edge Cases Podcast' - xib files were extremely brittle initially. Merging conflicts would be impossible to handle without dedicated tooling.


Really ? heard from a colleague that this will get you a likely refusal during the apple store validation process when deploying your app (said they were looking for storyboard files for each and every screen or something). Must have been wrong :-)

Though I think you're still stuck with xcode ? can you use a CLI toolchain (directly calling clang and so on) to compile, build and sign an IPA ? from the little I've seen of Xcode, it sucks (a lot): random crashes, GUI & behaviour inconsistencies, bad file management, etc...


Xcode still has some weird bits, but it's greatly improved lately. I well remember cursing the crashing multiple times a day. There are some wonderful additions to it I wish Apple included by default that NSHipster covered recently.

FWIW, Apple has never required storyboard. You can do everything in code if you -really- want to do it that way, no problem. Personally, I like offloading a lot of the visual stuff to storyboard/xibs unless you start getting crazy with custom animations and so forth.


definitely wrong, i've submitted > 10 apps and updates without a single storyboard

There is a third party IDE: AppCode by jetbrains which does most of what you need (and apparently lots of things better than Xcode) but i've never used it for any long periods

Xcode gets a bad rap, but i use it every day and can't remember the last time the current version crashed or did something strange, maybe i'm just used to it by now though


Thanks !


Yes! In iOS world you are screwed. You can decide to avoid iOS though. ;)


At some point I think it is important as a programmer to take pride not only in the result of the code, but the code itself. Efficiently managing a project can and should be rewarding as well. Developer usability is a very important feature. As a programmer, we write and iterate on code until it is ready to be published to the upstream repository. Our tooling has come about to help take our rough and ugly hacks and massage them into well written code we can be proud to push.

Also, banging out code is overrated. I've found that when I take time to think, my actual coding becomes closer to what you see in the movies. I can find my flow and quickly materialize my ideas, and most importantly, solve the problems that arise. When I try to just "code" it ends up being poorly designed and written.

The point here is that when you do have the desire to "just code" take a step back and consider taking pride in the code you publish. The tools and systems in place are there to make you a better code publisher. The goal is to write well written code that others (including yourself) can maintain. Rather than being frustrated you have to deal with the intricacies of rebasing or 3-way merges, take a moment to understand these skills will help you publish better code.


The desire to "move fast and break things" keeps you firmly in the 1 year of experience repeated 5 times camp. They teach you to overvalue tools like frameworks and languages and undervalue your use of them. It reinforces a consumer notion of technology, whereby you passively play in conceptual sandboxes constructed by The Great Framework Authors.

"Just use Rails!" you hear, ignoring the small voice in your head wondering if all of the beauty of computation can really be foisted into ill-chosen paradigms. The cognitive killswitch that is mass acceptance and cultural success overrides that voice, though, so it's all good.

Then, when you inevitably get things into a mess, you consult the inevitable cottage industry of people who give you tricks to ward off the pain temporarily.


When I try to just "code" it ends up being poorly designed and written.

this. Not really important for small and/or personal projects, but for anything beyond that simply spitting out code ends up in a bad design violating SOLID and everything else out there, more often than not. After I figured that out, "code" now actually for me means something along the lines of "think about how components should cooperate with a minimum dependency, [enter rest of SOLID and everything else out there here], then start spitting out the tiniest functional pieces needed to implement it from the bottom up". Might not work for everyone, but it has done wonders for me. Especially the more my knowledge/experience about the first part grows (i.e. thinking about design upfront), the better the second part (i.e. spit out actual code gets), and the more proud I am of it. Heck, code from past two weeks is actaully a new personal record for me. So beautiful and well-cratfed :]. Then again, I thought the same thing about a piece I wrote 6 months ago. And the same goes for the way I use all other tools in the build process actually.


I think this is something that happens to any field.

I trained as a 2D animator. When you start, it's great - there's just you and the drawings. You can make anything happen. But once you start to want to make something bigger than a couple of seconds, things get complicated; you have to plan a script, wrangle multiple scenes, start involving other people to break up the process in various ways - inbetweeners fill in frames between what the lead animator draws, cleanup artists neaten things up, colorists, background painters, 3D people for stuff that's a pain to do by hand... plus all the non-drawing support people. Watch the credits at the end of any animated feature. All of them. Every single one of those people put in a lot of hours to make that film happen; not all of them did work that shows up directly on the screen. But all of those people had an important part to play, and all of them had to have ways to communicate with each other.

I burnt out on that because I didn't want to be a cog in a machine designed to tell stories to the largest possible market. Now I draw comics. And guess what? I've got to deal with learning how to put together a website, put together a book for print, promote my work, go to conventions and sell stuff, etc. It is just barely doable by one person; I've been spending three years doing a ~400p graphic novel by myself, and I've got about a half a year left before it's done. I have template InDesign documents, scripts to help fill those templates, friends willing to try to make sense of the in-progress work and tell me when things don't make sense, sprawling directory structures full of source files and web/print res files, sketchbooks full of planning, etc etc etc. It helps that I was already able to wrangle a website; I'm slowly turning my custom templates into something anyone who wants to present a comic online the same way I am can use. If I wanted to start telling a lot more stories a lot more quickly, I could start parallelizing: separate the job into writer/artist/colors/lettering, find an existing publisher willing to handle packaging/printing/distributing to comic stores, bookstores, and e-stores, as well as shop it around to Hollywood.

I can knock out a nice standalone drawing in a few hours. I still do that every now and then when I need a break. But my aspirations are higher; I want to create a world for the reader to inhabit for a few hours. And that takes a lot more work.

Every kind of complex project has its own kind of scaffolding that users of the final project will never see.


About scale (in programming) https://news.ycombinator.com/item?id=8072730

Similar patterns, short is exciting until you hit the next complexity wall.


That's a very nice example, thanks for sharing.

Good process makes your work of being "a cog in the machine" much more easier (and abstracts the rest for you - and of course, your work will be abstracted away for other people)


This is true. The difference, however, is that software development is often considered to be largely an art centered around managing complexity.

While we're generally OK at doing this in the scope of one piece of software, we suck at doing it in workflow terms, particularly where other humans, inter-organizational dependencies, money and existing systems are involved. What this article is really talking about is shifting the complexity of the varying development and deployment toolchains and processes adopted by each different project and organization away from the developer.

The author cites a fairly simple example from the commercial programming world. The complexities he refers to are version control systems, editors/IDEs, build tools, language selection, database/persistence layer selection, test deployment, bug tracking, release processes.

Let's assume learning basic git, how to use web-based issue trackers and a text editor are reasonably considered non-negotiable, per-programmer overhead. The author still has a fair point that IDEs can be a plague in some sectors (mobile and Java in particular).

He is also correct that build tools are numerous, but what makes them so painful? Doug McIlroy famously stated Keep it simple, make it general, and make it intelligible. Is part of the pain that many of them are language-specific, with zero or poor integration with external (eg. system-wide) software package databases / dependency trees? It seems that, today, every operating system distribution and every programming language running upon them generally has (at least) one half-baked, error-prone, cross-platform package management solution with its own versioning scheme, configuration overhead, network assumptions, poorly implemented caching system and probably a broken use of cryptographic primitives. They're not general.

Database/persistence layer selection is an architectural question that is easy to either delay or perhaps - if extensive performance tuning is certainly expected, or deployment environments will certainly be heterogeneous in their persistence layers - sidestep through a layer of abstraction (CRUD) to facilitate provider abstraction. These paths are considered good practice - Ken Thompson teaches us to Spell creat with an 'e' (ie. premature optimization is a fallacy), Eric S. Raymond restates Prototype before polishing. Get it working before you optimize it. and RFC3439 summarizes most eloquently Optimization considered harmful: In particular, optimization introduces complexity, and as well as introducing tighter coupling between components and layers.

The final points of pain (test deployment, release processes) are essentially process design and infrastructure interaction and management issues. While the author mentions the additional complexity of the cloud, in truth this area has become a point of pain so intense that many projects now write their code and internal test/release processes specifically targeting one infrastructure provider (EC2, docker, etc.). That's not very general either, now is it?

So where is there clear room for improvement? The build tool issues are mostly a convention versus configuration question, which is philosophical and not going away. The process/infrastructure issues do seem to be the main candidates for change, and this does appear to be the focus of the author's call for approaches.

Tesler's Law of Conservation of Complexity: Every application has an inherent amount of irreducible complexity. The only question is who will have to deal with it - the user, the application developer, or the platform developer? - Larry Tesler (ca. 1984)

One could take the same view of the development process itself. Well, then... who are we pushing the complexity to? Is it...

(1) a single cloud provider of choice (EC2)?

(2) a platform-specific virtualization wrapper of choice (docker)?

(3) a person or team within the organization who just delegates truth from above (BOFH-style)?

(4) Some lightweight, open source workflow solution that enables us to step beyond the provider/technology lock-in of (1) or (2) and remains painless and configurable enough for mass adoption?

I believe that (1) and (2) do not meet the complexity requirements of larger scale users, and that (3) is not scalable or efficient. I think (3) will exist, but focus more on policy, whereas (4) style solutions will appear in the same vein as the literate, simplistically ops-oriented command line interfaces championed most recently by heroku, vagrant and docker. Regarding (4), if we get it right, we don't even need to notice the implementation. As Minsky said: We're more aware of simple processes that don't work well than of complex ones that work flawlessly..


The false comparison here bothers me.

Sure, back when I learned to code I could just type some stuff in on the machine in my basement, type "RUN", and have magic happen. Yes, it was stuck in my basement, and it couldn't do much, and it wasn't connected to anything. And yes, it could do exactly one thing at a time. But it was still pretty magic.

This guy wants the same experience when building for a computer a million times more powerful, one that fits in his pocket. And he'd like to access a global network of satellites that shout timestamps so that he can, in real time, calculate his precise position and then record it. While that computer continues to do all the things that he expects of it, like letting him take phone calls and receive email from anybody in the world while playing whatever song or movie he'd care to watch. Oh, and he'd like other people to be able to run his app, suggest improvements, view his source code, and offer fixes.

I think it's great to want things to be easy; that sort of irritation is what drives us to make better tools. But a big problem here isn't that the tools are overly complex, it's that he wants a lot more than he did 30 years ago.


Not sure I see how that's a false comparison. What does the additional capabilities of the platform have to do with the incremental amount of yak shaving? You're implying that the notoriously tedious Android builds are the result of the phone's underlying prowess? Maybe they're difficult because the tools used are balky.


It's not the complexity of the platform as such, it's the complexity of the desired result. If he wanted to write a little C and burn it to a simple, tiny, single-purpose computer that he just used himself, the toolchain for that is still pretty straightforward.

I'm sure the tools could be improved. But I think he should acknowledge that a simple developer experience 30 years ago was simple because his desires were relatively simple. (And because the tools had been polished enough that simple desires were simply accomplished.) What he's asking for is more complicated. Should the tools mainly hide that complexity? Sure. But that doesn't come for free.


The complexity of the platform's capabilities and the complexity of the toolchain don't seem to be necessarily related to each other; in fact, by pointing out the sophistication of the platform, you're sort of making his point for him. Writing a GPS mapping application for the Commodore 64 would be a vastly complicated undertaking, but writing one for a modern smartphone should be trivial precisely because the phone has an integrated GPS module and the processing power to parse its output in realtime without having to implement a vast set of hacks and optimizations. The point is that relative to the platform's capabilities, his desires aren't any more complex than those of the basement coder on a C64 thirty years ago, so why does he need to contend with excessively complicated development and deployment tools to implement them?


Even if I grant that his desires aren't more complex relative to the platform, that doesn't help, because the platform is enormously more complex.

But his desires are more complex relative to the platform. He wants to do open-source development with distributed version control and collaborative bug tracking. Try that on a C64.

Further, the C64 was a consumer device, carefully engineered to make it easy for novices to a constrained set of basic stuff. I agree that we should have Android development tools like that. Sort of a Logo or Visual Basic for the modern age. But the current mobile tools are for professional developers to do complex, professional things. That's because they mostly want to make complicated, highly polished, consumer-friendly apps.

Those tools may also be needlessly complex for the purpose, and I think we should fix that. But we won't do that if we can't acknowledge the essential complexity of the domain. As Einstein wrote, "Everything should be made as simple as possible, but no simpler."


I hear him on the added overhead of build tools, testing tools, ticket trackers, code review, and the like. They have their place, but I wouldn't describe them as 'fun'.

But for me, Git is integral to writing code. I couldn't image writing code without version control. Git makes experimenting risk free and encourages me to try things and move quickly. There is no risk of breaking anything because you can always go back to the previous working state.

There are other tools that are surely just as good or better than Git, but I'm pretty convinced it is a 40 year technology, much like Unix or Vim. It's worth the time to get really, really good at Git. I see it no differently than learning how to use a decent text editor.


"Git makes experimenting risk free and encourages me to try things and move quickly. There is no risk of breaking anything because you can always go back to the previous working state."

Having built a large project without version control I can safely say git is the best thing to ever happen to my coding productivity and enjoyment.

Without version control my anxiety goes through the roof any time I have to work on a large codebase.


Tim Bray recently wrote [1] about feeling "stuck and discouraged" by complex tooling and frameworks. Excerpt:

"In fac­t, for ev­ery hour I’ve put in­to ac­tu­al­ly fid­dling with Ja­va code, I’ve in­vest­ed an­oth­er fight­ing git submodule sub­com­mand­s, and now I’m star­ing at what feels like a thousand-meter-high Gra­dle rock-face."

"I should count my­self for­tu­nate that I’m not build­ing a browser-based ap­p; I’d have to bud­get an even high­er pro­por­tion of the time stay­ing on top of this week’s funki­est new JS libraries and scram­bling to have Wave ef­fects be­fore ev­ery­one has them and they’re bor­ing."

He also links to Ed Finkler’s "The Developer's Dystopi­an Fu­ture" [2] which was on HN a while ago and addresses the same topic.

[1] https://www.tbray.org/ongoing/When/201x/2014/07/17/Discourag...

[2] https://the-pastry-box-project.net/ed-finkler/2014-july-6


From the instigator of XML, which is generally respected as beacon of simplicity, straight-forwardness and common sense[1].

[1] http://en.wikipedia.org/wiki/Billion_laughs


"Premature optimization is the root of all evil... and tools are just another optimization." -Me, ripping off Knuth

The vast majority of tools exists to optimize and scale your time. Very few tools beyond a compiler/linker are required for software development. Depending on the size of your project, taking the time to learn a tool or set it up may not be worth it. Setting up build scripts and CI systems may be totally stupid for one project, and essential for another.

The trick is to know when it makes sense to optimize your workflow and manual tasks with tools.


> For a long time, I've attributed this frustration to the complexity of today's software. The simple programs of a few hundred lines of C++ long ago disappeared from my experience. What was the experience of riding a bicycle has become the equivalent of traveling by jumbo jet; replete with the delays, inspections, limitations on personal choices, and sudden, unexplained cancellations — all at a significantly higher cost. Where is the joy in that?

I dunno...The significant overhead is mostly because we attempt to latch on and leverage existing platforms and useful technologies...and so to complain about it would be like Isaac Newton, instead of appreciating standing on the shoulders of giants, complained how in the good ol days, pi was just 22/7.

Awhile ago, I tried to make a simple Chrome plugin that would let me filter Reddit IAMA's by dynamically hiding all threads that didn't involve the IAMA-host...simple enough to conceive it as a jQuery snippet, but I had to spend an hour learning how Chrome plugins worked, all of its conventions...and then about three hours digging up why inline click handlers were unaffected by my jQuery code (Chrome recently considered manipulating existing onclick events to be a security risk).

A total pain in the ass, considering that the code to solve the problem was "easy"...but I understand the tradeoff here...I didn't have to code my own web browser, and when playing with other people's browser environment, security can't be ignored. If I wanted to avoid these hours of learning about the Chrome ecosystem, there's nothing stopping me coming up with my own bespoke way (such as querying the Reddit API programmatically and re-constructing the iAMA as I wish on my own webpage), but that has its own tradeoffs.

I do think, though, that for novice programmers...the overhead must seem immense. It drives me a little crazy to see bootcamps attempt to teach non-programmers Rails. Yes, Rails is in demand, but for a long time, these novice coders think that running "rails new" is a prerequisite for every web project.


> I do think, though, that for novice programmers...the overhead must seem immense.

Agreed. This is why I recommend sites like Codecademy, Khan academy, etc, to just get started and see if coding is for you. Purists will argue that they're teaching you their way to code, but it at least gets your foot in the door in two important ways: getting over the hurdle of environment, and getting the syntax of the language. In my personal experience, teaching someone(very interested but with zero experience) to code without some sort of boilerplate already in place was very frustrating indeed. Only after several hours and lengthy explanations of what things are and why they are used later could the actual coding begin.

After completing enough of one of those options to grasp the syntax and language constructs, you can slowly introduce things like terminals, version management, environments, scripts, etc. As they wrap their head around each piece such that it's not in their way, they can get back to the coding that they're trying to focus on.


You could probably have written a userscript which is considerably easier, although not as easy to distribute.


One key is to stick with a toolset long enough to work out all the kinks and become effective with it. Constantly churning your tools leads to frustration.

Everyone also has their own preferences. The author was stating how much harder a web app is vs. a mobile app, but I find exactly the opposite to be true.

Just pick your tools, learn them, get good at them, and then code for a long time before changing. Don't get caught in the trap of always needing to try the newest thing. Focus on your core purpose of delivering a product, and you'll find that even a non-ideal toolkit still gets the job done, and you can stick with it for quite a while.


Exactly. You always have a choice, most of these things can be punted on. For example, I continued to use svn for many years before transitioning to git, despite everyone around me saying how great it was. It was a tradeoff, I am happy to know and love git now but at the time I wasn't ready to add that to my plate of complexity. In the end, I was able to focus on shipping sooner instead of learning a new SCM. The trick is learning how to recognize when something is no longer a fad but a true shift, and git eventually became obviously that, so I took the time to learn it. I think the ThoughtWorks Radar is a good resource for this type of thing:

http://www.thoughtworks.com/radar/#/

The assumption in this article is that you always have to be using the latest whiz bang tools and platforms, but that's up to you -- the "fashion" aspect of software development is the problem, not the tools per se.


I just want to paint said Michelangelo.. but now I spend all my time mixing paint and building scaffolding and such..

SCM,Frameworks and such are part of the price we pay to be able to build larger and larger projects.


"Coding" as a primitive activity has become subservient to the goals of coding. What are the goals of coding? - Make money - Solve a problem

In any non trivial pursuit along those two vectors, complexity typically creeps in because more people are involved. Since more people are involved, this becomes a human organization. A human organization is not concerned with only the primitive activity to be done, but also what are the right approaches, methods, measurement, priorities, politics, scaling, learning, etc. Therefore, one cannot just be doing the primitive activity and ignore all of these higher level goals.

I am going to use an analogy: let's say coding is brick layering + painting + wall building + carpet + etc. and we are going to build a building. The worker says he just "wants to do work" but if we let her do that the building would be a disaster, unless you are just building a dog house. In which case the risks are low and a single person can do the work.

At the beginning (startup), one can just "do coding" because the problem is ill defined and an organization has not been established to manage the higher level work.

I wonder to what extent this mentality of "just want to code" is related to the (stereotypical) aversion of the developer as anti-social and introvert.


To use your example the workers can generally just turn up and work. The bricklayer doesn't have to be concerned with the carpets. The joiner doesn't have to worry about zoning restriction.

I'm curious if software development (in large organisations) will ever reach that level of specialization, where a coder will be able to 'just code', because others workers will be employed to handle everything else.


I think this actually does happen to some extent. The larger an organization and longer it's been around, the more likely it is that some people on the team spend some or all of their time working on tooling for the rest of the team. When done right, this means that most of the team members just need to follow some well documented or automated process to get fully up and running, and never need to really think about all the moving parts.


This is a great point. How many 'full stack' builders do you find ? They obviously exist, but would generally fall under the 'handyman' moniker, which is maybe equivalent to a 'hacker' ?

Unfortunately, the world of software development seems to have been pushed away from specialisation and each person knowing certian topics well, into 'everyone is just a resource - anyone can code' mindset.


It does get there already, in some places. I've been in large organizations that have a build person, a source control person, etc. They take care of it all, so the coders can "just code".

Those environments have totally sucked, because to pigeonhole people into those roles in the first place requires a non-flexible, non-creative culture.

There is a proper balance to everything, but few organizations succeed in finding it.


I was fascinated with Brooks' proposed "head surgeon" solution to the coordination problem for large projects - basically (and forgive me if my summary butchers this a bit), instead of having 1000 programmers on one repo arguing, you have a smaller number of vertically-integrated teams with a "head surgeon" at the top of each, and it is the head surgeons who have to coordinate. (Needless to say, the "head surgeon" has to know how to code pretty well and be significantly involved in the project rather than hovering at 20,000 feet...)

I guess most people would only like to be the "head surgeon" because of the implied status (and not really knowing what it is like to be e.g. Linus Torvalds). But personally, I wouldn't mind doing a specialized job like the tools programmer or language lawyer because these are still interesting jobs allowing good forward progress to be made.

If the load of tedious work like build systems is too much, and the best we can do is either load-balance it across the team or dump it on some schmuck, maybe that just means we should try harder to break up projects into smaller units so that process doesn't completely dominate them.


>a source control person, etc. They take care of it all, so the coders can "just code".

What do you mean a source control person, just someone who manages the source control when it goes wrong? The programmers are still doing the committing, pushing, pulling, branching themselves


Committing yes, everything else no. The source person managed the servers, branched and merged as necessary, and oversaw the build processes to be sure we had good dev, test, UA, and prod deployments at any given time.

She hated that job, BTW. :)


Let's push this:

Before any carpet or brick is placed, a foundation needs to be poured, but before that, a hole must be made, but before that, a machine must be purchased. Then, someone is going to ask: "how big should the hole be?" And the answer would be, "Whatever, I am just going to put these bricks on it."

and the whole thing just falls apart because nobody can just lay bricks.


Fred Brooks wrote about the organizational complexity around larger projects in one of those books many cite but nobody reads, "The Mythical Man-Month." For example, he highlighted time spent on coordination among the n parties required to meet production targets. As you add more people, you might need anything up to n*n lines of communication; in other words, you have possibly an exponential growth in meetings. So he discussed some things IBM had tried, and some possible ways to mitigate that growth in coordination events.

Bear in mind that Brooks was mostly talking about his work on OS/360 at IBM. OS/360 wasn't a doghouse. IBM wasn't a small organization. His interest in reducing meetings wasn't caused by being anti-social, or having some antipathy toward management or PMs. It was caused by an interest in shipping products on time, which programmers are held accountable to do. (Consider that the next time you feel like accusing developers of being anti-social for wanting to get work done.)

Of course Brooks was discussing an idealized version of the problem, aimed at the essentials. But reality includes a lot of kinds of organizational bloat which really aren't necessary to software production goals, they're just hard to avoid in large organizations.

Say you are building a big house, and you can't wait for one person to finish it by herself, so you have to hire and manage and pay n workers and keep the pipelines full; so you need funding up front; long story short, you make a construction company.

Everything you're doing to scale up the organization encourages "leaks."

With more funding, more self-interested contention over where it goes, more fighting over who got it for the company, etc.

With more hiring, more competition among hiring managers to build empires underneath themselves.

With more management structure, more regulation and enforcement to justify and expand the management structure.

With more decisions, more attempts to influence or take credit for those decisions, and endless arguments down to useless bikeshedding. ('what are the right approaches, methods ...')

With more people, more internal social dynamics that become ends in themselves and change how the organization is steered and how resources are allocated in ways that often are actively harmful to the organization's stated goals.

Now relative to these real-world phenomena, does it make sense for software-producing organizations to see their chief risk as letting the programmers who know about code write the requested code as they know best how to do? Or to solve that "problem" with micromanagement, creating more and more barriers and distractions (and morale drains, and reasons to leave) for programmers?


> I wonder to what extent this mentality of "just want to code" is related to the (stereotypical) aversion of the developer as anti-social and introvert.

Being anti-social is not really a developer stereotype. Being a pedant is, though. ;)


"Having fun, yet? We haven't even begun to code."

Whoah, waitaminnit...

At least some of those decisions can be deferred if you like.

You've decided on Git or some other SCM you're more familiar with. OK, fine. Nothing stops you from using it locally and deferring your decision on where to put a shared repo until later. Same goes for ticket tracking, continuous integration, testing, and so on. And if you want, just skip them for now. Heck, if you really want to, you can even defer using an SCM.

Nothing is stopping you from starting your project, building it privately, sideloading it into your personal device, and deferring almost everything else infrastructure-wise until later.

Yes, if you are building for iOS, you need XCode. If you are building for Android, you probably want Eclipse. But that's about it in terms of required infrastructure.

You almost certainly can 'just start coding' if you want to.

In fact, given that it is smart to 'build one to throw away' anyway, starting out by avoiding the initial overhead isn't necessarily a bad idea.


No one is stopping anyone from writing applications by themselves and distributing them by standard installation packages on a website, like what has been common since about 1995. No one is forcing you to set up an open-source project with distributed source control. No one is making you deploy on the app stores. Those are complex systems, and so it's no surprise that setting them up is non-trivial. Learning about Git is almost certain to be simpler than re-writing something Git-like every time you want a DVCS. Submitting to an app store is probably easier than trying to get your program listed in every software directory that exists for a given platform outside of that app store. Of course these things can be simplified a bit, and likely will be, but I think the current state of things is expected as part of the transition to new models of collaborative work and software distribution.


Don't forget to change the toolchain every few years. "Sorry, we need someone with 3 years of Docker experience."


In web programming it's especially complex imo. Take a new programmer that can program basic stuff. He wants to learn server side programming and suddenly needs to setup a bunch of servers(vm, vagrant or docker) , needs to learn git, needs to learn how a framework works because you don't want to teach spaghetti code. Then there's html5, Css and probably some template language and a lot of js in modern apps. Add less, sass, coffeescript testing, rest, a js framework and websockets and you have a totally overwhelming setup which is the standard for most apps today.


You don't generally need to "setup a bunch of servers" when you're first learning web programming, and a "new programmer that can program basic stuff" can hopefully already use git - counting that as part of "web programming" as distinct from other programming is hugely disingenuous (git was created for programming that was not web programming). What's left isn't trivial, to be sure, but I don't know that it's harder than getting set up to do development on a microcontroller. More languages are involved, but languages are rarely the hard part.


>He wants to learn server side programming and suddenly needs to setup a bunch of servers(vm, vagrant or docker)

I didn't use any of those things while learning web/server programming. Just a laptop with Debian and Apache installed.


I feel his pain. This trend starts to seriously scare me.

So the other day I wanted to learn a JavaScript client-side MVC framework. For various reasons I picked Ember. Suddenly, I had to spend a day playing with... Node.js (hey, I was trying to do client-side work!). Hundreds of megabytes of tools and libraries; package managers, web servers, a half of an operating system, just to generate a Hello World example in the way Ember documentation says is right (I also skipped the Ruby on Rails part). An example that, generated, had literally tens of thousands files and directories.

Now the question is, why on Earth do I need to install Node to play with client-side JS (that is supposed to work, like, in browser)? Why do I need to subscribe to a particular TDD-ish development regime? Why every project nowdays uses a random build tool written in/for completely unrelated programming language? I've recently seen C++ code for AVR being compiled and loaded by Scala build toolchain.

I get that some of this may be useful in large multi-team projects. But for small code written by one or few programmers? I feel it's only a distraction. I smell cargo cult.


Is it my imagination, or is this problem fairly JavaScript centric?


I don't think so. More like webdev-centric (everything nowdays uses random build and test tools that are not necessarily in Node - sometimes it's Python, the other day it's Ruby...), but then again while working in an Erlang shop some time ago I went through two different build tools (one of them was made for Ruby libraries, btw.). And don't get me started about Java.


The webdev stack is so, so bad.

It gets an enormous pass because programmers love programmer culture, and it's open. (Exactly like the lackluster Linux of the nineties being immune to criticism due to openness).

You know what a great workflow is? Vim, C, and a Makefile. C offers a far more conceptually complete model that isn't forced client/server. The C compiler does basic checks on my code (revelatory, for sure), and the Makefile, as horrible as the syntax is, can be added later.

If I'm teaching someone, they can open something like gedit, type some code in, and run gcc manually. Boom! A program to run!


I agree with the concept that no one wants to deal with the overhead that comes with working on a distributed product or team, but tools like git have largely reduced the complexity and allowed programmers to get back to coding. without build scripts, you will spend a lot more time outside of coding.


I am wholly in favor of writing tools that do one thing well and make it a priority to get out of your way. That said claims like "Git is that it's a whole subworld on[sic] to itself" seem absurd in this context.

Perhaps this is the authors problem in general: the trick to productivity is being able to quickly reach functional understanding of a tool/library you've decided to use. Enough to 1) evaluate the tool/library itself and 2) use it in a way that's at least somewhat conformant to how the authors intended. Obviously doing these two things requires good authors and good client programmers. However, if being a large body of work kept developers away from things you certainly wouldn't have a zillion apps on the iOS app store.


I do agree with the author a bit. It's not that all these toolchains are useless (they aren't) -- its just that they should be a far more invisible part of the process than they are. The pinnacle of this frustration was when I used to contract native iOS apps. The sheer amount of crap you have to crawl through to do something nontrivial is ridiculous - and I'm not even talking about getting into the app store.

This is one reason I started to fall in love with javascript. I've been a longtime critic of the language, but I'll be damned if there is not an easier way to build, test, debug, and distribute code.

Growing up, I learned from MS QBASIC - which was an incredibly easy portal into coding. It is a shame no such things are commonly available today.


I started with QBasic also. Things like that still exist. See http://plnkr.co/ and others.


I agree with the overall point: tool complexity can be a major downer and impose residual drag throughout a project's life cycle. I too "just want to code".

However, I don't agree with some of the author's examples.

Git has proven to be rather simple, minus the initial 1 week learning curve (of casual use, give or take). I use it in solo private projects and in teams. Branching and merging, in particular, are part of why I love it so much; much better than the CVS and SVN days.

And isn't using SCM considered a best practice? I think so, but I'd be interested in hearing any counters.

I also find build tools essential. Even for web apps, simple ant scripts that automate remote tasks have proven invaluable, especially wrt deployments.


And now you have two problems.

What if the source code needs to be maintained internally? Our source code cannot be stored in "the cloud" where someone outside or most likely inside the CDN (our competitor) will take it.

There are a lot of things deemed best practice. Best practice for who? And, to what end? Here is a best practice: release developer tools for your platform that are the only way to build software for your platform. Locks in developers to your platform. Makes it difficult to switch or port to other platforms. Enables ways to thwart competitors, by slowing down their development cycle or even stopping it at the approval point. Enables ways to steal ideas and even code. Why do you think each major platform has their very own programming language and plethora of platform specific APIs? A floor of moving snakes is not much fun. One vendor even called their platform ASP.

Simple build tools. The ant script is simple, but installing ant is not. And, we don't really want ant and the dependencies it requires. But, whatever works for you.


> What if the source code needs to be maintained internally? Our source code cannot be stored in "the cloud" where someone outside or most likely inside the CDN (our competitor) will take it.

It sounds like you are mixing up Github with Git itself? You don't have to host your code at github to use git. It's just one of the more popular ways for open source projects.

> The ant script is simple, but installing ant is not.

It isn't? Ant is packaged in most major distros; what kind of servers do you normally deploy on that doesn't have it?


> What if the source code needs to be maintained internally? Our source code cannot be stored in "the cloud" ...

Then move your SCM server wherever "internally" is? SCM does not require "the cloud".

Regarding ant not being simple to install ... would you mind elaborating? I've installed ant a handful of times, either via "sudo apt-get install ant" or as a plugin to my editor; both ways were simple IMO.

And I agree with simplicity, particularly with dependencies. I mentioned ant off the cuff, though I also use shell scripts when I can. They handle most remote tasks I can do in ant, but w/a smaller foot print.


I worked at a place that had people whose responsibility was the workflow, precisely so that developers could focus more on writing code. Then people started doing their own projects (good) so they wouldn't have to use .NET anymore (good) but they never took the time to get the workflow right (bad) and each new project worked slightly differently or was built on a completely new and different platform (bad) so now everyone has a more complicated workflow than they used to, and they're all different.

No automated testing, no CI, no reasonable way of deploying code and certainly no rollback, who has time for such frippery? I left.


Most of this article is hyperbole. I work on a large Android application, and while there's some overhead the majority of my time is spent coding. Look at any other engineering discipline and the overhead they have to deal with for their projects. Overhead is intrinsic to any large engineering project. As programmers we are actually spoiled with the fast feedback loop available to us. Learning all the tools is part of the trade. Yes, it can be a pain to do everything, but these tools wouldn't exist if they didn't produce some sort of net gain.


There's a lot to be said for reducing the "time to first results" for people getting started on a platform, whether that's absolute beginners or people retraining to the new trendy thing.


This is my problem with Unity3d. Just like XNA, they picked a very good language (C#) but then they reduced it to scripting. So basically creating an app became just integrating pre-made chunks of third-party scripts/binaries + some assets from the store sprinkled on top. Furthermore it forces their architectural vision of what components and entities are about and there is no customization available on lower levels.


This is why I have side projects. My SCM is Darcs (ridiculously simple); I don't use an IDE, only a text editor; I build with Make or by throwing all the source files at the compiler; I test sparsely by hand; and I deploy only to my own client and server so I don't need a configure script or any JavaScript feature-testing madness.

It's wholly unsuitable for anything professional, and it's wholly a joy.


Is there something to back up the claim that you must use git and a popular bug tracker to get any community involvement?


If you want the community to be involved with your code you should use what the community use. For a lot of languages today this is git and GitHub.


But is the community so narrow-minded today that it won't bother if your project doesn't use their favorite hosting provider and version control? Before this git religion, I think it was normal for everyone to accept and live with whatever tools the project they want to get involved with used. And there was more variety as to where projects would host; certainly sourceforge and the like were popular, but so were projects that hosted themselves.


git, mercurial, svn, cvs... different names for the same tools and challenges.


> But if a potential contributor wants to fork your code, you'd better understand pull requests, branching and merging, and a whole series of other operations that might or might not have counterparts in the SCMs you've used.

I don't think these tools are all the same, at least as traditionally used. And you don't have to (ab)use all their features. Also, you don't have to use these tools at all.

But the original author more or less claims that you have to use git to get any community involvement.


I read it as github being the most popular example of that type of tool. If not github, then SourceForge, or code.google.com, or launchpad, or... the list goes on.

And whatever community you ascribe to, you'll have to learn how to operate within it to get community contributions.


I guess such communities exist, but it looks weird to me. It shouldn't be like that. I mean, your project's community should consist of people who use or are interested in using your code. Whether it be hosted on github, launchpad, your own yourproject.com or another server capable of hosting the project. Hosting it is technical details; people should come to the project for what it is, not for where it is hosted at. Therefore the community shouldn't be "github community" or "sourceforge community" or whatever. Or did you mean communities such as "git users", "cvs users" and so on? I think that's still the wrong way to think of it. It sounds like a religious cult formed around specific tools.

They're just tools. I shouldn't need to ascribe to "vim community" hosted on "vimhub" even if I wrote my software using vim (I don't).


Sure, but when you set up the project, wherever you host it, you're going to have to choose an SCM before you have any community. If you want to provide the least friction for potential contributors, you'll want to to use the SCM most commonly used today in open source, namely Git.

Add'l note: Other hosts, BitBucket, Google Code, etc. also offer Git hosting. So, yes, choose whatever host you want, but choose and SCM that will make it easy for folks to contribute.


This might sound crazy, but I'm on the "other side" of this conversation: I'm incredibly passionate about the management side and "shit umbrella" functions of software development. I'm driven by helping make sure developers can do what they love and do best: code! :)


Personally, the scaffold doesn't scare me. It's the fact that the scaffold moves with regularity.


This isn't the internet we were given. This is the internet that we lost. Everything is locked down. Web pages with more style than content. ISP's capping upload speed, preventing me from running a small home server.


My own pet peeve is Ant/Maven. Sooooo complicated to do so little more than Make. (Running on Windows instead of *nix isn't worth much to me, personally)


> ...now so overwhelms the developer experience...

Now? This is not a new sentiment. In fact, it's older than I am. That may qualify it as a cliché.


In theory that tools that the article rage about are designed to help us to solve problems , make us more productive and help us to make more money.


Programming small memory footprint embedded systems still requires you to figure out how to pedal that bike the best you can.


This is why I recommend all "learn to code" people to start with PHP


My issue with your advice is that I then have to trouble myself with learning Apache or some similar server stack. Last time I worked with Apache, over 10 years ago, it was annoying, to say the least.

What we need is a simple solution. I recommend bottle or flask. Documentation is available. Easy to start. A good community to work with for learners.

Will it scale? To Hell with that. No beginner should have to worry about that. They need to worry about first principles. Then work their way up.


virtualenv and deployment are why I don't recommend python for step 0 beginners

Most people I've talked to at this level, they are so far away from being "technical" that HTML/CSS is already a huge (unavoidable) hurdle and the simplest thing I can imagine is PHP + FTP. If they get on an FTP client that mimics the GUI of a folder system, people get it real quick - they are copying this local file to another folder on the server. To me, the quicker you can get them to deploy something the better. because the first steps are to understand how information is sent to your app, understand that it's your apps job to make sense of that information and return a response.

I also tell them to get a cheap standard shared host, so most of the apache is transparent to them anyways. When they get to the point of needing to customize apache, that's an incremental step from where they have already gotten


I remember, during a short-lived consulting gig back in 2012, trying to convince a company to use Scala. They panicked about the lack of IDE support and refactoring tools.

"IDE support is immature." Response: well, true, but you don't need an IDE. Java makes IDEs mandatory because it's a crappy, verbose language. Scala doesn't. Clojure especially doesn't. Haskell doesn't (although convincing that company to move off the JVM would have been impossible).

"Where are the refactoring tools?" Response: when you have good static typing, the compiler is far better at enforcing refactoring safety than any tools on the market. (Java itself combines the worst of static and dynamic typing.)

People-- especially non-programming managers-- have become dependent on the trappings of productivity that they've forgotten what actual work looks like. Java culture is full of this nonsense. The business becomes so dependent on a convoluted toolchain that it doesn't need for most purposes.

Then there is the myth of the "full stack developer". It's a good idea in theory, but no one knows what it means. It's just management "at the end of the day, it's all about leveraging our core synergies" bullshit. What's really going on is that execs and professional managers can't evaluate the work in most specialties, so they've decided to pretend that specialties don't exist. All types of programming get thrown together into "it's all code; just make the thingy work" and what that means is that programmers get stuck cleaning up any kind of technical mess they have the bad fortune of having thrown to them. The result is that a good number of programmers (those who aren't savvy enough to play the game and be selective in what work they do) end up learning a bunch of crappy, parochial, non-transferrable "how we do things here" details but (a) never get a coherent career or develop the ability to protect a specialty, and (b) are constantly learning "new" but ill-thought-out and incoherent things.

For what it's worth, I used to think that this was a "new" problem, or a symptom of the industry getting worse. I'm not so sure that it's very new. Maintenance programming, 20 years ago, was the same slog that it is today. If anything, the situation is getting better. I agree that Git is not user friendly at all, but the damn thing works, and it's far better than other VC tools (e.g. Perforce).


Back in 2012 Scala already had some decent IDE support in IntelliJ. It's far better now, as pretty much everything that is supported in Java works in Scala too.

Clojure is in a different situation though, but that's because its type system, or lack thereof. Just like in Javascript, people end up falling back to REPLs and other such primitive tools, because the IDE just can't tell you very much about your code. Code is data is not a strength, but a double edge sword.

As for Git's usability, yes, it's not very good, which interestingly seems to be a key reason of why it has so many advocates: it forces you to understand a whole lot of internals to be proficient at it, and while git's command line interface is pretty ugly, its internals are elegant. It's kind of the opposite of what happens with Mercurial: It's easier to start working with it, so people don't look into it long enough to fall in love and start advocating.

What I really find interesting about this whole 'let me code' thing is that, while I really don't want to spend valuable mental space dealing with a complex environment, I do not want to use it instead on code, but on figuring out what is the right code to write. We obsess over the mechanics of coding, over our tooling, but what really matters is that we are actually solving the right problem.

I spent some time this month with some code written by my predecessor at this job: A well known library designer and speaker. He had looked at a hard problem, wrote a proof of how to solve it, and wrote a mathematically correct implementation matching his paper. The problem is that his premises were incorrect: He failed to account for some unavoidable issues. That made his last 4 months of work here a waste because he was just solving the wrong problem.

I want to spend less time coding, and more time making sure that the coding is actually solving the real problem at hand, not an incorrect idea of the problem.


I think proving your implementation correct can be a great thing after it's had some time to verify your implementation accounts for issues that come up in production.


People-- especially non-programming managers-- have become dependent on the trappings of productivity that they've forgotten what actual work looks like. Java culture is full of this nonsense. The business becomes so dependent on a convoluted toolchain that it doesn't need for most purposes.

Web programming is now, too. Look around at job postings for frontend dev positions and you'll see basically just lists of fashionable tools, e.g. "Grunt, Bower, Angular, Phantom, Karma, Jasmine..."


Great point. Many devs won't touch a tech stack that doesn't have tons of tooling and lots of ceremony possible.

(To those used to such things, you will find something tremendously freeing if you drop down to using no language specific tools but the compiler/interpreter).


That's because we mostly know just how much time and frustration a good debugging (note: not editing!) toolchain saves. There's a huge difference using IDEA with it's integrated debugger to for example, typing everything out manually in GDB. And here Scala isn't any different - it's noticably easier and takes way less time to debug a complex piece of software since we have good IDEA integration.

And that's what most of people bashing IDE's don't get: it's the ease debugging that's important, not typing code!


Honestly, I've seen people who swore by their IDEs and paired with them... usually what was going on is they were writing code they literally did not understand, and used the IDE as a lever to help them get by without understanding. IDEs simply allowed them to write very bad systems without any conceptual integrity, and they did.

Which wasn't my original point really.


Personally, I'm a believer in the concept of a read-only IDE. Sometimes code visualization becomes easier with a GUI. I'll grant that. Likewise, IDEs can make debugging easier. Writing of code with an IDE has the problems you discussed.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: