Hacker News new | past | comments | ask | show | jobs | submit login
Software eats software development (cdixon.org)
147 points by goronbjorn on April 14, 2014 | hide | past | favorite | 130 comments



>In the pre-Internet era, tools like Hypercard and Visual Basic allowed hundreds of millions of semi-technical people to become software developers. Since then, there hasn’t been much work in these areas, but from what I’ve seen that might change soon.

I doubt that there will be much progress on this in the foreseeable future. Not for general purpose programming anyway. It's not true that there hasn't been a lot of work in these areas since Visual Basic. Not a week goes by without news about some tool that finally allows everyone to create UIs easily, wire up some logic and one-click-deploy everything on the web and mobile devices all at the same time.

But the bottleneck is somewhere else entirely. It's thinking in terms of models. Formalizing and abstracting what we know intuitively about the world and about the things we want to automate.

In my opinion, there have only been two inventions that really "democratized" access to thinking in models: Spreadsheets and SQL. Both are from the 1970s. There has been no progress since then, perhaps with the exception of some of the visual things you can do with modern game engines. Maybe R and Matlab deserve mentioning but they are for people who know exactly what they're doing.

I think the next step has to be abstracting away some of the intelligence required to analyse data and make all the tedious micro decisions about how to transform it into something fit for the task. What we need to make progress is some kind of AI enhanced version of Excel.


* Not a week goes by without news about some tool that finally allows everyone to create UIs easily *

That's because they start at the easy part (the UI) and move down until they hit that this is a hard problem. But there's reason for hope: there are breakthroughs happening in this space (reactive like spreadsheets, declarative like SQL). The results are really exciting.


> there are breakthroughs happening in this space

could you elaborate on this one. Any links?


I don't have any public links but send me an email, it's in my profile.


I am curious too...I'll send you an email.

I am the developer of HiveMind (crudzilla.com), it is a web app platform that aims to tackle web app development in a way that moderately skilled developers could do a lot with.

So I am definitely curious!


> But the bottleneck is somewhere else entirely. It's thinking in terms of models. Formalizing and abstracting what we know intuitively about the world and about the things we want to automate.

This is a bottleneck but one that exists even without programming.

In my opinion, the biggest bottlenecks are two fold. Software frameworks are coded to be coded against and the abstractions we use to communicate between sub-systems in programming languages require us to code (parameterized sub-routines).

Software frameworks that are composition centric (you hook up parts instead of calling parameterized sub-routines) make programming a lot easier. You don't even need to code against composite centric frameworks: you can just hook things up using general purpose VPLs.

Parameterized sub-routines lead to way too much specialization of interface making software frameworks both difficult to use and not composition centric. This is why we have to code against our software frameworks. I also feel that parameterized sub-routines are the reason why "there has been no progress since the 1970s".


Can you give good examples of the both kinds of software frameworks?


Apple's objective-c and associated framework were designed to be coded against (and even though it supports message passing, it still heavily uses parameterized sub-routines).

A lot of 3d modeling software provides frameworks where you hook things up instead of coding out solutions (Foundry Modo, Eyeon Fusion, Dynamo are a few). These are very domain specific and not designed to work as general purpose languages/frameworks.

We are working on some ideas for a general purpose compose-able framework and language that uses messages with behavior as the abstraction for communication between sub-systems (http://blog.interfacevision.com/design/design-composition-ba...).


Funny, I was just looking at Google Apps Script with the idea of making an add-on for Documents. The tool looks brilliant. It supports many Google services (from fusion tables to Gmail to...), provices simple sharing with anyone and proper authentication. But it does not provide any tool for building better models or architecture. It's VBA on the web.


"A Small Matter of Programming" by Bonnie Nardi is an excellent book which delves into the history of end-user programming. Spreadsheets, CAD and so on, all based on researching users.


> Since then, there hasn’t been much work in these areas

Our startup is working on exactly this, with a simple block-based system which is easy to use (like Scratch) but still very powerful (working with lists of objects, recursive functions, loops, local variables etc). It's for our general purpose game editor Construct 2 (http://www.scirra.com), which is far, far beyond "create UIs easily".


There are modeling innovations in research, few I know of(probably many more availble) :

https://news.ycombinator.com/item?id=7458922

Another huge innovation in modeling is machine learning, and it can build very complex models(in many cases far accurate than rule based models) using a relatively simple interface.


The conversational programming idea you pointed to in the other post looks interesting, because a conversation provides information to both the human user and the learning algorithm at the same time.

I always felt that the feedback in supervised machine learning is too far removed from the actual usage of the system. Robotics seems to be the most natural area where this could be improved.


> Not for general purpose programming anyway.

> Spreadsheets and SQL.

Those aren't democratized. Other examples like Matlab or Mathematica come to mind, where you use a domain language to simplify the programming model drastically.


Where I work, we've got lots of account managers who would never dream of writing C#, who are perfectly happy writing SQL queries.


agree on spreadsheets but i think the promise of sql was not fulfilled

what about visual xml mappers? for business analysts - non-coder domain experts. they are selling, but i don't know how well this aspect has worked out... as a coder, i wouldn't hear about it even if it was a success...


>the promise of sql was not fulfilled

I know quite a few domain experts who are able to use a little SQL. I haven't seen any non-technical people use XML mappers.


When I was something like 13, I went to my first computer trade show, where on vendor was touting tools that would let non-programmers write software for the minicomputers of the day, eliminating the need for those expensive programmer guys.

Ever since, I've seen people selling similar notions. Whether or not they work, I expect we'll see a number of similar tools on the market. As things like magnetic healing bracelets demonstrate, not actually working is no barrier to selling a product.


Programming for non-programmers is an oxymoron to begin with. Even using Excel is programming.

I've become convinced that one of the reasons this field has failed so spectacularly in the past is because people are trying to achieve something that is literally impossible.

In order to communicate with computers you have to become computer literate. Or at least, it's far easier (and will remain so for a while yet) for a flexible human to become computer literate than for an inflexible machine to become human literate.

Finding better ways to make humans computer literate is and will for a long time be a much better path to make computers more accessible, and it's largely what's worked to get us to the point we're at now.


I wouldn't say it failed spectacularly, quite the opposite.

OK, tools for the complete non-programmer maybe. Those are vaporware anyway. But tools for people with domain knowledge to do a little bit of programming were hugely successful. Visual Basic (pre .NET) is the best example. Click together a GUI, add a bit of Databinding, and write some business logic in BASIC. Spreadsheets are another, and they are a gateway drug to VBA. To a lesser extent the .NET ecosystem with XAML and so on falls into this category, too.

Another example was Flash. Artists could make little videos with the vector graphics tools (which are still pretty unique in how you can "paint" vector graphics). You could then animate them and make little movies, and add simple interactive features. Then you could add some ActionScript (~Javascript) to get more complex behavior and make litte games.

I'd say this kind of programming-language-light was hugely successful. Visual Basic, Access, and the likes, and the huge collection of commercial ActiveX controls was probably one of the most overlooked reasons for Microsoft's dominance in the late 90s. Millions of small businesses used MS products and small programs to generate reports, form letters, and so on. And the lasting legacy of Flash is probably the indy-gaming scene that is really successful nowadays (the other drive for that scene comes from mobile apps, which ironically was a main source of Flash's demise).

I don't know why this kind of programming tool slowly faded away. Maybe because people realized that it can cause huge maintenance headaches. Maybe because there is more supply of "proper" developers now. But I guess if someone built a decent IDE and allowed you to click together apps in a high-level fashion (and without boilerplate code!) and using Javascript or Python, it could be a success even today.


I don't think of either VB or Flash as being in any way 'non-programmer'. VB has tools to eliminate a lot of the more mechanical aspects of GUI programming, but in the end you're still writing a form of BASIC to make it actually do anything. Likewise for Flash and ActionScript.

If anything, these both go to my point. They are very much programmer tools, they just lower the bar to getting something on the screen so you can get to the hard parts of programming sooner. If you aren't computer literate, or refuse to become computer literate, you still won't get anything done with either of them.


Let's operationalize it then:

We need tools that bring the learning curve down and allow people to develop their own basic tools to make their lives easier, while storing the data on a central location.

Amazing things have been done in Excel, but have you ever seen a department have locking issues? "Can you close the spreadsheet so I can get in?"

Yeah, we need multiuser excel, or perhaps a google docs equivalent that can be hosted internally.


>We need tools that bring the learning curve down and allow people to develop their own basic tools to make their lives easier

This won't achieve much because the issue is that most people can't think about the problems they face in a structured way.

Modern programming isn't hard, all you need to learn is a handful of control structures and how to do I/O, and there are a multitude of tools to take away the tedious bits.

But if you can't describe the problem you are trying to solve in a structured way than no non-AI tool will help.

P.S. I remember being considered a Wizard in a non-technical university course because I looked up some Excel tutorials and built a spreadsheet with some basic calculations and graphing. This wasn't technically hard, but most of my classmates lacked the ability to comprehend the relationships between different cells and tables in Excel.


Currently, programming requires critical thinking and coding.

Enabling people to become critical thinkers is a much bigger problem than just programming: programming will always require critical thinking.

Making programming a computer easier is a difficult problem to solve but very solvable as long as we assume that programmers are critical thinkers.


See I don't think coding can get any easier for the level of problem we are talking about here. The kind of small problems we seem to be talking about are the kind of thing that is within the realm of "scripting". You don't need a deep understanding of Computer Science or advanced data structures to write good enough code for those problems.

"Coding" will never go away before AI because at some point you need to formally express the design you have thought up. It doesn't really matter if you do that with arcane syntax, symbols, or "natural" language.


> "Coding" will never go away before AI because at some point you need to formally express the design you have thought up. It doesn't really matter if you do that with arcane syntax, symbols, or "natural" language.

It sounds like you are saying, and I'm not disagreeing with you, that the act of speaking required coding.

That is, speaking is taking input from the senses and memory, converting that input into a natural language (coding/encoding) through thought and saying the final thought out loud.

How does this help in understanding the differences between typing out syntax, manipulating symbols, expression using natural language or AI using it's own coding approaches?


No, I am saying that coding will always require some kind of formal expression, and that the difficulty of the grammar used is not the major barrier (as these issues can be dealt with by compilers and IDEs).

Attempts to create "natural" language programming haven't worked because actual natural languages are too informal and imprecise for simple algorithmic comprehension.

A system that can comprehend natural language and turn imprecise specifications into a program should be an AI.


Modern programming isn't hard you say, for a programmer I'd like to add. I have never been able to program well with something like a programming language as I can seem to hold enough information in my head structured that way.

Building giant spreadsheets is easy though. I don't think programming is for everyone, but more people could learn than today.


Sounds like you need to modularise the problem more, which you are naturally forced to do by the limitations of spreadsheet "programming".


That's why SharePoint is so prevalent in corporate networks, despite being hated by programmers and users alike.


Yeah, we need multiuser excel

Welcome to seven versions and seventeen years ago: http://www.techonthenet.com/excel/questions/shared.php


For a rudimentary definition of multiuser, that works. :)

I was thinking more of a access/excel fusion that has a more robust backend, but sure.


> Programming for non-programmers is an oxymoron to begin with. Even using Excel is programming.

Ya, it should be something like "Programming for non-coders". We used to say we would "program our VCRs". Somehow programming has become synonymous with coding but, as you pointed out, using Excel is programming.


We've been waiting for a long long time now.

http://www.cs.nott.ac.uk/~cah/G51ISS/Documents/NoSilverBulle...

abstract reasoning doesn't become any easier because it's done via the form of pictures.

If any of this stuff was actually remotely possible, we'd all be building our software from UML diagrams right now.


Right tool for the right job. Wiring UI controls and data sources together in LabVIEW is easy, but it all falls apart when you need to implement a non trivial algorithm.


These vendors (like Pega Systems) will never go away, because they always find ways to dupe senior management into believing that there's a silver bullet to software development.

So as long as there are non-tech savvy senior management around wielding disproportionate influence over IT procurement, billions will be made selling Hocus Pocus IT solutions. "Never write code! Use drag and drop interface! Let business make on-the-fly changes!"


Well, it kind of evolved into 'programming for everyone!' nowadays.

And I believe, that is the wrong direction.

Sure, we need to make learning programming more available. There are A LOT of people in non-English speaking countries where the language is a barrier for entry.

Many people say that people could make tools for their own use. It looks easy for us, programmers, when we eat, breath code. But for people with no background in programming, it's a lot to learn.

Let's make programming even easier to study but please stop shoving it down people throats putting it as something you can learn in a weekend when you are bored.


I wasn't aware that English was used as a programming language.

Seriously though. Most languages are relatively small (tens of reserved keywords, plus a handful of symbolic operators).

I don't think that changing the keywords to be based in another language would eliminate the complexity of getting the syntax correct, nor would it eliminate the logic required to understand the problems a level higher than that. And I am fairly sure there are tutorials available in other languages. I suppose documentation and libraries will be a different matter though.


Libraries are in the center when considering the role of natural language in programming and the barriers it creates: Programming languages contain built-in libraries, which consist of a large vocabulary of more or less strange concepts. Also, finding and grasping libraries is the way to effective programming.

Imagine turning the tables and learning programming with the APIs in Chinese if you can't distinguish the characters let alone "spell" them.

Even those who can program often hit a wall in the land of Haskell with monads, monoids and functors, which are more scary words and new formulations than complicated concepts. Heck, I'm sure some people first stumble in Hello World because it has little to do with a printer.

Also outside reserved words: How's an "object" different from an "entity" or how are "variables" in programming like "variables" in mathematics?


It has nothing to do with reserved words, it's about learning resources.

>>> And I am fairly sure there are tutorials available in other languages.

It depends. If we are talking about Russian, Spanish, Chinese, etc - definitely. But it's a totally different situation in smaller countries with their own languages (I am Lithuanian).

Of course at some point many people learn English but it's hard for teenagers who want to learn programming and don't speak any of the major languages.


Silly gimmicks like that are not the way programmer jobs get eliminated. More powerful programming languages and better tools are what do the job. That and the ever-expanding universe of libraries for every task under the sun. If you can hire one programmer to use an open source library to build an application that'd take 10 programmers to write from scratch, you eliminate 9 jobs (obviously).


I have noticed the same. I am now building a database for my work with a good web framwork, that would have taken a team of 5 a few years back.


And enterprise software development has been going in the opposite direction. I work for a $100 billion company. Some days, I want to stab myself and throw myself out the window.


I find enterprise development pretty fun these days. Now that many of the easy things are done, we're being asked to do complex things with the expectation of a fit and finish like the apps they're used to using elsewhere.

But enterprise is so varied -- some places are amazing, some places are terrible. The trick is to find a group with whom you love going to battle. Find people who are looking to continually improve. People who believe in the campground principle -- leave code better than you found it. People at every skill level who are willing to get better.

If that's not your group, find a group that is. Better yet, inspire that in your group.

It's not about the company -- it's about the people with whom you sit at lunch.

(Okay, and sometimes the business owners... yeah, sometimes they suck.)


I'm amazed at how many brilliant minds I find in the enterprise. Most of them are rolling boulders uphill. Sure, there's a ton of slackers and dead weight, but there are a lot of people who would be the envy of any startup.


All of us are rolling boulders uphill.


some of them are just pretending to roll boulders, but would crumble if others stopped :)


In these situations, I believe there are two types people:

A: "Wow, things are really messed up. I'm out of here." B: "Wow, things are really messed up. There's opportunity everywhere."

Either can be right, depending on the person and situation.


I'm a B, for sure. I'm trying to create new tools to deal with the "things are really messed up" problem... in part because I can create a working business and get myself out of the enterprise, and in part because I genuinely want to make this industry better. What I like about what I'm building is that it changes the human/computer ratio in debugging and problem-solving for complex systems.

Have computers do what computers do well, so people can do what people do well. That's the core of my approach to building software. A big part of the reason enterprise development sucks is because debugging is so manual. Automate parts vulnerable to automation, and engineers won't just be more efficient - they'll be happier.


At a megacorp, as a developer, you have as much chance of changing the culture as you have of stopping a hurricane with a fart.


There are two Bs, though: one sees opportunity to sell change. The other sees opportunity to perform arbitrage against the inefficiencies.


Actually A and B are same person. Just two different phases.


That age old saying, "Change you job, or change your job."


Opportunity to fix other people's mess at the expense of your health (since you are not likely to get dedicated time for refactoring) and have a "thank you", while CEO gets an oversized bonus?


ok coach.


All those tools you talk about still need to be maintained.

Anything built for "non-programmers" as you say, is also many orders of magnitude more complex, and at some point that complexity will grow to a point where it is cheaper to replace the whole system entirely rather than upgrade.

API's and stuff are easier on some level, but distributed systems are also much more complex. Upstream API's change, or get EOL'd or change license.

actually, instead of listing all the reasons : every point you made is just going to end up with more software being developed. it all feeds into itself. endlessly.


This also means that design is playing a proportionally larger role in startups.

A lot of the engineering work needed to build the next Instagram/WhatsApp/etc has become commoditized. Design is what's left.


Wow. That's actually something that hadn't really struck me until reading this comment. While there is development to be done in ever project still, the upfront amount is growing smaller and smaller with the (P|I|S)aaS techs available to companies. A lot of it does really come down to making the front end of a product.


This is my feeling as well. I'd add the caveat that design shouldn't be interpreted merely as "front end work" or "ui". You also have to design your algorithms, data models/structures, features, and an optimal user experience.


Building Instagram required a lot of engineering work?


Here's why this is, at least on a certain level, wrong...

Software complexity expands to exceed available tools.

This is a Big Truth. No matter how fast hardware gets, we'll beat on it until it's burning up. Give us better tools, and we'll just write more and more complex software until the tools break down under the complexity. Give us better processes, and we'll make bigger projects.

Better tools don't make it any easier to write software, never have, never will. They just change the kind of software we can write. You don't buy a Porsche to drive 55 more effectively, and you don't use better software development tools in order to work less.


Why aren't developers responsible for managing the complexity of their project?


They do - if they have ownership too. But project managers won't like to give up the ownership, they just push for more features in less time. There goes your hope for responsible complexity management...


You know every time we create some innovation that unloads the burden from programmers we invent some other crap to waste their resources on.

Circa 2010 enterprise java managed to take all benefits from freeing memory management and waste them on absurd abstractions and xml permutations (you throw some settings in xml and see what happens)

The universe is conspiring against us shipping real code.


LOL, so true! I'd say the opposite is true: software isn't eating programming in any significant way. As much, it's eating the overhead you describe.

Breakthroughs? Show me!


Competitive market forces will keep programmers busy for quite a while. As soon as it becomes easy to do things like make basic online stores with Shopify, stores will want more fancy functionality to differentiate themselves from their competitors.


Economically these are not different from technology in general. We can do more with less. There are two side to that statement.

We can do with less. The first two points are especially relevant to this side of things. The most "talented" can be extremely productive. What if Google search & Gmail could be built and run by 14 talented individuals? Duckduckgo seems(ed) like an experiment in this direction. Doing with less is not necessarily painful or revolutionary though. Often, it's smooth. The answer to a lot of "how did people do ___ before computers" questions is "secretaries."

We can do more. That has been the overwhelming result of technological change up to now. Better tools -> More stuff gets made. Most of the resources freed by "doing with less" go into doing more. Agricultural machinery frees peasants to work in factories. Industrial robots free laborers to work as social marketing content creators. Progress.


It's not hard to imagine rote programming becoming a universal skill among high school graduates. I could even see this happening within our life-times; it's really only a matter of societal will and available teaching staff. The basics are... basic.

If this happens, then APIs and libraries that make development easy and intuitive for end-users could easily become as in-demand as well-designed websites and mobile apps are today.


I can see general programming being added to a school curriculum as most people think it's a guarantee at a good career (for some reason).

I don't mind this mindset, and would enjoy teaching such a course.

I look at it this way though, we theoretically teach every single American student how to write, has that caused an increase in high quality literature being produced? Theoretically, every high schooler can read, I'm curious what percentage of adults actually read literature for fun?

Most long form writing (which I'd consider to be analogous in complexity to large scale programming) is not very common. The average adult's writing is quick and "dirty". I'd expect to see an increase in programming similar in size, scope, and usefulness to email, text, lists, and meeting notes. It would be personal, private, and not meant to last. Tents, not marble palaces.

I think a much more valuable tool would be to teach everyone git (or something easier but still with good branching). For quick and dirty, excel is fine, and some improvement on it likely will come in our lifetimes. What I think is far more valuable is a way to version all text a person produces, with branching and history.


> we theoretically teach every single American student how to write, has that caused an increase in high quality literature being produced?

Maybe, who knows? How many American authors would have been illiterate if we didn't cast a wide net?

More to the point, you don't have to contribute a great piece of literature in order to get some value (and give society some value) from basic reading and writing skills. Basically everyone does some reading in their everyday life which is crucial to their level of productivity (and personal enjoyment).

> I'd expect to see an increase in programming similar in size, scope, and usefulness to email, text, lists, and meeting notes. It would be personal, private, and not meant to last. Tents, not marble palaces.

Very much agreed. I don't mean to suggest that software engineers would be displaced in any meaningful proportion, just that we could increase the productivity of everyone by a little bit :-)


The problem with quick and dirty excel, is that the temporary gaps that it fills become permanent solutions. The a couple of years down the line, too many parts of the system are hinged around an unreliable error prone excel sheet, when something better should have been written.


Well that's just it, what is "better"? Currently, the software to replace a typical excel spreadsheet is prohibitively expensive. (As someone who currently makes a "spreadsheet replacer", I've heard our BA regularly comment that it takes us tens of thousands of dollars to add in features he was able to add to his spreadsheet in a few minutes.)

I recognize that we are building a cathedral meant to last, and he was building a tent that was error-prone and easy to break. The thing that gets me is how fast he was able to turn data into the answers he needed. Why don't my tools let me put something that fast together?

I've become lately interested in the idea of a "better excel" that could solve the issues excel has: easy to "forget" a new row or cell, hard to version, hard to share, stuck behind a complex gui, hard to add bigger data, etc. I think such a tool could greatly increase the "general programming" of the world far better than just teaching high schoolers to program. Most people who would program either do so for someone else (trying to automate something) or for work because they need to get an answer out of some data. I think excel IS a programming language, it just has a gui instead of text. I think if that was improved upon, it would meet a big need.


I (pessimistically) don't think that creators will ever be the majority. maybe in the future it will be easier for all who create things to incorporate custom software, but I don't think that the majority of the population will ever feel the need to create their own software, even if it's as easy as editing music with garageband or manipulating images with photoshop. I would love to be proven wrong within my lifetime though!


What do you mean by creators?

I know very few people who don't create something. Most people do so at their jobs, and those who don't usually have a hobby or two (sewing/knitting, hobby construction projects around the home, gourmet cooking, etc.)

Someone I think you're using the term to mean something else.


> It's not hard to imagine rote programming becoming a universal skill

Rote learning is a process of memorization that maps from an exact context to a specific answer ( 3 * 3 = 9).

Writing a program very rarely has the exact same context by which a rote learner can map a solution to. There may be aspects of a program that can be mapped to a specific solution but I really feel this will never be the case for an individual coming up with a complete solution to a unique, never solved before, problem.

Critical thinking is required to come up with a solution to a unique, never solved before, problem.


> Critical thinking is required to come up with a solution to a unique, never solved before, problem.

1. Perhaps I'm being slightly too liberal with my use of the word rote. But by "rote programming", I mean the sort that requires no understanding of anything beyond CS1 concepts, the domain and your toolset. The programs are very simple and sometimes hacky, but get the job done (e.g. "Move all those files over there and then watermark them using this snippet I found on SO and then add them to the database so they show up on the website").

2. Even for many professional software engineers, often times most of the critical thinking is probably about the domain as opposed to the programming. Most people aren't (or shouldn't be) coming up with truly unique algorithms/approaches; rather, they are mapping new problems to tried-and-true techniques usually already implemented as libraries.

So it's possible to solve new problems (in the sense of your domain) with rote programming (of the sort described above), if all the critical thinking is about modeling the domain as opposed to the software.

Of course, every once in while, the "tried and true" technique is a piece of CS that only an educated engineer has heard about/knows how to correctly use. But my point was that there are lots of problems you can solve without getting to this level.


> Most people aren't (or shouldn't be) coming up with truly unique algorithms/approaches

Rote approaches to solving problems are great when the problem is very well defined and repeatable (need to fill an order for a coke and two fries). However, as soon as the mapping changes even slightly, Rote learning[1] and, in my opinion, Rote programming don't work so well.

Even if every tool ever required to implement a software solution (algorithms/approaches/etc.) was available we still need to map a given problem to one or more solutions. This is why I wrote "Writing a program very rarely has the exact same context by which a rote learner can map a solution to."

When I lectured, students could "get" what tools were available to them. If I asked them to describe a sorted list, they could in great detail. However, when given a generic problem, it was a lot harder for them to select from the tools available to them ("Oh, so to speed up this programing, I could use a sorted list"). That is the critical thinking part.

[1] "Umm, I've already told the register that you are giving me 10 dollars so you will need to give me that instead of $10.50". - This happened to me and is a big signal that the person had done rote learning their whole life.


This. Thanks for putting words to a thought I've had for a while. It's already happening. We're transitioning from web/mobile app demand to services demand, which explains the relatively recent explosion of the SaaS/PaaS/IaaS models.

Also, there's no reason to base it on an arbitrary threshold like high school graduation. The amount of rote programming I have to do (as a professional working in the field of 10 years) is drastically decreasing.


There's already debate over the merits of introducing this in the Utah public schools. I think it's a great idea.


Software eats software development is nothing new. Everyone who writes libraries is doing just that for past 30+ years. In fact good software developers take pride in writing software that require less code. Its in the inherent nature of software development itself. Come to think of it, software field could be the most egalitarian of all. You won't see lawyers lobbying for less complicated laws!


"Developers have steadily marched upwards from Assembly to C to Java to, today, scripting languages like Ruby and Python."

Downwards in terms of language design though, you can add JavaScript to the end of that sequence to make it even more obvious. Otherwise we would've been using a some sort of sane syntax version of Haskell by now. If software is "eating" 'whatever' then computers should be doing more for you not less, hence even more pre-runtime checking, not less.

"... allowed hundreds of millions of semi-technical people to become software developers. ... these tools act as a force multiplier for the software industry."

Multiplying the code mess professional developers will then have to maintain? Example: converting Excel and Access spaghetti into sane programming models in finance.


Who moved to Java? Write once - debug everywhere.


Guess you don't code in Java. As that's bullshit. But maybe you refer to GUI != HTML applications. And even there it's not true.


It's interesting that there is no mention of the so called "citizen developer". According to some Gartner reports very soon a fairly large amount (approx. 25%) of all business applications will be "written" by so called "citizen developers". These are mostly business people with minimal or no programming knowledge but are able to use software tools to create other "software tools". A good, VERY BASIC example is building an Excel "app" with some VB scripts...


Programming for non-programming works best when they don't know they're programming. It's easy if you don't know you're doing it.


> General purpose tools for non-programmers: Since then, there hasn’t been much work in these areas, but from what I’ve seen that might change soon.

Really? Loads of examples. One just off the top of my head is Scratch: http://scratch.mit.edu/


I love Scratch and teach it in my gen ed class, but I'm not sure I'd quite call it a "general purpose tool" in the same way that Hypercard was. I could certainly see it evolving (or rather, forking) into such a thing, but as it stands it's a little too kid-education-animation-game-oriented.


Maybe it's worth to mention Livecode here (http://livecode.com/); it's open source and it's nice and, by now, quite portable; I have compiled it for ARM on the Chromebook (Crouton) and the Pandora.


scratch is still programing. following a recipe is interpreting a program. logical sequences of steps.


> Developers have steadily marched upwards from Assembly to C to Java to, today, scripting languages like Ruby and Python.

Since when does being a dynamic language and not having to have a main() function make you a "scripting language?" I didn't know people still said those words.


It's not the dynamism, or the lack of a main() that makes Ruby & Python scripting languages. It's the fact that you don't have to compile them before executing them.


The phrase "scripting language" makes less and less sense in a world where C interpreters exist (e.g. Ch, CSL, picoc, CINT, tcc -run), and in a world where Ruby, Python and PHP have compiled implementations (both JIT and AOT against various virtual machines and native targets).


what terminology do you think is more appropriate for these high level "glue" languages?


I think the point is that you cannot group them like that.

Static typed languages vs dynamic type languages is a much better separation but even then it is grey.

The pejorative "scripting language" was a term that Sun pushed really hard a while ago to separate them from "real" languages (i.e. Java) that you would trust your company to.

[Edit] I may have mis-remembered and it it might have been Oracle rather than Sun trying to dismiss "scripting languages" as toy languages at the time.


> The pejorative "scripting language" was a term that Sun pushed really hard a while ago to separate them from "real" languages (i.e. Java) that you would trust your company to.

Fascinating. I do believe this without hesitation, but for the sake of sharing this in other social circles, is it the sort of claim that's possible to find a citation for?


As soon as I wrote it I have tried to find a source I can link to but my google-fu is failing me.

I will keep looking. It may have been Oracle rather than Sun.


Here's this from Oracle (I think), might be relevant:

http://www.java.com/en/download/faq/java_javascript.xml


I remember reading about it in an article where they got their hands on a "guideline document" for an expo (One of the big Java ones) for Sun/Oracle people.

In it it basically talked about the need for constant low-level differentiation and playing-down of scripting languages.

The link you posted above is the kind of thing this encouraged. Don't be actively hostile but more "Oh, a scripting language? How cute"


Sounds reasonable but a bit outdated. Then again who would've guessed that historical ubiquity of JS scripting language would eventually put it on server side?


I have no trouble believing that any (or even many) players abused and over-extended the distinction for their various ends, but it appears Ousterhout (author of Tcl) was using "scripting language" before Java came on the scene. In that case, it clearly wasn't intended pejoratively. I would like better sources than I've been able to find, though...


I phrased it badly. what I should have said was:

"scripting language" used in a pejorative sense was pushed by Sun/Oracle.


Fair enough.


They aren't glue languages to begin with. People are building substantial applications with these languages.


>scripting language

It's just a categorization. Not an insult.


It's a categorization with no meaningful difference. There's no honest reason to create something like that.


It's a categorization that was created to be an insult.


Because they are more often used for scripting then anything else. Let's not add political correctness to programming.


"Because they are more often used for scripting then anything else. Let's not add political correctness to programming."

Citation? You're just making stuff up.


python.org

Look at main page:

"Python is a programming language that lets you work quickly and integrate systems more effectively." - it's about integration (where dynamic scripting can shine).

Look into history:

https://docs.python.org/2/faq/general.html#id6

"It occurred to me that a scripting language with a syntax like ABC but with access to the Amoeba system calls would fill the need." - it was created as a scripting language.

Look into FAQ about compilation:

https://docs.python.org/2/faq/design.html#can-python-be-comp...

"Can Python be compiled ...? Not easily. ... a “compiled” Python program would probably consist mostly of calls into the Python run-time system, even for seemingly simple operations like x+1. ... ... improves the start-up time of Python scripts. ... Note that the main script executed by Python ... Usually main scripts are quite short, so this doesn’t cost much speed." - no comments.

Of course now it's increasingly being marketed as a general purpose programming language because it makes devs feel better.


Marketing is not evidence. Stop making stuff up.

Mods: Please delete my account. I'm done.


Right, because people aren't building real systems with Python and Ruby. We're just building toy systems, duh! Scripting language is a pejorative.


heh but isn't it also the reverse as well? it's an insult laid at Java insinuating that it is not nearly as lean/simple/modern. That's the feeling these flamewar phrasings leave me with anyway.

I left Ruby for the JVM and I'll never look back despite all these articles implying Java is not "modern" because it lacks all the features that make scripting language devs "so much faster". Rock-solid, efficient, type-checked code on an efficient platform should be considered modern as well :D

that said, i love Clojure so it's not like i'm a hater -- i just like Java interoperability & don't see the coding paradigms as mutually exclusive. Can't we all just git along?


For those in machine learning and programming languages, what would be the major hurdles in achieving voice programming (or at least scripting). A way to interact with a computer so that one could ask things like:

"Computer, search google's rss news for arduino articles. Sort them by lenght, in the top 100 results search for the world circuit. Copy paste the paragraphs containing those results into a text file. Send it to Jane".

Is it really that far? Wouldn't such a program enable some kind of programming for non-coders (of course assuming one would need to learn some rules, and have a clear idea of what one was looking for).


The problem is it's already hard to be very precise about what you're saying when talking to a computer just with text, so we'd need functions specific for each of those voice requests, more reliable voice recognition, and also some AI intelligence to understand the English you're using, so even that simple request would be difficult unless we pre-scripted the functionality for that whole phrase.

It's not impossible, the technology just isn't there yet.


Question, does this follow the same trend as other Relationships..for example Mathematics can be said to be the 'software' of high-level physics..do we see the same trends in that there is a high-level in Mathematics where new theories are being created but a lower level where a large amount of people use tools to apply these new theories?

If so, than are we facing the same exact problem just in a different field?

My take on it, is that we never fully mastered teaching how to come up with abstractions and new models..tangential proof is the large number of religions created to come up with new models of explaining how the =universe works.


> do we see the same trends in that there is a high-level in Mathematics where new theories are being created but a lower level where a large amount of people use tools to apply these new theories?

Definitely.

If you allow more longer-running examples, this is exactly the situation of calculus. The Calc I-III that many use regularly is the application; Real Analysis is the theory. Ditto for any "applied" sort of math you can think of (stats, crypto, machine learning, etc. -- all basically just use theorems which are the "abstract interfaces" to pure mathematics.)

> we never fully mastered teaching how to come up with abstractions and new models

And we never will, as long as the human race continues progressing :-)


> Software is eating the world, and doing so using smaller and smaller teams. WhatsApp was able to disrupt the global SMS industry with only a few dozen engineers. Small teams can have a big impact because software development (and deployment) has improved dramatically over the past decade.

While this is true, I'm coming to realize more and more that a lot of would-be highly needed disruption is just not feasible because laws and monopolies are standing in the way, much of it due to lobbying and sometimes patents. It's sad.


Halide makes me think we can give a bunch of that power that Moore's law gave us back to actual problem domains without making programmers any less effective.


Pretty much his entire list is wrong.

"Deploying a commercial website ten years ago required significant upfront capital"

Actually... you could get a very large website up 10 years ago on almost nothing. $30/month dedicated servers were available that could do 10,000 connections at once. Which means you could serve millions of people cheaply. You could also get lots of virtual shared hosts for $5/month or less.

Heck, even 15 years ago you could do that. Maybe not everyone realised it till later, but it's not a new thing at all.

"Startups created simple APIs that abstract away complex back ends. Examples: Stripe (payments), Twilio (communications), Firebase (databases), Sift Science (fraud)."

There were plenty of services around on the internet 15 years ago with APIs too. Including payments, communications, databases, and fraud. I know because I developed some of them.

"Open Source. Open source dominates every level of the software stack, including operating systems (Linux), databases (MySql), web servers (Apache), and programming languages (Python, Ruby). These are not only free but generally also far higher quality than their commercial counterparts."

Guess what? Open source was around 10 years ago.

"Programming languages. Developers have steadily marched upwards from Assembly to C to Java to, today, scripting languages like Ruby and Python. Moore’s Law gave us excess computing resources. We spent it making developers more effective."

Ruby and Python were both around ten years ago. The same as perl, php, and haskell amongst others.

"Special-purpose tools for non-programmers. These tools let non-programmers create software in certain pre-defined categories, thereby lowering costs and reducing the demand for developers. Examples: Shopify (e-commerce), WordPress (blogging), and Weebly (small business websites)."

um... blogging, e-commerce, and lots of other non-programmer tools were around in the 90s.

"General-purpose tools for non-programmers. In the pre-Internet era, tools like Hypercard and Visual Basic allowed hundreds of millions of semi-technical people to become software developers. Since then, there hasn’t been much work in these areas, but from what I’ve seen that might change soon. By allowing more people to program, these tools act as a force multiplier for the software industry."

Have you heard of the internet? Seen the reams of cut and paste code out there from semi-technical people? Whole industries are centered around letting less skilled people make software. This hasn't decreased, but increased. There's a whole "everyone should code" movement hitting us.

Is this the quality of article we want on this website? Please delete my account!? I'm done.


His point is not dependent on getting the number of years ago exactly right. Mentally replace "10 years" with 15 or 20 if it helps.

I don't really blame the author for getting it wrong. It's easy to lost track of how many years have passed as one gets older. I can still remember the Internet before the Web, so it can't have been that long ago, right? Right??


The way I remember it is ten years ago was the tail end of an era in which hosting was expensive and bandwidth was even worse and small sites that got a little more popular than expected would regularly go dark at the ends of months because they hit a resource cap.


Is this the quality of comment on this website... please delete your account!

What were the "plenty of API services" 15 years ago. I don't remember that many, you are talking pre google days... Like flashing "Under construction" websites... If there were APIs i think they were probably an accident ;)

Ruby and Python were both around ten years ago, and No body cared.. Python was primarily used as a CGI layer, and then later in Zope... Ruby just wasn't used (as far as I recall). After Rails, and Django, people started paying attention, and then google gobbling up the python community worldwide changed that.

What were the Blogging/E-commerce tools around in the 90s? Do they compare at all with the ease of use/scale of those quoted?


The e-commerce shop sold to yahoo in 1998 by pg. The guy who made this website. [0] Ebay was also doing lots of business in the 90s.

Paypal was founded in 1998, and it wasn't the first e-commerce company by far. [1]

Perl, php, TCL via AOL and yes python were definitely used for much of the web. Google founded in 1998 used a little old language called... python. [2]

mod_python was released in 2000, which came after the non-cgi version of python the Netscape Enterprise server... which also used JavaScript as a server language. [3] However, lots of people were using perl for web services.

SOAP designed in 1998 was one of the libraries or protocols where people published APIs, along with Corba, and xmlrpc amongst others. [4]

[0] http://www.paulgraham.com/yahoo.html

[1] https://en.wikipedia.org/wiki/PayPal

[2] https://en.wikipedia.org/wiki/Google

[3] https://en.wikipedia.org/wiki/Mod_python

[4] https://en.wikipedia.org/wiki/SOAP


... You still didn't mention a single API.

And yes, python was in complete disuse that only obscure academics used it (Which is what google was founded by). I mentioned in my comment it only gained traction after google and django before then it was niche Zope and CGI. All your citations don't actually refute it, they in fact strengthen it...

So show me an API that existed in 1998... all you showed me was that SOAP was 'designed' in 1998...

Are you really citing "Netscape Enterprise Server" as proof python was in wide usage?

Paul Graham sold a business to yahoo, wasn't it in lisp? Did it have an API? does it actually prove any of the points you made, or is it just a statement about "There was business on the internet" which I never refuted.

You were really scrounging to come up with pretty much nothing, so thanks for further illustrating my point :)


Livejournal was a pretty big blogging center that started in the 90's. (Well '99 so the end I guess) If you look at wikipedia for blogging it seems blogging started in the mid 90's but was probably mostly done on self hosted sites.


> Since the pre-Internet era, there hasn’t been much work in general-purpose tools for non-programmers, but from what I’ve seen that might change soon.

What startups might he be referring to?


Developers keep pushing the envelope to make tools easier to use and people keep coming up with increasingly complex requirements.

Neither side is ever content.


This. This is the reason why I never say "We can build X". We can never know because the requirements always get pushed an order of magnitude higher whenever we can actually build X.


"Software is eating software" is true, but not good. It's not good at all.

Take a look at the manner in which so many companies use project management software like Jira: to create a Big Brother system. Developer's velocity drops below 15 "story points" per week? There's your bad guy! Some companies use time-tracking software (designed to help an individual manage her own time better, and certainly not to abuse workers) to take micromanagement to a new level. Then there's LinkedIn, which may have been started with good intent but makes it astronomically more difficult for people to reinvent themselves (giving even more power to corporate management, which is thus empowered to fuck with peoples' careers long after they leave).

The real bad guys have been using us (programmers) for decades to wage their war on the workers. That's most of what they want us to do. Our effective purpose is not build flying cars or cure cancer, but to vaporize jobs for the poor and deliver the proceeds, efficiently, to the rich. Since we are also workers (we're the upper-working class) it's not surprising that we'd see our own weapons used on us.


> Then there's LinkedIn, which may have been started with good intent but makes it astronomically more difficult for people to reinvent themselves (giving even more power to corporate management, which is thus empowered to fuck with peoples' careers long after they leave

Just curious, how do you mean?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: