Hacker News new | past | comments | ask | show | jobs | submit login
Why aren't developers interested in Ada? (embedded.com)
54 points by lbrandy on Feb 9, 2009 | hide | past | favorite | 52 comments



OK, so I read this article and I want to give this Gnat thing a try. I go to this AdaCore company's homepage and I'm looking for a place to download a batteries-included Ada distribution. Where? Where?

Finally I click "pricing" and I find a ridiculously long survey form of some kind I'm supposed to fill out. At that point, I'm gone and I'm never coming back.

That is why developers are not interested in Ada.

Compare the experience to getting Python: go to Python.org, click "download" on the left-hand menu, click an appropriate installer. Or if you're in Linux, fire up a package manager, type "python", see it's probably already installed, open up a terminal, type "python", type "2+2" at the REPL and see that you get "4" back, and go from there.


In all fairness, you can do apt-get install gnat, too.


Yep, yast and yum will do it too.

There is also the Hibachi plugin for Eclipse.


As stated, you can do the exact same thing in linux for ada as you can for python.

As for AdaCore, you've gotten sidetracked onto their dual-licensed (and costs-real-money) compiler. The free stuff is at: https://libre.adacore.com/


True, but the free site is also more annoying than Python's: "Downloading the GNAT GPL Edition is free of charge and requires a brief sign-up process."


i've written lots of code in ada (dod related airplane systems). at the time, i was a student, and pretty sure i was programming myself in to a dead end skill.

ada, for me, was like a teacher who taught the entire class at the speed of the slowest student. it tries to prevent you from doing anything marginally likely to cause a bug, at the expense of speed or creativity. it assumes the worst in a programmer. good programmers are forced to program simply so that future generations of bad programmers can read their code.

(i'm very glad to never touch it again)


good programmers are forced to program simply so that future generations of bad programmers can read their code.

Is this necessarily a bad thing? I generally try follow the "code such that nearly anyone can understand" mantra. Helps keep things simple and concise.


I actually don't like forced readability like the python whitespace as I think it stifles creativity, and the only way it makes code better to read is by forced formatting (the forced white space) while your code can still have hard to read logical loops, blocks, variables.

In a way, it stiffles creativity just for a small gain of formating/readibility.

Plus, with modern IDEs it is very simple to format the code the way you want. In eclipse you have RightClick -> Source -> Format then viola, your code looks the way you want.

Having worked with both python and java, I have enountered more hard to read code in python than in java, (maybe this is due to the superior tools/IDEs in java).

As for ada, i thought it was a great language to start learning programming. Exceptions are verbose, and it is an easy language to pick up, as it looks a lot like Pascal. I learned it my freshman year, but I never used it in a production capacity and I wouldn't consider doing anything serious with it right now.


'and the only way it makes code better to read is by forced formatting (the forced white space)'

Indentation is required when writing in any language as a matter of courtesy. Python just doesn't ask you to delimit your blocks of code a second time by unnecessarily requiring brackets.


at the high cost of crippled one line lambdas, no thanks


I'd argue that there is nothing crippling about it. You can define functions anywhere, if you need a multi-line lambda just define a function and be done with it. Multi-line lambdas are messy... this constraint keeps your code sane.

You can nitpick all you want, but the fact is I can start reading just about anyone's python code and feel like it's my own because styles are consistent across the board. A lot of this has to do with the tight community, but the fact that it's all reinforced through syntax constraints is a huge plus.


* Because there's a ton of C/C++ code out there. As such, there are lots of jobs asking for C skills. As such, lots of people learn C. As such, lots of schools teach C. Ada never hit that critical mass (as Java and C/C++ did) where people learned it because so many other people knew it rather than because it was cool. Why does everyone in the world learn English? Because more of the world's money speaks English than any other language. So, if you want one of those C++ jobs, you'd better know C++ - not because it's good, but because so many other important things use it.

* Because Ada isn't that fast. It's significantly slower than C according to the language shootout (http://shootout.alioth.debian.org/). It's one saving grace is that it's decently efficient on memory - not as efficient as C, but efficient enough that it can be used for things you wouldn't want to use Java or Lisp for.

To an extent, it was too little too late. C was introduced in 1972 and quickly gained fame. Ada, a decade later in 1983, does have dome value, but it also has things many programmers would find annoying (like a syntax that requires more typing). Sometimes, you need to provide enough reason for people to switch. A tiny enhancement isn't always enough. Plus, companies like C because it's everywhere and easy to find programmers (whether that argument holds true or not is debatable, but companies still base their decisions off it).

There might not be anything particularly wrong with Ada, but that can still leave no catalyst that is getting programmers to learn it. There aren't a lot of things written in Ada (compared to C/C++/Java) and so it isn't the "I'll get a job with it" excuse. It isn't cool like Python/Ruby/etc. It isn't widely used in academics like Lisp and Scheme. Ada became widely used in defense and aerospace, but it hasn't created enough of a catalyst.


Ada is a fine language. For those of us that did the Turbo Pascal thing and enjoyed that Ada isn't that far off. There was this odd sort of period in my life, I learned Pascal and took classes that used it and wrote fairly complicated programs that by in large worked. Most of the debugging I did was really logic debugging. Then I switched to C and between the esoterics of doing C on a DOS x86 machine (anyone remember the difference between ASCII files and binary files?) and the language itself I started spending a lot more time debugging C and I put a lot more effort in to making programs radically more simple just to avoid complex debugging. At times I kind of pine for those old Pascal days... Had I not experienced that era, I might have gone in a different direction entirely.

Through out the 90's I tought C++ to undergrads for a couple years and there was this disease of sorts, some went though so much pain trying to make C++ do simple things (and they had no understanding of what the machine was really doing, they were trying to just learn a language) that they either a) would drop out of computer science all together or b) they'd have such a better taste in their mouth that they'd never fully learn Lisp or ML or really anything else and always revert back to C++. Ada is quite similar to Pascal in syntax and sort of behaves the same way. FWIW, Ada is kind of interesting, if you've ever wasted a lot of time tracking down where you C program bumped the stack or leaked memory or something then the things Ada requires of you are kind of nice in a way.

Seems Ada missed the timing on a number of issues. First Ada compilers were built to be sold to the feds, they essentially designed the language and they mandated its use and so compiler writers assumed that they could charge $20,000 a seat type prices. Completely missed the market, largely due to costs. This put a very serious damper on things.

The other thing that they missed on was tooling. When Java first came on to the scene there was a vocal group of folks that were used to teletypes and modems and such and the difference between "{" and "}" and "BEGIN" and "END" was substantial to them. If they have a Visual Studio or Eclipse like programming environment, those differences kind of vanish but that argument took place in the early and mid 1990s, there are still people that feel that their coding ability is strongly related to their typing ability.

By the time tooling caught up and the GNAT compiler was around Ada had accumulated a huge perception deficit. Also the world has moved on and a lot of developers want protected runtimes, like a JVM or CLR or the interpreter of Ruby, Perl and Python. There are still efforts to create higher level C competitors, the D programming language, the Eiffel programming language, Modula-3 as one, it just seems that not that many people are interested.

I like to look at Rails and I think a big part of its success has nothing to do with any technology so much as the effort they put in to teaching their users a good and working structure to build web apps, they will automatically generate the skeleton and you fill it in; by the time you've done a real world production Rails app, you've done everything yourself and don't even realize it because they got you started. Ada and Pascal sort of provided that structure to building machine code applications, people really rebel against that structure unless you sort of trick them into it like Rails does.


One thing I would add to that is that Ada at that time (Ada83, that is; Ada95 fixes this) lacked function pointers. There was no mechanism in the language to support the runtime selection of behavior that function pointers allow. Such a feature was intentionally left out of the language in order to facilitate static analysis.

For the class of problems where you are willing to accept decreased expressiveness in order to get increased reliability (e.g. missile guidance software), Ada is great. But in a lot of application development scenarios, expressiveness is important.

As I said above, Ada95 fixed this, but it is still a bit cumbersome and before one could reliably use Ada95, it was necessary to indirect through C to do things dynamically.


When ADA was new, the compatability suite for ADA was also around $20k (at least that's what I remember). Since the name was trademarked, no one could sell a compiler without passing the suite. There was no way that two guys in a garage could sell a compliant compiler without getting it certified. But making a C/C++ compiler and selling it for $25, that happened. So the only compilers in the beginning were from big companies for big money.


geez. no coffee and I'm glad I "taught" C++ rather than grammar or English. I also meant to say "bitter" instead of "better"

It's moot, I need to proof-read more before submit.


Another reason is that many places that develop safety-critical software use languages like MISRA C, a subset of C that's supposed to make code more reliable and static analysis easier.


Somebody wrote this comment on the original page

"Ada enforces the engineering principles that we should be following but the hacker mentality the dominates this field pushes back against that since they have a cheap & "easy" language like C to use instead."

What do think about this "engineers versus hackers" debate?


I think the target environment makes a big difference. Quick iterations, hacks and experiments are nice when creating a web app or an application prototype which can be refined iteratively. However, if you program a spaceship or nuclear missile, you have to get it correct the first time, and the program have to work correctly under any concievable input. I think this calls for different approaches and different languages.


I actually think the bounds checking etc of Ada is good. If you put that in a very C like language it would have been better accepted. Instead it had a bunch of readability 'improvements' that turned programmers off.


I don't think (some) hackers prefer less strict languages because they like to be sloppy, but because they honestly believe it is the better engineering principle in the long run.


Hackers don't work in maintenance perhaps?


Actual hackers (usually) make things that require minimal maintenance. The idea that code needs to be "maintained" (it's not like it degrades over time) is actually silly once you think about it. Maintenance is just a way of covering up for software that was poorly designed or has too many bugs.


Unfortunately, in the real world requirements are often poorly understood by not only the development team but the customer (making it impossible for the "hacker" to create the right software), and only after continued use of the in production software are requirements truly fleshed out.

I'm not sure if you have had to re-engineer an app before, but it's not pretty regardless of how well built it was. This is what I consider maintenance, considering most applications have to be continually adapted for ever changing requirements.


For instance, three years after deployment, the production line is modified and the code must be adapted to suit the new configuration (which may contain devices that didn't exist when the code was orginally designed). Do you consider that maintenance?


Praxis High Integrity Systems uses Ada to write large systems that are practically bug-free, for a fraction of the cost of standard approaches:

http://www.spectrum.ieee.org/sep05/1454

While most projects don't need perfect code, Praxis' success makes me think there is something to the engineering argument.


For me, personally:

    No guaranteed memory safety.
    No discriminated unions.
    Verbose syntax
    No metaprogramming/macros.
    Costly polymorphism (compared to templates, but not as bad as Java)
    Few employment opportunities compared to the mainstream.


One reason Ada is not used instead of C on embedded platforms is it's RAM requirements. Many embedded platforms have 3000 - 8000k of RAM. Ada's runtime uses something in the range of 100k.


ADA advocates always talk about these safe guards but if its so easy to see how much better and safer it is show us some code. It is telling that the only snippet we see is from c.


The impression I get is that Ada's benefits are most apparent when implementing safety-critical systems, so it's hard to find individual snippets that show it off. Here is an open source project that supposedly demonstrates good Ada development practices:

http://www.net-security.org/secworld.php?id=6619

(The cynic in me wants to point out that the only snippet we see in the article is buggy C.)


The last time I looked, there were no advanced tutorials on the net (this chimes in with the "don't only talk, show us the code demonstating the benefits" and the cheapest book on the market was $130. That was when I gave up, even though I suspect that it is an interesting language with things like calculating with dimensional quantities (you will notice that half of your team uses imerial units when you compile the code, not when your probe crashes into another planet).

Maybe I wouldn't like the style of the language or maybe I would, but my limit for books about languages I am not sure I even want to know is well under EUR 100.


There is the Wikibook on Ada programming:

http://en.wikibooks.org/wiki/Ada_Programming

That's about as free as it gets. I have the "Programming in Ada 2005" book by John Barnes. I use both it and the Wikibook.


Ada has had a high level interface for multi-tasking for a long time. I'm surprised that doesn't get mentioned more often.


Dijkstra talks about it: http://www.cs.utexas.edu/users/EWD/transcriptions/EWD08xx/EW... . It seems that without DARPA, Ada would have faded away. Perhaps things have changed from his time, but the problem still remains: Ada appears to have something of a bad name.


Ada was the first language I learned in college. Our CS professor at junior college was a DOD programmer by day (most of our CS courses at the JC were at night).

Looking back, I think it was a good language to start with at the time (this was '94). Plus, it was nice having a professor who actually had current, real-world experience as a software engineer.


I had the unfortunate experience of being forced to use Ada as my primary programming language for three years during college. I hated it with a passion, though I can't tell you exactly why at this point.

Ada took all the fun out of programming. I think the major problem was that there weren't any good GUIs that supported it. The one we used was ugly and counterintuitive. Additionally, the compiler was very strict and the error messages it spit out were cryptic. We spent so much time just trying to get the syntax right that it discouraged us from exploring and trying new things, the exact opposite of what a good programming language should do.

Not only that, but only a few few organizations actually use Ada. We graduated with an expertise in a programming language no one used. A year later someone made the decision to stop using it and to switch to Java. I think that person deserves a medal of some kind :)


I worked at a company where we very seriously talked about adopting Ada. Let me tell you what else takes the fun out of programming besides an overly strict language: knowing that if the customer encounters a bug, he will send you an email, stop using your software, and use a competing solution until you ship him a new release.

At that job I shipped the least-buggy, most-correct code I have ever shipped in my life, yet I also had the highest sense of incompetence and failure I've ever had. The stress level when making code changes was incredibly high, especially since the code was mostly legacy (i.e., lacking documentation and tests) and introducing even rudimentary testing required rewriting large parts of the system -- which of course created bugs and made regression tests rather less useful than they should have been.

I even had a dream once about reading the gcc manual and finding a new warning flag, -Wsomethingorother. I was really excited. Then I was really disappointed when I woke up.

I REALLY wish that codebase had been writting in Ada. The code was mostly correct (though hideously designed) when I inherited it; if it was written in Ada, the runtime checks would have caught a lot of cases when an innocuous-looking bugfix in one part of the code caused another part of the code to start dereferencing stale data and mixing a subtle spurious signal into the customer's results. In the worst case, we would have had an immediate failure with a good head-start on debugging. Ada might have actually preserved some of the fun of programming for me.

Nothing else I've worked on has called for Ada, though. It's a niche language and should definitely be considered whenever writing a mostly-static system where you're willing to invest a lot of time per loc to avoid bugs.


...if the customer encounters a bug, he will send you an email, stop using your software, and use a competing solution until you ship him a new release.

What kind of software was it?


The software was part of a scientific computing system. We were an upstart product competing against a decrepit seventies-era product with an eighties-era interface. The competition was markedly inferior in performance and usability, but it was a known quantity and thoroughly integrated into the customer's workflow. By comparison our product was faster and usually more accurate, but less predictable. Our new bugs were much more disruptive to workflow than the existing product's old, well-understood bugs.


"It makes one work very hard to get a compileable source file."

That seems to be a sufficient explanation to me... (Haven't tried it myself).


Ada does have a huge following... In its incarnation as Oracle PL/SQL.


Albeit PL/SQL is a subset that gives you only a flavor of Ada.

I think Ada just plain did not gain momentum at a time when developers looked for a new paradigm to aid in building larger, more complex systems. The nod went to OO and C++ was the lucky recipient of attention because of its relationship with C.

Now that OO, as implemented by C++ and Java, have not proven to be the Holy Grail, developers are looking for another paradigm. It appears to be a dog fight now between dynamic and functional languages for the nod.

Not to stray too far off topic, but Lisp seems to have not gained momentum for almost the polar opposite reason as Ada. Ada's OO stuff was a little late to the table when OO got the nod; Lisp was too far ahead of everything - programmers were too used to the machine level interaction and were not ready for high level abstractions. (Mind you I am coming at Lisp as a newbie who looks at what it offered for decades and shake my head that we passed it up.)

Languages often do not get a second chance but Haskell and Lisp, as implemented by Clojure (or Arc), may actually become mainstream.

Ada falls into the same category, for me, as Eiffel. I know intuitively that they could reduce some common errors but they're just not interesting.


Yeah, an Oracle guy once told me that PL/SQL was inspired by Ada. And, having written my share of complex PL/SQL, it inspired me to never learn Ada.


Because it writes like BASIC and C had an unholy love child--vaguely C-ish in style, but it lacks case-sensitivity and uses stupid stuff like BEGIN and END.

Note that I've never done straight-up Ada, just the derivative VHDL.


In other words: "I don't like the syntax."

I'm always skeptical of this argument; every programmer has a favorite syntax -- some prefer C-like, some prefer Pythonic, some prefer Lispish, etc. In most cases, it's not the syntax they first learned, but it's the syntax in which they do the majority of their work. This suggests that familiarity is the basis of syntax preferences, which supports the "you'd like it if you gave it a chance" school of thought.


I assume you mean that syntax is less important than other language features, and I mostly agree. However, while there are of course many variations of syntax on the same basic set of features, some variations are better than others. It may be difficult to say whether Python, Ruby, or C syntax is best, but that doesn't mean that some aren't better than others.

For instance, terse languages, as long as they remain understandable, generally seem better. Whether to use braces or indentation may be a matter of preference, but few would argue that a COBOL-like syntax is preferable.


my list of least favorite languages:

Pascal, Ada, Mumps in that order, but mumps has a few saving graces, it's quirky enough that you get some insights that you would otherwise never have.

To me ada is like esperanto, it's designed to be universal but in the end that means it fits nowhere really well and it doesn't feel natural to anybody.


Ada was adopted by the D.O.D because it fit their needs.

I heard once that Ada experience + US DOD Secret or higher security clearance was life-long job security.


"Ada was created for the D.O.D according to their needs."

Fixed that for you.


The fix is correct. I was there at the time, but not directly involved. There was a competition to design a language to replace the large number of languages and variants used by the military. Pascal was in vogue at the time and 3 of the 4 proposals were Pascal derivatives. OO pretty much didn't exist. Simula and Smalltalk were there but nobody in the competition was paying any attention. The requirements documents were a laundry list of things from every language but Cobol, including a number of ideas that had never been implemented in a production language. Rendezvous comes to mind. The compiler for language turned out to be extremely difficult to implement and it took much longer than expected, not a first for the military. The colonel in charge was insistent that there be no subsets. When it finally arrived, the world had passed it by.


Perhaps ADA could be thought of in terms of Battlestar Galactica. ADA would be like a Centurian, clunky and mechanical. Lisp would be more like a "Six", organic and capable of independent though. :)


Because it's far away, oh, what?




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: