One thing I liked about Ada is that the authors of the language also wrote why they choose things the way they are, what was rejected, etc... Their is a document about this called "The Ada Rationale" available on the web:
Yes, but many ISO documents are not publicly available (they're paywalled instead, even though ISO doesn't pay the authors). The Ada rationale is publicly available, so anyone can see exactly why various decisions are made. That's useful for someone using Ada (obviously), but it's also useful for people designing any other programming language... you can often learn about a craft by learning about why people made various decisions.
I spent about 4 years of my early career working with Ada in the aerospace industry (specifically, I worked on the GPS satellite program). It was rather career-limiting because I ended up as a young engineer with a lot of experience in an extremely niche language.
It took a while, but I was finally able to escape and start working with more modern technology (involved a lot of self-learning on side projects and finally finding a rare employer that believed engineers are capable of learning new skills).
I do occasionally get emails from recruiters looking for Ada programmers - they're exclusively for legacy maintenance. Fortunately, I'm in a position to ignore them now :)
As a language, Ada isn't bad. I'd prefer it over C++ or Golang for sure, but there just isn't enough interesting work going on using Ada to warrant investing time in learning the language.
Ada is just a language. The 99.99% of the experience you've acquired there perfectly transfers to any other domain or language - only uneducated HR gatekeepers won't recognize this.
Then (after you make sure you learn enough of another language to not embarrass yourself) be a little bit creative in your resume/CV in order to pass that stupid filter.
(by which I don't suggest to lie, rather to make sure you mention you have some experience in popular language even if your primary job didn't use it. E.g. mentioning the side projects that made you learn C++ will let a recruiter tick that box)
I agree, I see nothing unethical about tailoring your various job-related communications to the type of person you're speaking with. Especially at large firms, HR employees are often quite literally just looking to check off boxes. Obviously you should not outright lie, but we engineering types have a tendency to take these sort of questions too literally.
I use C# at work. I've programmed in Java here and there and have done some small projects in it, but we don't use it at my office. If I apply for a Java job, and the HR rep asks me during the 1st round phone screen if I have Java experience, I'm going to say yes. When I speak with a more technically minded person, I can go into the full story, because they have the background necessary to put it in context.
This comment feels more idealistic than realistic.
Companies rarely want to pay for someone to learn on their dime unless there's a specific gain. It typically takes 3-6 months for an engineer to become truly effective and that's with domain knowledge transfer.
Domain experience is valuable and allows speed to get up and running.
one of the issues that does matter is how/where you are applying the language. Ada use is super niche so unless you're looking to stay in the same domain it does matter. As an educated, non-HR gatekeeper I do care about the context of your experience.
It's a brilliant language and due to the fact that it can limit you in the eyes of employers is one reason why I'm dabbling with other languages (including Go) and also a syntax for a new one inspired by Ada.
I suspect that you wouldn't agree about the desirability, but I've long thought that doing a simple substitution on some of the terminals in the Ada grammar to turn it into a curly brace language would do a lot for adoption. For someone to do it the right way would mean a tool capable of also doing the reverse substitution on a curly brace text, which would produce output that could be consumed by "legacy" (non-curly brace-aware) compilers. With the rest of the grammar and semantics all staying the same, this would mean a level of fluidity that would allow existing Ada programmers and the programmers working in the hip new "Attica-B"† dialect to work together on the same projects with no more friction than the way that a lot of people already use gofmt.
> their just isn't enough interesting work going on using Ada to warrant investing time in learning the language.
You can help change that -- by doing interesting work! The nice thing about open source and self-directed projects is that you can use what you want based on its own merits, and leave bandwagoneering for when you have to find a job.
I don't know, if I'm trying to create an open-source project with the intent of making something that becomes useful and popular, I'm not sure I'd want to artificially limit the number of potential contributors by picking an obscure language.
That said, I don't really have a strong enough desire to get back into Ada anyway. I'm a big fan of Clojure and Kotlin at the moment :)
I was a fairly junior software engineer at the time. And took the job only because it was 2008 and I was unemployed. Had never used Ada but the employer (rhymes with "Going") didn't seem to mind. My mother was kind of enough to supply me with all of her old Ada books when she finished laughing about me taking a job using a language she had long since left behind.
The money was about... market rate for the time. Not good, not terrible. Better than taking a job completely outside of my field.
And precisely because it's a niche language, and you have the experience now, you should definitely accept those legacy maintenance requests, as a freelancer, and ask for big bucks. You're wasting opportunities here.
Ada is/was heavily used in the defense/aerospace industry. Most of these legacy maintenance requests come from defense contractors who do not want to pay top-dollar for expertise. They want to put butts-in-seats to soak up billable time and keep the govt customer just content enough not to cancel their contract.
So they'll happily hire anyone with a clearance. If they can't find someone who already knows the language they'll eventually just hire someone who doesn't and let them figure it out on the job. Defense contractors do not pay big bucks for niche skills on legacy maintenance programs. Not to mention, it is some of the most boring work in the industry.
Let me tell you a story. 25+ years ago there was a bid in my city about how to renovate the main postal office. Architects all over the city did their bid, as requested by law, including the biggest of them all who had basically the entire local council in his pockets.
At the time I was a student, and was working as simple designer (AutoCAD) for an architect teacher who just opened his private architecture firm (6 months prior only). But the guy went to the board in question in last day, with a different bid. He proposed them bigger changes, not just renovation of exterior walls, but also design changes of interior space. Now doing that he made them go back to Ministry of Internal Affairs where the entire project instead of being billable as simple renovation, was changes and the entire budget would qualify for a superior plateau inside the Ministry levels of budgets. It jumped the qualification from (roughly estimated for today's $$$) $1M to $5M. And my boss at the time won. For next 6 months I worked my ass in AutoCAD to do all the changes and I am very proud of it even today, as it still stands as is currently.
Now here's your take of my above story. Get interested in their proposals, see what is about the projects and see if they can't actually be helped to jump to a bigger plateau of prices within government. You have the technical expertise, maybe it's time to create your relation expertise as well. They will jump through hoops and all if you bring them something to bill to something much higher their client (the defense department). Then you can really enjoy a bigger pay. Good luck and have fun.
Here is another example, when evaluating which language to pick for their security critical firmware, NVidia ended up picking Ada/SPARK.
At FOSDEM, Ada room has been a continuous presence for more than 10 years, or even longer, as the language still gets quite some use on high integrity computing around this side of the world.
The GNAT Ada compiler is open source, generates good code, and has really good error messages. There's a story behind its development, which I tried to summarize here:
Thank you for posting that. Especially interesting in light of Walter Bright's comment(s) in this thread.
I didn't quite understand why someone who wrote a c++ compiler would think Ada was insurmountable - this adds some context.
In my nativité I'd have guessed that writing an Ada compiler was on par with Pascal (but with more types and stuff). This is probably the same (wrong) reasoning by which Mr Bright tricked himself into writing a c++ compiler ;)
Ed: note, I was introduced to Object Pascal - not plain Pascal - so that's the comparison I'm alluding to.
Oh Ada is so much more complex than Pascal. I think getting something with the numeric types is doable in a smaller timeframe, the OO, tasking stuff would be much harder.
You could knock out an object-Pascal / Oberon compiler in a matter of weeks imo.
As noted in the article I posted earlier, Ada isn't that difficult a language to compile, it's just that early Ada compiler developers assumed it was hard, so they made it hard.
The GNAT Ada compiler is OSS, so you can look it yourself. It has a hand-crafted lexer (for speed), but otherwise it's not complex. It uses recursive descent (for good error messages), but lots of compilers do that & it's not hard to understand either. (If you don't understand creating a recursive descent parser then you'll need to learn that, but it's a one-time cost to learn about that).
Ada is really easy to parse compared to C++. There aren't the backwards-compatible-with-C historical stuff that makes things complicated.
Ada has built-in support for tasks, including safe communication mechanisms for them. Many other languages today do too, so that's not such a big deal today.
Generating very good code is hard in any language. GNAT "cheats" by building on gcc, which already works on this problem. GNAT parses source code & then generates the internal structures needed for gcc to do its thing. Many language implementations parse & then pass things on to an infrastructure (gcc or LLVM), so that's not unusual.
PL/I is complicated in part because of its baroque "automatically figure out all the conversions" rules. Ada in general makes you be explicit about types, so that is a non-issue.
"Dewar personally told me that when implementing the GNAT compiler, whenever they looked at the language specification and thought it might be complicated to implement, they assumed they didn’t understand the language specification and looked for a simpler way to do it. This mindset that “there is probably an easy way to do this” seems to have been different from others, who seemed to assume that complexity was always an inevitable part of the job."
I've been on the losing side of this battle my entire career. While I thought striving for simplicity was core to the job, it's only harmed me. Though it's nice to read about another person who apparently thinks like me.
If any one needs me, I'll in the basement with my stapler, sipping my now cold coffee.
(No NetBSD ones though, forcing us to use emulation)
Ada reminds me of spitbol, and setl can be written in spitbol. R K Dewar's spitbol, written in a portable assembly language called MINIMAL, has been open-sourced. There's a version of setl written in spitbol, too. Hopefully we will have the source for setl one day.
The author's dissertation on setl is a favourite of mine:
Interesting. Never heard of SETL before. There is even a Wikipedia entry: https://en.wikipedia.org/wiki/SETL. References 1 and 2 confirm that it was indeed used to implement the first validated Ada compiler. Didn't know that.
Another factoid: It was actually SNOBOL (later SPITBOL from Dewar, who was a founder and president of Ada Core), not AWK, that was the first language to offer "associative arrays", which are simply called "tables".
I got to use Ada for eight months in 1997, on a co-op at Rockwell-Collins in their General Aviation department. I learned and wrote a lot of Ada and I loved it. When I had to go back to school in the fall, I was no longer interested in writing C++ and it kind of bummed me out to have to use it again.
I had a similar experience. I learned Smalltalk and then, when I tried to learn C++, I saw the bit-shift operator be used to output stuff to the terminal.
It took me 8 years to get over the initial disgust and learn C++.
Ironically what got me into C++ in 1993, was being able to make use of a type system similar to Turbo Pascal's one that I got to love (I was using TP 6/TPW 1.5 by then), alongside "cheap" compilers and widespread availability on home computers.
I just used C for a couple of months before being given a copy of Turbo C++ 1.0 for MS-DOS, and since then I only used C when the option was outside of my control.
It already felt primitive in 1992/1993 vs the alternatives, and so far C17 has hardly changed in that regard.
If you have the time, I'd love to hear a bit about why you ended up creating D, rather than writing an Ada compiler? (Seeing how you wrote a c++ compiler...).
At the time, I had only written a couple toys. I thought writing an Ada compiler was impossible. It's also impossible to write a C++ compiler, but I didn't realize that and did it anyway (several years later, after I'd written a C compiler). I thought that C++ was just adding member functions and a few keywords to C.
Shame you didn't write an Ada compiler, the docs were available. It would've been interesting to see what you would've come up with. It'd still be interesting I think.
A lot of the older closed compiler are still based around the program library idea which is no longer required, considering GNAT is file / project based. I've only used GNAT and have no idea how the program library versions work.
I bought the original spec MIL-STD-1815 10-December-1980 back in 1980 or so, but it didn't survive a purge I made when moving. A couple years ago I went looking for it. It's not online anywhere, but I found an old library copy for sale and snapped it up. It's green, and I don't have any of the other specs.
I'm sad I also purged my DECSystem-10 "Orange Book". I've never been able to find a replacement anywhere.
Do you remember if the nickname was "Orange Book" or was it "phone book"? I didn't see the former when searching for more information, but the latter was referenced here [0].
If it's actually "phone book", then [1] seems promising.
It looks like the "Assembly Language Handbook" in the first reference. It was a very thick book, printed on very thin paper like a phone book. I clearly remember it as "Orange Book", after all, it is orange, but that just might have been Caltech vernacular. (Caltech had many local words unheard off campus.)
A marvelous find, thanks! I'm glad they haven't all vanished. I'd still like a scan pdf!
Another thing is that the language backers made sure the ISO standard was available at no cost which was not obvious in the eighties so students could read it.
I was told that the Ada guyes fought with ISO to have a per paragraph numbering to ease translations and references, ISO (at the time, don't know now) was line numbered, but they lost.
I wish Rust had Ada's type system. It makes a lot of sense to define types specific to your algorithms and have the storage type determined at compile time.
Sorry, can you elaborate? I'm skimming through the documentation and I can't find a section on having "the storage type determined at compile time", except for the section on arbitrary-range integers.
Ada's type system allows you to separate the high-level specification of a type (which models a problem) and its low-level representation (size, alignment, bit/byte order, etc.). The language also requires explicit conversions for two different types even if their underlying representation on the hardware is the same.
Example 1:
type Byte_Count is range 1 .. 4
with Static_Predicate => Byte_Count in 1 | 2 | 4; -- Aspects are Ada 2012
type Component_Count is range 1 .. 4;
V1 : Byte_Count := 3; -- compiler error: expression fails predicate check
V2 : Component_Count := V1; -- compiler error: requires explicit conversion
For these two types I'm not really concerned about how they are represented by the hardware, but I could if I needed to.
Example 2:
Extra constraints added to some pre-defined types:
subtype String8 is String
with Dynamic_Predicate => String8'Length <= 8;
subtype Even_Integer is Integer
with Dynamic_Predicate => Even_Integer mod 2 = 0,
Predicate_Failure => "Even_Integer must be a multiple of 2";
Example 3:
Use big-endian for some network packets:
type Packet_Type is record
Header : Header_Type;
Data : Data_Type;
end record;
Low-level representation (placed in the private part of a package spec):
for Packet_Type use record
Header at 0 range 0 .. 255;
Data at 0 range 256 .. 1855;
end record;
for Packet_Type'Bit_Order use System.High_Order_First;
for Packet_Type'Scalar_Storage_Order use System.High_Order_First;
for Packet_Type'Size use 232 * System.Storage_Unit;
Ada allows you to customize the size and memory representation of a lot of type. It's called representation clauses [1]. It can be very useful if you want to exchange data structures with a different language without marshalling. I have used it to interopt with C++. It can be quite nice and we had some very unpleasant packing bugs in the C++ code which were caught thanks to Ada good error messages.
Overall, working with Ada was quite pleasant. The type system is nice. The module system is nice. Writing concurrent code was really nice. The Algol like syntax somewhat reminded me of my day working in Ocaml. I did miss true variant types however.
I'd be very interested in someone with substantial experience in Ada/Spark providing some insight:
* Why did Ada not manage to get a significant foothold outside of some small domains?
* What's great, what's not so great about it?
* How "modern" does the language feel, including the tooling, documentation, etc? Specifically Spark 2014. ( I see a LSP server and VSCode plugin, for example)
* Spark2014 seems to be open source, with a GPLv3 license. Is this what most companies use, or are there significant closed source parts that must be bought?
Are most companies still on older versions?
I don't know SPARK and haven't touched it, but...I had to learn Ada9X at uni in 1995, I've been using it for the last 15 years and have been on #Ada on Freenode for longer.
1) The creator of #Ada, caracal, once said that when he was in the army, there were two groups of people: i) those who were interested to learn about Ada and ii) those who were totally against it without having seen any of the language at all. The second group were the ones who, when they came around with the green manuals, just refused to even read them.
2) I went back to Ada after burning out in a shitty games company, essentially sitting in a debugger for 19 hours straight isn't good. The only time I have to use a debugger is when using pointers, Ada allows you to avoid pointers for the most part.
Ada's type system, data modelling, it's unparalleled anywhere else ever.
There's not enough people working with Ada in OSS.
It's easy to burn yourself out on a big project sometimes.
3) Very modern, the docs are good, but the tooling is lacking. But then I come from a time of command lines and no package managers.
I don't have extensive experience but learned it a few years ago and believe I can answer some your questions. This is subjective, others might disagree or perhaps add their own experiences.
> Why did Ada not manage to get a significant foothold outside of some small domains?
Probably mostly due to licensing issues. Ada has a runtime that you basically have to use and the most complete AdaCore version forces you to release executables under GPL. The FSF version is less complete (many libraries are missing) and allows you to use the much more permissive mGPL version. However, it always lags behind the version from AdaCore and many developers in the past were scared away by the licensing issues.
Other reasons often mentioned:
- Ada has a certain (perceived) background in the US military and aviation industry, which some people don't like.
- Ada is a very complete language and not easy to learn. The syntax is special, in that it specifies each construct on its own and re-uses keywords.
- Overuse of pragmas and "glued on" Unicode string handling
> What's great, what's not so great about it?
Great: It's an extremely fast and safe general systems language suitable for all kinds of programs. It allows high level and extremely low level programming and is suitable for embedded programming.
Not so great: The type constraints for generics are not as expressive as they could be. Aliasing rules ("pointers") are stricter than they would need to be. There are a few oddities with anonymous vs. named types to watch out for.
> How "modern" does the language feel, including the tooling, documentation, etc? Specifically Spark 2014.
Tooling and documentation is excellent, thanks to AdaCore and several good books, as well as the official Ada reference. The language does not "feel" modern, because many libraries are old and appear to be outdated. In reality, old Ada code will just compile and run so there is no need to constantly update and maintain libraries.
> Spark2014
Can't answer that question.
> Are there any close alternatives that are production ready?
The GNAT Ada ecosystem mostly maintained by AdaCore is as production ready as a language could be. AdaCore expect you to buy a commercial license if you're doing safety critical work. As for other compilers, that depends on the vendor and most of them do not support the latest Ada standard.
> Ada has a runtime that you basically have to use and the most complete AdaCore version forces you to release executables under GPL.
See above.
> The FSF version is less complete (many libraries are missing)
Incorrect, AdaCore just packages their libs with their version of the compiler, with FSF, you have to build them yourself, which is a pain. But they are working on that.
> and allows you to use the much more permissive mGPL version.
MGPL hasn't been a thing for at least 10 years now. The FSF version if GPLv3 with linking exception.
> However, it always lags behind the version from AdaCore and many developers in the past were scared away by the licensing issues.
And the cost of licencing from AdaCore. Not small company / startup friendly, but apparently that has changed, but you still have to email them for prices!
> - Ada is a very complete language and not easy to learn. The syntax is special, in that it specifies each construct on its own and re-uses keywords.
This makes no sense, all languages are "complete" to some degree. The syntax is not special, it's like any other, it's defined by a grammar, as are all languages. A lot of languages re-use keywords.
Thanks for the "corrections". Some of them are fine (I didn't know about the FSF license change) but with some I really don't agree.
> Incorrect, AdaCore just packages their libs with their version of the compiler, with FSF, you have to build them yourself, which is a pain. But they are working on that.
Various fairly essential libraries maintained by AdaCore are under GPL, so you could not use them for non-GPL software that is distributed. That's what I meant by "incomplete", because the FSF version was traditionally used as the free option for developing distributed, proprietary software. AdaCore has always blocked that by making essential libraries GPL instead of, say, LGPL.
If you used AdaCore it didn't matter, since either you were forced to make your program GPL anyway, or you paid for a commercial license.
> MGPL hasn't been a thing for at least 10 years now. The FSF version if GPLv3 with linking exception.
The question asked was why Ada was not more successful / popular. Your correction just adds more evidence to the licensing confusion. The problem is you can't even be certain it stays a certain way.
> The syntax is not special, it's like any other, it's defined by a grammar, as are all languages.
No, Ada has a rather special grammar in comparison to most other languages, because it specifies every construct on its own, and that is one of the reasons why it is harder to learn than other languages. Grammars for other languages are much more compact and use more shared non-literals.
> all languages are "complete" to some degree
That's not the point. Ada provides much more functionality than most other languages, for example memory alignment for embedded programming and OOP with tagged types. It's as expressive as C++ for systems programming. That's what makes it way more complex than lnaguages like, say, Scheme R5RS, C, or Go. It's easier to learn than C++ but it's much harder to learn than a related language like Pascal or Modula, for instance.
> A lot of languages re-use keywords
Whataboutism? Languages that re-use keywords are generally a bit harder to learn than those with only few constructs and no keyword reuse. You don't think so?
I've never met any Ada programmer before who thought Ada does not have a relatively long and steep learning curve. You'd be the first I've met.
> Various fairly essential libraries maintained by AdaCore are under GPL, so you could not use them for non-GPL software that is distributed.
That's not true, the AdaCore libraries are covered by the Runtime Library Exception of GPL and available on GitHub: https://github.com/adacore
And soon available in the package manager: alire.ada.dev
> The purpose of this Exception is to allow compilation of non-GPL (including proprietary) programs to
use, in this way, the header files and runtime libraries covered by this Exception.
> I've never met any Ada programmer before who thought Ada does not have a relatively long and steep learning curve. You'd be the first I've met.
Make that at least two, I think Ada's pretty easy to learn.
One thing that makes Ada easier is that if you get something wrong, it's usually a syntax error that's caught by the compiler. GNAT's error messages are very good and will often point you the right way. The whole language was designed to try to catch errors at compiler time. Obviously some errors will slip through anyway, but it helps.
Ada does have some features that you have to learn about before you can use them. That's true for any language. E.g., it has a task (thread) system built-in; if you want to use it, you'll need to learn it :-).
Regarding 'why did Ada not manage to get a significant foothold outside of some small domains?' here are some factors that were relevant at that time (late 80s to early 90s):
* The internet was either practically non-existent, or in its infancy. At that time, getting compilers, documentation, and sample code was much more difficult and time consuming.
* I recall there being a fair bit of criticism about Ada being 'designed by committee' and having a reputation as such.
* C was still shining brightly and overall well-liked. C++ being an improved C, was seen as a very desirable new language for the software development community.
I wonder if ADA and the gnat company have created barriers to entry that prevent a community to form that gets ADA beyond the tipping point.
The gnat and base library license confusion has been discussed in the thread.
In addition to that I'm missing a succinct and to the point introduction to ADA. The signal to noise ratio of the standard "programming in ADA 2012" book was freaking me out. I stopped looking into the language not because the language was hard but because it was explained so poorly.
I spent a lot of time circa 2009/2010 teaching myself Ada for similar reasons. I work in avionics systems (or used to, present job is not) and Ada is grossly underutilized. I’d be willing to wager half the bugs that I saw make it to the testers or, worse, the field would not have happened if we’d used Ada (and used the type system properly).
And these days, using Spark, even more bugs could be detected before compilation even succeeds. But no one wanted to use it at any of my jobs, and that knowledge has mostly atrophied from disuse.
Using Spark contracts to specify and verify low level requirements should be a huge time saver.
I too don't understand why everybody seems to use C/C++ nowadays. I guess commercial Ada is just too expensive.
There is a group trying to bring contracts to C++, and Spark is something they have looked at for guidance. I'm not sure of the progress, last I checked there were conflicts on how to use them. That is the compiler guys wanted to use the contract to optimize farther, while the safety guys said that is insane: you don't know if the contract is proved.
It was my first language i learned in school... in Radford ( Virginia), before my university switched to Java. ADA was/is popular with defense contractors, so a lot of schools in VA teach it.
It is very safe language, being both static typed, and strong typed. It was originally a procedural language, but later they added some OO to it. (to folks that don't know what a imperative language is, think of a language where all methods are like Java Static methods, and you store data in some global space).
It can get a bit wordy to program with it, but I also like the lack of '{' bracket use everywhere that C like syntax has. With some modifications, ADA can be transformed into a great/fun language.
Anyway, due to its small echosystem and adoption, I think it will remain a 'defense/aeronautic industry' only type of language.
This is a simple program, and I think it is fairly elegant, but a bit wordy:
with Ada.Text_IO; use Ada.Text_IO;
with Ada.Integer_Text_IO; use Ada.Integer_Text_IO;
procedure Check_Positive is
N : Integer;
begin
Put ("Enter an integer value: "); -- Put a String
Get (N); -- Read in an integer value
if N > 0 then
Put (N); -- Put an Integer
Put_Line (" is a positive number");
end if;
end Check_Positive;
Readable is a matter of education. As a C++ programmer I've learned what {} means, and so that is more readable than begin/end which happens to not be the block symbol in any language I know well. Considering all the languages I know I've concluded that block symbols are important enough to be worth learning as a separate symbol (Looking at python where it is white space). Of course I can learn begin/end, but it isn't worth typing those extra letters of the word when something that common can be done in one.
Note too that {} graphically indicate the open/close in a way that begin/end is another reason not to like begin/end.
That's not necessarily the same thing though.
Verbosity can obscure the information you're trying to gather from the code you're reading. A "can't see the forest for the trees"-kind deal.
I cannot see the source code for the documentation. This happens to me in so many cases. I work with a couple of codebases that have 90% documentation, 10% code. In Ada, code IS the documentation that also can be formally verified. It is wonderful.
Oh, double-hyphen for comments! So maybe that's where SQL got it's comment style from? I'm not sure which SQL spec defined it and what is its overall availability, but it's defined in DB2 and Oracle SQL languages so I'd say it predates the SQL-1992 standard. I think it's also available in MySQL SQL and pgSQL, with a required space between the dashes and the comment text.
Ada is basically boomer Rust. You don't need a specific reason to use it -- just jump in! My first Ada programs were crappy reimplementations of Unix utilities, just to see how the language felt when doing common tasks.
Man there are so many of those languages for me. A lot of mainstream languages seem to mass-add features, but the two I'd like to see people look at are Ada's type system and Erlang-like native bitstring handling.
Looks like a good tutorial, but it doesn't go into much depth on memory-management. The More About Types section has a little. It mentions that automated reference-counting is available through the GNATCOLL library, which seems to have since been broken apart into three different packages [0], so it now resides in gnatcoll-core [1].
We agree that it's a hole in the current curriculum, that we intend to fill at some stage with the advanced lessons.
State of memory management in Ada is:
- You have a lot of facilities to stack allocate/not heap allocate tons of stuff that you would heap allocate in pretty much any other low level languages.
- When that doesn't cover you, in Ada you're basically at the level of C++: You have refcounted pointers, unique pointers (which are enforced via limited types) in GNATCOLL, managed containers in the stdlib, and manual memory management. You have storage pools which, with the 2012 additions, are roughly similar to custom allocators in C++/Rust.
Anecdotally, I have a friend/colleague working on a fun side project, https://github.com/Roldak/AGC, meant to plug a garbage collector into Ada, since the language is much more amenable to that than C or C++. (completely prototype/for fun project, hence why I'm not including it in the "serious" options at your disposal above)
Fun fact GNAT (GCC Ada front-end) will use a biased representation for range type when packing tight:
with Ada.Text_IO; use Ada.Text_IO;
procedure T is
type T1 is range 16..19;
type T2 is range -7..0;
type R is record
A : T1;
B,C : T2;
end record;
for R use
record
A at 0 range 0 .. 1;
B at 0 range 2 .. 4;
C at 0 range 5 .. 7;
end record;
X : R := (17,-2,-3);
begin
Put_Line(X'Size'Image); -- 8 bits
end T;
For those not familiar with Ada and bit representations, the compiler is exploiting the fact that types in Ada can have constraints that limit their size.
T1 is a type of integer ranging from 16 to 19 (inclusive). Thus, it can only be four distinct values (16, 17, 18, 19). Two bits is enough to represent these four distinct values and thus the binary representation of the type (in record R) can be reduced to 2 bits.
The type system in Ada was always so nice to use, and it was incredibly useful when using SPARK to really constrain the bounds of your types and make the runtime exception freedom proofs easier. I miss working in Ada, but I don't really miss working on those kinds of projects.
I heard that the only reason Ada had an "Integer" type as opposed to just have range types with user provided bounds was because of String index needing it - String is just a standard array of character in Ada.
Very nice introduction. For those interested in a modern language that has a strong type system as in Ada and it is also growing safety capabilities à la SPARK, you might want to check out Nim.
Ada was the first language I learned at university back in 2003 (I had previously taught myself C/C++, Assembler and VB). It had a huge impact on me and influenced my entire career and how I approach software engineering. These days languages like Erlang and Haskell are carrying on some of the ideas (modularity, type safety, concurrency, etc.) in more modern incarnations. But there is still something special about Ada.
Their marketing folks should introduce Ada certification.
Many developers would jump through hoops to get any dodgy certification. I have seen dubious FEMA certifications and NFA certifications on some LinkedIn pages.
I have plenty of certifications, not because I value them, rather they are the way many corporate vendors make money with lots of software used by Fortune 500, which use them to gate access to their software.
Sales usually are done as X licenses + at least Y certifications.
Petty and inconsequential as this sounds, I'm convinced Ada go have gone farther if it didn't have capitals and underscores in all their libs, and long keywords like `procedure`. All that shift-key action makes Ada a physically painful language to work in.
One of the "Steelman" requirements that Ada was built to satisfy was that any program should be able to be represented with a subset of ASCII that would be compatible with the largest range of terminals, teletypes, and punched cards as possible. Less relevant today, but that's why Ada has so many keywords in it's syntax.
Seriously, the curly braces are a huge improvement. That's no joke.
The "fn" and ":" are good too.
I'm less sure about "include", because I doubt Ada does textual inclusion. If it does, then that is good too. Otherwise, better choices might be: using, use, require, requires, depend, depends, external, library, unit, module...
I can't stand that autocomplete. It gets in the way of actual typing. What if I want nothing in the list? What if I actually want to use the arrow keys immediately after that thing has decided to trigger?
It probably also cuts down on human ability, like never taking the training wheels off. In the long run productivity might be lower.
This is actually one of my few gripes with Rust. I see lots of code that starts to resemble what I call "cryptosyntax" (also a feature of C++).
In this regard, Ada explicitly had maintainability as a design requirement, and I wonder how well Rust will fare with its concise, symbol-heavy code in that regard.
Ada is case insensitive, so if you really wanted to you could write it in all lower or all upper case. Although, your source would be considered hard to read.
people had similar complaints about python and all the space/tab/indentation. I like to think it was not aesthetically pleasing but also, it was way ahead of it's time. If Rust was released in '95 it may not have been received as well.
There are many industries that could benefit from Ada, but one in particular is the games industry, but then, they'd also have to start designing stuff, stop feature creep and changing every little thing all the time, and crunch.
I like that the primitives types are so refined in Ada. That's something that is missing in a lot of languages. You can define subtypes or different types and ranges from the integer type for example.
Any recent popular languages with similar support for types of Ada? Is such type system being costly to implement the reason not many other languages pick up the feature?
Ada is peculiar because the checks for the ranges are done at runtime. This mean you need a small runtime. A lot of languages are inspired by C and C++ and the types are simple in these languages so it could be an explanation.
Haskell has `Numeric.Natural` but I don't think you can define an integer constrained to a range. It probably goes beyond what the type system can do (statically). I think it's possible with Dependent Types (in Idris) but this is rather new.
Runtime checks are compiler implementation issue rather than language issue itself. With static analysis compiler could eliminate redundant runtime checks.
For example, if SPARK (Ada subset) is used and absense of runtime errors can be proved by GNATprove, then runtime check generation can be disabled in Ada compiler.
GNATprove generates verification conditions (conjectures) from SPARK code and assertations. Then it feeds these verification conditions to proof tool (Why3, Alt-Ergo, CVC4 or Z3).
You only need to go on programming languages or compilers subreddits to see that they only seem to know C or C++, due to the "hey, I made a language and it's based on C(++), only btter." I look and it's basically the same mistakes each time, the for loop, the basic data types, etc.
Does any other programming language use “Ada case” identifiers (mixing uppercase and underscores like "XML_HTTP_Request") instead of camel case ("XMLHTTPRequest" or "XMLHTTPRequest") or snake case ("xml_http_request")? Ada case avoids the ambiguity of lowercasing acronyms or smooshing together of words.
(Yes, I know the actual capitalization of my example identifier in JavaScript is "XMLHTTPRequest". That’s what makes it such a good example. :)
Indeed. It always bugged me that languages like Rust chose snake_case for lowercase identifier, but CamelCase for the rest. Surely it should be Camel_Case.
It's Pascal inspired but certainly not any of Wirth's syntaxes, like 1/2 line if's, semi colons in weird places making blocks look odd. Ada has a more inclusive syntax, whereby "is" or "begin" and "end" encases what is inside it. Compare Modula's:
MODULE X;
...
END MODULE X.
and Ada's:
package X is
...
end X;
Ada's package feels like one statement containing others, whereby Modula's looks like it ends on the first line.
Pascal-like syntax seemed to have been popular around the time. Outside the Writh languages(Pascal, Modula, Oberon) and Ada, I was reading about CHILL[0] the other day, which is supposedly similar to Ada, but was rather domain specific. Sadly I don't think any modern, free implementations exist.
Indeed VHDL is supposedly based as much as possible on Ada syntax. Naturally you would have several differences due to the purpose of the languages, but they are _very_ similar IMO
Maybe somebody remembers better, but I remember when I looked into it over a decade ago, GNAT had the problem that it didnt have a GNU library linking exception or so. Does anybody remember the history?
Also I still find the whole environment confusing for windows. GNAT Programming Studio IDE seems to be useful, but what ballpark is the pricing for the commercial version. Anybody has an idea?
There are several GNAT compiler versions the FSF version has a linking exception, the Adacore pro version requires a commercial license.
IMHO they should have renamed the FSF GNAT (1) to clarify the situation..
I was a freshman CS student in 2007 and this was the language that I learned in CS 101. I wonder how common that was.
Frankly, that was my first exposure to programming, and I thought the language was strange enough and the prof wasn't good enough, so switched majors. Still the only formal programming education I've had.
Totally unfamiliar with Ada, but reading below it seems that type safety and concurrency are big selling points of it. In which way would this better than a modern functional language, say F#?
I've been intrigued by Ada for some years, but never really dived into it too much. I dont mind the language, but the tooling always seemed weird to me and it feels like any real world use of the language seems like its been relegated to legacy maintenance. Most of the time if I see Ada in a job description, it makes it seems like its something that's nice to know because there's a sliver of a chance you might have to rewrite or modify it, but majority of work will be done in something like C++, which is a shame.
Shame that the tools for Ada go from bad to worse even in the best case scenario. It's no wonder nobody really wants to use the language when other languages have all sorts of tools and ecosystems around them that make everything a breeze.
Almost every language has a less advanced type system than Haskell. But compared to most mainstream languages, Ada's type system is still quite advanced.
So Ada's type system is better than an "average" language but not as good as a good language. Therefore, if I'm looking to trade up from an "average" language, it's in my interest to leapfrog Ada and go with a good language.
Actually, using Ada + SPARK, there are many static properties that can be verified which would be unverifiable in Haskell, and would be awkward to verify even in Agda or Idris.
But you've got your mind made up already -- please just enjoy your tools, and let others enjoy theirs.
It makes much use of English words rather than symbols. This was intended to maximise readability, even at the expense of writeability. It's not the way most modern languages are designed, sure enough, and I'm not sure it really improves readability, but I'm not especially au fait with Ada so I'm not the best person to comment.
> unintuitive
This strikes me as another way of saying unfamiliar. Forth and Haskell are also extremely unintuitive languages, to people who don't know them, but that's not really much of a criticism.
http://www.ada-auth.org/standards/rationale12.html
I did contribute the HTML version of the Ada 1995 Rationale (first language revision) back in the day:
https://www.adaic.org/resources/add_content/standards/95rat/...