ATC is much more impressive than my little story of Emacs in unexpected places...
My research group needed to very rapidly make an interactive piece, for a looming big open house event for sponsors and VIPs. For Reasons, we had trouble getting computers, but there were some ancient ones that didn't run much of anything. I'd been programming in Java heavily before this, but the solution to the immediate emergency of deadline plus resource constraints was... Emacs.
Emacs would run on the box we had available, and, with it, I could develop all the things I needed, including complex HTML generation, even faster than with Perl (which I also knew). I ended up having a new Emacs process exec'd for every request, by Apache CGI, and displaying in Netscape. Process startup of the dumped Emacs itself was fast, and it worked fine.
I don't have that code anymore, but some other non-editor application for which I used Emacs before then was a kind of code generator: https://www.neilvandyke.org/jomtool/ You can see Emacs was supporting objects and even syntax macros, which was very respectable for any programming language at the time, and this was only the extension language of a text editor. What you can't see is that Emacs was additionally a super-productive "IDE" for developing with its own extension language.
This is a great story - especially that someone "almost hugging me" in Germany in a professional context is more or less equivalent to a marriage proposal. He must have made their lives really, really better.
I witnessed the disconnection of a (large) sensitive system because the 5yo grandson of a highly graded person who was touring him in the center flipped the switch of an AIX system. The system came back with a 513 error code (I think I still get it right after 30 years, the trauma).
The high graded person's jaw dropped on the floor when he realized what was switched off, looked at me, smiled and zoomed off with the kid. I recovered the system with a toothpick and manual in Swahili and, boy, that was quite a night.
Fascinating story. I wonder what other dirty secrets state agencies have in stock for us tech stack-wise.
This being said, Excel is most likely the most widespread tool for anything that involves office work. I've seen entire financial algorithms being implemented using Excel alone.
About 15-20 years ago, apparently one of the major ISP in my country was running off excel. Every customer was a row in some file.
ISP agents would come to your house to collect your bill. If someone didn't pay, the agent would flip a cell in their excel row. And that would trigger that person's internet being suspended. Sometimes they would flip the wrong person, and you had to call them, and they would fix it.
Sorry but I just don't buy this. As fas as I know, there was no collaborative editing in Excel 15-20 years ago (no Office 365). The only way for several people to edit the same Excel file stored on a network drive would have been to a) make a local copy of the file, b) do the modifications and then c) overwrite the file on the network drive. This would have destoryed _all_ modifications that occured to the file between a) and c), and would have resulted in such high rate of errors that the company would have been simply impossible to operate.
EDIT: As many people have pointed out, Excel did suppport collaborative editing at the time.
Excel supported collaborative editing of binary excel files on SMB shares under windows NT, back in the 2000s.
Now, "supported" does not mean it was in any way good idea, or that it resembled how collaborative editing works in things like Google sheets or O365. Conflicts, deadlocks, etc were common - it's even why I built my first Rails app - we were given excel sheet and asked to do data entry on it over network from multiple machines, I didn't trust it despite a demo of it working and built a quick and dirty webapp.
You could do it. Excel worked well with querying SharePoint. To most users it would be no different from just editing a file. You could also do it via samba.
(Source: I built one of those during my school holidays but I didn't get paid and I didn't have the balls to ask for payment)
I never said it was one file. I think every agent had one file for their area (a few hundred houses). Then some excel macros to pull all those files into one mega file, that the backend system dealt with.
There are entire trading stations built in Excel, used by multiple traders of one firm, synchronized via network drives, doing requests to the broker using VBA. I saw that around 2003.
About 15 years ago or so, back when I still did web development gigs and the like (because I hadn't accepted yet that the low level system development I loved was what I should do for a job, hard to imagine now), I got requests to build complex or not so complex systems, but with the restriction that "it has to be written in Excel". Most of the time, what they really needed was a CRUD app and a database. Needless to say, I never accepted any of them.
Yes, but from the point of view of an office (lowercase o) worker Access is that thing that, in decreasing order of importance
1) it's not clear what it does
2) it doesn't show them all the data at once all the time
3) it doesn't let them write formulas to work on the data
4) the few that saw Access in action say that they have to call a programmer to do what any Joe can do in Excel
So whatever Access is doing Excel does it better and they can't understand why Microsoft developed a crippled version of Excel to manage tables of data. At this point I almost don't understand it myself anymore despite having seen all the years of Access and Excel ;-)
In a past life the most “fun” I had as a developer on a CAD file processing solution was discovering that engineers loved linking Excel sheets to the CAD files so crucial calculations could be done there. And then having to make this work on Linux servers.
I had to run a very large accounting Excel file inside a web service about ten years ago. The service was born as manually operated Excel sheet plus interaction by email/phone and they wanted to scale it on the web.
I run it on Windows Server and paid somebody to write the Windows service to manage the Excel sheet. I can't remember which language I used for the web service itself.
They eventually gave up, commercial problems, not technical ones.
I'm torn on this. I think many saas type services can be a spreadsheet, but excel is only 98% of the way there. Because the main issue with excel is that it is local only and even then cannot be accessed even by other local programs. Hence all the wacky VBA "scripts" that turn into full flegded programs. I wish excel _would_ become a database, that is, a frontend to a postgres instance that could be run locally or on a server. Then we just set the postgresql srting in our app and everything just works: excel is the frontend, special features run in a sane programming language, can be updated seperately ect.
Since that will nerver happen, someone needs to make an open source speadsheet that runs on postgres. And yes, it does have to be open source because otherwise nobody will learn it. And it needs at least 80% feature parity with excel.
I am well aware of ODBC. Critically you cannot edit the database itself, only copy from it. This makes sense so as not to let some sales guy tank an entire company becuase he misspelled a formula. What I mean is a spreadsheet that literally runs on a database. It could even be sqlite, _anything_ that allows two way communication between the slreadsheet and the application.
spreadsheet gui <--> database <--> application
The spreadsheet does simple data manipulation and acts as a GUI
The database can be mysql, postgres, sqlite (even JSON...)
The application does batch processing, generates documents, sends emails ect. everything excel can't or shouldn't do.
This stack could solve, in my opinion, >50% of all business software.
I have also considered some sort of general data storage but it's too slow. Drop box et al are not made for anything close to real time communication, which is critical for a spreadsheet that needs to talk to a program. Something like 100ms at most. This would also presumably require a perfect .xlsx implementation to read and edit the data which doesn't exist to my knowledge.
Google sheets, maybe excel, can make http post and get requests. It’s trivial to write an api that can read/write to a database on behalf of the spreadsheet.
The trouble with that is it's a central server which means an IT department to fuss with it.
Better would be something like Excel but that works P2P like SyncThing, or even directly using SyncThing itself.
I don't think it would actually be too hard as long as you didn't need truly realtime collaboration if you used file sync as your backend, and gave up on the 2D sheet model in favor of a table/records model that lends itself to conflict free operations instead of edits to specific numbered cells.
Depends on your definition of dirty. I'd take "air traffic control running on emacs" over "guy prints out a form he received digitally and other guy types it into their computer by hand" any day. The first seems dirty, the second one is downright idiotic.
Yeah, I used to work at a large UK university and I saw (helped manage) a vast Excel spreadsheet which calculated students' final degree classifications. It was so complex I never managed to unpick the algorithms, just assumed it was right.
For the rare cases where digital comms between planes and ground ATC system is done, planes run OSI Internet protocol stack to talk with ground stations.
The rest of this is essentially about ATC talking to other ATC components (for example airport's CTR zone passing a plane to another region after takeoff)
Original author here. C worked on VMS but it was not a good match. VMS used complex data structures and async for kernel comms ($QIO) and the C standard library was built with other expectations. Setting up and handling syscalls was cumbersome and the POSIX style library was slow.
I still remember picking up VAX Pascal for an other gig and getting a decent taste for the delight that VMS systems level coding was. Yes, Pascal was better suited for that than C on VMS. Fun times :)
Ah, thanks for the response! My experience in the same timeframe was using C and Fortran on VMS, though I suppose the work I was doing wasn't digging into the same level of detail as you did.
My understanding is that, even for the brand new x86-64 port, large parts of OpenVMS are still written in BLISS and MACRO, although I believe they prefer C for brand new code. A key component of the porting effort was modifying the existing BLISS and MACRO compilers to produce x86-64 binaries instead of Itanium ones. [0]
> This would have been 1991 or 1992 so DEC was probably still a BLISS shop. Maybe they didn't ship C manuals or something.
I suspect the actual reason – in 1993, DEC released a new C compiler for OpenVMS – DEC C, to replace their earlier VAX C compiler. [0] DEC C was designed to be a fully ANSI C compiler, VAX C was a pre-ANSI (K&R) compiler with various unusual proprietary extensions, along with some half-finished ANSI C compatibility tacked on the side. Probably, they were used to writing ANSI C on other platforms, and were given VAX C to use as a as a compiler, and were (quite understandably) frustrated by its inability to compile ANSI C code correctly.
Timing-wise, that could be correct. However, the biggest issue to me was the inability to nicely interact with VMS system and library calls. Even with the best that DEC could do in their header files, setting up a complex $QIO was a headache, certainly compared to more “native” languages.
I was a bit of a fan back then. Much more powerful and stable than pretty much anything I encountered in Unix-land. And once you tossed out the C compact libraries, very fast, too :)
That's pretty cool. Are there also other editors that could, at a stretch, pick up similar tasks? Emacs always seems so singular in these accounts (or in the it's-an-OS-sense) but I just wonder what some other editors in that class, or close to it, might be.
Probably not as easily. One way to look at Emacs is as a Lisp system in which someone has written an editor. It has scads of functions for file and network IO, a zillion useful general functions, and a decent, well-tested runtime. While Emacs Lisp is most often used to write functions that manipulate documents and files, there’s no constraint at all on what you can use it for. I mean people have implemented web browsers, window managers, MP3 players, and other wild stuff in it.
Now, I think you could theoretically implement all those things in Vimscript, in the sense that both are Turing complete. However, Emacs Lisp is a decent general-purpose programming language in its own right. I wouldn’t necessarily choose it over other modern languages for implementing non-text-editing related things, but if that were the only option conveniently available to me, I wouldn’t cry too much about it.
Pretty much every modern IDE with a plugin system could be repurposed into a message router, since they all have underlying code execution in their respective runtime environment (just as how in this case, Emacs was really just a Lisp runtime).
Certainly any of the major Java-based IDEs (Eclipse, IDEA, etc.) could host a message router, although this is quite boring as it's just Java. Amusingly though, Volkswagen's automotive dealership service tool, called ODIS, is built in Eclipse.
On contrary software designed to be a fully integrated ecosystem meant to be extended indefinitively and refactored indefinitely in the same codebase, at any level, user programming level included regularly (even if rarely) show they can do countless things with simplicity and power. Oh, sure they are hard to design and evolve at first, BUT they pay back so much.
These stories put side-by-side clearly show the complete failure of the software divide et imper commercial model and the need of classic desktop computing as designed back then by Xerox and kept up by Symbolics and co. Some in the unix land have seen that late and try to correct the aim a bit with Plan 9 but they fail. Nowadays the modern web with a substantial U-turn from widget-based UIs to document based ones, modern Uis in general, are another small brick few seem to see. In a decade perhaps we will back at original desktop model have forgotten it from the past...
Well, certainly not the best choice. But of course, Emacs has a full Lisp implementation (ELisp) so certainly doable. The problem is not lack of turing-completeness or features, it is maintainability and performance. Apparently, the latter was not an issue, and the former has probably been solved by now by reimplementing the code in something more common.
Believe it or not, last year, I have operated a $100k worth semiconductor manufacturing equipment (magnetron sputtering tool) with a GUI done in VBA of Microsoft Access 2003.
great story, for me another interesting part is that lots of those tools/utils used in their dev work were copied from that dude's home, surely that is very reproducible & auditable.
We're talking early '90s here. Security was a thing of course, it meant you set a five-character root password on your FTP server ;-). MD5 wasn't even around so you had to trust that your source tarballs were not tampered with whatever the origins.
So whether I brought them from home or that company (if it had internet at all...) pulled them from gnu.org probably wouldn't have made a material difference. It was one of the reasons there was a big antipathy towards free software, at least with the vendor tapes you had someone to sue if they got tampered with.
There is a slight risk with auditability, but were it me in the mid 90s I'd be honoured to hire someone who is excited enough to keep source copies of the GNU coreutils at home.
Someone who is eager and creative like that is unlikely to be a sociopathic jobsworth. I.e. the type most likely to steal secretes or undermine your business.
In 1995/96 I was the first tech employee at a startup. We had a Solaris server at the core of our network, and needed a C compiler.
Paying Sun whatever stupid amount of money they wanted didn’t seem to make sense, GNU still didn’t have their own domain, and for whatever reason I couldn’t find a gcc binary for Solaris to download (probably related to the terrible state of web search engines at the time). So I visited my university sysadmin and copied gcc from his Solaris network to use to compile our own fresh copy.
It’s sometimes hard to remember just how bad things used to be.
Mine was on a couple of dozen floppies from a BBS.
Downloading them over my 28.8kbps modem was so expensive and tedious and unreliable that I put my PC in the car, along with a stack of floppies and the 15" CRT monitor, mouse, and IBM Model M keyboard, drove six hours to the opposite side of the country, to where my friend who ran the BBS lived.
We spent the weekend copying floppies and installing very early Slackware on a couple of machines.
Thanks for confirming that my option of asking a local admin with internet access was the right way.
Still... had to load the tape on our office AIX machine (the only Unix box I had access to), then wire up a null modem cable to a PC, install Kermit on both ends, transfer all the floppy images, find an MS-DOS image writer, and finally copy the works on that stack of floppies.
Yes, good times! I have the experience of copying games on multiple floppies; though it might be seen as an inconvenience tech-wise, it had other things which was fun. Travelling to a friend's place, yapping while it was getting copied. Cutting the archive into pieces and joining them back was a small personal learning exercise, and more. Thanks for the story!
My research group needed to very rapidly make an interactive piece, for a looming big open house event for sponsors and VIPs. For Reasons, we had trouble getting computers, but there were some ancient ones that didn't run much of anything. I'd been programming in Java heavily before this, but the solution to the immediate emergency of deadline plus resource constraints was... Emacs.
Emacs would run on the box we had available, and, with it, I could develop all the things I needed, including complex HTML generation, even faster than with Perl (which I also knew). I ended up having a new Emacs process exec'd for every request, by Apache CGI, and displaying in Netscape. Process startup of the dumped Emacs itself was fast, and it worked fine.
I don't have that code anymore, but some other non-editor application for which I used Emacs before then was a kind of code generator: https://www.neilvandyke.org/jomtool/ You can see Emacs was supporting objects and even syntax macros, which was very respectable for any programming language at the time, and this was only the extension language of a text editor. What you can't see is that Emacs was additionally a super-productive "IDE" for developing with its own extension language.