Hacker News new | past | comments | ask | show | jobs | submit login
IBM COBOL for Linux on x86 (ibm.com)
133 points by belter on April 7, 2021 | hide | past | favorite | 105 comments



I'm just imagining the level of jaw dropping and brain exploding that would have gone on had that same headline been published in 1998.

Obviously at this point almost three decades into the internet era no one really cares. We all know that "somewhere" old COBOL code is running, but in practice the companies managing the bulk of data for society didn't even exist before Y2K. Everyone with unmaintainable COBOL has long since migrated to really bad Java code instead.

But... yeah. Linux is now an official heir to the S/360. We've made it. Finally.


> Everyone with unmaintainable COBOL has long since migrated to really bad Java code instead.

I can assure you that’s not correct, as I work in an enterprise that has plenty of unmaintainable COBOL still running, though they’ve supplemented it with lots of really bad .NET code more recently.


I too have seen lots of badly written .NET code out in the wild.

However, as with most languages, that is more a measure of the quality of the developer, than the quality of the language.

As a C++ developer now writing C# using .NET that new language and framework are a pleasure to work with.


> However, as with most languages, that is more a measure of the quality of the developer, than the quality of the language.

No, it’s more a measure of the quality of the development organization and process. The quality of individual developers is a factor here, but not the only one and often not the limiting one on software quality (and particularly often not on, e g., whether or not business requirements and system design are documented in a useful way to support maintenance and change analysis.)

There are lots of fairly common institutional factors at the kinds of shops that also tend to have lots of COBOL legacy code still running which also tend to result in instant-legacy code independent of platforms. They also have common in institutional patterns that tend to have platforms like Java and .NET.

This is not an indictment of Java or .NET as platforms, or of the individual developers in those environments.


Quite right — and to the extent that the blame goes to any individual developers, that comes back to the organization: do they pay well, offer a good working environment, etc.? One bad hire, sure, it happens but if it’s repeating that’s structural.


> Everyone with unmaintainable COBOL has long since migrated to really bad Java code instead.

Sadly this is entirely untrue. What's happening in practice is they're still running COBOL and they're paying some woefully underpaid and underappreciated developers to write PHP ("Zend Core") for System-i.

This allows companies to run web applications on these aging systems.


Plenty of jobs for remote COBOL devs in the 120-160k range, still. I wouldn’t call “live-where-you-want” at 150k/year woefully underpaid. That affords a great living in a lot of America. ‘Specially if you take two.


I wasn't talking about remote COBOL devs.

Pay for PHP devs, even on these systems is generally lower and a lot of the work is in Europe where developer pay is half.


I know that Mono for i, at least, is the result of an IRC bet between a couple of hobbyists about whether it was technically possible. I wouldn't be surprised if the PHP port was similar


Mono's port was done on their own initiative. Zend Server is, however, an IBM supported product. The same goes for e.g. node.js where IBM ported all of V8 to ppc64. (There is a community version of PHP released in 2019, but the commercial version precedes this)

This same work has proved useful in running chromium on ppc64le/Linux, even if I think running node.js on IBM i sounds ridiculous ;)


COBOL working on Red Hat OpenShift/Kubernetes:

https://developer.ibm.com/technologies/systems/videos/gettin...


> We all know that "somewhere" old COBOL code is running, but in practice the companies managing the bulk of data for society didn't even exist before Y2K. Everyone with unmaintainable COBOL has long since migrated to really bad Java code instead.

Not quite. I'd suggest that there are still many trillions of dollars worth of people's money still managed in COBOL on mainframe core banking systems. Certainly the "big 4" banks in Australia all still have such systems in production (though they also have java systems)

Disclosure: I have worked for a few banks.


> I'm just imagining the level of jaw dropping and brain exploding that would have gone on had that same headline been published in 1998.

I worked at IBM at the time and we had internal-use-only versions of a lot of major IBM AIX/Unix software running on Linux by 1998. Our experiments with running it directly on big iron started around then too. At least in my area most of the engineers were moving from X terminals to Linux desktops at that point so porting the software they were working on was inevitable.


> Everyone with unmaintainable COBOL has long since migrated to really bad Java code instead.

Unfortunately, not true. Most of the companies decided it was too much time and/or money to do that.


Sounds like another opportunity to extract millions of dollars from hard-headed customers that are 20 years behind


1998 was a bit early, but I could easily have imagined it any time after 2000. I'm frankly surprised it has taken this long.


One of the VAX computers I rescued from disposal at the height of the Y2K impending apocolypse came with licenses (PAKs for those in the know) for DEC COBOL, DEC RDBMS, and DEC FORMS. These three applications formed the backbone of payroll processing for the company that was tossing the VAX.

This machine, processed the payroll for a bit over 5,000 employees on a "clustered" machine that had a combined total of 512MB of RAM and 4GB of disk. I can assure you that "Quickbook Payroll" would not be able to handle this on a similarly resourced x86 machine. Not even close.

Unicode support is nice, and the fact that once you've compiled your application to binary it doesn't have to go over the Internet to load key parts is super nice.

And yes, there are people that still use COBOL, still think it is the best choice for the task that it is doing, and keep it going with a modest amount of maintenance.


> I can assure you that "Quickbook Payroll" would not be able to handle this on a similarly resourced x86 machine. Not even close.

Why? I've been into x86 databases since the Windows NT days and 5000 doesn't seem an impressive number to me.


The OP's comment was conditional: on a similarly resourced x86 machine.

Windows, let alone Quickbooks is not going to run very well on a machine with just 512MB of RAM.


I know today Windows is not going to run very well. But why? What are they doing to consume so much resources? Today Windows and Office do exactly same job they did 20 years ago (I even used 3-rd party bells&whistles to make Windows 2000 and 98 look at least as "cool" as it does today as I was a kid and enjoyed that).

Also, I can imagine using an Excel spreadsheet with some sheets, thousands of records on each one (Excel 97 max rows was 65536) + formulae and macros even 20 years ago so, again, 5000 doesn't seem an impressive.

I can't remember how much would SQLServer with a database take those days though so perhaps a 4GB HDD might indeed be insufficient.


Windows NT/2000/98 would run really well on 512MB of RAM.


similarly priced machines would run circles around it?


Can't you get all this stuff with a hobbiest license and a full media kit?

Doesn't macro32 on alpha or itanium run old VAX binaries transparently? Why keep the antiques? Even SimH seems a better option (assuming the licenses work).

http://www.openvmshobbyist.com/news.php


You could, except that they stopped the hobbiest program sadly.


VMS Software, the newest maintainers of OpenVMS, have continued the hobbyist program. https://vmssoftware.com/community/community-license/


Except this bit: Please note that in accordance with the license agreement between VMS Software Inc. and HPE, VMS Software Inc. are not able to distribute VAX licenses.

Since I've got actual Vaxen it doesn't help.


So back to the original question, what about macro32, the VAX emulator built into VMS on alpha and itanium?

Where you have the source, you can build a native binary. Where you lack it, perhaps the emulation is sufficiently robust.

https://en.m.wikipedia.org/wiki/VAX_MACRO


Running VAX binaries per-se isn't a problem, you can run them on SIMH quite reasonably[1]. The challenge is running VAX binaries that use authorization keys (PAKs). Unless you have the key (or forge one) the binary won't run.

I have keys for the VAX assembler (and Fortran, BASIC, and Ada) so running source, and building applications that DECUS distributed back in the day, isn't a problem for me.

What annoyed me was that VMS/VAX support was discontinued by the hobbyiest program. When it was announced and I asked for details they suggested I contact licensing. Which I did and suggested I could buy new licenses for VAX machines for $5,000 good for up to 25 users. For a machine that was headed for the skip. So someone, somewhere, and I'm thinking US GOV here, "can't" upgrade their VAX hardware and HP has them over a barrel so they continue to hold those licenses as saleable products. But that is purely speculation on my part.

[1] In fact, for the earlier models (anything before the VAX 4000 series), they run faster on a modern PC under simh than they do on the original hardware.


There's two separate programs under VMS:

MACRO32 is a VAX assembly compiler - given a VAX assembly source listing, it can turn it into object code for Alpha, IA64 or x86. In some cases, the code will need special annotations.

There's also VEST on Alpha which does translation of user-mode VAX binaries into Alpha binaries. I think that was an optional product. Many products (including things unrelated to the operating system) don't fall into the user-mode category due to the use of the "Privileged Image" mechanism of VMS.

Nitpick: Neither are emulators in the strictest sense of the term.


What's so burdensome about payroll processing? It seems straightforward enough.


It's really not that simple, because payroll processing isn't just the part where an amount of money is sent to a bank account. It's tied up in a lot of complexities like accrued time off, time off used during the pay period, which pool of time it came from, potential for comp time or overtime, etc. That's not even getting into benefits like payroll deductions for health insurance, the choices of different insurance types, performance bonuses, annual cost of living increases... There are a ton of variables involved, this doesn't scratch the surface.



>"Unicode support is nice, and the fact that once you've compiled your application to binary it doesn't have to go over the Internet to load key parts is super nice."

Which is what I or anybody else who wants to have now anyways. I do not recall my applications "loading their key parts" from various places. Everything is local and native yet connected when needed.


COBOL will always have a soft spot in my heart as the first programming language I learned. I did not use it much - 2 years max - but when reading COBOL it was really clear that there were some programmers who just saw code differently - in an abstract, artistic sense. It was the ultimate litmus test.

COBOL is really easy to write spaghetti code with - perhaps even the default. But some of these programmers (not including myself back then admittedly) just wrote code that came together beautifully like a jigsaw puzzle. I’m sure if given a C++ or other language with modern features, they could do amazing things. This was before a time when Clean Code (capitalized on purpose) and such were in the zeitgeist. I didn’t appreciate the full beauty of these modular, generalized, programs until a decade or more later.

I also didn’t realize that Z systems natively ran Linux. Must have missed that news. I suspect they’ll open source this soon - makes a lot of sense, if only for keeping consistency with their C/C++ work on Z systems. They are a big contributor to L.L.V.M in that area.


I worked on the IBM COBOL compiler in a past life. It's not LLVM based; the optimizer and code generator are derived from J9, the IBM JVM. I would be a bit surprised if they open-sourced it; OpenJ9 is open-sourced, but the COBOL compiler for Z is one of those rare compilers that is still being sold to paying customers.


Did I work with you!? I also worked on the COBOL backend for TR. And yeah, I'd also be HIGHLY surprised if it ever got open-sourced.


We must have just missed each other. I started in January 2014.


Thanks for the insight. I strongly suspect the COBOL compiler buyers are really just buying ongoing support and updates. Open sourcing should result in minimal revenue impact, but shared code might make it quite problematic license wise. Wouldn’t blame them for turning down that headache.


> I worked on the IBM COBOL compiler in a past life. It's not LLVM based; the optimizer and code generator are derived from J9, the IBM JVM.

Does it have anything in common with XL C? Does it use W code as an intermediate language?

And what language is it written in? (PL/X? PL.8? C/C++?)


Assuming nothing's changed since I left, the front-end is taken from the old COBOL compiler, and is written in PL/X. It emits WCode for the backend, which is written in C++.

XL C is a separate codebase, with very little shared code IIRC.


some programmers who just saw code differently - in an abstract, artistic

Yes, some people like this are absolute geniuses.... And also result in code that's incredibly difficult to maintain, especially when it wasn't well documented or documentation binders have been sitting in boxes at some off-site cold storage for 25 years.

For anything that isn't pushing the boundaries of computing, I'd much prefer a few experts to a single genius.

If you find yourself with a true genius, you need to realize that they are not replaceable and you cannot rely on them alone, doing their work as usual. To harness genius like that, you need to build a structured team around them, not try to fit them into an existing structure.


^^ I understand my comment on genius could be seen in a negative light, e.g., genius-level intellects are problematic and perhaps undesired. I mean quite the opposite:

1) I prefer geniuses to be working in areas where they are pushing boundaries further. I don't think such talent is used wisely in service to finding novel solutions to mundane problems much faster than the average expert. Better to throw three experts at the problem instead.

2) However, sometimes a large organization finds itself with a true genius in the ranks. (It might just be the sort of work they enjoy, or the best job in their region and they don't want to relocate, or any other factor) The problem is that even well-organized large organizations are is highly structured and setup to organize the massively vast majority of people who are not geniuses.

Yet a true genius is inherently an agent of chaos. They are a genius precisely (in part) because they see things beyond the current structures & paradigms, and realizing their potential requires them to either break or circumvent the current status quo. If you put such a person in an agile scrum and assign them stories, they will either

1) Quit out of boredom

2) Fuck off on their responsibilities out of boredom.

3) Complete their assigned tasks finding novel and interesting ways to solve (compared to their capacity) stupidly simple problems that will make future code maintenance a nightmare when a mere expert or newbie has to decipher it.

4) If #3, they may have their productivity recognized and be promoted into a PM/Manager type of role they will hate, at which point see #1 & #2.

Under #2 & #3, if you're fortunate, they will spend their copious free time finding more interesting things to do that benefit the organization. If not, they'll be off pursuing their own curiosity.

So, any organization should have some mechanism of recognizing these individuals (This is not something I have an easy solution for because, given #1 & #2 above, they can sometimes be indistinguishable from lazy/incompetent) However, if recognize, you need to set them to very challenging tasks suited to the direction of their genius:

Build a team around them. At least one of those team members needs to be an expert. This is the person that will act as a buffer between the genius and everyone else. The expert will also coordinate with the rest of the team to determine the practical/logistics involved in making use of the output of the genius. They will systematize the chaotic creative forces at work. They will document, they will disseminate to the rest of the organization, they will ensure that the incredible value of the genius's work isn't lost if/when the genius moves on, retires, or whatever.

How do I know this? I've seen one or two geniuses myself. I have even experienced this dynamic first hand. Let me be clear though: I am absolutely not a genius on that level. I am very much a generalist with a few areas that extend close to expertise. I have, however, found myself working in organizations that are so far behind what is possible that even very basic things have had people label me a "genius", which is very embarrassing because I'm not. And what I've done-- to use an artistic metaphor-- is kindergarten finger painting, Yet presented to a crowd that's never seen art, I am praised as a Picasso. Seriously: Pulling down 5 years of data, pre-processing in python (my preference) and running some basic regressions in R to show that a current very time-consuming process was useless was thought to be revolutionary.

As a result, I've been through #1, #2, #3, and I'm currently resisting #4. When in #3, which is most of the time, I try to document as much as possible and make sure anyone who would be responsible for my work if I left is fully in the loop on what I'm doing. And I get to spend about 10% of my time on things I find truly interesting that push my own limits. But I like working for organizations that are behind the times, if they are flexible to accept change. I like it precisely because while I am not a genius, I am pretty good, and working with such organizations still allows me to make an outsized impact on them for the good.


I worked full time maintaining a huge COBOL system in the 90s. Seeing how a massive code base could be elegantly structured around COBOL gave me a new respect for the language, and the original developers. I always felt a warm sense of pride every time I managed to improve upon their work.

It's not so much that languages are good or bad, it's developers that are. COBOL can be made elegant, Swift can be made ugly. Even driftwood can be made beautiful.


35% of the total installed capacity world wide today is Linux on Z and is growing at double digit rates each year. RHEL, SLES, Ubuntu and most recently CoreOS (with OpenShift) all run beautifully. Ultimately, it's just hardware and it's what people do on it that matters.


I remember years ago there was a contest for the most error messages from the fewest lines of code from a compiler.

I believe you could get the IBM COBOL compiler to generate 600 lines of error messages if you put a single period in column 6.

Of course times have changed and I don't know how to get a punched card into a linux machine. :)


> I remember years ago there was a contest for the most error messages from the fewest lines of code from a compiler.

When I started being exposed to and interested in computers around 1982, I inherited a stack of Creative Computing magazines dating back to maybe 1978-ish.

I remember reading about these error message contests in Creative Computing.

Today, I understand why that was. In the olden days, it was important to get as much information from a single compiler run, because interactive computing wasn't universal. Programmers had to share a machine and submit programs in the forms of decks of punched cards to a job submission window. You would not want to fix one error per round trip.

This means compilers had to be smart about error recovery: to try to repair the program after an error, and then keep going to get more information out of it. Whenever error recovery inserts a token into the program (like a suspected missing closing parenthesis), it is making the program longer, and risks confusing itself and creating a runaway loop that has to be curbed somehow.

I suspect the best solutions to these contests must have been exploiting features of error recovery; tricking the compiler into creating a cascade of errors out of something small.


> it was important to get as much information from a single compiler run

That's a good point. I remember debugging a core dump... in hex... and going and circling memory addresses with a red pen and chaining them together until I found the problem.


Five spaces and a period.


with a proper drum card, you could just press skip. ;)


You could get a $600 bill of AWS Lambda for forgetting a ; today too Not much has changed.


> Memory requirements are as follows:

> Minimum 250 MB for product packages

> Minimum 2 GB of hard drive space for paging

> Minimum 512 MB for temporary files

> Minimum 2 GB RAM, with 4 GB more optimal

Ouch; this ain't your grandpa's lean and mean Cobol.


IBM COBOL for Linux on x86 1.1 - Data Sheet: https://www.ibm.com/support/pages/system/files/inline-files/...

- "Communication between COBOL and C/C++"

- "Compatibility with Enterprise COBOL for z/OS and COBOL for AIX"

Integrated with IBM CICS TX on Cloud 11.1

- "Certified Red Hat operators that can simplify deployment and management of CICS TX applications on Kubernetes"":

https://www-01.ibm.com/common/ssi/cgi-bin/ssialias?infotype=...


probably includes a hidden z/os emulator


I expected it to be Open Source, but it is not.

I had a few jobs where I programmed in COBOL when I was in college. Its probably just as well that it's not Open Source, because I was tempted to install it just for fun.


GnuCOBOL is open source though:

https://gnucobol.sourceforge.io/


Finally! This is what I've been waiting for.

Remember that Skynet was written in COBOL:

https://twitter.com/ThrillScience/status/1249742678532620293


> supports one of the following operating systems

Redhat and Ubuntu.

Is that a common thing for software to be marked as "available for Linux" but only supported for 2 flavours?

I guess it's fair, a debian and rhel base would cover "most" use cases.


In the enterprise software world: yes. Remember, this is not open source. They probably only test on those two platforms, and compile against the libraries for them. Could you make it work on other distributions? Sure. Certainly in a docker container or chroot.


You can make it a static binary and thus make it portable on all distributions that have a minimum specified kernel version. Or you can use formats like AppImage that are designed to encapsulate a program into an image that can be run as a normal executable, without depending on the distribution but again only on the kernel.


It may be possible, but enterprise type companies like supporting a limited number of differences. It limits the risk of having to diagnose issues that are due to some sort of weird compile option for the kernel or someone using musl instead of glibc.


> Is that a common thing for software to be marked as "available for Linux" but only supported for 2 flavours?

Yes, because it keeps the support matrix smaller. As much as people like to think “Linux is Linux is Linux”, there’s all sorts of subtle and not-so-subtle system and library-level differences between distributions that can cause software to not work properly. IBM can’t and won’t test against every single distribution nor will they support their software running on whatever obscure distribution someone decides to use.


Yes, there is an expectation that if there's an OS issue there's an enterprise support organization the application developer can work with.

I'm actually surprised to see Ubuntu. It's typically just RHEL and SLES with Oracle Linux occasionally thrown in.


> Is that a common thing for software to be marked as "available for Linux" but only supported for 2 flavours?

Very common. For example Steam for Linux only really supports Ubuntu.

In my experience, extracting stuff from debs and rpms to run on, say, Arch Linux was very easy, using debtap, rpmextract, etc.


Too often "supports linux" means "supports Ubuntu, might work on other debian based distros". But I can't blame people for not wanting to support every distro under the moon


The key word here is "support". Will it run? Probably. Is some IBM support engineer going to help you if you tried it on some other distro? No.


Generally yes and that is an enormous step.


If IBM had a clue the first flavor on their list would be Amazon Linux.

(yes, I know it's a RHEL derivative, but I also know some large fraction of 'developers' are oblivious to these matters, which is why no one hesitated to list 'CentOS' when that still made sense.)


As a former (retired) old-school CICS-COBOL programmer (in the credit card industry) it might be fun to setup an instance and see if I can get Base I/II up and running for old-times sake.


'Ordering information' Yeah, IBM is still IBM.


This is pretty clearly targeted at rehosting CICS applications on Linux. No one will move batch processing without JCL support, does anyone know if they offer that for Linux? Also this appears to be commercially licensed only, so no way to download and play with it that I saw?


I had to chuckle at this, one of the bulleted features:

"Offers an extended source format that lets source text vary in length up to 252 bytes per line. COBOL for Linux on x86 supports fixed source format and extended source format. Fixed source format consists of text that varies in length up to a 72 bytes per line."

Yeah I'm never going to be in the same room as COBOL code


Hehe, as usual they don’t even talk about how much it costs. Will it be against the TOS to benchmark it?


Seeing this and another thread about this https://gnucobol.sourceforge.io; wonder and guess none interoperability ba.


I wonder how much better than GnuCOBOL it is.

https://gnucobol.sourceforge.io/


Probably a lot. Gnu COBOL has some annoying limitations, such as no easy to use network API, nor do they make it easy to read-write files whose names are not declared at compile time.


They should've announced this last Thursday :)


”Offers an extended source format that lets source text vary in length up to 252 bytes per line” - punch cards are big nowadays.


I recall hard to debug issues due to compiler silently dropping everything past col 80 ...


Predicting apparently highly effective file compression benchmark results due to files with larger LRECLs whose lines are padded out to 80 or 252 with space characters. “iZip-in-Cobol on Linux now 3x better - order now!”



There is also COBOL on Wheelchair:

https://github.com/azac/cobol-on-wheelchair

IIRC both of them where featured on the HN front page once.


I remember the first IDE I ever used back in 1988 on MS-DOS was a COBOL IDE from 1983. It felt ancient back then too!


> It felt ancient back then too!

Sir, the politically correct term today is..vintage.


Retro?


They say that UTF-16 is a fixed width encoding. But I don’t think this is the case.


Releases? It's not released yet.

>Planned availability date: April 16, 2021


That's old news. COBOL on Kubernetes is the current bleeding edge: https://github.com/IBM/kubernetes-cobol


Yay, just what we always wanted!


costco still uses ibm as/400


I know several places still using AS/400's, I wish they would stop calling me about software I wrote over 30 years ago.


My dad worked on AS/400. When I first saw it, it looked like he was working in the matrix with that black/green color scheme.


I was a consultant at Allstate when they ordered on the order of 13,000 AS/400 machines--one for each field office.


What’s this for? To get new people on board with the language to maintain states old unemployment systems?


IBM can piss off.. they bought RH, and now RH can piss off.


COBOL is the new Lisp.


The lack of comments vs position on front page is intriguing.


The post just went up, people are still writing their insightful essays on their experience with COBOL, what this means for the future of the language, and speculating as for the reason IBM is porting their compiler to Linux. Can't wait!


There's nobody here (besides maybe me) old enough to have anything to say.


You’re not alone - I worked with COBOL a decade or five ago.


I used it back in the late 80s to handle bank account processing on a Burroughs mainframe. I got moved from a night shift sorter operator to a day shift maintenance programmer because I was literally the only available programmer in town. Had next to zero experience at the time, too.


Well, I worked with COBOL developer team in Canadian bank a few years ago. It is absolutely alive and well in this industry.


According to my father it is alive and well in Canadian insurance as well. But he said where he worked they don't change the COBOL code much anymore, they add features by adding layers in other languages around the COBOL core.


In th Culture books by the late much lamented Iain M. Banks, one of the oldest drones began life as a house minicomputer that over time accreted many programs and uncountable layers of software, achieved sentience by the time the Culture came about some 9,000 years later.


I worked with COBOL just in time to fix Y2K problems, and I still have nightmares about it -- not about Y2K, mind you, but about COBOL and banking systems. Such a horrible, horrible programming language.


I worked with it about 10 years ago.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: