Hacker News new | past | comments | ask | show | jobs | submit login
IBM will offer free COBOL training (inputmag.com)
322 points by jacobedawson on April 11, 2020 | hide | past | favorite | 227 comments



It's a trap.

The problem with those old codebases that governments, hospitals, big businesses are struggling with is not really the language, it's the engineering practices of that time with regards to constraints of old technology. The language is not the problem - lack of comments, bad variable naming, bad structure (little or no procedures or readability), and just sheer volume of it, is.

It would be very interesting to see the old systems rewritten in a modern language, with modern engineering practices, but keeping the old UI and UX (which often is incredibly ergonomic) - so as to limit scope and not mess it all up by trying to introduce mouse navigation and windowing mess.


Not to mention GOTO. If your one of the people who hyperventilates when you see a goto because you learned that it was considered harmful in programmer school then Cobol might not be for you. ;)

You might be surprised about the comments though - depending on the age of the codebase. Mainframes were rented back in the day, you paid by resources consumed, terminal time was precious, and mainframes were often turned off outside business hours.

Because of this a lot of the development actually happened between terminal sessions in flowcharts, pseudo code, documentation, and peer review before the programs were ever modified and run.

If you ever run across really old comp-sci books you’ll typically see them divided into three sections - first section was usually a guide to the author’s terminology and symbology, second part was usually a guide to flow charting and documentation (IBM had standardized forms for developers to use), and then the remainder of the book was the content with lots of explanations of how to work the datasets hugely larger then the memory available to you.

But as time passed and computer time became cheaper many of those formal development practices started to get lax.


> Mainframes were rented back in the day, you paid by resources consumed

To expand on that, IBM used to rent mainframes based on a 40-hour week. The computer had a usage meter (like a car odometer) that would keep track of how much time the computer was running. If the meter ran over, you would be billed for the excess charge.

The computer actually had two usage meters, with a key to select between them. When an IBM service engineer maintained the system, they used their key to switch from the customer meter to the maintenance meter. Thus, customers weren't charged for the computer time during maintenance.

One interesting thing I noticed about the IBM 1401 is that it's built from small circuit boards (SMS cards) that are easily pulled out of the backplane for replacement. Except the cards driving the usage meter. Those cards are riveted into place so they can't be removed. Apparently some customers discovered that they could save money by pulling out the right cards and disabling the meter.


I found a picture of the meters for those that are curious - http://static.righto.com/images/ibm-360/epo-30.jpg


The terminals were turned off. The mainframe kept running. In the computer room, the second shift operators ran batch jobs, printed reports, and did backups.

Didn't matter if the terminal was turned off or not either. The UI was burned into the phosphors.


>The UI was burned into the phosphors

Oh wow, so true, brought back many memories and also one of those overlooked aspects when you change a system as the previous burned in fields would with the right lighting create a whole avenue of data input errors that truly was a case of given the user a new monitor, which can be fun if your looking at it as a software issue in some new rollout. Yeah that can be a fun one and sometimes can't beat site visits as the local environment will never be replicated in any user transition training setup, however well it is done.


I used to have an IBM flowcharting template I inherited. I mostly used it to draw shapes and didn't appreciate what it meant to have a standard you could make tools like that for.

https://americanhistory.si.edu/collections/search/object/nma...


Awesome! I used to have one of those templates, unfortunately it got lost in a move. Surprisingly you can buy similar templates on Amazon but I’ve not seen one used in decades.

https://www.amazon.com/Rapidesign-Computer-Flowchart-Templat...


The New York State Civil Service exams for IT positions (meaning anything involving software development as well as other stuff) still have a flow chart section.

I actually quit the interview process with an insurance company a year or two ago after they wanted me to take a test involving reading flow charts, but now I'm in the position of having to pass something similar if I want to get promoted.

However, I don't think people use flow charts on the job anymore, even in state government.


I've found that flowcharts are enormously helpful for software design and communicating the design decided upon. Maybe in your role you don't have a need to communicate how software should be written?


I like to make hierarchical text outlines.

I feel like once you need a general graph sort of structure, you're too far down the road to spaghetti and/or excessive detail.


I make flowcharts all the time. I find them particularly useful for showing stakeholders how different pieces of an integration project work together. Usually the moving parts are pretty coarsely grained, like an ETL job or something that writes out a file and something else that picks it up. IMHO, your 70s-era flow chart diagram is pretty good at that kind of stuff.


>However, I don't think people use flow charts on the job anymore

Only for our service desk. Much easier to read than walls of text.


>If your one of the people who hyperventilates when you see a goto because you learned that it was considered harmful in programmer school then Cobol might not be for you. ;)

I don't know, at this point seeing a GOTO in my « native language » (the C language) is so incredibly weird that my first thought is « this guy is trying to do something weird and interesting ». It just wouldn't cross my mind somebody would be using a GOTO as a result of ignorance or laziness.


Really? Not trying to be derisive, just surprised. One of the common paradigms in C is GOTO for cleanup and error handling. I know it can be avoided if you really try however when used correctly, it can greatly improve the readability of code.


C gotos are scoped to the function, making them fairly reasonable to use. COBOL and FORTRAN have global gotos.


Real programmers use INTERCAL's COME FROM


Oh lord, it's been a while since the last time I've heard someone mention INTERCAL.


That language with no pronounceable acronym remains an important teaching tool for programming language learning and design.


Yes, indeed. I started as a bank programmer, working on an IBM mainframe.

At night, nearly the whole machine was consumed by batch processing. If there was some problem that required a late-night fix, it was worked out first on green-bar print of the program, using hex to determine what was in the registers. Then the code could be submitted via ICCF, but the wait for a compile could take literally hours. If you mis-typed something (say a forgotten period), the compile would fail and you would have to resubmit the job. Waiting hours again!


> But as time passed and computer time became cheaper many of those formal development practices started to get lax.

I agree.

~15-20 years ago we used to have some teams which were specialized in creating flowcharts for anything that had to be implemented.

Nowadays I do a bit of everything (project mgmt, development, support, analysis, etc...) and in my area I'm the only one drawing logical overviews (very primitive stuff - I use "MS Visio" and I like it a lot) whenever we have to implement something that has the potential to become a bit challenging => so far I've always been very happy of having done that (all conflicts/complications/flaws/etc... of the proposed logic are then caught already at that stage, therefore no problems later during the core dev phase and we have as well less problems with the resulting implementations).


YIP I trained early 80's in JSP, didn't even learn GOTO was a verb, as could code around using it with JSP.

Then you hit reality and all the baggage legacy code has as well as standard, JSP did not traction well and when it did - well maintenance of code......lots of legacy spagetti out there.

WHy is COBOL still in use, its a robust data processing language and that is the bulklk of things - batches of data that need processing, mailing lists for the post, bills.

COBOL does handle data well if you want to know when it rounds/how it rounds and have and trancation....Formating ourput. This was at a time that no other could fit the job and all runs upon robust hardware designed not to fail as much as your consumer affairs that were still a glint in many eyes.

So the legacy grew, bloated. I've worked on a fair few migratioin projects for a software house and the costs to migrate you large blob of legacy code on legacy hardware to run upon something modern is not a quick process, not cheap and so much planning and due dilligence as well as data integretary and testing involved alone is a huge costs.

SO you end up with legacy code hanging in there as no managment team can justify a 6-10 year budget in the mindsets that work upon a 5 year plan and budget.

WHich ends up with many systems being literally too big to fail and too costly to ever be full migrated as the risk/costs just grow and management with the guil and drive to push against the status quo in management is often a path to career suicide, so they carry on with the heard mentality that management prevail. With those that do stick their neck out being of two types, those who care and know what's needed and those who just want to be seen to be doing something big, run into it, then the rush decisions unfold and before you know it, they have already flown off to another company saying how they initiated a project that will die a horrible death not long after they left as people realised what a mess and the true costs involved are.

Hence many reasons why COBOL still around today, it just works and in some ways you can't knock legacy. Nokia phones, work for days, just work and do the job of being a phone. So for that task they do the job much better than anything modern, however the modern android and iphones do much more, bells and whistles of all flavours and yet if you just want to, or need to make a call, overall they are not as robust compared to using an old nokia, that just works and works well for the task at hand.

This and the mentality, if it works, don't change it does have merit and is something you learn over time.

But there is always hope, so bits can be pulled away from the legacy and if planned and managed right by people who know what is needed and the business needs as well as requirements and mindful of minimising risk and interruption, there will always be a way.

Though I've seen many a project with the best in the world be doomed from the start as the bites of the cake made it an all or nothing approach and nobody wants to wait or budget/plan for something that takes longer than 5 years in software in the business customer world. ALways exceptions, but then many are planned for 5 years when known will take longer on the basis that 3 years in you push a new 5 year plan out and bolt on a few trinket features and justification to hide the fact that it was never going to go fluid and end up on target.

Best approach, bit by bit, batch processing and the like can be more easily migrated, though the data as always and interacting with that will always be as big a part of any migration than the code.

But yeah GOTO, when your working on code that needs performance and the platform is more costly to upgrade than most, you will see much use of GOTO in the code. Then you get wonderful things like variable length records that many won't even know about and unaware that in COBOL you can define a say a top level record definition as 80 character and then redefine that with a PIC(x) OCCURS DEPENDING UPON VARIABLENAME. Then write that feild and get variable length records stored and save data storage and other expensive resources that we take for granted today. So yes, many gotchas and creativity to eek out performance and reduce storage costs.

WIth that, GOTO is not your biggest problem with legacy code of the COBOL flavour, let alone linked in machine code specially crafted to do a sort upon the data as was faster and now nobody knows what that blob actually does or how to change it, so yeah, lots of traps in any legacy code of any flavour.


> Mainframes were rented back in the day, you paid by resources consumed, terminal time was precious, and mainframes were often turned off outside business hours.

Sounds very cloud


Cloud is return of the "computer centre" model of old, similarly billed by usage and with various level of vendors services provided.


Which is why everyone should be laughing at IBM Cloud right now for not succeeding that well at IBM's original business model. ;)


Azure Stack Hub and AWS Outposts are both fairly mainframe-like: you rent racks for your physical premises that are essentially opaque to you, managed by the cloud provider, and bill according to usage.


I doubt that mainframes where turned on and off every day even later with super mini's we left them powered up


Every mainframe I worked on until the mid-80s was turned on by the first person to arrive in the morning and turned off by the last to leave in the evening.


Many sites had operations plan that involved weekly or biweekly "IPLs" done on weekends.


Doesn't mean the power was off


> keeping the old UI and UX (which often is incredibly ergonomic)

I have the feeling that with the success of the iPhone many people forgot that a thing like a UI can have a target audience as well. If you make a tool that is beeing used twice a week for a minute at a time it has to look fundamentally different from a tool that is used 50 times every day.

With the former beeing intuitive is more valuable, while with the latter reducing friction is more valuable. This is a choice which has to be made – and sadly I often don't see it beeing made. People just make a UI that is akin to the ones Google or Apple make and call it a day.


>People just make a UI that is akin to the ones Google or Apple make and call it a day.

It's worse than that. Lots of people involved in the creation of software don't just follow the trends, they have internalized the idea a UI exposing any complexity is inherently bad. That if something can't be easily expressed in the interaction language currently fashionable in mobile, then it must be a misfeature.

A distant but perhaps illustratively analogous example can be seen in non-nerdy teens and young adults. Take one that does class writing assignments in a google doc on their phone (they're not hard to find, you can even find some that try to do CAD on mobile devices). Try suggesting that if they learned to properly touch type on a real keyboard they'd find the whole process easier and faster. Then tell them apple's bluetooth keyboard can pair to iPhones. Compare the reactions.

tl;dr: In the TV show Metalocalypse the characters derisively called acoustic guitars "grandpa's guitars." That's the UX world in a nutshell.


Just finished a contract with a hospital that built a lot of stuff in house in the 80ies based on a C (still pre C89 in many places) and Delphi/Pascal. The problem is indeed volume (over 10MLoC), two or three wizards supporting it all, but firmly coding like it's '83. No training of newcomers whatsoever, and thus really no way to contribute. If you manage to get some support, it will be once a year and in the form of code they wrote for you without much feedback possible.

Management prefers not to think about long term, because management obviously does not think long term.


Is it possibly not a management problem at the core? I mean, I know exactly what you mean about suits like to ignore the future, but I've worked with a few 'wizards' who, once they've got the job under their belt, use it to keep other people out.

They'll not comment, help, document or whatever, and once they're doing that, they are uncontrollable. They can't be sacked because they're the only ones keeping the system running, and they won't help others train up. That seems a very difficult situation for the suits to deal with, even if they want to.


I worked with a guy like this. He bragged he was the only one that knew the entire legacy codebase and he only shared parts with colleagues on his pretty sizable team to ensure he had a job and fat salary for life.

The the multinational decides a change in direction and fired the entire office, relocating it for regional diversification (this was not offshoring, spreading the work out across multiple teams). This was pretty niche stuff, and he became unemployable until moving across the country.

And I know of a company that similar was outsourced to. Fully outsourced with seemingly no internal expertise. Bleeding their client for increasing support prices year-by-year imagining they could do this for ever. I was in the team in that client that for two years, going from-scratch re-implementing the functionality, no cheap proposition, but within 3 years of in-sourcing the savings were already there.

Not all wizards know magic.


That smells familiar. I worked with one of those wizards and he designed everything to keep him the job. That was until he rode a motorbike into the side of a car at 100mph and someone else (me) inherited it. I had to start again because it was that impenetrable. Consequentially it turned out it didn't do a lot and I'd rewritten the bulk of it in ASP/SQL Server at the time in a couple of weeks reducing the cost of the entire platform to an hour or two a week rather than an entire "tier B" (whatever that was but I was told it was a lot and was tier F myself) salary. When I quit it took a few hours to hand over to the next guy.


Indeed, the word wizard was perhaps ill-chosen. These guys know this particular codebase, in all it's hairy glory, but not much else.


I don't think ill-chosen. In their eyes, they are wizards.


> Is it possibly not a management problem at the core? I mean, I know exactly what you mean about suits like to ignore the future, but I've worked with a few 'wizards' who, once they've got the job under their belt, use it to keep other people out.

> They'll not comment, help, document or whatever, and once they're doing that, they are uncontrollable. They can't be sacked because they're the only ones keeping the system running, and they won't help others train up. That seems a very difficult situation for the suits to deal with, even if they want to.

This sounds _exactly_ like a failing of management to me. If developers are expected or allowed to "just code" without documenting anything or training anyone, management is absolutely to blame for allowing it to go on.


well if you have 3 guys that are the wizards that can't be sacked, hire 3 contractors for a year long project to document things sufficiently that these guys become more sackable. Contractors of course have to be highly experienced as well, and well-remunerated, so maybe management doesn't want to go down that expensive avenue, which means it's a management problem.

Not to mention the years where they did not comment, help, document or whatever, and became uncontrollable sounds like a management problem over those years.


Your first suggestion might work, but having been there I know obnoxious programmers can be very obstructive, and put up some major barriers to newbies. That said, I think yours is a good solution if it can be afforded.

Per your 2nd point about long-term management problems, no doubt of it at all, but sometimes the new (and sometimes good) management simply inherits what previous mismanagement left behind.

Also perhaps you underestimate the power that programmers in well-bedded-in positions have. They can outright ignore management orders - experience speaking.

But a good post nonetheless, thanks.


The other side of the coin is that new hires can be very obstructive. 80% of new hires here are relentless politicians who snitch to management that they are "underappreciated" while making one idiotic suggestion after the next to show their relevance.

In reality they cannot do anything, so they scheme to get rid of people who can. During the battle no useful work is being done.


not that that doesn't happen, but in the context of the thread here, I imagine it wouldn't happen (or be more rare). You're needing to hire senior COBOL experts, and the 'senior' part is where the sort of behavior you're describing doesn't happen as much. Usually it's a mixture of confidence in your own ability, a dislike of politics, and an assurance that you can get work someplace else if/when you decide to leave.

The people that 'snitch' and feel 'under appreciated', and so on... my experience is they have trouble keeping work, are afraid of being found out, and do what they can to get rid of others who can recognize their lack of skills. I just don't think you'd find as many of those getting hired in the context of the needs of this thread.


I'd like to back up what you say because I wasn't clear enough in my first post - I've known a few people like that, a very few. Most are good and do their best. Despite the impression, idiots as I described are very much a minority.


If you're hiring people like this, I don't think old code bases are your biggest problem.


If you hire people like this you need to involve HR in better selecting who the org hires.


> if you have 3 guys that are the wizards that can't be sacked

Don't forget to also sack the management that looked the other way for years while this situation got where it is.

The zero asshole rule is non-negotiable.


Maybe if job security were a real thing, those guys wouldn't create obstructions to guarantee it. If employees don't get loyalty from management, why should the street go one way?

The problem is that management doesn't care about the human cost of their decisions and it causes technical problems.


In the real world this approach will most likely lead to getting 6 indestructible wizards instead of 3. Magical staff that needs to be supported by wizards should be shipped to Hogwarts and replaced


Management that allows one or a few persons to become that indispensable is a management problem.


I propose that most of the code written in the old days, in COBOL in particular, was not done by computer scientists. There were many lanes for learning COBOL, and they included community colleges and industry training courses. The amount of "computer science / software engineering" concepts that were widely known (or even possible) in those days was very limited. 're-engineering' the code is a valid operation, but the amount of code is extreme. and without a deep testing regime, a dangerous journey. so, we keep on keeping on. the same way.


I believe COBOL was invented prior to the first computer science department in the US, never mind CS degrees becoming a common thing.


This is true. Back in the day, COBOL was marketed to businiess people as an easy to learn code to just get stuff done. Serious programming was done with FORTRAN.


Actually, FORTRAN was marketed to scientists, not for "serious programming" which was still done in assembler.


Why sometimes still today someone's Jupyter notebook of data science Python is calling into FORTRAN libraries under the hood.


still true today. (not really, but there is a lot of f77 out there...)


I don't know anyone who uses f77 anymore, but f90 and up is still popular in scientific research.


Now imagine 30 years from now when you are going to have to track down the documentation for version X of the web framework for version Y of some language and version Z of frontend framework.


> Now imagine 30 years from

For the javascript ecosystem this is already true for projects that are > 2 year old.

Not kidding, try to build a 2 year old React web application, you'll see what I mean.


TBF, part of the problem is that dependency locking only caught on recently. (Yarn came out in 2016.) That makes it more possible.


So... don't do that.

There are plenty of options that have a reasonable probability of being stable.

(Also, consider committing the docs right in to the repo. In the 1970s such an idea would have been absurd. Today, a lot of my projects technically already have this, thanks to vendoring and docs embedded into the programs themselves.)


best documentation is getting the business logic mapped out and the code itself, along with data mapped-out with what does what to it when and how. As any code documentation will be out of date in way way or another, even best sites it will be case of getting that documentation and then mapping a few decades worth of change management, bug tracking and other avenues that modified that code.

I will say though, every migration project I worked up, the documentation was carefully worded in the contract as being the customers liability and with that, code gets migrated logic for logic, bug for bug and testing so anal that it will show that what goes in is the same that comes out in the migrated code. That and the business documentation will still be good, even much of the code documentation if high enough level, but the code itself will be the best documentation.

Until we get a standard in which the documentation produces the code and all changes done to the documentation over quickly hacking the code, the disparity between any documentation and the code will always be adrift.

So you see many bespoke solutions to go thru the code and produce documentation from that to varying levels of success, however that success for one sites quirks in code may not work as well with others.

Hence even the best documentation will wisely get treated with a pinch of salt and in many instances, be like comparing a book to the movie it spawned in many ways, some close to the original, many not even close. That is documentation and code in a nutshell.

Always best to map the data first and that can be done easier and more automated, more so databases and generating a schema and then map what code talks with what and gradual get to see what is happening.


> It would be very interesting to see the old systems rewritten in a modern language, with modern engineering practices, but keeping the old UI and UX (which often is incredibly ergonomic) - so as to limit scope and not mess it all up by trying to introduce mouse navigation and windowing mess.

I was the tech lead on one of these projects. Personally I was sad and frustrated we had to keep the old UX/UI. I would have much rather have made something more ergonomic for our users. Alas retraining would have been too expensive to do that even though it would have been probably more intuitive.

I do agree with you that there is some benefit to being able to do everything on a keyboard without having to deal with the baggage of what we consider modern.


At some point in the web, ergonomic went from something that meant all form and minimal viable function. Efficiency just isn’t a real noun in the UX/UI vocabulary when it somes to complex entry.

30 years ago, those of us on green screens at Big Org knew more shortcuts than any emacs user. Now whenever I have to use a “modern” CRM it’s the most anti-productive aspect of my job.


Web UI optimized for user who can close page any time if he does not like anything. You have 100 competitors and user can flee to anyone of them.

Enterprise UI optimized for speed of trained personnel. They can't close page if they don't like it, they are paid for work.

It's completely different situations and when someone mixes them, it leads to bad UI.


I think it would make a lot more sense to pay COBOL developers at the same rate as other software developers. The era where we had analysts writing psuedo-code so clear that it could simply be "coded" into COBOL is over, if it ever really existed in the first place. In addition, paying COBOL developers more is so obviously less expensive than trying to completely re-write all of that legacy code.

It is likely true that many people would not be excited to learn COBOL. That said, I do think there would be a good amount of developers, perhaps new to the field, that would be willing to write the code in return for a fair wage and the work experience. But this attitude that COBOL code is somehow worth less than JavaScript code needs to be worked out of the system. It simply is not true and it is clearly doing harm.


Reverse engineers can manage to stare at assembly and figure things out. Every programmer can learn the skills of staring at really fucked up representations of logic and eventually figure things out. It's definitely slower of course.

Now is definitely not the time for a rewrite. As much of a trap this seems to be, it's actually the best move given the circumstances.

I may actually take IBM up on this since my father was a COBOL programmer, but I wouldn't plan to make a career of it.


Other than constraints most of the issues you raise are just as likely to happen in modern languages as well. That comes down to how the team was managed. If anything I have found a lot of older code over documented. My difficulty when I had assist in migrating some old COBOL from CICS on a zSeries was understanding their file structure techniques. However that was easily remedied by understanding the task as hand so that the data was better represented on the new system.

Is it a trap? Well if you want a secure position in managing a code base and maybe eventually working with others to move it to a new platform I do not see how. I have been around enough new languages to know we are always going to run into code bases we just want nothing to do with but here we are.

The problem to me is you may land in a development shop that is not well maintained. The code has worked for so long that management outside the department just went assumed that everyone knew everything.


I completely agree. I work with a large APL codebase and the main problem is not the language but the culture. Overly long expressions, one-letter variable names, gotos etc. make the code obfuscated.


I was the team lead to rewrite a large APL codebase with a very small team as part of a much larger group that refused to change and I'll agree with this. Once we started documenting and building tests for the code that replaced APL it became clear that the complicated bits were easy to replace and the hard part was just man hours converting all the conditional logic.


And APL requires supreme discipline to prevent that from happening.


> one-letter variable names

Just like in math!


Except in math, the notation is surrounded by natural-language prose which carries the brunt of the semantic load.


It’s not a trap, just IBM pr.

The legacy stuff is usually fine, it’s the layers of middleware scaffolds around the mainframe.

Mainframe jobs are 90% batch, so even under stress, it can handle it. Your circa 2002 scaffolds are the problems.


But who would rewrite it for the low salaries? Most people I know are not interested in low-level correctness at all, the open source C++ projects on GitHub with a high churn rate always have tons of corner case bugs.

If you don't get the right people, the rewrite would be an overengineered object oriented nightmare with an endless stream of bugs.


> modern engineering practices

Heck, I’d be willing to settle for seeing modern software languages used with modern engineering practices.


Absolutely (and I say that as an erstwhile mainframe COBOL programmer). The language itself is dead easy.


The language isn't dead (and doesn't have to be). If anything, it must be taught as part of a programming language design course. In a way COBOL is similar to SQL which is far from dying. (IMHO it would have been useful if they had merged into one language; the available embedding of SQL into COBOL was kinda clunky.)


The person you're responding to didn't claim that COBOL is dead. He/she wrote that COBOL is "dead easy", meaning "very easy".


> The language is not the problem - lack of comments, bad variable naming, bad structure (little or no procedures or readability), and just sheer volume of it, is.

I wonder what basis, evidence or data you might be using to make this assertion. You are assuming quite a bit there as well as generalizing all problems across all affected systems to have your list of issues as the root cause.

Could it be that nobody bothered to maintain and modernize these systems because spending more money on software that "works" isn't going to earn anyone in government points? Government and politics have metrics and fitness functions that do not align very well with the real world (anything outside of government or large stagnant companies).

And yet, at the same time, have you looked at open source libraries lately? The phrase that comes to mind is: rotten smelly stinking mess.

I just had to deal with one of those a few weeks ago. No comments, horrible code structure, massive class hierarchies, just awful stuff. The complexity and thickness of the interface they created was astounding. We re-wrote the entire thing in about thirty lines of code. Yeah. A massive multi-source-file library got boiled down to just a handful of clean code, no classes, just clean, simple and easy to understand code with comments anyone could understand.

I know it might be difficult to modern programmers to understand the kinds of constraints software developers had to work with in the '80's or before. A simple example of this would be single character variable names. When you only have a few thousand bytes of memory and you are working with an interpreted language, variable names consume memory you desperately need, not to mention CPU cycles. So, yes, people resorted to use single character names to conserve memory and improve execution time. Context is important.

I took one semester of COBOL back in the dark ages, FORTRAN also. Thankfully I never had to use them professionally. I started professional life using APL, C and FORTH. I realized, years later, how lucky I was to have been shoved into that path by a physics professor who insisted I veer away from COBOL/FORTRAN and take his APL class.


lack of comments, bad variable naming, bad structure (little or no procedures or readability), and just sheer volume of it

This has nothing to do with technology and everything to do with people. For all the progress we've made with technology in the past 50 years, we've made little or no progress with any of the items in this list.

The second biggest problem building software has always been programmers too small for the task at hand.

The biggest problem has been managers who are even smaller.

So don't blame COBOL or any other technology. Fix the people and you can build anything excellent from almost any tech.


> This has nothing to do with technology

Oh, it has everything to do with technology. Wouldn't want to waste that perfectly serviceable punch-card by punching a * in column 7.


It always mess up by introduce not the mouse or any technical things. But your keyword introduce ...

What do you think to get the proposal off the ground. Just reimplement granddad system is not it I am afraid.

Saw that so many times ...


anyone familiar with current node/npm/front end web dev would have very little trouble dismantling the false notions of implicit chronological progress belying this argument.


The IRS, treasury, and others locally are looking for COBOL programmers. I can explain why they're struggling to find any just by looking at the pay scale: $30,113 - $86,021 (GS-5 through GS-12, and most won't get the top end of the GS-12 scale). The $30K GS-5 starting is almost as low as working for their call center which is farcical.

By contrast state, county, and even city government are paying $65K - $85K starting going all the way up to $100K at the top end. And these are jobs where you're doing modern development, not COBOL.

People will start learning COBOL as soon as it makes rational sense. As it stands a lot of organizations aren't looking for any COBOL developer, they're looking for CHEAP COBOL developers. It doesn't make sense to learn when you'd lose money for taking those jobs.


That mirrors my experience in mainframes. Out of college I was writing HLASM for zOS systems making roughly 65k in Houston. Definitely not bad for right out of college from a non-prestigious state school, and I was happy to have work. However, it seemed the upper bound on what I could earn there after climbing the entire ladder was 100k.

I decided to move to NYC and was making 100k immediately as a Jr RoR developer at a lean startup that was paying probably 10-15% below market.

It makes little to no sense for anyone with other options to take these incredibly low paying jobs relative to what they could be earning, doing what I would argue is far easier development work. It continues to baffle me that these major organizations claim a shortage of mainframe developers, especially as all the senior ones are retiring, yet their pay scale looks comparable to what department managers made when I worked at Circuit City.


This is exactly true. The pay is this low because there is a glut of cheap cobol developers in the US due to decades of off shoring. There have been 2000+ cobol developers laid off here over the past 5 years alone (medium midwest metro).


There cannot simultaneously be a surplus of COBOL developers and a shortage of COBOL developers.

Well, supposedly there could be one in the Midwest and one in NJ, but theoretically those who do not have a job in the Midwest would be motivated to move, since their only options are:

A) Remain jobless in Midwest.

B) Switch careers in Midwest.

C) Move to NJ for immediately available position.

The bigger question is why does a COBOL dev accept work for 60k when they could make 120k in Javascript?


Low salaries on offer convincingly reveal the true narrative. The question is why is the false narrative being promoted? Rather effectively at that.


> The bigger question is why does a COBOL dev accept work for 60k when they could make 120k in Javascript?

Structural unemployment due to ageism?


Pay scales are idiotic on purpose to force the government to fail to hire required employees and pay 5 times as much by hiring a private sector company. This is all working as designed.


Does IRS work with consultant firms? I can see how someone sets up a consulting shop, finds COBOL brogrammers and charges IRS $5k/hour.


> As it stands a lot of organizations aren't looking for any COBOL developer, they're looking for CHEAP COBOL developers.

This is one of my pet peeves. I am also thinking that it is outrageous that I am facing such lack of supply of Porsches. What? What do you mean, I should be paying more than a thousand bucks for a brand new Porsche? What sense would that make?


Here is the official description & link to the course:

“Open Source COBOL Training – a brand new open source course designed to teach COBOL to beginners and refresh experienced professionals. IBM worked with clients and an institute of higher education to develop an in-depth COBOL Programming with VSCode course that will be available next week on the public domain at no charge to anyone. This curriculum will be made into a self-service video course with hands-on labs and tutorials available via Coursera and other learning platforms next month. The course will be available on IBM’s own training platform free of charge.”

[1] https://github.com/openmainframeproject

SOURCE: The full IBM press release is here:

[2] https://newsroom.ibm.com/2020-04-09-IBM-and-Open-Mainframe-P...


I'll also add the Open Mainframe Project blog post here as well...

https://www.openmainframeproject.org/blog/2020/04/09/open-ma...

A few other bits to clarify things ( coming from me being Director of the Open Mainframe Project )...

- The coursework itself is being contributed to a new open source project being hosted by Open Mainframe Project ( CC-BY-40 license ).

- We would have liked it to be ready at the time of announcement, but it literally got approved by the Open Mainframe Project TAC as a new project about an hour before the blog post went live ;-). Have no fear, it should be landing next week ( there will be a bit of work to come on translating docx files to markdown, in case anyone wants to help ).

- Right now the course work focused on VS Code as an editor, but the project is very open to contributions that leverage other IDEs ( such as Eclipse Che, Atom, etc )

- Open Mainframe Project is part of the Linux Foundation, with IBM being one of the 30+ sponsoring organizations.

- On the notes I've seen around "hey let's rewrite all that COBOL code in some modern language", I won't add more fuel to that fire ;-). I will however say there is some interesting work in a project hosted by Open Mainframe Project called Zowe ( https://zowe.org ), which basically makes connecting to mainframe apps and data on z/OS much easier ( think REST APIs, CLI interface you can use on your laptop, App framework for creating browser based apps, etc ).

Anyways - hope this helps! Feel free to ping me if you want more details or help getting engaged ( @jmertic on Twitter ).


My first serious computer job (early 1990's) was COBOL.

I last used it in 1997, then moved on to Oracle PL/SQL, Java, Oracle's Java software stacks and now iOS ObjC/Swift.

Now I'm 52 years old. I have my own apps now on the store, don't need the money but I am looking for a new challenge - something a bit more social than working for myself.

I think I'll do this COBOL refresher. Only issue is I'm in Australia but I see there may be some demand here too. Nothing to lose - plenty of time at the moment to do the courses. Good for a laugh anyway.


Serious question, why do you use COBOL and "looking for a new challenge" in the same post? I found it a trivial language with a syntax that goes on for pages. It's entirely uninteresting. The only challenge is understanding any business requirements, and that's not a language thing.


The language itself is trivial to learn, the environment COBOL is running in is usually the challenge. I really enjoy bug fixing existing software, and I enjoy learning about existing systems. As I said I am done with working for myself, and as for the social aspect, I'd find it interesting to work with peers who may be closer to my age, and not a group who is just starting in the game working on their first big piece of software. I was there 30 years ago, this seems an opportunity to work with more mature and experienced folk. Also back in the day there were far more women working in software than there are now, there may be more gender balance in the older technologies.


It's interesting to me that systems could be written that function correctly once they go live and don't have to be taken offline for decades... the architecture, not the language


This really, really interests me as well, and I've written my iOS apps the same way too. They have no server side, no real dependencies, all client based and fully automated internally - so as well as being fully contained they schedule required notifications etc into the future without my intervention. As long as the user runs the app once about every four months (which isn't an issue given they get many notifications during that four months) then the app will work for a very long time and bring in passive revenue. I consider this a holy grail of software.

I like pure solar powered calculators for the same reason. If you were to send one back in time it would continue to work and be useful without requiring intervention or dying.


>(which isn't an issue given they get many notifications during that four months)

I assume users can turn notifications off though? Almost nothing on my iPhone is allowed to send me notifications.


Yes, they certainly can. Apple allows any apps notifications to be switched off at the OS level - it isn't optional. My users love the notifications though, and they commonly customise to get _more_ notifications. The reason it is 4 months is because iOS only allows 64 notifications to be scheduled in advance for any app, and if they want users can configure to schedule that many in a 4 month period.


Terrible code and terrible situations can be encountered in any language.

I had a project that gave all sorts of hints of stink that was Java based. Turns out that project involved integrating a system in a preexisting code base with tens of millions of lines of code. Limited documentation, so much abstraction Java was just a syntax of underlying conformity, the language was understanding the myriad of interdependent abstractions to accomplish the integration task. Documentation was incredibly limited given the scale of everything. Obviously, progress was painfully slow.

Java itself was not the problem, everything else was.


Wirte a highlevel to COBOL Compiler


You totally miss the point. This is not about creating new code. It's about understanding and changing old code. To understand old dense code that is written with no comments, variable names like N1, X2 etc - finding out what the code does is the problem.


Don't enjoy that stuff, someone else can do this academic project.


From the article: * State unemployment agencies are notoriously underfunded*

And there’s the problem summarized in less than a single sentence. Yes, they need help, but it’s the same kind of “help” as in “I want a BMW but I only have a dollar, please help”

There would be a ton more COBOL coders if companies paid the equivalent of FANG companies - I say this as a former COBOL coder myself


Leaving aside what they're willing to pay or not pay for COBOL developers, does anyone seriously believe that a bunch of developers can waltz into an old code base and speed it up by a large amount in a relevant timeframe? Which seems to be what these articles are at least implying.


The government doesn't know that. That's why they're calling it COBALT.

I honestly can't wait until all the old non tech people retire.


Do you think that new non tech people are better?


These New Jersey and elsewhere new COBOL jobs are all volunteer, they are not paid jobs?


I am curious about this as well. If we take this "volunteer" word being used in these articles at face value, it means these states are asking for people to work on their systems for free… you've got to be frickin' kidding me. It's such a preposterous idea that I can only believe that it must have been an error or misinterpretation at some level which has entered into and lingered in the news zeitgeist like a fart in the shower and that these states actually do intend to remunerate their contractors at a fair rate. Why else would anyone accept these contracts? Are there really people who are both experienced COBOL devs and such huge fans of the bureaucracy of their states that they'll serve them for free? If there really is any overlap on that Venn diagram, it must be quite small - and shame on every single one of them for devaluing themselves.


They might mean 'volunteer' like people 'volunteer' for the army in the US. you get paid, and get the tools provided, etc., but you're not going to be forced out of your current life and conscripted in to COBOL work.

It's a bad word though, because it creates these confusions. However, they might be at a max of existing COBOL 'full timers'. Who would 'volunteer' to leave their current dev roles to jump in to the NJ COBOL life? Might mean moving to NJ as well...


The NJ site actually asks if you plan to seek compensation and that compensation would have to comply with state procurement laws. So don’t expect boku $$$ for these jobs. They want freebies. There’s no shortage of COBOL devs just a shortage of people willing to pay for them or pay market rates.

https://form.jotform.com/200966530056150

Why aren’t we asking other government employees to volunteer for free?


NY pays everybody who does any sort of IT or software dev work a starting salary of $56K at the moment. I expect NJ would be similar. You think that is too much for current employees and not enough for prospective COBOL programmers? Explain further if you please.


I, for one, am glad that there are people out there willing to sacrifice their time and experience for the greater good. This is one of those “ask not what your country can do for you, ask what you can do for your country” moments in time that will testify to the character of us all.


I remember reading some years ago that there are or were schools in India teaching COBOL because of the demand for maintenance programmers and the aging of US programmers who did that sort of thing in the 1970s. I wonder if that is still a thing.


They probably had their own Banking industry in mind. This was in Argentina so YMMV. Back in 2010 i had friends in university looking for internships and first jobs and COBOL was a real prospect. A friend took a deal with accenture where they would give you a 2 month-ish course and then send you to consult for banks. I remember the whole "COBOL devs are retiring! salaries are gonna be huge!" which was likely to be true for the agency's rate but didn't trickle down to coders, they used to make as much as the lower rung of IT (QA tester, etc). Later in life I worked at a bank and met the fabled close to retirement AS400/Cobol developer. He didn't make more than any ssr developer with a modern stack in demand.


There are three main efforts in the official announcement[0]:

- Forum for COBOL programmers to express interest in volunteering or hiring[1]

- Forum monitored by experienced COBOL programmers to help developers[2]

- Open source COBOL training materials from IBM[3] (Available “in the coming days”)

[0]: https://www.openmainframeproject.org/blog/2020/04/09/open-ma...

[1]: https://community.openmainframeproject.org/c/calling-all-cob...

[2]: https://community.openmainframeproject.org/c/cobol-technical...

[3]: https://github.com/openmainframeproject


I'm working in IBM (Europe) in a project that has the main business app in COBOL. Using Java and jt400.jar we just expose via web services the business. Even now we have in team COBOL programmers that creat new programs.

I find the main issue when you want to learn COBOL is the access to a machine as close to real one an not a simulator. Maybe openmaineframeproject is the missing link between learning and practice.


I agree. I have attempts to learn COBOL on a few occasions, but it's always stopped by the fact that there are, as far as I can tell, only three ways to do it without investing sometimes insurmountable amounts of money:

The first is GNU Cobol. It's quite functional for its intended purpose: To port mainframe applications to Linux. However, when writing a new application on Linux you usually want to do things like access arbitrary files (without hardcoding the filename in the source) or make network connections. It does have nice interactive screen support though.

The other option is to run MVS 3.8j in Hercules. This is a predecessor to the current z/OS which runs on mainframes today. The problem with this is that it's stuck in the 70's (which is when the last free version of MVS was released). A lot of work has been done by the community to keep it up to date, but the Cobol compiler is the language of 40 years ago, not modern Cobol.

None of the above options are really appealing unless you're like me and have a thing for messing around with stuff in their non-native environment.

The third option is: http://mtm2019.mybluemix.net/

When signing up, it gives you access to a z/OS account where you have a surprising level of access. It's probably running on an emulator (it's quite slow) but it does give you access to modern software, including compilers for Cobol, C, Java etc. It also has DB2 installed.

However, the purpose of this option it to learn z/OS, not necessarily learn Cobol. And developing using the ISPF editor isn't particularly nice. They do give you ssh access to the Unix-compatibility environment though so maybe it's possible to edit files using Tramp in Emacs locally. I haven't tried that.

In any case, what is needed is a proper Cobol development environment that you can run locally on your workstation. As far as I understand, that's how Cobol developers normally work. IBM would do well by releasing such a product for free. However, I'm not having high hopes given the fact that the mainframe division seem to be actively hostile to any free software (look into the difficulty the community has to get even the smallest community-made improvements accepted by z/OS, or releasing some small tool for the MVS community).


>> They do give you ssh access to the Unix-compatibility environment though so maybe it's possible to edit files using Tramp in Emacs locally.

It's a UNIX shell. You can always use ed :)


You can actually ssh into it and use vi. That's probably the easiest way to do it. However, the edit-compile-test cycle is somewhat complicated.

You need to first edit the file, fine, you can use vi. Then you need to go into ISPF (or TSO) on a 3270 terminal to submit the batch job that compiles the code (and possible runs it). Then you need to go into SDSF to view the results of the compilation.

Back in the 70's this was an acceptable way of working, but not what a modern programmer would expect.


You can submit jobs to JES2 from USS with a shell script. Easiest way I found was to fuse-mount a working directory, edit everything with VSCode/Vim With COBOL Extensions and have two shells, one constantly reading the output data file and another for submitting job cards.


Thanks for the information. I assume actual mainframe developers do these things. But can you do that from the environment provided to me you by master the mainframe?

On the MVS 3.8j side, I can submit a job directly via the virtual card reader and then read the results directly from the printer and feed it into Emacs. That's the most efficient way to work with Hercules but you'll be stuck with a very old version of Cobol.


>> You need to first edit the file, fine, you can use vi. Then you need to go into ISPF (or TSO) on a 3270 terminal to submit the batch job that compiles the code (and possible runs it). Then you need to go into SDSF to view the results of the compilation.

Well, I don't know how you can avoid that part, i.e. submitting a batch job and looking at the results separately. SuperPaintMan describes an alternative but I'm not sure how this works. When I was working on a mainframe, it was like you say, except I couldn't edit files remotely with vi - because big financial corporation security :)

To be fair, I didn't try. There probably was a way. I didn't mind the editor I had on ISDF, EZY editor. The only annoying thing was that, if I understand this correctly, EBCDIC doesn't have an end-of-line character so I couldn't just control-End to go to the end of a line, I had to use the arrow keys or touch the mouse (yuck!).


This sounds like university HPC today


This isn't the first time the world has needed lots of new COBOL programmers to burst on a single project. I'd suggest the same solution as last time: pay people $500-$1,000 per hour on short term contracts to do so.


As a healthcare worker who also has an interest in programming - why does this comment not surprise me in HN? If a worldwide pandemic doesn't make you reflect on problems in our societal structure - that our key workers are the most poorly paid, but part of a necessary backbone of any country, and that the most over-paid workers are often the least 'key' at times like this - then I guess nothing will. But sure, the solution is just to pay programmers more money, because that is all they care about.


“Poor planning on your part does not necessitate an emergency on mine.”

Which is to say that these are organizations that have been sitting on this problem for thirty or more years, doing nothing about it, and suddenly they cannot handle the load due to THEIR greed, apathy, and incompetence. But programmers are meant to queue up and work for free in order to fix it because it is now a "crisis" of their intentional making.

This is an argument for private profits, subsidies the loses. How about no? How about we eat into these organization's balance sheets in order to fix the massive financial and technical debt they let build up, because frankly they deserve it and it is the only way they'll learn. Any other solution will just excuse them to do this again.

Why should programmers bare the cost of their mistakes while they seemingly get a free ride/keep the "good year" windfall? It is immoral for programmers to profit, but not for these organizations that created these problems out of their greed? Nope.


This comment seems to ignore who the "they/THEIR" are. The organization that choose not to invest in unemployment benefits IT was the voters and taxpayers of New Jersey/"name that state".

Yes, it might be nice to have voters with more foresight and long term horizons who don't mind paying higher taxes. But lets not blame some nameless profit centered corporation.


“the most over-paid workers are often the least key”.

The ONLY reason that social distancing is an effective option is because so much is possible online and that all that infrastructure is good enough to handle the surge in traffic that they are currently experiencing. Imagine an alternate world in which these services were not as competent.


I don't know about that. I suspect that most of the truly essential work (health care, driving trucks/trains, keeping power plants running, producing food and toilet paper, etc.) still can't be done from home.

I also think that, precisely because of that, social distancing will only slow the rate of exponential growth, not prevent it.


That's harsh criticism, maybe well deserved, but doesn't provide a solution. The GP provided a solution with money that's reasonable. This is a short term need that requires a fast turn around. Training brand new programmers doesn't sound efficient at all.

There's also the context of more money in a time when the fed prints trillions of dollars every week. This will be a drop in the bucket, and in an environment where governments have decided to spend their way out of the problem.


You can have the solution fast, cheap or good. Pick two.


But you chose your profession, it wasn't assigned to you. You knew going in that either it's more about people, less about money if you're a nurse, or more money if you're a cardiac surgeon. Am I missing something?


How will you get them otherwise? There are way more jobs than qualified developers. Why would anyone want to work on an antiquated technology stack like Cobol? By the time they get up to speed on fixing the Cobol systems, the pandemic will be over. The real solution is to think long term and rewrite all of these antiquated systems so that the next time there is an emergency it will be much simpler to find qualified developers.


>> Why would anyone want to work on an antiquated technology stack like Cobol?

Because it's actually a hell of a lot of fun. I did a year in a Cobol shop as a graduate developer at a large financial corporation. Working on a mainframe was the most fun I had on that job. And why not? You're logging on to a gigantic computer with millions of users and billions of transactions daily, with a text-based user interface that looks like it was designed by Tarn Adams. And I say that 100% as a compliment.

Seen another way, the "antiquated Cobol stack" is like a deep dive into the history of computer science and programming languages. You can see with your own eyes how stuff used to be done 50 years ago. And there is so much to learn. It's not just Cobol: there's a whole bunch of other languages like Rexx for example that is like javascript on a mainframe, or like, well, JCL which is an absolute horror to behold of course. There's a whole new operating system to learn, or three and there's a whole new computer architecture to become familiar with. How is that not absolute programmer heaven?

I mean, seriously, when I first got all the permissions and so on that I needed to work on a mainframe, I was giggling to myself like a little girl. "Really? They gave me access to all this?". It was like someone had given me the keys to the playground.

The job sure got a bit boring after a while, which is the reason I left, but for a few months it was just sheer tomfoolery, poking at things and finding how things worked.

>> The real solution is to think long term and rewrite all of these antiquated systems so that the next time there is an emergency it will be much simpler to find qualified developers.

That is not a sustainable solution. Fifty years from now people will be making jokes about "that antiquated Python stack" and state agencies will be ringing alarm bells for the lack of experienced young Python programmers. You can't just keep throwing out all the old code and replacing it with whatever new language is cool right now.


> Fifty years from now people will be making jokes about "that antiquated Python stack" and state agencies will be ringing alarm bells for the lack of experienced young Python programmers.

I wonder how many people realise Python is more than 30 years old already.


> The real solution is to think long term and rewrite all of these antiquated systems so that the next time there is an emergency it will be much simpler to find qualified developers.

Assuming the 're-written' version isn't similarly out of date by that point in time, the real solution is for business/government to finally understand that if your org depends on software then it's infrastructure that requires constant maintenance and scheduled replacement.

Getting them to understand that is an exercise for the reader.


Quite. A proposal for yet more of the consumer-capitalist program of dehumanising via inculcation of greed which is devastating the globe. I entered tech because I was broke at the time. But it truly is an ethical wasteland, and I bitterly regret not having done something more hospitable to human fraternity.


You should consider joining the Tech Workers Coalition[0].

0. https://techworkerscoalition.org/subscribe/


That just gets them through this crisis but leaves everything else unchanged.

Someone needs to decide whether Cobol is going to live or die. If it is going to live then people have to stop pretending that it is not there, create proper training courses for it and recruit people to maintain and develop the systems properly. And of course the requirements should be reviewed and updated to include the capacity to cope with the volume of work that is being experienced by these systems.

Of course this won't happen because as soon as the crisis is past all those USD 1000 per hour people will be laid off, the employers will breath a sigh of relief that it is all over and go back to their old ways.


Had to recruit for COBOL engineers 8 years ago. Almost impossible finding someone with less than 10 years of experience. Most were 15-20+ years of experience and about to retire. Many with COBOL experience moved to cloud tech that would never move back to COBOL that would cause them to be outdated. New people in the field seemingly only got trained because they were willing to learn it at a bank.

It's a struggle to hire these people cause there are so few people and many companies only want to hire local talent (within 20-50 miles, and don't want to do VISAs).

There's money to be made on supporting tech this old, likely more on the consulting side than becoming an internal employee.


This whole recent COBOL surge reminds me of a friend I had who was upset that he had a Bachelor's in Computer Science but still couldn't find a job while his friend with no college degree taught himself COBOL and got a job at a local bank.


Yeah, the devs I met with the most stable and laid back jobs are all AS400, COBOL/CICS or IBM RPG experts


That's probably because they work in isolation from newer technologies which are constantly adding business rules. Most of the mainframe work I see involves maintaining old code bases and handling batch processing.


good guess. if the rest of the organization bends over backwards to ensure that you never have to update your code to accommodate, say, more than 8 ASCII characters in a password, or hashed passwords, and the rest of the org puts their security at risk to make sure your codebase never has to change... yeah, it might be a bit laid back... :)


"Why? Because coding a percentage in COBOL would take an estimated five months."

Five months to code a percentage, or five months to add the calculation, integrate that data point with other systems (does this need to be shown as its own field on any unemployment forms or reports? etc.), test it to the point where it has been demonstrated stable enough to be included in a critical system, and then deliver this change?

I don't know the first thing about COBOL but the fact that government is (for good reasons) slow to move, the stability needed in critical things like unemployment processing, and all the other things that go into "coding a percentage", explain that timeline a hell of a lot more than what language was used.


I'm sure after you finish the training you'll find all the ads looking for people with 5 years experience.

Assuming the demand for people still exists by then.


No, they need senior programmers so 10+years experience.

In the same way they ask for 10+years programming experience in Swift or Rust


Young COBOL specialists are being fired here and there. They can't find a job. They can't "volunteer" either they need to get paid at the end of the day.

I see Fiserv firing all their 3-6 years COBOL juniors (last in, first out.) They retain only the most expensive 15-25 years pros.


Next: COBOL 30 day bootcamp with guaranteed job at the end.


I'd say yes to that in a heartbeat. Thirty days is probably not enough to someone new to programming, but to experienced programmers in various other languages it's probably enough to get the basics down and get over the "know just enough to be dangerous" hump.

If these orgs desperate for devs want to put their money where their mouth is and start a program like that, hit me up. Sure.


I'd sign up for something like that in a heartbeat.

I used to do COBOL, but the problem is that possibilities for advancement are very limited - there's far more jobs available - and promotion pathways - for a Python or Java coder than there is for a COBOL coder.

In exchange for a refresher in COBOL and a guaranteed job, yes that is a much better deal.


Same here.


Personally I would like to be paid to learn such a technology.


There are techs people would accept pay cuts to work with and there's cobol.


I'd imagine a lot of the problems are due to the increased volume. So that would show up as abends due to exceeding data set size definitions in JCL, VSAM, IMS or DB2 (or even IDMS and the other database technologies in use 20 - 30 years ago.)If the unemployment system is adding a covid related unemployment code then they would need someone to crawl through the code looking for where the logic needed to change in calculating benefit amount, duration of benefits and such.


COBOL programs are divided into paragraphs that start with a paragraph name and have a group of statements within the paragraph. A program starts at the first paragraph and continues sequentially through multiple paragraphs until a STOP RUN statement is executed or it executes the last statement in the last paragraph. Easy peasy.

Well, not so fast. If you PERFORM a paragraph, then it's like a BASIC GOSUB statement, or "jump and store" in assembly language. Sort of like a function call, but with no local variables.

Or you can do PERFORM A THRU B to jump to paragraph A, continuing sequentially until the end of paragraph B.

Or you can do PERFORM A THRU B VARYING X FROM 1 BY 1 UNTIL C, which is sort of like a for loop.

The nasty thing about all this is that looking at a paragraph (block of code), you can't tell shit about how it gets executed. Does it fall through to the next paragraph? You can't tell: that's determined dynamically at runtime. Is it a loop? Can't tell. That's one of the things that makes large COBOL programs really hard to understand.

Another fun COBOL statement is ALTER:

https://riptutorial.com/cobol/example/19820/a-contrived-exam...

If you think GOTO is bad, it's a walk in the park compared to the lethal combination of ALTER + GOTO. Basically, when you say GO TO A, it can go to some other place based on a previous ALTER statement. Now we're having fun!


All this, Cobol programmers are in short supply and now let's train new ones, sounds like a sort of technical debt bailout.

As some have noted management has failed to look to the future and now their business has a major problem they are struggling to deal with. Same as we debate not bailing out, financially, companies that failed to plan for future disaster or catastrophies I would argue we don't bail out these companies that failed to maintain their own internal technical architectures by planning for upgrades and future maintenance.

It's akin to the BS ISPs argue about not being able to afford to maintain it upkeep their infrastructure. It's largely BS. Put less in the upper managements pocket and more in the business and all of a sudden the business works and has resiliency against unexpected events. If you don't think these "vital" banks and hospitals can't afford it then you haven't been paying attention.


What other languages can we expect to see a similar surge in demand? FORTRAN? In the data analysis space my bet is on SAS


Fortran (standardized without caps for at least the last couple decades) is current to the 2018 release. I haven't come across anybody who cares or is using it. You do eventually see a lot of older code in it in symbolic and algebraic maths in the deeper code bases still because the algorithms are complex and nobody really wants to touch someone else work if it can still be wrapped in something like R or Python and is nearly as performant as C/++.


There's Fortran inside your numpy/R installation doing the heavy lifting.

Not sure if there's real demand for paid Fortran programmers, but it's certainly not a dead language.


BLAS benchmarks show that fortran is actually faster than C is lots of scenarios.


Sure if you turn off bounds checking or similar.


The modern Fortran (2008-2018) is indeed used actively, although less than 40 years ago, as there are many rivals now. Just check out the Fortran GitHub projects.


I guess the thing about Fortran, to circle back to your question, is that it isn't common in critical infrastructure.


It's often critical, just in a completely different field. The best explanation is simply the names - Common Business-Orientated Language vs Formula Translation.

So the Fortran equivalent of the current situation was NASA being desperate for Fortran programmers to patch Voyager a few years ago (which may have been overblown - my understanding is that they had the programmers, someone just took the "what's the problem?" and ran with it). Orbital models, weather models, stuff that's deep science put into code. COBOL is stuff where business logic is put into code.

They're both titans in their fields to this day - just different fields. At a very, very high level it’s essentially matlab vs excel.


I work at a place that writes new Fortran code today. Also c++, Python, Julia. Scientists will choose the least impedance mismatch to a library, some needed data set, or their brain.


To be sure. And I don't think that modern Fortran is overly obscure that you couldn't basically pick it up over a weekend from the language specification.

Every modern Linux distribution has a current gfortran bundled.


Modern Fortran is amazing. Force your company to move to Fortran 2008 and beyond. A good reference book is Modern Fortran Explained: Incorporating Fortran 2018.


are accurate weather forecasts not part of society's critical infrastructure?


They are but I'm saying that the code often isn't redundant in the same way a piece of infrastructure may be.


You joke but I still see FORTRAN jobs around where I live than Ruby. I even worked for a consultancy that did a Matlab -> FORTRAN rewrite.


Fortran is super fast. For math/analytical stuff it's preffered over C. But to avoid relyance on a language with so few devs, these orgs dhould only have code that the the heavy lifting math in fortran. And use expose those as wrspped functions to some other more common and easier to use/mantain/hire


Indeed - with its stricter semantics Fortran is much easier for the compiler to generate a highly optimized machine code for.


Unemployment claims processing is something that each State does and yet each has its own unique way of handling it. This approach is followed for every multimillion dollar endeavor, building costly, redundant solutions to the same problem. This has been great for the private sector but costly for the public.


It's very frustrating to me that we're even talking about these COBOL programs today. During Y2K there was suddenly a huge demand for COBOL programmers to come out of retirement and young programmers to frantically learn it. The claim I remember at the time was that those systems should have already been rewritten, but the departments were underfunded, and they were finally doing it since it was an emergency. And yet, all they did was a little hacky patch and 20 years later it's still running. So we're 40-60 years on with these pieces of software. Look for whatever hacks they made for Y2K to be biting people in the butt again in 20-40 years, and programming historians to be pulled out of universities to patch again.


This month I wrote 2 Rexx stored procedures 1. Does a db2 bind 2. Executes any DML, DDL, DCL and selects in Db2

These stored procedures will be triggered by REST APIs & Endevor(version control). What makes me happy is, these will work for another 10+ years without any upgrades or me tinkering the code again for new version.

Well if it was any web technology or cloud application, I would be getting a mail from them saying they are going to decommission a version of language so either you need to rebuild your code only to know it fails in new version(every 2 years). Well that doesn't happen in mainframe.


Do you really want to end up in a situation where you're in a position of maintaining someone else's ancient codebase? I guarantee you this is not a path to anywhere but misery.


I really like working with ancient codebases. There is no more satisfying thing than deleting 10k+ line of spaghetti code. Yes, you have to spend days or even weeks to figure out how it works. Hunt down edge cases, bugs. If you like playing detective you will like it.

I rather work on a huge legacy codebase than rewrite the same CRUD app in the JS flavor of the week.


As a freelancer who does this 2 out of 3 jobs, I can tell you is definitely fun if you like challenge as in debug for 3 hours to only realize the old coder in 80's never initialized a structure and a newer tool was used to create the modern platform. And when you come on-board for maintenance/upgrade you are not told this from nobody. So misery or challenge, depends on your point of view, but the other side of the coin is..well, the coin itself, it's huge :). You get paid handsomely.


Reportedly, there are definitely more interesting challenges. We are seeing this in academia quite frequently where the older professors have some hard worked program in BASIC from whenever and they try really hard to get someone to maintain it's existence since otherwise their efforts really do disappear. For the person taking this on it's tedious, thankless and of dubious tangible merit to the participant who should be focused on their own contemporary work.

Have the reports of high pay been confirmed, AFAIK the push now is for volunteers.


I have some experience at porting ancient software to more modern languages, and I found it very satisfying.

Maintaining it as it is, without the authority to document and refactor it might not be as much fun though.


Forget the talk about pay too high, too low. This is a way to help our neighbors who are hurting, a way to help with our skills.

A question I'll ask is, someone like me; retired from a long development career, experience in many languages but not COBOL. Could I take this course, or a similar one, and be useful to prop up a rickety code base on a volunteer basis? If yes, are any of these states allowing remote work on it?


I looked out of curiosity and there are less than 10 Cobol jobs where I am (4 mln metro area, USA) seems like this is mostly marketing?


I'll likely run they this course. I've always been interested in this kind of tech, I did mastering the mainframe previously


The article mentions a training class but i don't see any link to it actually. Would love to see this announcement from IBM


https://www.zdnet.com/article/ibm-open-mainframe-project-lau... seems to have more information but the course itself (https://github.com/openmainframeproject/cobol-programming-co...) is just an empty Github repository.

edit to add https://www.openmainframeproject.org/blog/2020/04/09/open-ma... is the actual announcement.


Thank you for this


Unless they are going to pay me way above my current pay to compensate for dealing with an obsoslete language and the programming practices of its era, no way.

This is a great non-monetary example of one of the biggest issues we are seeing in our time. That is, as a civilization, we have become terrible at investing in future contingencies. In the last few months, we have witnessed how our biases prevented most of us(and businesses) from building up a meaningful savings for stormy weather, and the COBOL situation is hardly different. Banks could have invested in gradually switching from older mainframes to modern ones with software built with any modern language, and they were(or should have been) in one of the best positions to do this. Instead, they said "whatever" or simply were complacent in where technology was headed, and did little or nothing.

As a society in it's current state, we need to seriously look at ourselves in the mirror.


Okay so I can learn it, but can I run it? Is there a mainframe simulator out there?


I would not have attended one even if I got paid for it. Why should I flush my career in toilet, sentencing myself to dig into piles of 50 y.o. crappy code. Stop riding dead horse.


meanwhile, in the 24th century... https://abstrusegoose.com/323

(gonna keep posting this on all the COBOL threads haha)


That's a bit like Tom Sawyer offering free fence painting.


Let it die


The site is horrendous with ads


I don't want to sound condescending but why can't these systems be rewritten in a modern language? Now is obviously not the right time to do this; I am wondering are there technical limitations to do his?


They’d have to be revalidated. Don’t fix it if it’s not broken.

Edit: and when do you upgrade? If they upgraded in ‘85, it would be in C++. ‘95, and it would be Java 1.0. ‘05 and it would have been VB6. None of those would have been substantially better save that they would have been easier to hire maintenance programmers for.


C++ or Java seem like great alternatives. And I don't get the reference to VB6-- in 2005, it would still have been Java, and in 2015.


Java 1 could actually be progressively migrated to up to date java 11. It would have been a fantastic choice if they had started a reimplantation. (Unless performance was an issue)


Ok got it. But it seems to be broken somehow if they need programmers now.

Maybe write the tests first so it can be validated easier?


Upgrade? You mean re-write.


And now that they're broken, they can't be fixed. ;)

With so many legacy systems that are often in very important places, I wonder whether it wouldn't be smarter to spend money on systems converting a modern language to cobol, e.g. python2cobol. Is that impossible?


I guess there are technical reasons. a big bank does risk assessment. I can't imagine a problem like this will go unnoticed for long


The problem with these systems is rarely the language itself. It's that they're spaghetti monoliths - with decades of growth and tangles. And it's really difficult to overstate monolith here - government and finance are often the Himalayan examples, monoliths on tectonic scales.

The closest that modern design patterns come to these systems, is using them as the nightmarish example that justifies why modern practices exist.

It's rarely actually a technological problem. It's scale, scope, documentation, budget and motivation. You have to take Mount Everest, and carve it into 2 million separate boulders. Document every single one of them. Paint some of them. Replace some of them with stronger materials. If any of them move, you failed. If any snow is disturbed, you failed. If the climbers even notice this is happening, or has happened, you've failed. And on top of this herculean feat, the person paying for it needs to understand that despite the insane cost of this endeavour, he's probably not going to see a single benefit - but his successor in 10-20 years will. But if you fail, he's going to feel that hard and fast.


>> I don't want to sound condescending but why can't these systems be rewritten in a modern language?

Cobol was a modern language 50 years ago. Python will be an ancient language 50 years from now. What do we do? Keep rewriting everything every 50 years?

And those ancient Cobol codebases actually have a big advantage: they've been maintained for so long that all the major bugs have been virtually eliminated. Creating a new system from scratch means another 10 to 20 years of maintance until the new system reaches the stability of the old one.

This is no joke. Given that most of the Cobol running today is running on mainframes at banks and card networks and the like, "a new bug" may translate to a few hundred thousand dollars of losses.

There's no point in building a new house of cards every few decades.


> What do we do? Keep rewriting everything every 50 years?

Uh… sure? What would you consider an acceptable minimum amount of time to be after which rewriting a large but fairly critical codebase to modern standards becomes acceptable?

> And those ancient Cobol codebases actually have a big advantage: they've been maintained for so long that all the major bugs have been virtually eliminated. Creating a new system from scratch means another 10 to 20 years of maintance until the new system reaches the stability of the old one.

That's not quite fair. Having a known good code base while porting means that you can just rewrite the existing algorithms in the new language and then run some automated tests to make sure you get the same output from the same input across both systems. Unless you're doing a black-box rewrite for some reason, you're not really throwing away all fo the maintenance done on the existing system.


Unfortunately, nothing works that effortlessly in software.

Re-writing an existing algorithm is one thing, but even that is likely to be a big source of new bugs, given that Cobol is actually a quite low-level language and much code will rely on its specific view of a mainframe's architecture.

The bigger problem is that any implementation of complex business logic is going to depend very heavily on the facilities provided by whatever language it's originally implemented in (Cobol, in this case, obviously). A direct translation to a new language is likely to be completely impossible. And the bugs will grow in all the semantic gaps between the old language, and the new.

And that's before considering that, for Cobol in particular, the Cobol code itself is only half the story. Cobol programs run as batch jobs controlled by JCL ("Job Control Language") which often means that crucial aspects of businees logic are spread over multiple files in _two_ languages. And the JCL part is a mess. I didn't mind Cobol when I was working with it, I even came to like it a bit actually. JCL is really, really awful.

But, aesthetics aside, where does all the JCL-encoded logic go? Is that translated to the new language, also? That's going to be really hard given that JCL is operating-system specific. Is it going to be translated in scripts in a new shell language? The difference between concepts on JCL and, say, bash, or powershel, is going to be impossible to bridge without making drastic changes- and cultivating new bugs.

In general, translation of a large codebase between two very different languages is going to cause lots and lots of new bugs. So, if you rewrite everything every 50 years, in 200 years you'll spend a total of 40-80 years fixing bugs. If you write it once and let it be, you'll spend at most 20. I don't see a good reason to do it.

And what's wrong with an "antiquated language" anyway? I mean is it just aesthetics we're talking about here? Is it the lack of programmers that's the problem? The latter is sure to make translation even harder and more bug-prone. What is the real reason to change a working codebase every n years?


> What do we do? Keep rewriting everything every 50 years?

Yes, or sooner.

And you keep the 'institutional knowledge' externalized in documents, and you use testing tools and use virtualized systems and what not (virtual systems were available to consumers even 20 years ago - this is not that new).

"Upgrade" or "rewrite" every 10-15 years. This should just be a cost of maintenance. I'm at the point where I've had PHP code running on systems for 15+ years (had a call from someone in 2017 about software started in 2002 and last touched in 2004). There's a 'on the public internet' distinction with web apps vs internal bank systems, for example, agreed, but it doesn't remove the need for upgrading old systems. Doing it on your own schedule, on your own terms, vs having to deal with systems in crisis, is where the benefit is.


> and you use testing tools and use virtualized systems and what not

"what not" has been running in production since 1972: https://www.ibm.com/it-infrastructure/z/zvm


sure... I know my experience of virtualization tools is relatively limited, but I was using consumer versions of VMWare in .... 1999 IIRC. These ideas aren't terribly new.


But every change to the Cobol codebase carries a risk of introducing new bugs (and presumably there are changes if they employ programmers to work on it) so if a newer language and platform makes changes easier and less risky it could still be a win.


Can the hardware not be upgraded? The article says the system is struggling with 400k people a week in new York. That's terribly slow. No overclocking?


Thanks for your comment, that's something I can relate.

Somehow I think C or C++ wouldn't run into the same problem while I can see this happen to python.


Here's some simple COBOL code:

http://www.csis.ul.ie/cobol/examples/Conditn/Conditions.htm

    identification division.
    program-id. letters.
    data division.
    working-storage section.
    01  Char               PIC X.
        88 Vowel           VALUE "a", "e", "i", "o", "u".
        88 Consonant       VALUE "b", "c", "d", "f", "g", "h"
                                 "j" THRU "n", "p" THRU "t", "v" THRU "z".
        88 Digit           VALUE "0" THRU "9".
        88 ValidCharacter  VALUE "a" THRU "z", "0" THRU "9".
    procedure division.
    begin.
        display "Enter lower-case character or digit. No data ends.".
        accept Char.
        perform until not ValidCharacter
            evaluate true
                when Vowel display "The letter " Char " is a vowel."
                when Consonant display "The letter " Char " is a consonant."
                when Digit display Char " is a digit."
                when other display "problems found"
            end-evaluate
        accept Char
        end-perform
        stop run.
(I removed some chaff which GnuCOBOL doesn't need and fixed an apparent bug.)

So... where is it actually testing what kind of character you input? Where is the code for that? You input a specification for what a Vowel is, for example, and you write explicit code for what to do when a Vowel is input, but where is the code which goes through the specification and decides, yep, that's a Vowel?

COBOL is kind of an odd language. It's verbose in some respects and quite concise in others. Rewriting COBOL into something else would take actual human effort if you wanted the "something else" to look like code a human wrote, as opposed to the intermediate pass of an optimizing compiler, which is what the C GnuCOBOL can output looks like. Re-writing the COBOL might be the best move in some cases, or replacing it with entirely new code, but it isn't something you'd be able to do "for free" in any sense, especially with regards to time.


"when Vowel" is not very different to bog standard switch-statement magic. Actually in this case it looks like Cobol is almost doing a kind of pattern matching. Which is cool. Although (my Cobol is a bit rusty) I think there might be a more elaborate way to write "when Vowel" that hides less of the actual process? Not sure.

Cobol is really not such a bad language. It's just got a lot of ...ceremony. All those forced divisions and sections. But that's a feature: in the olden days, structured programming was a big thing. And an experienced Cobol programmer can take a quick look at a big Cobol file and find where everything is in a blink.


Java similarly imposes a lot of structure that allows you to more easily find where everything is, but the trade-off is that there is a lot more of this everything to find, whereas in a more flexible language you would have less code to read through and therefore less need for organizing it.


> "when Vowel" is not very different to bog standard switch-statement magic. Actually in this case it looks like Cobol is almost doing a kind of pattern matching.

It is, but the "magic" is that the pattern-matching is part of the variable declaration, so you can reuse those patterns wherever you can use the variable.


It's actually still better on my eyes than JavaScript.


That looks a lot like Visual Basic, except it allows you to name the patterns.

  sub letters
      dim char as string
      do
          char = inputbox("Enter lower-case character or digit. No data ends.")
          select case char
              case "a", "e", "i", "o", "u"
                  msgbox "The letter " & char & " is a vowel."
              case "b", "c", "d", "f", "g", "h", _
                   "j" to "n", "p" to "t", "v" to "z"
                   msgbox "The letter " & char & " is a consonant."
              case "0" to "9"
                  msgbox char & " is a digit."
              case else
                  exit do
          end select
      loop
  end sub
(This example is actually LibreOffice Basic, as I'm not on a Windows machine right now, but it should be the same in VBA and VB6.)


> That looks a lot like Visual Basic, except it allows you to name the patterns.

More to the point, it allows you to keep the patterns near the variable declaration, so you can reuse them.

The pattern matching is part of how the data is declared.


Can you also reuse them for different variables, if you want to store two of these chars? This program appears to have the patterns declared as part of the variable declaration.


Thanks for posting this. Our computer teacher in high school skipped the outdated COBOL sections (just an intro, around 2004) I realise i'd never seen an actual code sample yet!

Aside, RE testing, a wild guess: could the "accept char" be tested with emulated keyboard inputs?


Never saw cobol in my life before, guess its "evaluate true when" part. not very explicit but readable. maybe it's just a well written example but something to work with


If it was open sourced there'd probably be some willing to modernise the system for free.

Govt IT systems and procurement are such a mess, putting it all out there for people to review and complain about is the only way it'll ever get better.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: