Hacker News new | past | comments | ask | show | jobs | submit login
Fourth-generation programming language (wikipedia.org)
120 points by luu on April 29, 2022 | hide | past | favorite | 103 comments



Back around 1993, plus or minus, I was working for a company whose name you'd recognize but I won't mention. We had an essentially failed project that they wanted to get back on its feet by changing to a more conventional platform.

But which platform? We had a few contenders: PowerBuilder, Gupta SQLWindows, and good old C++ (using Borland's OWL framework, iirc). So they divided us up into separate teams, each to try a small proof-of-concept with our respective assigned platform, seeing who could accomplish the most in a month.

I was on the SQLWindows team, and we managed to implement the most of the PoC project. But when us developers got together and compared notes, we found that although my team had won at the finish line, the details told a different story. The C++ team finished 2nd, but if you looked at the specifics of our progress, my team got off to a running start, while the C++ team had to build a bunch of infrastructure first. After that, their velocity was much higher than our own, so if the PoC test had been extended for another week, they'd have wound up the winners.

We had drawn this all out on a whiteboard, with the intention of explaining this to management the next day. But when we got into the office that morning, our whiteboard had been erased, replaced with just "SQLWindows", and management told us that whatever the results, it had been decided to go with SQLWindows.

I cannot deal with office politics.


Even if you had presented, there was a big chance that SQL Windows would have been chosen anyway. The risk calculation senior leadership would have considered delivering anything sooner with iterative improvement versus a long ramp up but with more efficiency. While the first may end up being more expensive, it's less risky and more likely to pay off. The longer and more complicated ramp up a project has higher risk, the more likely it will fail before it even delivers.

Leadership is FAR more likely to lean on a sure thing than higher risk/reward. To bypass that you need a proven strong leader and team who delivers big complicated projects and a VERY strong presentation to show the problem was thought out.

Even then, competing priorities and other factors may come into the decision making process.

And I am not saying I agree with the decision making much of the time, but this is what I have experienced.

To be very honest, the only time I managed to get a project like this approved is when I proved with 100% certainty that all the alternatives would fail, why, and how. Even then it took me 1.5 years and significant buy-in from principle engineers to convince senior leadership the it was the only path forward.


A friend asked me offline to elaborate on my scenario, answering publicly.

We had a gap of about 12 capabilities in our environment. The capabilities were in the same family, but functionally different. To make matters more complicated, different teams/leaders only required 1-3 of the capabilities and did not know about or care about the others.

So senior leadership (about 4vp, and a dozen or so directors) saw this as 12 problems that can be solved with 12 solutions.

The problem is solving it as 12 solutions required 12x the engineering to implement, 12x to operate, and 12x user support -- actually I'm simplifying, it was worse in reality. The financial cost was also higher but more like 4x.

To be solved as a single solution required a far longer ramp up time to a functional solution, but would take significantly less engineering resources at every step. To boot, it would also scale far far better - requiring far less people to operate and scale.

So two problems needed to be overcome is convincing leadership that it was not merely a problem of solving their perceived problem of 1-3 capabilities, but 12, and that if we had to support all 12 capabilities we'd need a significantly larger team.

Adding to that was the upfront financial cost for a unified solution was more expensive than solving the perceived 1-3 capabilities any leader knew or cared about.

The solution was shot down 3 times with me being directly told "This is the wrong solution, please have someone else propose a better solution."

And to be very clear, this company actually had good leadership. But with such a large environment they all had their priorities and limited people/budgets to get them addressed with.

The problem was they needed to understand a problem existed outside of their domain, a domain they didn't see as as affecting them very much, so they didn't want to dedicate time to learn it.

The solution was to convince their trusted advisors. Principle and senior engineers, senior program managers, etc.

Politics suck, but they affect any organization with people. The more people you have, the more politics. The more people you affect, the more politics. If your company becomes successful, you will have to ultimately have increased politics. Period.


I’m disposed to liken it to an anytime algorithm (https://en.m.wikipedia.org/wiki/Anytime_algorithm). You want your projects to be as ‘anytime’ as possible, IMHOBOLHE. I think that’s what your comment is driving at: that there’s a significant benefit to a plan that delivers half the benefits if it only runs for half the (eventual, real; not planned) time, vs a plan that requires a long investment of time but only - hopefully - pays off in full right at the end.


It was easier to get free stuff from Microsoft for choosing their products than the C++ standards committee.


https://en.wikipedia.org/wiki/Gupta_Technologies doesn't seem to have anything to do with Microsoft?


You're right. I didn't realize they had a SQLWindows tool that wasn't associated with Microsoft. Microsoft used to be real aggressive with their trademark on Windows.

> SQLWindows was one of the first GUI development tools for Microsoft Windows.


There was no C++ Standard's Committee in 1993.


That's the joke.


Law of Focus


> PowerBuilder, Gupta SQLWindows, and good old C++

Was there any sense in 1993 that C++ would still be alive and kicking in 2022, while most people wouldn't have even heard of the others? Half the lesson is just "choose boring tech."


Good question, because from 30 years later, I think you're right.

But my recollection of the zeitgeist then was all about progress. The trend of the computer industry was always about upgrading, whether that's your hardware, or your OS (this was in the midst of moving from DOS to Windows 3), and there were tons of examples of important languages being left behind, like COBOL or FORTRAN.

And prejudicing the decision against C++ were other related but separate technologies that had led to the failure of the project's original inception. The VP in charge of this major project (it was upgrading the company's core customer tooling from DOS to Windows) had us trying to do all kinds of crazy things. CORBA comes to mind, although I don't recall the details, and trying to make the entire architecture being data-driven.

So there was a sense - not entirely unfounded - that C++ pulled in too much complexity, and we needed to shift gears to something much more straightforward.

But I do recall realizing even as the decision was being made that the kind of simplifying abstractions that these 4GLs gave us may be great so long as what we want to do is "inside the box", but stepping out of the box leads to greater pain than if we'd been trying to accomplish it with conventional tools. Of course everyone realizes this, at least today. The difficulty was an management believing that the entire application, or nearly all, could be built inside that box, and that product owners wouldn't demand special and fancy behaviors.

So I remember spending a good deal of time (and fun!) writing wrappers around Windows message handlers, with their LPARAMs and RPARAMs and the challenge that entails, so that other SQLWindows developers could do these things - whereas the C++/OWL path is designed to deal with such stuff natively.


Speaking of progress... Progress 4GL was the main tool used by the small company I first worked for after university. (https://en.wikipedia.org/wiki/OpenEdge_Advanced_Business_Lan...)

It worked in the context of the company being a VAR for solutions that used it to run big business ERP systems and the like, but our manager saw it more as a universal hammer for any job. Fair enough; he had become the manager after being the sole wunderkind programmer for this company, and had no management experience, or really much other prior job experience. I like to think we've all grown beyond our initial roots since.


The amount of hype surrounding C++ in 1993 is really hard to believe these days. You couldn't walk in to a Walden Books or a B. Dalton book store without seeing it in the entry at the mall. The USENET had many different flavors of C++ groups as well. I think at this time even Microsoft was converting a lot of their development towards C++.


C++ was the future at the time, and for some time after. You may recall that there really were not many choices if you wanted to deliver a mass-market or even small-market product to end users. Your realistic options if you wanted to avoid assembly were C or C++.

Pascal was fading. Smalltalk, Lisp, Objective-C weren't viable in 1993 for various reasons outside of very specialized niches. Java didn't exist. (I don't think GNU Ada existed yet? But if it did, you had to use GCC, which kinda sucked back then compared to commercial compilers -- slow and poor codegen.)

We are so spoiled today compared to then. I could automate my house with a system written in bare metal Scheme on a cluster of Raspberry Pis if I had enough time and energy.


There was Object Pascal, supported by Apple and Borland. In 1993 the earliest versions of TP for Windows were available already, and Delphi would come out not long after.


>choose boring tech

This is one of those things that is easy to tell someone but profoundly difficult for certain novel stimulation-driven personality types ( which I would say predominantly is what the HN audience is made out of ) them to internalize without feeling the pain of it yourself a few times.


On hacker news today, choose boring tech is celebrated - unless it’s Java


PHP!


Oh God, SQLWindows. At The Instruction Set (quite well known UK training company), I re-wrote the only UK training course for this (the Gupta course was basically unteachable). It's really a terrible kind of Visual Basic, but with few of the good points, such as VB sort of had. And teaching them to use the tree-structured editor was, shall we say, "interesting".

The second time I presented the course a couple of goons from Gupta sat in and started complaining because I was showing the punters how to use the Windows API to add features that SQLWindows simply did not support. Eventually I had to get my boss in to consign the goons to the smoking lounge.


Ah yes, that outline/tree code editor was one of the worst ideas I ever had.

I'd gotten hooked on some of Dave Winer's stuff - ThinkTank and GrandView - and thought everything should be an outliner.

Be grateful that you and the world were spared at least some of this: at first I thought that even expressions should be done in outline form. We punted on that and left them in text form.

After 2-3 years I left Gupta to work for Cooper Software on Ruby (no relation to the programming language). This was originally going to be a user-programmable shell for Windows 3.0, but Microsoft decided to go a more conventional route with Program Manager and File Manager, and turned Ruby into the "Visual" part of Visual Basic.

So you can blame me for the worst parts of both SQLWindows and Visual Basic!


As much as the management decision to use SQLWindows bugged me, I was rather fond of its outline-based UI.

As it happens, around this time I had also attended a usergroup presentation demoing a beta test of OS/2 Warp. It incorporated a file manager with an outline metaphor.

This inspired me to "invent" an outline-based UI for our own product that, a couple years later, turned out to be exactly what Microsoft did in Windows 95 for their File Explorer, except that I'd also included a type-ahead search at the top of the outline pane to search for nodes matching the input (there were quite a lot of nodes).

Our management rejected my idea, though. Instead, their favored UI metaphor was tabs and pages, which was quite popular at the time. The thing is, such was the complexity of the data we were collecting (lots of outline nodes, as I mentioned) that this metaphor led to major tabs across the top of the window, minor tabs down the right side. Then they originally had a third row of sub-minor tabs across the bottom until someone realized that this broke the metaphor in the physical world - how could the metaphorical notebook be bound? So the sub-minor tabs were switched to be pages within the sub-tabs, using left and right arrow buttons to flip from one page to the next.

I still maintain that the tabbed design was an unwieldy mess, and the outline-based one that I proposed, with the search on top, was vastly more usable.


> So you can blame me for the worst parts of both SQLWindows and Visual Basic!

Something to be proud of :-)

Actually, I didn't dislike SQLWindows _that_ much, because I knew how to write C & C++ code for Windows, so I could program round its little foibles. But it was tough to teach!


I haven't heard "Borland's OWL framework" in over 20 years. Wow, that brought a flood of memories and nostalgia back. I used Borland products starting with Turbo Pascal back in the 1988-1992 time frame before switching to Microsoft C/C++ and MFC. Thanks!


Did you make a video about this on YouTube?

The YouTube app on my TV autoplayed a video last night while I was doing chores around the house. I didn't get to listen to it closely, but I overheard "Gupta SQLWindows" (for the first time ever, mind you) several times and I remember hearing something about INTERCAL.

I don't have access to my YouTube play history at the moment or I'd check it instead of writing this reply.


4GL was a buzz word in the late nineties when I entered software profession. It was pushed by ERP (Enterprise Resource Planning) software majors of that era such as SAP & BaaN.

Even though such software are the core of an enterprise today. Many technologies that they pioneered did not gain mainstream adoption after the entry of Internet software companies such as Google/Facebook/Amazon.

Take for example what is now a buzz "Low code No code", it is a poor ghost of Model Driven Architecture (MDA) - https://www.omg.org/mda/ . Enterprise software always wants to push software to the realm of standard components and modules. That cycle was broken around 2010 due distraction from the smart phone wave.

Other techs of that era - ESB, WSDL, SOAP, SOA, UDDI, BPM, Business Rules, EDI, Data Mapping (all are almost but forgotten). Many of them are getting re-bottled e.g. Microservices

Enterprise software has not had any major breakthroughs since. Glamour of new tech from Internet giants who invent it for their specific purpose is pushed more often than not, on to enterprise systems by other software vendors/consultants pouting it as the new and best. This has made standardization almost impossible and hence kept software always in the "handicraft" mode.


Model-Based Software Engineering still pops up in some environments, like LabVIEW or whatever that Matlab one is (they use it at NASA on some of the critical code, name escapes me). I've done some pretty cool things with Model-Driven Testing but never had a chance to use it as part of my job. GraphWalker [0] for example lets you build a model of an application where nodes are states and edges are actions to get 100% automated coverage (not No Code, but Cool Code). I used it to test HackerNews once in my free time combining it with the Page Object Model in Selenium which worked surprisingly well. You still need to build the framework and now an additional model, but you get all the tests for free.

[0] https://graphwalker.github.io


Matlab Simulink, probably. It's actually pretty neat, but I haven't touched it in over a decade now. It was useful for our embedded systems to have an executable model versus a prose specification document, and then it did get used to feed into the testing routine. Since the models were simpler to understand, confidence in them was higher. Differences in test results could be readily determined to be problems in the system and not the model after a quick analysis of the model ("Yep, that's supposed to be X, not Y" or "Nope, Y is correct, that's an error in the model here.").


Simulink! That's the one! Thanks for the reminder - I googled but couldn't find it straight away.


The original generations referred to hardware: (1) thermionic valves, (2) transistors, (3) integrated circuits, (4) VLSI. Japan's MITI initiated and funded a project to develop a fifth generation of computers, which were to be highly parallel, but MIMD (Multiple Instruction Multiple Data), specifically dataflow computers, rather than SIMD. The system language for this was to be a Prolog variant, and many of the applications running on fifth generation computers were to be Symbolic AI - expert systems, machine translation, etc.

These generations were later mapped onto software. The first two generations were machine code and assembly language. Prolog and other similarly high-level languages would be fifth generation.

The Japanese Fifth Generation Computer Project was largely unsuccessful, but after it was announced it spurred governments in various countries to start competing projects (e.g. Alvey in the UK).


Dataflow analysis is commonly used to extract small-scale parallelism for modern superscalar compute - either done by the compiler, or directly in hardware. On a slightly larger scale, high-performance async programming using job-stealing schedule strategies across multiple compute cores can also be viewed as a dataflow approach. So all of this stuff is in fact being commonly used today, a lot more than most people might realize. Your database queries are probably handled via methods quite indistinguishable from much "Symbolic AI" or "expert systems", and parallelization is gaining ground there too.


Yes, I'm aware of this. At the time (early 1980s), the aim was to build machines in which data was sent between separate general-purpose CPUs in a multiple CPU machine. The University of Manchester had a working prototype.


The language Forth ("fourth" but that's one of these stories where the filesystems of the seventies didn't allow free naming) also refers to hardware evolution, but in the mind of its author it was about hard disks, which for sure were a revolution compared to punch cards and tapes.


> MIMD

At that point, why not just have another core?


More fine-grained.

Because separate cores each have their own cache, and their own program counter, there's a lot of communication overhead between them. When you're using instruction-level parallelism[1], you can actually do stuff like "perform a single add and a single multiply in parallel, then add them together", and the system is designed so that you aren't adding any overhead when you do that.

[1]: MIMD instructions are one way of doing this, but the computer you're reading HN on is probably an out-of-order superscalar, which accomplishes the same thing implicitly.


MIMD does include today's bog-standard multicore systems.


IMHO the idea that the history and future of programming languages follows a pre-determined ladder, or 'generations' is misguided. More abstactions isn't necessarily better then more explicitness. Functional style isn't necessarily better than OOP, and OOP isn't necessarily better than 'imperative style'. All the different languages will not converge into a "perfect" language some time in the future, if anything, it looks like the opposite is true.

I sometimes wonder if Marxism has a bigger influence in some "language designer circles" than (for instance) Darwinism ;)


Rust needed 25 years to bring us a significally improved abstraction over C++ without sacrificing resource usage. It just shows that it's much easier to say that we need a higher level language than to know what would work in practice well enough to be useful as a systems level programming language.

At the same time Linux was released 20 years after C, so of course Rust has to mature more to really take over C++.


Is Rust an abstraction over ideas in C++ or is it in fact just a better set of abstractions over the same underlying operations? I would say the latter, mostly due to jettisoning OO nonsense.


Sure, I agree with you, I meant that Rust's type system is a better abstraction over manual memory management than C++'s type system, I didn't mean that Rust is an abstraction over C++, I'm just not that great in expressing myself clearly in this case.


It's not just OO. It's also making things safe by default, and getting rid of a lot of accumulated jank.


Nah. We had stuff that was better than C++ for a long time. (C++ has improved over time, but so have the alternatives.)

A large part of C++'s staying power is due to status quo bias. So in that sense you are right, if you take 'significantly improved' to mean improved enough to overcome status quo bias.


> IMHO the idea that the history and future of programming languages follows a pre-determined ladder, or 'generations' is misguided.

Generations of programming languages aren't a predetermined ladder or order of quality or teleological progression, it is a description of a broad, backward-looking groupings of the order particular combinations of features appeared.

> OOP isn't necessarily better than 'imperative style'.

OOP is imperative (you might mean procedural, like C/Pascal or unstructured, like BASIC.)


Is Smalltalk's OOP imperative? What about CLOS?


Yes, both Smalltalk and CLOS are imperative.


You can do something like OOP in Haskell or Scheme, without being imperative.

But yes, most OOP incarnations are imperative.


I think most people have given up on the concept of "more abstract"=better. However, there are other was to "advance" the state of programming. One is to find better metaphors to express the same concept. For example, C is a wonderful language. But if you spend much time at all in an advance C codebase, you will find lots of code that is attempting to do things that it's hard to tell it's attempting to do. Because C has this really half-baked type system and it's only generalization of data is the void pointer and so on.

There are two solutions to this. Take every feature of C and explode it to its logical conclusion and beyond. And there you get C++, which has it's uses but is really one of the few languages I would ever call straight up bad. (I know, it's beloved to many and I came up on it but it is bad, I'm sorry.) But if instead, you back up and look at what is good about C and start modernizg its metaphors and instead of leaving them incomplete like in C you fill them out, you get something like zig. Which really is a beautiful language. No more abstract than C but much easier to understand.


> Take every feature of C and explode it to its logical conclusion and beyond. And there you get C++, which has its uses but is really one of the few languages I would ever call straight up bad.

I think you could take the approach you described, and get something better than C++. Especially with the benefit of hindsight.


> I sometimes wonder if Marxism has a bigger influence in some "language designer circles" than (for instance) Darwinism ;)

Or some (programming) language designers are creationists deep inside their hearts (without willing to admit it). ;-)


> IMHO the idea that the history and future of programming languages follows a pre-determined ladder, or 'generations' is misguided.

> I sometimes wonder if Marxism has a bigger influence in some "language designer circles" than (for instance) Darwinism ;)

That would be Hegelianism. (Marxism is sort-of a specific application of Hegel. Not that either makes more sense.)


Parallel History? :) Circa 1968:

"For fun, he [Moore] also wrote a version of Spacewar, an early video game, and converted his Algol Chess program into the new language, now (for the first time) called FORTH. He was impressed by how much simpler it became.

The name FORTH was intended to suggest software for the fourth (next) generation computers, which Moore saw as being characterized by distributed small computers."

https://www.forth.com/resources/forth-programming-language/


I think Yanis Smaragdakis Onward! 2019 paper fits here "Next-paradigm programming languages: what will they look like and what changes will they bring?" https://yanniss.github.io/next-paradigm-onward19.pdf

He has done a lot of work using datalog and argues, convincingly in my opinion, that it is reasonable that some of the elements of datalog-like programming will be part of the future. For example, that there is a very simple way to express a relational-join and the underlying machinery figures out the most efficient way to do it, just like a DB query optimizer does.


If you agree, check out the HYTRADBOI (have you tried rubbing a database on it) conference happening later today: https://www.hytradboi.com/

This is a meeting of people who agree or at least are willing to explore that datalog and related systems are the way forward. Lots of interesting programming systems will be presented!


Not OP, but thanks for that! Will check it out.


Datalog is one of my favorite languages, specially the Soufflé implementation with static types. Just as mainstream languages started adding OO-related and later FP-related features, I hope logic-programming / relational-programming is next in line, and Datalog is (IMO) by far the best implementation of the paradigm. It's super simple to understand, it pushes you towards good modeling practices by design, it's highly efficient, the only downside is weak language integration / lack of popularity...


Yeah, but in my experience, DB query optimizers are terrible. Ideally you'd be able to write "select * from A, B, C where A.col1 = B.col2 and B.col3 = C.col4", but in practice, you have to hold the SQL interpreter's hand by using inner joins and carefully choosing which table to start with and what order to join them, in effect doing the query optimizer's work for it.


Following one of these languages (that I had not heard of) back to its wikipedia page, I see:

> In JavaOne 2003, in June, JavaSoft announced its intention to create a wholly-owned scripting language for the Java platform. This monopolous move has stifled the lives of most of its competitors. The result was the Groovy language, whose name is apparently a phonetic morphing from "jruby", with its first version released in 2007. Its built-in Ant support is suspiciously connected to Judoscript's Ant scripting feature, chronologically speaking.

Some bad feeling there!

https://en.wikipedia.org/wiki/Judoscript


I've always associated Fourth-Generation Languages (4GL) with database products like FoxPro, Clipper, Clarion, DBase. They combined database and programming language in one package. They were powerful and more accessible to beginners than stand-alone programming languages.

These 4GL database/programming products seemed poised to succeed in the web era. Think of building small or medium web sites with database and programming language all-in-one. Instead of succeeding, these 4GL products disappeared. I wonder why they failed to adapt to the web era - they seemed ideal for providing web-based database solutions.

Those 4GL database tools made accessing data far more easily and seamlessly than the solutions today. Also, I suspect there was some snobbishness amongst programmers who didn't view these products as 'real' programming languages.

Today, we have clumsy, glued-together solutions for web development with databases. Don't believe me? See this recent discussion on SQLite on HN [1]. I sense that lots of posters and readers in that thread are looking for a concurrent database that is simple and easy-to-deploy and turn to SQLite even when it is not designed for concurrency.

[1] Have you used SQLite as a primary database? https://news.ycombinator.com/item?id=31152490)


The problems I had with 4GLs back in the day, is they had a more extreme version of the 80/20 rule. It was gloriously easy to prototype but if you had to interface to external systems or any other thing they didn't think of, you would spend an enormous amount of effort on making it work. Some of them didn't even have any significant abstraction support.


> It was gloriously easy to prototype but if you had to interface to external systems or any other thing they didn't think of, you would spend an enormous amount of effort on making it work. Some of them didn't even have any significant abstraction support.

The only one I know of that had any decent level of abstraction was the Clarion template system and I havent seen anything as powerful in any other coding tool. On a basic level its like a macro language like you see encapsulating data structures in windows .h files but because its much more than that ie put some data into it and it took and built stuff for you in a structured framework, it can be expanded to do practically anything and if it didnt do stuff you could expand it by calling external dll's which did things like popup windows for further input or processed stuff.

I remember reading that it won some code productivity thing where it had beaten factory teams from MS and/or Oracle or some other big names but this was a long time ago.


> These 4GL database/programming products seemed poised to succeed in the web era.

Access pretty much murdered them all right before they really had a chance, but Access (while it eventually got web-ish features) wasn't Microsoft’s strategic focus for web.

Simultaneously, a host of new languages and frameworks with much easier on-ramps than those 4GLs were defined against earlier erupted, and communities leveraged their open source nature to rapidly build out ecosystems far better than any of the proprietary 4GLs, so the 4GLs lost most of their original market and had a kind of competition thet weren't prepared to deal with pivoting to the web market.


> These 4GL database/programming products seemed poised to succeed in the web era.

The lack of adaption to anything, not just the web, defines these tools.

There were plenty of 4GL-like tools for building websites in the late 90's but just like these tools they all suffered from being good at prototyping and terrible for long term development.


> I wonder why they failed to adapt to the web era

At the beginning of the web era; Microsoft owned the software market and did not believe in the web.

MySQL and PHP were the successors do the "4GL" databases, all the commercial efforts to compete with them ran into the brick wall corner of "only M$ may make money" vs "free".


It was more than a question of Microsoft. Microsoft itself fielded two applications in this space: Access and FoxPro; both went on the wane as the web took over. There were a number of factors at work:

1) The rise of n-tier architecture. Clipper and dBASE do not scale well past a single PC. Once you have use cases that mandate multiple (kinds of) clients, separating the concerns of storage, business logic, and presentation starts to make a lot of sense.

2) The fall of ISAM-style "navigational" databases and the rise of relational databases and SQL. The programming advantages of xBase are sapped once you stop thinking of iterating through rows and start thinking in terms of queries.

3) Object-oriented programming became trendy in the 90s, and with it, ORMs. This added complexity to the programming model and didn't dovetail well with xBase's BASIC- and Pascal-influenced procedural model.

There were a number of web application products like ColdFusion that flourished in the 90s and early 2000s. But to your point, open source ate most of their lunch. Microsoft lasted a bit longer, but ASP and ASP.NET have had to make a retreat into the corporate/enterprise space. And of course now the latter is open source.


I’d rather use the best (or at least my preferred) general purpose language on top of my preferred database than use an all in one solution that is a compromise on both fronts.


Surprise by the definition and the examples. Use icl quick builder and quick master in 1980s and the key selling point is end result oriented, not procedural etc. you state what you want and the computer figure it out. Not really. But the dream and selling point is that. Just like the 5th generation using prolog. The system figure itself out.


> end result oriented, not procedural etc. you state what you want and the computer figure it out.

Yeah, this is how I've always understood fourth-gen programming language as well. SQL and Prolog were always the prime examples of "state what you want and let the system figure out how to get it".


They seem to list the Unix shell as a 4th gen language... Considering my experience with shell scripting I think I'm fine with the tools provided by my 3rd gen languages.


Don't take the following as a critic of your disdain for shell. Your comment reminded me of this:

https://www.gnu.org/software/nosql/4gl.txt

Note to phone users, the formatting works better in landscape.

I don't remember when this was written, but it's an interesting read. It requires a bit different approach, but it is augmented well by 3GL programs. Commands that don't exist can be written in shell, Python, C, Rust, etc. I usually use a shell first approach, start the command as a shell script and rewrite it in something else if necessary. It is quite flexible and extensible if you work within it's data types (probably the wrong term, but f.g. files and text streams).


Stockholm syndrome


LISPer looks at this and rolls his eyes. Pfft. Muggles.


I view this as a highly accurate response from LISPers.


I’m a lisper and i did roll my eyes :D


Anything substantive to contribute or just condescension?


> In spite of its lack of popularity, LISP (now "Lisp" or sometimes "Arc") remains an influential language in "key algorithmic techniques such as recursion and condescension"[1]

1: https://web.archive.org/web/20220422061858/https://james-iry...


Neither LISP nor Prolog can be feasibly extended to enable development for large scale parallel computers. Erlang and Rust are true Fifth Generation Programming Languages, everything else might just as well be a purely amateurish effort.


Apart from *Lisp (https://www.mirrorservice.org/sites/www.bitsavers.org/pdf/th...), BaLinda Lisp (https://www.sciencedirect.com/science/article/abs/pii/S00960...) and probably quite a few others I'm unaware of.

Lisp as a language is particularly well suited for parallelization, e.g. function arguments can be evaluated in parallel, and functions can be mapped in parallel over lists. This would be MIMD parallelization.


Lisp isn't semantically different from other languages, only syntactically - sure things are expressed as `map` instead of a for loop[0], but what's important is the memory layout of your data and whether or not it'd actually be faster to parallelize it, rather than if you can force it to work that way. So as long as the program works the same way it still can't/shouldn't be parallelized.

[0] except when Lisp programmers brag about how cool their complicated `loop` macro is


I take it that you're unaware that there are already Lisps running on Erlang, like LFE [0]?

[0] https://lfe.io/


Sssh, this is clearly a religious flamewar -- don't go inserting facts into it!


I was introduced to programming in *NIX environments by Informix 4GL, which after learning I could generate the intermediary C code files with an option provided by its 4glcc (or something like that) command, led me to C. I was fascinated by how many lines of C code were generated with a 'let' assignment statement, and also by how assignments in 4GL were mapped to push and pop operations on a stack variable.

I also saw the link to RPG in the article. I took an RPG programming class in 1983, and I'm glad I forgot most of that.


My Informix 4GL days were simply the most productive of my career. We were able to do so much, so quickly using these systems. Along with the domain expert, I was able to, single handedly, create an entire distribution system in less than a year. Order Entry, Credit/Debit memos, Inventory and planning, shipping and traffic, the gazillion reports, custom forms (let's hear it for dot matrix printers and their myriad of escape codes), and workflows, posting to AP, AR, GL, all of the exotic (and ever changing) pricing, discount, and royalty models. It was not uncommon for a typical order to hit 50 different accounts in the GL.

One on one interviews with the users, direct user support ("Yea, hang on, I'll be right there"), seeing their face light up when you showed them something they liked, getting yelled at when things got rocky, the whole kit. The back office staff was about 50 people. The IT fellow knew the business, so I interacted directly with him, and coded everything up. The GL/AR/AP system was pre-existing, I didn't have to write that.

We were a small consulting firm that customized an existing accounting package. Most customers didn't need anything as big as this. But even then, we had another fellow do essentially the same thing for a warehouse company. Hook up with an internal domain expert and pound out their entire system.

We're talking green screen, and green bar paper here. So, "UI" was trivial. UI discussions centered around how best to abbreviate field names or codes to cram more on particularly loaded screens, and what field order to use.

CRUD screens and reports in an hour. Thank heavens for decimal math.

A simpler time to be sure, but we got so much accomplished.

Informix 4GL was imperfect, as are all things, but we rarely had to say "no" to something because of it.


A snarky tagline used to say, "Informix-4GL is not 4G, and it's barely an L."

I understand the sentiment, although in reality it wasn't that bad. What I4GL and its ilk accomplished was to make a passable bridge between SQL and a vanilla procedural programming language, which still had the usual SQL/procedural impedance mismatches, but was miles ahead of horrors like Embedded SQL/C (as you've alluded to with the remark about generated C's verbosity.) I did my share of I4GL programming and still remember it semi-fondly.


I don't know if this counts, but I built so much software in Access '97. Mostly for small businesses and individuals. I could build a whole inventory management system in a weekend(a simple one anyways). It was phenomenal. Once I learned Java and SQL(how to correctly use SQL, lol) I quit using it as much. But sometimes I still prototype software in old versions of Access just to model everything out.


Game consoles, fighter jets... Anyway, it's interesting that there is no 5'th generation language, at least on wikipedia.


There very much was something that claimed to be the "5th generation" project, out of Japan. At the time it was taken very seriously. Most people now regard it as having been a failure.

https://en.wikipedia.org/wiki/Fifth_Generation_Computer_Syst...


The common joke is that it was a very successful project in that it secured the academic careers of many people.


Yeah cheers. I missed that.


It's probably because everyone realized that a marketing person claiming that something is "fourth generation" doesn't actually mean anything in a technical sense.

But also: https://en.wikipedia.org/wiki/Fifth-generation_programming_l...


I know all languages evolve over time, but it seems like languages on VMs like Java have the ability to radically reinvent themselves while not losing compatibility with old code. For example the improvements/differences in ideology from Java that are explored with languages like Kotlin, Scala, and Clojure.


I don't see what is special about VMs that would impart any exceptional ability to reinvent a language that (compiles into something that) runs on it.

As opposed to what?

There's more "modern replacements" for the C language that are backwards compatible with existing code/binaries than I can count.


Oftentimes there's some optimization that it'd be really nice if you could perform, but it's possible to write code that the optimization would result in a miscompile of without analysis that's too hard to do without killing compile times. (And often, popular large frameworks do all these terrible things hundreds of times per second!)

By compiling to a VM like the JVM, you expose the opportunity for speculative optimizations to run and be falsified at runtime without affecting correctness. This isn't a silver bullet, but can certainly be useful for an optimizing compiler developer.


My favourite is NATURAL: https://en.wikipedia.org/wiki/ADABAS#Natural_(4GL) It looks a bit like COBOL, but makes it much more fun to code on a mainframe. Ah, I miss those days ...


My first programming job was working with Data General Business Basic which was marketed as a 4GL but probably doesn't fit the definition these days. It took high level code and generated really ugly pure BASIC from it.


My only personal experience with this term was FOCUS, which in the hands of Information Builders got wielded as a marketing term and felt kind of like saying it was "a space age language" (new and modern! ... In the 70s).

Not sure if anyone out there has experience and can change my view, but the language seemed a bit...wildly rogue, opinionated, idiosyncratic, cavalier, and I avoided it as best as I could at that client.


Long ago, as a summer intern, I had some brief exposure to FOCUS. As best I can recall, I found the environment pretty cool. But I was constantly stymied by having to use an IBM 3270 terminal, which insists that Return and Enter are different things, and Enter is in the wrong place.


4GLs live on through environments like SalesForce and Zoho Developer.

The web and cloud ended the era of the desktop and local client server 4GLs in my opinion.


SCULPTOR was heavily advertised in some computer magazines when I was a kid. I had no idea what a 4GL was but the ad copy sure convinced me it was going to be the greatest revolution in the history of computing. Luckily, like most language products of the time, it was too expensive for a teen hobbyist to afford and I didn't go down what would have been a rather shallow rabbit hole.


Nomad/2 - https://en.wikipedia.org/wiki/Nomad_software

Never used anything more productive than this - mid 1980s, IBM 4381 VM/CMS.


Low-code, no-code etc I sometimes think is a regurgitation of 4GL. Am I right?


The problem was that there never was an accompanying wave of fourth generation programmers. 4GL were targeted at analysts and non technical personnel.

Turns out, making people care about how the machine does it is the real challenge. A higher level lang isn't going to change that.


Lots of people live out full productive lives in SQL, R, etc.

I like the article's idea that DSLs are 4GLs. ie. crafting the nouns and verbs needed to work a problem.


A huge number of in-house corporate developers never needed anything beyond powerbuilder to make the apps their companies needed. Certainly they didn’t give a damn about the hardware. Those apps spend 100% off their time waiting for the user to do something or waiting for the database.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: