Hacker News new | past | comments | ask | show | jobs | submit login
Why use Pascal? (castle-engine.io)
292 points by mariuz on July 8, 2023 | hide | past | favorite | 307 comments



Many people using Delphi back in the day will probably already know Lazarus, which is essentially an Open Source recreation which runs and compiles natively pretty much everywhere, Raspberry PI and similar ARM boards included. Installing libraries however can be tedious, therefore FpcUp and later FpcUpDeluxe were created to automate the task of installing the IDE along with other modules and some quite interesting addons.

https://wiki.freepascal.org/fpcupdeluxe

Here's a quick&dirty instrument panel widgets demo I just put together using some free widgets available with FpcUpDeluxe.

https://ibb.co/9bchx7T

FpcUpDeluxe does work also under Alpine Linux (get the musl version on the releases page), which opens possibilities for adding instrumentation panels to very small systems. All code is compiled native on various platforms, and runs fast: no interpreters, no web browsers etc.


> free widgets

Are you saying there are paid widgets? I use Qt5, which comes with more widgets than I know what to do with. Are the ones that come with Lazarus limited?

Last time I tried Lazarus (and Delphi), Unicode was difficult. Everything seemed to assume ASCII.


Unicode was fixed in 2007 (or thereabouts). I think thats a problem with Delphi (and Lazarus) everyone compares it to when they used Delphi (in 1999) it didn't do this cool thing that languages in 2023 do, however modern day Delphi does a lot more stuff. It's worth a revisit, I use it with the skia graphics engine to create multi platform apps very easily.


I had to use Delphi professionally in 2018, still ran into this everywhere?


Legacy components perhaps? There's a TN here (pdf) https://www.embarcadero.com/images/dm/technical-papers/delph...


> Many people using Delphi back in the day will probably already know Lazarus

If there is someone who interested in Lazarus, Qt and 2D/3D CAD software, here is an opensource boat (not only) design & simulation software project which was rewritten from Delphi.[0,1,2]

[0] https://github.com/markmal/freeship-plus-in-lazarus

[1] http://web.archive.org/web/20160128212152/http://hydronship....

[2] https://sourceforge.net/projects/freeship


I used to use Delphi/Pascal as my main language. In the last decade or two I have been on a bit of a journey looking for a language that feels right to me. I used Haxe for quite a while but I felt Haxe fell into the trap of 'There's a Macro for that' (Macros can do anything, but ultimately enough macros make everyone programming in their own macro augmented language). JavaScript developed decent improvements (but could still use more) and I do a lot of stuff there now.

A year or three ago I was writing some 8-bit AVR code and I gave FreePascal another go. I found it extremely enjoyable. I got to use the newer features and it felt like a truly modern language. Part of what made it so enjoyable was that because I was coding for a tiny space I was not using most of the standard library and just building custom specific code as I went. This meant I did not have the layers of backwards compatible namespace clutter that FreePascal has accumulated over many years (TList, TFPList, TFPGList, TFPGObjectList etc. My main pain point was that the compiler did not allow constant floating point expressions on a target without an FPU(or emulation). Since these could be done at compile time it would have been nice.

Having worked in other spaces, I do now find the inability to define variables mid function to be restrictive, being able to have sub-function scoping is nice too. Begin End vs { } doesn't really bother me. I think I now prefer case sensitive languages and would rather not have letter prefixes on types. I think this is more due to the advancements in editors than anything. Syntax highlighting and active linting can remove text clutter by moving information into separate domains.

I would probably be quite enthusiastic for a descendant language of FreePascal that had a clean slate approach. A new standard library that used the newest features as first class citizens. Maybe when I retire I'll have a go at it.


> Begin End vs { } doesn't really bother me.

but they don't really mean the same thing - {} defines a local scope, begin...end doesn't.


They are both used to define blocks. Eg Begin … End in for loops https://wiki.freepascal.org/For

From what I recall, Pascal doesn’t support variables scoped within a specific block. But then neither do some languages with C-style curly braces too.

So {} and Begin…End are pretty much the same thing.

BASIC (and Visual Basic especially) often get a lot of criticism for their syntax but the End Thing style block terminator was very clear.

It’s a pity that the only popular alternative to C-style braces is Python because I do think ALGOL-style syntax has a lot more going for it than people give it credit.


The most C-style language is C and it does have variables scoped to a block, I believe. The same for C++.


I was using “C-style” with respect to {} syntax for compound statements rather than language specification because the GP is discussing language grammar.


i do not think you understand the concept of scope.


I’ve written a few compilers in my time so I have some idea about scoping rules ;)

As I and other have already said, it depends on the specific language. Some languages scope local variables at the function level (irrespective of {} or Begin End), and other languages scope local variables inside the containing {} or Begin and End block.

In that regard {} and Begin End serve identical purposes of defining a local execution block and then it’s up to the language or dialect, or sometimes even the compiler authors, to decide upon the rules of variable scoping.

What you’re doing here is conflating syntax grammar with deeper rules about a languages compiler or specification.

To come back to your original point, Delphi does support variable scoping inside Begin End blocks. I don’t know if FreePascal also does but older dialects of Pascal didn’t because of the original intent of Pascal being a single pass compiler. Equally, not all languages that have C-like curly braces support variable scoping in the way you’re trying to describe. JavaScript originally didn’t and had to create a whole new keyword to add it (a little like what Delphi did, funny enough). My own shell scripting language very intentionally doesn’t support scoping despite having curly braces.

As for inlining other contexts like functions, classes, etc, it has been a while since I’ve written any Pascal so I can’t recall the nesting rules here. But like with our discussion about variables, it’s more a question of whether the specification and/or compiler has support rather than a quirk of the Begin End grammar that forbids it — just as is the case with {} too.


> As I and other have already said, it depends on the specific language. Some languages scope local variables at the function level (irrespective of {} or Begin End), and other languages scope local variables inside the containing {} or Begin and End block.

And some languages do either, depending on the keyword used to declare a variable. Looking at you JavaScript(var vs let).


this post is specifically about pascal - my comment about {} vs begin...end was intended for that context. pascal has no way of defining variables within a local scope - it was added to delphi as a kludge much later. i admit i should have said something like "... {} in C and C++".


> this post is specifically about pascal - my comment about {} vs begin...end was intended for that context

That link was Pascal. It was just one of many dialects of Pascal.

> pascal has no way of defining variables within a local scope

I just did a quick search and turns out Pascal does support nesting variable scopes too.

> it was added to delphi as a kludge much later

Is it a kludge though? Or are you just calling it that because it disproves your point? The syntax is pretty similar to a lot of C-like languages. So you’d be calling their variable declarations a kludge too.

It feels like you’re being overly dismissive.

> i admit i should have said something like "... {} in C and C++".

it certainly would have helped your case to have opened with a little more context rather than assuming other people don’t understand how scoping works ;)

But even that aside, Pascal does support scoping inside compound statements.

Begin End and {} are just tokens to denote the start and end of a compound statement. Just like how Python uses indentation to group compounds.

Everything else you put is missing the point about how language grammar is parsed.


Well the good thing about this thread is that I found out that Pascal has grown inline vars since I last used it. With type inference too by the looks of it.

Maybe I will make my way back after all.


Perhaps you're being a little short here? Explaining what you mean would go a long way to fostering good will.


More of these replies on HN please!


Would you explain it? I'm self taught, have programmed in Python, C, and am looking at Pascal, and I don't understand the difference.


Depends a lot on the language does not it?


we are talking about pascal


you are probably talking about C++ vs pascal. But you should be talking C.


no, c and c++ have basically similar rules for the scope of variables, with the regard to braces


I don't think you know much about pascal, C or C++


About pas Al in comparison to languages using {} for blocks, where some have block scoping and some don't.


no idea what you are talking about here - what is "pas Al" - post an example of what you mean.


Pascal. It’s a typo.



without that, one can use a repeat until loop to simulate I think


I used Delph professionaly for several years in the 90s, and liked it, but really got tired with Borland (and subsequent owners) mis-managing the product and the language, as did Anders Hejlsberg, who left Borland for Microsoft, where he created C# - IMHO a much better language & architecture to invest your time in.


Reliably deploying C# desktop app is total nightmare in comparison to single exe generated by Delphi's / Lazarus.


C# can generate single executables now.


Kind of.

Depends if we are speaking about Xamarin/Mono AOT, .NET Native, Native AOT or IL2CPP.

Each of those ones has plus and minus, none of them allow to pick a random .NET application and just compile it straight into native code.


I've found .Net deployments to be rock solid and easy. What problems do you hit?


A couple of months ago I started working at a company that (mainly) develops C# libraries.

The pipeline to deploy is complicated. Using a tool like ILMerge to combine DLLs into a single DLL (to prevent DLL hell issues). Using Babel to obfuscate the code base. Using nuspecs to create a Nuget package from the DLLs and various metadata. Signing the DLLs with some certificate. Then, if native stuff is included, also ensure the Nuget package can be deployed on all supported architectures. Probably some more stuff.

Setting up a pipeline in Azure devops for a product I am working on, for the company, is already taking me a week and will probably take another couple of days to complete. But at least it’s a one-time task.


Well, a lot of those requirements are highly bespoke to your task and to your team.

My C# pipelines and C# apps are far easier to package and deploy than my Python apps for example. And, as you said, the pipeline is normally a one time cost.


Comparing to Python is not exactly fair. I don't think there are any languages that have a worse deployment experience.

How does it compare to Go or Rust?


They want obfuscation. Rust libraries necessarily include the AST for public generic functions. So it may not be suitable.


I think that's a totally different topic, but regardless it's the same with C++ and I don't see that being a big issue to C++ library vendors.

Rust doesn't have a stable ABI so it's not like you could realistically distribute a closed source "native" Rust library anyway.


We were talking Delphi vs C#. Where does Python fit in?


Ah, I've not had to combine DLLs, obfuscate code, sign code or create Nuget packages.

All of that does sound painful.

What makes it easier in other languages? Is it a matter of better tooling?


I am not sure if it’s easier in other languages (no experience except mobile dev). But I do know packaging macOS and iOS apps is also a painful process and becomes more painful over time (e.g. Apple’s recently introduced notarization process).


I'm finding myself using Pascal for fun these days due to resource constraints.

The old release of Borland's Turbo Pascal, version 3.00A, runs under CP/M and provides an editor, compiler, and libraries all under 64k.

It's fast enough to use interactively, and produces code that is good-enough to implement low-level utilities, simple games, and other random hacks.

I've not used Pascal on anything larger, or more modern, but I have to say that my recent experience has been rewarding enough that I wouldn't rule it out!


Turbo Pascal and later Delphi were the first time when a programming language really spoke to me. Eventually, I went the C and C++ route, but I’ve always had a desire to return to more expressive languages.

I had no idea that there was still a modern Pascal implementation, so I may just have to get reacquainted with my first love.


I’ve also a had an early acquaintance with Borland TP 7.0 at the time C99 was already around, and Pascal felt rather clumsy in comparison. The strong typing, the lack of a preprocessor, the verbosity in data declarations, the small set of available libraries just made my early programming self consider C superior.


Nowadays you look at pascal and see go without braces.


Check out Free Pascal (FP) if you haven't already. It supports many platforms.

You can check on their site if it supports CP/M if you need that. I would not be surprised if it did.

The TUI IDE is very similar to Turbo Pascal, and lightning fast. The language has a lot more (advanced) features than Turbo Pascal, but you don't have to use them if you don't want to.

The generated binaries were also very small, similar to C, when I did a quick test of it on Windows, some time ago. Under 50 or 60 KB for a simple hello world program.


Sadly it seems not to be available for such systems.

Though for reference compiling my sample "Hello World" script on TP gives me a "hello.com" file which is 8320 bytes. 64k is as much memory as I have on the machine, so if it were 50-60k I'd be worried there would be no space left for anything else!


Free Pascal has support for Z80 which might work with some CP/M systems (AFAIK Z80 is backwards compatible with 8080 and some CP/M systems used Z80). You need to compile the compiler from source but it apparently can target the ZX Spectrum, so with some modification it might be able to target CP/M too.


Oh yeah, I realised that just after hitting submit. Had worked on almost as low powered machines a while, much earlier, as a hobby. Sorry for leading you down the wrong track.

It looks like the broad cross-platform capabilities of FP are mainly for modern machines and platforms, even if low powered like the RPi.


Use optimisation to get the binary down to 1k. I did just that yesterday.


'under 60 kb' for a cp/m machine is like 'under 60 gb' for the laptop you're probably using


Oh Pascal!, oh the nostalgia. I saved up the princely sum of $80-$100 back in TRS-80 Model III days to buy the Pascal 80 package, the New Classics version, which was basically a proto-IDE, and a pretty sweet one at the time. http://www.trs-80.org/pascal-80/ I recall it being a huge step up from the M-III BASIC dev environment and the games and utilities I wrote ran well enough in the 48K of RAM that was available. I wonder what became of the Pascal-80 codebase? Not that it would be any better than the modern tools available for retro hardware, probably far worse. Oh yeah, "Oh Pascal!" was the book from which I learned the language, and I doubt there were many other options, as this predated Pascal's commercial use by quite a few years.


Anyone can check out Oh! Pascal! [1] just like a library book.

Even now, it's one of the best introductory programming books for any programming language.

[1]: https://archive.org/details/ohpascal0000coop/mode/2up


I was interested to revisit the book, but when I tried, it says:

DOWNLOAD OPTIONS download 1 file ENCRYPTED ADOBE PDF download High Quality Page Images

In order to access your downloaded book you will need LCP-compliant or Adobe-compliant software on your device. The Internet Archive will administer this loan, but Adobe may also collect some information.

That's a turnoff.


Thanks for the update… I’ve only checked out books, not download them.


Welcome.


Wow.

"Oh Pascal!" was a fantastic book, both for the content (going from beginner to a bit intermediate (1), in a single, not too thick book) and for the fun and engaging writing style, and with very good production values too, meaning the book's appearance.

IIRC, it was written by one or two US postgraduate college students or professors. My uncle, who was himself a college professor in the States, brought it for me on one of his visits to us, as a present, because he knew I was learning computers.

And during that visit, I wrote a small scientific program for him in Turbo Pascal, to use in his lab.

(1) Why I said "a bit intermediate" is because it had, near the end of the book, a program to do text concordances, and maybe one or two other non-trivial ones too.


I disagree with some of their reasons.

Modern: Object Pascal isn't a modern language. It was modern in 1998, maybe, but it hasn't evolved much since then. Latest big change was the addition of generics, behind almost any other language.

Fast: FPC doesn't generate particularly fast code, and the nature of OP objects doesn't help with locality. It's faster than scripting languages, but generally slower than AOT compiled languages, even those with GC.

On the other hand, the ecosystem is great. There are lots of good libraries and tools (the most remarkable one being Lazarus, the Delphi clone). In my experience, people working with FPC or Delphi don't care much about modernizing the language or things like that, the just get things done (TM). I don't see the language being attractive to new coders though, so I don't know what its future will be...


> Latest big change was the addition of generics, behind almost any other language.

FPC added generics 17 years ago, that is far from recent. Also i'd say that anonymous functions and function references (closures) are better candidates for "big change that was added recently".

> FPC doesn't generate particularly fast code, and the nature of OP objects doesn't help with locality. It's faster than scripting languages, but generally slower than AOT compiled languages, even those with GC.

In practice the performance is fine and you can actually optimize the code as much as you need for any hotspots you find - it can be a bit more of a PITA if you use the "high level" classes compared to C++ but it isn't impossible.

Though if you really want performance out of the box with minimal effort from your side, there is a new LLVM backend. You need to compile the compiler from source to enable it as the entire runtime library, FCL, etc need to be built with the LLVM backend, but that takes only a couple of minutes. On the other hand the compiler becomes much slower (and IMO the difference in performance isn't worth it).


> Fast: FPC doesn't generate particularly fast code, and the nature of OP objects doesn't help with locality. It's faster than scripting languages, but generally slower than AOT compiled languages, even those with GC.

You're selling it short.

Firstly, as the other poster downthread pointed out, it benchmarked as fast as C++ in the past.

Secondly, it's main use is local gui apps, and there's nothing I've seen, including C# gui apps and have gui apps, that even comes close to how snappy it is.

So I am curious what benchmark you used to determine that it's about 50 times slower than it actually is.


> it benchmarked as fast as C++ in the past

Does this mean it's no longer as fast as C++ today? Do you have specific benchmark results?


You can find some in Debian benchmark game. In general FPC generated code is around 1.5 to 2 times slower the fastest entry (often C++). Note though that this is with FPC's own code generator and there is a new LLVM backend in the development version (FPC's own code generator is the default and will always be, the LLVM backend is for those who really want it and is much slower).

I'd expect synthetic benchmarks like those in the Debian benchmark game to be closer to C++'s performance with the LLVM backend.


When I had a look at the CLBG results last time the code generated by FreePascal was even about three to four times slower than C/C++; but the benchmark rules are not particularly well suited for fair comparison (some code is obviously written with inside knowledge of the particular compiler/version and not what you usually see for the given language, and the Pascal code likely includes range and overflow checks which make it slower compared to a language without these, etc.). The LLVM backend is not officially supported by FP and doesn't support all platforms as far as I know; and unfortunately micro benchmarks are usually not representative for the daily overall performance of an application; the Are-we-fast-yet benchmark suite would be better in this regard.

Anyway, I would be interested in why FP benchmarked as fast as C++ in the past, but no longer today.


> The LLVM backend is not officially supported by FP

It used to be a separate project but these days is part of the main development branch. Though indeed the OS and CPU support is very limited.

Also i agree about the micro benchmark comparison, they tend to exaggerate differences. FWIW in my own programs i never found Free Pascal's code generator to be inadequate.

I do not remember the exact difference but last year i did compile my 3D game engine with the LLVM backend and the difference was small enough for me to decide that i don't want to bother with the much slower compile times.


I read other comments of people claiming the code generated by the LLVM backend was less than factor 1.5 faster than the one generated by the original backend, which is not worth the effort (and the humongous overhead and additional dependencies) from my point of view; but I'm still trying to find information about the specific optimizations done in the current FP compiler.


> I'm still trying to find information about the specific optimizations done in the current FP compiler.

AFAIK there isn't any explicit documentation but the "toptimizerswitch" and "twpoptimizerswitch" (the latter is for whole program optimizations) types in the compiler define the available optimizations in globtype.pas and have the following values:

    cs_opt_level1,cs_opt_level2,cs_opt_level3,cs_opt_level4,
    cs_opt_regvar,cs_opt_uncertain,cs_opt_size,cs_opt_stackframe,
    cs_opt_peephole,cs_opt_loopunroll,cs_opt_tailrecursion,cs_opt_nodecse,
    cs_opt_nodedfa,cs_opt_loopstrength,cs_opt_scheduler,cs_opt_autoinline,
    cs_useebp,cs_userbp,cs_opt_reorder_fields,cs_opt_fastmath,
    cs_opt_dead_values,cs_opt_remove_empty_proc,cs_opt_constant_propagate,
    cs_opt_dead_store_eliminate,cs_opt_forcenostackframe,
    cs_opt_use_load_modify_store,cs_opt_unused_para,cs_opt_consts,
    cs_opt_forloop

    cs_wpo_devirtualize_calls,cs_wpo_optimize_vmts,cs_wpo_symbol_liveness
level1/2/3/4 are basically collections for some of the above and are enabled for -On where n is 1 to 4. You can enable optimizations explicitly with the -OoXXX (for per-module optimizations) and -OwXXX (for whole program optimizations). The -io and -iw parameters can be used to obtain the available names for these.

I think the names are more or less self-explanatory, at least for the most part (not sure what "uncertain" does... which i think is appropriate :-P).

Some brief documentation (though very brief) is available in the programmer's guide:

https://www.freepascal.org/docs-html/current/prog/progch11.h...


Thanks for the hints. I was already concerned that I would have to analyze the source code directly myself, and so I started to build tools for this purpose (https://github.com/rochus-keller/FreePascal).

I read somewhere that there are issues with higher optimization levels. Can you confirm that?


> Thanks for the hints. I was already concerned that I would have to analyze the source code directly myself, and so I started to build tools for this purpose (https://github.com/rochus-keller/FreePascal).

FWIW you may want to check out the fcl-passrc package[0] which provides units for scanning and creating a syntax tree from FPC source code, including some helpers like resolving references.

It is used by the fpdoc documentation generator and the pas2js transpiler that converts Free Pascal source code to JavaScript, both official FPC projects. fcl-passrc is itself part of Free Pascal.

> I read somewhere that there are issues with higher optimization levels. Can you confirm that?

AFAIK the main issue is that -O4 enables some optimizations that may break code that relies on things like having the exact same math output as non-optimized versions as it enables FASTMATH, code that relies on classes having specific field order (technically class field order is not guaranteed to remain the same) since -O4 enables reordering the fields to remove unnecessary padding and code that depends on runtime errors or exceptions being thrown from code that the compiler (in -O4) decided it has not explicit side effects and removed it.

These are not "issues" per-se but Lazarus labels -O4 as having "aggressive optimizations, beware" which might give a false impression.

Aside from that there might be bugs, but that is the case with any compiler.

[0] https://wiki.freepascal.org/fcl-passrc


Thanks again for the hints; I'm not (yet) fluent with FreePascal and assumed to be much faster when quickly implementing a rudimentary parser to make arbitrary queries over the source code (the one I implemented for Lisa Pascal/Clascal including the browser/cross-referencer took a week as a side project) than discovering and learning all the required FP libraries and tools; but the FP language turned out to have some pretty dark corners which are much harder to parse than I expected from a Pascal descendant.


Yeah, Free Pascal is far from simple. It not only tries to implement everything Delphi provides (itself never being a "simple" language) but also as much as other Pascal dialects provide (via compiler modes) and various useful features from other languages - and that in backwards compatible ways (often via "modeswitches" - kinda like submodes).

With that in mind it is kinda interesting that there are (AFAIK) three parsers for it anyway: FPC itself, lcl-passrc and Lazarus' CodeTools. It does mean that any new stuff in the language gets some time to be implemented in others though.


> Free Pascal is far from simple.

It's still much less complex than e.g. Ada or C++, and unfortunately, the language has redundant, competing concepts that are probably explained by - as you say - putting different languages and design styles together; it's definitely not my favorite language.

Unfortunately, there was also no complete and correct grammar, so I had to create one myself, and even in this one there are still productions without definition, e.g. array_constant, record_constant and procedural_constant. At least the syntax diagrams were not left recursive, and also removing ambiguous alternatives was straight forward.

Surprisingly, the FP language specification does not distinguish between language variants; only the compiler knows different modes. There seems to be a common grammar for all modes.


Delphi and C++ Builder share the same compiler backed.

Nowadays it is based on LLVM.


> It's faster than scripting languages, but generally slower than AOT compiled languages, even those with GC.

Not sure about FPC, but Delphi 7 was on par with C++ ten years ago in competitive programming, that is about 2-3 times faster than Java. By competitive programming I mean very short (1-3 seconds per execution max, so any JIT is at disadvantage), CPU-bound single-threaded heavily algorithmic computation on mostly default compiler/runtime settings with no external libraries or ability to tweak. The common knowledge was: you either use C++/Pascal, or use Java and occasionally rewrite your 200-300 line solution to C++ if you get "time limit exceeded" error, and it passes with flying colors.


reasons why Delphi was/is a strong option for CP:

1. Fast compilation enables edit-compile-run workflow. Fast iteration enables solving CP problems faster.

2. Native dynamic arrays, native String type

3. Easy to learn language for middle/high schoolers.

4. Usefulness for outside CP, for example I created GUI programs in Delphi for clients and earned $$$ while in 8th grade, right after I was done with competition season.


the greatest difference btw Pascal and C++ is developer experience.

Pascal uses a single-pass LL(1) compiler, which allows you to compile in milliseconds. Pascal enables REPL-like experience where you can Edit->Compile->Run in less than a second.

C/C++ with macros and slower compilation times is worse developer experience, at least it was the reason for me to learn Pascal instead of C and Delphi instead of C++


Just to add more context: Niklaus Wirth's foresight was to design Pascal's syntax to allow a single-pass compile. But it took more than 15 years until we actually had a blazingly fast single-pass compiler and that is totally Anders Hejlsberg's merit.


Every so often a thread pops up about Turbo Pascal and I'm astonished as to how nice an IDE you could fit onto a 64K CP/M system. (See also comments above/below.)


Oh, it was very very nice. I still miss it today sometimes. Here are a few highlights:

- Pressing F1 gave you reliably context sensitive help and the help content was really well put together.

- The debugger was great and was basically what we now know from Eclipse or IntelliJ and completely not like gdb. It had the same keyboard shortcuts for stepping as IntelliJ still has today.

- Computers in the 90s really did not support more than one display, but there was a weird trick that allowed you to connect one color and one monochrome monitor. Turbo Pascal fully supported that and could display the app on the color display while you saw the debugger on the monochrome one. This was before Windows and GUIs, everything was fullscreen. Without that there was no way to see debugger and app at the same time.

- The editor was so good that I preferred it over a word processor even for writing prose.


Back in the days of Turbo Pascal 5.5 I had a secondary monitor, a monochrome orange phosphor, connected to an EGA card, I think. The main screen was VGA. Two monitors provided a huge productivity gain, with common and cheap hardware, and it was super cool. Turbo Pascal was the only software I used that could display on the second monitor. Sometimes I miss seeing everything in orange.


I had a VGA and a Hercules card and it's probably hard to relay how super cool that was at a time and in an environment where computers alone were pure magic to most people. Then there were these kids, like us, that could not only use these magic programs but could control and manipulate them from a second screen:-)


The best thing about built-in help was that it not only contained documentation of a function, but usually a short example code. That was extremely helpful and I relied on it a lot.


TP IDE was indeed magical. Except for one pain point, from an emacs user POV, no extensibility. You can't tweak the ergonomics so you're stuck with what the devs put in. It was 99% perfect but that broke me out of it.


My memory is fuzzy, but even before Hejlsberg magic, Wirth parser was a light and fast (but maybe unreadable) hand coded piece of code.


> Pascal uses a single-pass LL(1)

No longer the case with the Pascal version used by the game engine.


which Pascal compiler is it - Embarcadero or FPC?


It's the language itself; I currently build a parser for FP 3.2.2 and I need more look-ahead than LL(1) in different parts of the syntax. And it has features which require more than one pass.


One thing this thread seems to forget, Pascal sources do not include literal kilometers of headers compared to C++, and there’s no code in Interface sections either. Every C++ translation unit explodes into a full set of libs at preprocessing phase. One pass or LL(1) is just a cool bonus compared to that.


Delphi and FreePascal have compiler directives and include files quite similar to C, and it's commonly used e.g. in the FP source tree, albeit to a lesser extent than in a typical C/C++ project.


> On the other hand, the ecosystem is great.

Hard disagree. I worked with Pascal for about 10 years, and the lack of modern libraries was a source of frequent frustration, meaning we often had to develop the solutions ourselves, or abandon an idea entirely.


In regards to Delphi there were plenty of them, sold by companies specialised in component libraries.

Naturally one had to be willing to pay for them.


Yep. Unless it's part of a legacy project, or I'm developing something as a fun experiment, I have no interest in languages that lack modern, robust, stable, up-to-date libraries.


> I have no interest in languages that lack modern, robust, stable, up-to-date libraries.

I guess JavaScript is out then. ;-)


May we please have a few examples of what's missing?


The situation may obviously have changed today, it's been a few years since I last worked with Pascal. Back around 2016, we were looking for a different way of communicating between our services (you could kind of see them as micro-services, but I would not describe them as "micro"), which until then had relied on our own communication implementation on top of Windows socket. Our implementation was OK, but could not communicate beyond a single Windows machine, and it definitely did not allow multiple instances of the same services. We wanted to scale up, so we started looking into new technologies.

Our lead developer was interested in building a RabbitMQ prototype. There were two libraries around to find, but none of them were finished, so we spent a lot of time finishing one of those implementations. We also looked into ProtoBuf, but that was a steeper hill to climb. When I left in early 2019, we were still using the old WinSock communication, though I understand from former co-workers, that they've switch to an UDP-based system instead now.

That's the most vivid one I can remember, because we pushed on, despite the available libraries being weak. I also remember trying MongoDB, which required a lot of polishing the available libraries as well. Other ideas I remember pursuing briefly, until I recognised finished some of the libraries would be too tall an order.

For the record, I don't have a problem with Pascal in general, nor its community. But considering how trivial it is to find a solid robust library for modern technologies (even if kind of fads) for other popular languages, compared to Pascal, it definitely feels weird to call its ecosystem "great".

Of course, the libraries I've mentioned may be great today. But RabbitMQ and MongoDB are also a bit older technologies today.


I also think the modern part is exaggerated here. Object Pascal was had cutting edge features and developer experience in the 1990s, but I think it's hard to see it this way nowadays.

There are two big features that one expects in a "modern" language that are missing in object pascal:

1. Some sort of automatic memory management that prevents memory leaks and dangling pointers. Modern languages usually have GC, automatic reference counting (like Swift or Python) or some form of static analysis (like the Rust borrow checker) - Zig is probably the only outlier here, but it does have better memory safety tooling (like defer). Object Pascal really feels like going back to C++ as it was during the 1990s and early 2000s, before smart pointers and later move semantics caught on. If I remember correctly, COM interfaces, but they carry the COM baggage and nobody is using interfaces for everything.

2. A package manager that is isolated (not impacted by the global environment), supports specifying dependency version in a configuration file and supports repeatable builds with lockfiles. fppkg doesn't seem to go that far.

There are other features that you could argue for I guess, like type inference, simplified expression syntax, asynchronous I/O and more, but they would be more controversial.

I don't think I can honestly call Object Pascal a modern language, and it's sad, since I still have a warm place in my heart for Pascal.


> A package manager that [...] supports specifying dependency version in a configuration file

This is an anti-pattern common among modern programming ecosystems that they should ideally be moving away from—and ecosystems that are not already afflicted with this sort of thing shouldn't make the mistake of making it a milestone to aim for.

These versions listed in a text file that you refer to are nothing more than a way to route around one's own version control system (example: package.json and Git). The version control system should be used for version control—not text-file based hacks that end up being the source of unreproducibility. And to the extent that one's version control system can't be used because it doesn't do what you want, then that's something that should be fixed at the appropriate level (i.e. the version control system itself)—not with band-aids applied there, either.


There are other reference counted types with copy-on-write, like strings and dynamic arrays, it isn't only interfaces.


FPC (if my experience with Lazarus is anything to go by) doesn't even compile quickly - certainly not compared to the Delphi compiler.


> but generally slower than AOT compiled languages, even those with GC.

Is this an assessment by experience or do you have specific performance data, e.g. comparing a set of benchmarks with C++? What's the difference if checks (range, overflow, etc.) are disabled?


The Object Pascal as originally created by Apple, not.

The Object Pascal adopted by Borland, and evolved into Delphi, is on par with plenty of C++ capabilities, given the Delphi/C++ Builder symbiotic relationship.


Why? Old languages must remain around for legacy support. However, it's hard to see the point of shoe-horning all those modern features into an old language.

I write a lot of Java, and really, almost everything since Java 8 should not have been added. Lambdas, for example, are a kludge in Java.

Want modern features? Use a modern language. If you want to stay in the Java ecosystem, for example, you could use Kotlin.

I'm talking about Java only because I haven't seen Pascal for literally decades. I last programmed Pascal in the 1980s. Adding that laundry list of features to Pascal is just silly.


lambdas are a kludge? Really? You prefer creating one-method anonymous classes instead?


I think they meant in terms of how the feature was implemented, not the feature itself.


Ok, yeah, that's true.


> Use a modern language. If you want to stay in the Java ecosystem, for example, you could use Kotlin.

That's easier in Java given how easy it is to interop between languages (see also Clojure). What if I'm using C++? There's no easy upgrade path to anything else.

There's still value to "shoe-horning all those modern features". I might not start a project with C++ today, instead choosing a modern language (e.g. Rust) instead. But there are millions of lines of C++ already written that would be improved by modern features (see how C++11 changed the landscape).


> I might not start a project with C++ today, instead choosing a modern language (e.g. Rust) instead.

This would depend on what your constraints and needs are. Rust isn't a "modernized C++", it's a language/ecosystem with a different set of strengths and weaknesses. You may very well prefer C++ over Rust today.


Then we have to switch PLs much more often. Seems reasonable to me to add/integrate new features that have proven to add a benefit to delay this process.


Java until recently had less going for it than Delphi 15 years ago except the insanely large ecosystem - which trumps everything else, admittedly.

Java the language was misguided in its decision to use libraries instead of language features leading to impressive amounts of boilerplate. Yes, you can do everything you can do in Python, Lisp or Haskell. But, you can also do that in brainf*ck, by definition, so the actual complexity of Java solutions is actually higher than a more featured language, because you end up reimplementing all those language features anyway in a slightly broken way or pull in tons of dependencies that do that for you, in either case with added API surface to learn.


> because you end up reimplementing all those language features anyway in a slightly broken way or pull in tons of dependencies that do that for you, in either case with added API surface to learn.

I suppose this statement is true for absolutely all programming languages?

You have to use libraries in all languages, all libraries is somehow broken because of using unique way to implement functionality, by using any library you have to learn unique API, etc. etc.


Yes.

Except sometimes you don’t have to use libraries because the language just does the right thing out of the box, e.g. go with channels, selects and goroutines. This is what Python’s ‘there should be only one way to do it’ is really about.


Only in the last few years we have a competing JVM language.

Scala was a dead end. Clojure was a fad. Rhino/JS was very limited in its ambition.

Java improvements allowed JVM to stay relevant and fend off DotNet, which is a good thing considering Microsoft's history.


> Scala was a dead end.

I'm sure all the companies, like Microsoft, Twitter, Disney, and basically every bank out there, will be very surprised to hear this.


Where does Microsoft use Scala?

Or every bank out there?


Machine learning:

- https://github.com/microsoft/scala_torch

- https://microsoft.github.io/SynapseML/

- https://www.infoworld.com/article/3236869/what-is-apache-spa...

And Microsoft of course owns LinkedIn which is built on a Scala backend.


Small projects, most of the stuff they do in ML is based on C++ and Python. Hence why they hired Guido and are putting money into optimising CPython.

LinkedIn is a separate entity, even if owned by Microsoft.

Zero Scala content at BUILD.

What about the banking world across the globe, no numbers?


Just search any job board for 'spark', you will see plenty of banks and many other companies.


Plenty of banks isn't every bank.


Those are just the ones you will see from a random sample of job postings. There are many others out there that are not posted at any given moment. If you don't believe me just talk to any data scientist who has worked at or does work at a bank.


Those busy using R, Python, Excel, alongside PowerBI and Tableau?


No, the ones using Spark. Do keep up!


It's a bad sign if use of a language is tied to a singular framework such as Spark or Rails.

The next fad is in, the language is out.


Interesting, then Java is in trouble I guess because it's mostly tied to Spring Boot?


It's not. Maybe it's selection bias: I don't see it around that much.


It is a bit hard when after all isn't every bank as originally described.

Need to move that goalpost around.


Nothing gets past you, Mr Literal!


Often overlooked, the Ada embedded ecosystem has advantages of maturity in static analysis, debugging, and target support.


I always thought that Ada was underrated. Personally I appreciate features like memory safety, a standard concurrency model, support for unit/measurement types, and ahead-of-time compilation.

Ada's Pascal-like syntax seems verbose to me but isn't hard to read. And VHDL is based on Ada syntax, perhaps making it easier for people who are working in both languages.

Swift seems to check some of those boxes (and adds many other convenient features including closures, automatic reference counting, type inference, etc.) I don't know about measurement types, but it seems they should be doable. There doesn't seem to be a Swift-like HDL yet though.


Does Ada support Android and iOS targets yet? Last I checked there didn't seem to be any obvious way to build for NDK.



It isn't the friendliest to get started on macos.


Alire mostly resolves that. It can manage the compilers you have installed. Much better than the situation a few years back.

https://alire.ada.dev/


Pascal is actually good and perfomant. It was the second language I picked up in high school after Basic.

I don’t understand why as an industry we had to regress to Python, Ruby, Java, & Javascript in late 90s & early 2000.


I saw a talk that touched on this recently: https://www.youtube.com/watch?v=Tml94je2edk

He explains that dynamically-typed languages like Python, Ruby, and JS became popular in the 90s because they offered a fast feedback loop for website building, and they didn't need an IDE or compilers, which were slow (and often not free). It ended up not being worth trading your development time for the performance increase when all your users were connecting via 56k modem anyway.

The talk makes a lot of other cool points about how development has changed in the past decade or two, and why the trend is moving back to static typing.


Do not forget Pearl.


Yes, it is a gem of a language.

Now, I wonder who its mother is?


> gem of a language

PP already mentioned Ruby.


Perl or Rakudo?


> I don’t understand why as an industry we had to regress to Python, Ruby, Java, & Javascript in late 90s & early 2000

Computers got faster and added more memory, so they had to come up with a use for it. ;-)

Also the web browser became a popular programming platform and mainly used JavaScript. They're still trying to cram a full OS into web browsers in addition to the browser part. Chrome is getting close.


My first project as a freshman in university was an elevator simulator (with text graphics) in Pascal. I enjoyed the language, there's something elegant about it when compared to C-like languages. This is the reason I'm enjoying Nim nowadays, which afaik is inspired by Wirthian languages (Pascal, Modula 3, Oberon, Delphi), in addition to Ada and Python.


Pascal isn't too bad; actually I think that it has some advantages over some more modern programming languages. It isn't perfect though and does have disadvantages too. I sometimes use for Pascal and BASIC for DOS programming; Pascal does look OK for that, at least (and C seems not as good for DOS programming (at least in real mode), even though it is sometimes done). However, for programming in Linux, I generally prefer C (although I also use PostScript; I think that both C and PostScript have some advantages compared with some more modern programming languages). Pascal can be used for other programs too; even TeX is written in Pascal.


What's your use of Postscript? I find fascinating when people use it for anything other than print related stuff


you may appreciate http://postscriptcode.com/ by nathan laredo

i wrote a parametric 2-d cad system in postscript: http://canonical.org/~kragen/sw/laserboot/cut-3/README.md.ht...

it's a very flexible dynamic language, much like lisp, and a little less like python, but with much better performance than python


I think that it is not such a bad programming language. I have used it for both graphical and nongraphical stuff. For example:

- A library to read/write the ZZT file format. I had then also used it to make a program which will draw a graphical map (I also defined a "PCEncoding" vector, although some characters seems to not work somehow). I had also done other things with it that are not graphical, including constructing ZZT world file automatically or making batch modifications to existing world files.

- A library to parse a UHS hint file. Later, a program could also be written to allow it to be printed out, perhaps even using invisible ink or scratch-off layer if you have a suitable printer. It could also be used for interactive mode without printing.

- I wrote a program to convert the levels from a DOS game into the Free Hero Mesh level import format, in PostScript. (A similar thing can be done to convert levels from other games, I suppose.)

- I wanted to measure the computer's temperature and load over time, so I wrote a program in C to write the measurements to a file (just simple binary data with fixed length records) and then wrote PostScript program which will plot it on a graph and make a PNG file.

- I also implemented getopt and JSON and multicodec in PostScript. I don't use JSON much, although I do use the getopt implementation sometimes.

- I wanted to write a Pokemon battle simulator in C. I had used PostScript to manage some of the configuration-related stuff that is used before the program is compiled. As a side-effect, it can also make the type matchup charts in PNG as well (in addition to producing the files in the format used by the battle simulator I wanted to write), so now I have those as a reference even when playing a different game.

- I also implemented Infocom's Z-machine in PostScript, although that is mostly just to see if I can. It implements both interactive mode (without the printer, using standard I/O), and the transcript (with the printer).

Nevertheless, there are some things I would improve in PostScript. For example, add a "unread" operator (similar to ungetc in C), a built-in procedure called "#!" which just skips the rest of the line from the source file (the following slash is a new token in PostScript (which will be skipped by this procedure) so this will work), optional auto-allocation for stuff such as "readline", making warning/diagnostic/error messages on stderr instead of stdout (although "print", "=", "==" would still write to stdout; you can use "write==" etc to write to other files, but being able to do this with "pstack" would also be useful), and some other stuff. (Some of the stuff in Ghostscript is good, such as the ARGUMENTS array; that is probably the feature of Ghostscript which I use most often. %pipe% is also useful.)


My first languaje after BASIC was Modula2, which looks a lot like Pascal. I studied OOP, algorithms and data structures with Pascal in college, so it's a languaje for which I have great memeories.

Nice to see it's still around.


Modula-2 is a much better language than original Pascal, but if you want OO features Oberon-2 is the corresponding Wirth language.


Both made it into modernity as large swaths of Golang. Modula-2 even had coroutines.


Go has surprisingly little in common with Oberon or Modula (mostly the receiver syntax of Oberon-2 bound procedures). The coroutines were not a language feature of Modula, but a library feature; also the Oakwood guidelines of Oberon (not by the original authors) include a coroutine API, but I'm not sure whether there was a working implementation.


golang is basically newsqueak


Go is basically Limbo + Oberon-2


it's a lot closer to newsqueak, but limbo of course drew a lot from newsqueak as well


And it's the fastest language in the benchmarks game, whilst also producing the smallest code. Not only better than Pascal or Go, but even better than C++.



Can you post a reference, please?



Thanks for the link, though I don't think the fellow referred to that; it was last updated in 2003 and Modula-2 is far behind C or C++ (see https://dada.perl.it/shootout/craps.html); and the results seem also quite questionable since Lua and Python are too good to be true.


Does Pascal still require all variables to be declared at the top? I did a quick search and that seems like the case, but I want confirmation from someone that knows the language.

If so, that's a immediate "no" for me. We know by now that keeping variable declarations close to their usage is a big boost in readability, and at times even in performance (you only declare variables you actually use).

I know Pascal has evolved a lot over time. If this is indeed still a requirement, it's hard to understand why, and a barrier to those used to scoping in "modern" languages (even C99 has this).


Delphi allows inline variable declaration but Free Pascal does not (the developers explicitly decided against implementing it).

Note that in Pascal (FPC, Delphi, whatever) you can have nested functions so this isn't that much of a limitation as it sounds. For example if i have a procedure that is very long i often end up splitting it in multiple nested procedures and each one has its own variable section.

Also from a more practical perspective Lazarus (the most common FPC IDE) has a shortcut key to automatically insert variable declarations so you don't have to move up and down in code even if you have a large function body. I'd expect Delphi to have something similar.


> Delphi allows inline variable declaration

it may do so now, but for the longest time it didn't, which made it compare poorly with C++ and even with C, which allows definitions in a scope. in fact the OP concept of scope is a bit crap - basically the whole function,


Your concept of Object Pascal scope is also outdated, in relation to Delphi.


Seems a bit shallow. JavaScript equals is an abomination, which to me seems like a larger issue, and people use it just fine.


I agree that triple equals is an abomination. It's one of the many things ensuring I use JavaScript as little as possible.


You can pretty much ignore mostly anyone most of the time who screeches at you to use triple equals (strict equality comparison) instead of regular ol' "==" (abstract equality comparison). It's one of modern programming's dumbest, least well-founded memes, resulting in such brilliant advice as insisting during code review that

  typeof(foo) == 'string'
be changed to

  typeof(foo) === 'string'
because the latter is "safer". Anyone saying—anyone—this is guaranteed not to know what the hell they're talking about—and shouldn't be in a position where they're pretending that they do. It's unjustifiable and pure cargo cultism.


A well designed language should not have an equals that results in a cargo cult


I have missed the part of the discussion that contains an imperative for folks to debate the resolution of whether JS is a "well designed language"—and not about JS's equality comparisons per se and whether it's still usable or not despite that. These:

> JavaScript equals is an abomination, which to me seems like a larger issue, and people use it just fine.

... are your own words...


One of my primary languages is C89 and don't find it an impediment. It's been over 2 decades since I last touched Pascal so I'm not sure how it works there, but in C you can always open a new inner scope with { and declare more variables there.

and at times even in performance (you only declare variables you actually use).

All but the stupidest compiler (which usually means no optimisation at all, not even precomputing constants) will not be affected by completely extraneous variables.


> All but the stupidest compiler (which usually means no optimisation at all, not even precomputing constants) will not be affected by completely extraneous variables.

The statement is certainly false in this general form, even with full link-time optimizations. If the type has a constructor with side effects external to the program (e.g. it makes syscalls), the compiler cannot remove the variable.


If they cause visible side-effects, they are by definition not extraneous.


I'm not interested in playing this semantics game. The compiler can't know whether a syscall has side effects. And some side effects can be benign and irrelevant, and you definitely left the variable in by mistake, but they're still side effects.


If that's the worst thing you can find about Pascal (and it's just a style issue) then Pascal must be the best thing ever :)


I have found a beautiful beginner book "High-Level Languages and Their Compilers" by Des Watson and I want to run all that examples which are given mostly on Pascal.


> Why use Pascal?

Because you like Niklaus Wirth's languages, and only those ... but only up to Pascal.

You don't think that the improvements in his subsequent Pascal-like languages are Wirth a damn.

You believe that Wirth went soft in the 1970's and 1980's, and sold out Pascal.

If that is you, you probably write code in Pascal, implement Pascal, write about Pascal ...

(Everyone else should probably skip Pascal and take a look at Modula-2 and Oberon.)


> (Everyone else should probably skip Pascal and take a look at Modula-2 and Oberon.)

I did check out Oberon-07 because of its minimalism but i really couldn't get over the SHOUTY keywords :-P.

(Wirth used uppercase for Pascal's keywords too but unlike Oberon, Pascal is not case sensitive - my pet theory is that Oberon is case sensitive as a reaction to Pascal programmers not using shouty capitalization :-P)


Yes, forcing uppercase keywords is rather unfortunate; have a look at https://oberon-lang.github.io/


I think the main reason for keywords being uppercased in Oberon (which is case sensitive) is to free up the space for user-defined identifiers. As long as the programmer sticks to lowercase and mixed case identifiers, new keywords and predefined procedures can be introduced in the language without invalidating old programs.


Wirth could have figured out a better way to do this, or asked around.

You can have a way for a program to declare the version of the language it wants. Then keywords introduced in all newer versions of the language disappear (are treated as ordinary identifiers).

Via a mechanism like Racket's #lang, Component Pascal could support Oberon-2, Oberon, Modula and even Pascal translation units in the same program.

It does seem as if Wirth operated with serious academic blinders on. He had no concept of treating the language as a product, which supports existing users with dogged backward compatibility. The result was a fragmented landscape of languages, which helped eroded the popularity of the languages.

The C preprocessor has been much maligned, and usually rightfully so, but with the preprocessor, I can take C code that uses C++ keywords like class and new, and make it work. We can easily write a macro convert(type, expr) which will expand to static_cast<type>(expr) when the code is compiled as C++, and to (type) (expr) when compiled as C. C itself has been able to bring in new keywords like bool, true and false without breaking old programs which use these, by requiring programs to include a <stdbool.h> header from which you get things like #define bool _Bool.

Speaking of C++, most C programs can be easily converted to C++, often with little or no modification. Or at least that used to be true for a long time back in an era when it was important to be able to do that.


For a "Loberon" dialect!


s/For/Fork/


For the first couple of items on the list, Austral might be a language worth considering:

https://austral-lang.org

It's new so it obviously doesn't have the community of libraries to use, but it does have a very friendly and accessible Pascal-like syntax, while also having a state of the art linear type system.


I'm using Free Pascal for my 3D game engine (recent-ish screenshot[0]). For me there is really two main simple reasons:

1. Lazarus. A game engine is -waaay- more than just a 3D engine with the tools being a very important aspect. Lazarus and LCL provide a rich and well featured WYSIWYG RAD IDE and framework for making desktop applications. As a bonus Lazarus as an IDE (even ignoring the LCL framework) is very fast.

2. The language is decent and i know it. This is important because i'm not interested in learning a new language or framework or whatever while also making the engine, i want to focus on one thing. Also i have a lot of code written over the years to cherry pick (i started the project in 2020 but some of the code goes back to 2007-2008). And finally it has a strong aversion to breaking existing code - there is a culture of preserving backwards compatibility.

Free Pascal is far from perfect though and TBH i'm annoyed by some of its constraints - some of them being quite pointless too IMO. For example the language has three ways to create compound types: records, objects and classes. They are almost all identical, except each one of them has limitations that isn't found in others, e.g.:

1. Records. They are like C structs. With "advanced records" language submode (enabled more functionality in a backwards compatibility preserving manner) they can also have methods, properties and "management operators" allowing the compiler to insert calls when things enter or exit the scope (so you can e.g. implement smart pointers, RAII or whatever). However they do not have any form of inheritance or virtual functions. They are value types and as such can be put in the stack, heap and their can be part of another compound type.

2. Objects. They are basically records with inheritance and virtual functions, though they cannot have management operators or some other record-only functionality like a variant section (allowing, e.g. functionality similar to C unions). They are also value types and can be put in stack, heap and/or be part of another compound type.

3. Classes. They are objects with extra functionality, have extra the bells and whistles like dynamic/message-based methods, a published section that expose properties and attributes via RTTI, etc. However they are reference types - they can only be allocated on the heap[1] and as such cannot be put in the stack and cannot be part of another type (only references can be stored).

The thing is none of the above limitations have a technical reason - especially those between objects and records. In fact in the compiler source code all compound types are implemented with mostly the same code with some specialization checks here and there, it isn't like each of the above gets its own separate code path.

But this is far from the only limitation. Another is that there is no way to expose a readonly view of an instance - something like C/C++'s "const" - so you can't -e.g.- have a mutable collection as a private field in a class that is exposed as an immutable one directly, so the compiler knows it can generate code for directly accessing the underlying field but not allowing any modifications to it. The weirder - and misleading - aspect is that the language does have some ways of specifying readonly views in some cases - specifically function parameters can use "const" or "constref" (the latter ensures a value is passed by reference, the former lets the compiler decide) which disallows the value to be modified. However if an object is passed like that you can still make method calls that mutate it even though you cannot assign it to another value, which kinda makes the whole thing feel only skin deep.

On the topic of arbitrary limitations and types, there is also a weird limitation on properties: you can have property getters (and setters, but what i'll describe makes sense only for getters) be either fields or functions. For the former the compiler will generate code that'd be equivalent to accessing the field directly, for the latter it'd be a function call. When you define a property you also declare its type - but if you use a field as a getter it must much the type exactly, on the other hand you can have a function that returns a field as another type. This means that you cannot work around the "lack of const" limitation for collections by making a base class collection that only provides a readonly view and a subclass that allows mutation and then expose the mutable instance as its immutable parent class while letting the compiler generate code that access the field directly - you have to go through a function call and not only the compiler is very stupid at eliminating unnecessary calls, this also changes the RTTI data if you want this to be handled for -e.g.- serialization or via a GUI property editor.

Also when it comes to RTTI which it is still better than what you'd get via -e.g.- C++ (i.e. nothing), it has limitations like only allowing primitive types (so if you have a class for a 3D object and you want to expose its position to be edited via a property editor that uses RTTI to figure out what to edit or to have automatic serialization and deserialization you cannot expose a "Position" property of a "Vector3D" type directly with the RTTI specifying what that "Vector3D" means and instead you have to specify separate "PositionX", "PositionY" and "PositionZ" properties of "Single" type (which is a primitive type for 32bit floats). You can still have a "Position" property of a "Vector3D" outside of RTTI and you can still use a private field of "Vector3D" to store the position and then use getters and setters beneath the scenes to access the individual vector fields for the "PositionX/Y/Z" properties.

All of the above have the end results of using tons of -often inefficient- boilerplate to work around language limitations that in 99% of the cases have no reason to exist in the first place (aside from nobody implementing them - which would be fine, it is a free project, but the thing is the above are things the FPC developers refused to acknowledge as problems - in fact i personally tried to submit a patch that implemented the property case).

As an example, in my 3D game engine i have a "dynamic array" generic (an incredibly common container you'd find in game engines) that i want to be able to allocate on the stack, heap or part of another object, i want it to separate allocation from number of elements (the language itself has dynamic arrays but they always allocate the same number of elements as the array has which means when you insert items you do a lot of allocations and deallocations - that not only fragments the heap but the FPC memory manager isn't that fast in the first place) and of course being able to automatically finalize (i.e. destroy) any managed objects (e.g. a dynamic array of strings with string being copy-on-write "smart" objects defined by the language itself).

In order to get the "allocate on stack or part of object" i couldn't use classes, only objects or records. Because i wanted to expose it directly but without allowing mutation, i really could only use objects since records to not support inheritance. However objects do not support management operators so i'd have to explicitly initialize and finalize the objects. So the workaround there was to store the data in a separate type that is a record with management operators and specify a single field in the (immutable / readonly / base) object of that separate type so that when the object goes out of scope or is finalized itself, its contents are also finalized (remember that i mentioned how this is an arbitrary limitation? The compiler already knows this can happen and handles this, otherwise this workaround wouldn't work!). And also because i cannot have the mutable field accessed directly as its immutable/readonly base type, wherever i use it and i want to expose it i have to use a method getter which returns a pointer to the underlying field (but as the immutable type) - this is needed because otherwise the getter would return a copy of it (remember these are value types).

[0] https://i.imgur.com/zzIH4dl.png

[1] (there are workarounds like using a proxy array with in-place initialization but they are very bug prone so i ignore them as they are not practical)


(continued)

These are some of the annoyances with Free Pascal as the language, but there are others like how a unit (think module in other languages) has an "initialization" and "finalization" section that is executed on startup and shutdown respectively. This is good and useful thing, however the problem is that the initialization also has implicit code that zeroes out any global memory in that unit. This means that if a unit A's initialization code runs a function from unit B that accesses some global in unit B but the initialization code in unit B isn't called yet, any modification made by unit A calling that method would be lost ending up with weird bugs. The order the initialization code runs depends on the order the units are used in a program but if there are circular dependencies this order can be whatever the compiler encounters first. The workaround for this is to make sure the program (not any unit) that you compile has in its "uses" section the units that you don't want them to modify and make sure they do not have any dependencies themselves.

Found that the hard way because in my units the initialization section contains RTTI registration calls that were later clobbered by the RTTI registry unit somehow getting initialized after some units that registered some types - this ended up with some assets not being loaded in the engine because the registry couldn't find the type of some serialized objects (because these types happened to be registered before the registry's own initialization code zeroed out its globals and erased any registered types).

The obvious fix would be for the global zeroing out to happen before any initialization section code is executed (and that only in systems which actually need it), but for whatever reason FPC doesn't do that.

On the positive side the compiler is relatively fast - it takes about a couple of seconds to build the full engine + editor on my (now 5 year old) PC. And while there are issues like i mentioned above, the positives (which include the fast IDE, rich framework - and also i had a much easier time when trying to contributing to Lazarus - fast compiler... and the fact i already have a lot of FP code and know the language) outweigh the negatives.

It isn't like there are many alternatives anyway, especially when it comes to something like Lazarus (of which the only potential alternative i can think of - which i haven't explored much - is Qt Creator but i find Lazarus both the better IDE and LCL easier to work with than Qt, not to mention how FPC is waaaaay faster than G++ or Clang++).


What about performance, e.g. compared with C++?


I never really had a problem with it, at least so far.

Actually a few years ago i wrote a 3D game for DOS using Free Pascal[0] (here is a review from a YouTuber[1]) and the performance was decent, though i did write the rasterizer inner loop in assembly (not that great assembly TBH but still slightly better than what FPC generated).

I did optimize it over time and expectedly, the biggest gains weren't from microptimizations but from changing how rendering works (e.g. i got a boost by adding a PVS and then another by replacing the PVS with portals since the PVS wasn't that great :-P and then yet another when i added mesh occlusion culling using data from the previous frame).

About the engine i linked at in my post, i also did a few optimizations but again they weren't microoptimizations but just algorithmic changes. I did write a profiler[2][3] a couple of years ago that helped (the video shows it in practice with the engine) but it uses some Windows-specific functionality and i've switched to Linux since then. Perf works fine under Linux with FPC, but FPWProf (my profiler) has some useful functionality and i'd like to port it to Linux at some point.

[0] https://bad-sector.itch.io/post-apocalyptic-petra

[1] https://www.youtube.com/watch?v=Lo7VlrYiTeE

[2] https://www.youtube.com/watch?v=yF0wmN9J8Ts

[3] http://runtimeterror.com/tools/fpwprof/


Your level editor looks awesome :)


I see Go as a direct competitor to Pascal, in particular for all the points mentioned in the article.

But Go has a modern runtime (with garbage collection), a modern ecosystem (with major libraries such as HTTP and TLS in stdlib) and an active community.


I do not. Co-routine thinking is more complex than imperative coding regarding the teaching aspects.

Java and more concrete C# are the one who took Pascal and Delphi originally out of the game for day to day programming. And Go is already so much more popular than Pascal for real life work, that competition is a really the wrong word.


> modern, readable, fast, type-safe, cross-platform programming language

Let's assume this is all true (although I wouldn't say it's particularly modern, and I don't know about some of the rest). And let's forget that some of these adjectives are relative or subjective.

Even under that assumption - there are other programming languages which meet these criteria. Why would we prefer Object Pascal over them? Especially when it has existed for many years and not gotten a lot of traction?


"For those who like this sort of thing, it is the sort of thing they like." - attributed to a number of people.

There are other languages which meet these criteria. But if Pascal fits the way your brain works, then it does. So use it.

And if it doesn't fit the way your brain works, then don't.

People are different. It doesn't hurt to have different tools.


Related, and of interest, IMO:

Delphi – why won't it die? (2013)

https://news.ycombinator.com/item?id=7613543


The funny thing about that, is here we are in 2023. Lots of its competitors and haters are likely still asking, "Why won't it die?"

TIOBE index has Delphi/Object Pascal ranked at #11. That's ahead of "top players" like Go, Rust, and Swift.


In Germany we keep having a Delphi / C++ Builder developers conference, and maybe due to their relation to Anders, it is not surprising to see related articles on the .NET Developers Magazine.


Too bad the TIOBE index is not representative of the real world. And it certainly won't get you a job either. If I had to chose between go/rust and pascal/Delphi for a healthy ecosystem or for job offers, I know which one I would chose. Especially early in a career.

Ok maybe rust jobs aren't so common either, but swift and go are for sure.

And that's not even mentioning the de facto vendor lock in when using Delphi in most professional settings.


TIOBE is arguably the most reliable one, though all of them are arguably flawed in various ways.

Not saying the other languages mentioned don't have jobs, they are after all "top 20" languages too. But, Pascal/Object Pascal/Delphi is around a lot more and has more opportunities than many might be thinking. Especially from a worldwide point of view, and not just for the U.S.A Pascal/Delphi is still taught and used in a lot of schools around the world. While Go, Rust, and Swift are not. There are still massive amounts of legacy projects and code, not to mention Embarcadero is still selling Delphi and doing well enough to stay in business.

I think the mistake that some make is the time frames that it takes to displace older and previously popular or dominant languages. Popular this year or a few years, does not usually and instantly erase that some other language was popular for 20 or 30 years.

People run around saying COBOL, or even languages like Ada or PHP are dead, yet various institutions and companies are begging for programmers that know them. Those claiming a language is "dead", can be quite suspect too, because they are competitors who "wish" it to be so. "Is dead", "wish it was dead", or promoting sites "claiming the competitor's language is dead" are clearly a bit different.

Because a language is popular this year, does not necessarily offset all the years that an older language was popular and in use. Lastly, Object Pascal/Delphi did not just fall completely off the charts either or stopped being used at all. It stills maintains significant levels of competitive popularity around the world.


>Pascal/Delphi is still taught and used in a lot of schools around the world.

That's interesting, and good to know.


Good analysis, including the point about competitors.


Pascal was one of my first languages and I love it for that, but I can’t imagine going back to it. It never had a modern ecosystem and the OOP stuff was super janky


Pascal was the first prog language I ever learned as a 12-y/o kid. It's been almost twenty years. Shoutout to everybody from Kaitnieks who was scripting SCAR in Pascal to macro old RuneScape- how time does fly…


I wouldn't use it.

...but I'm kinda glad Delphi was my first language. I feel it primed my thinking along helpful lines - typed & pretty rigidly structured.

Spent a lot of time in python since which always felt a little..fuzzy for lack of better word. Recent took up Rust and had a bit of a "oh this feels familiar" moment. Clean and unforgiving.


We need to bring back Oberon. Fast compile times, automatic memory management, live environment.


Oberon is even older than Delphi; a more modern version is Oberon+: https://oberon-lang.github.io/


Is this still actively developed?



It was great for its time, nowadays we would be better off having a similar experience based on Go, C#, D,...


Add Vlang in there too, which inherited the Oberon-2 method syntax and has Pascal/Wirthian influence.


The original is much less jumbled.


Here is the language specification:

https://miasap.se/obnc/oberon-report.html


I would recommend everyone to learn Oberon just to see how concisely defined an imperative, structured, procedural and modular language with support for object oriented programming (extended types) can be. The language reference is only around 16 pages long including examples and formal grammar.


Pascal and Logo Turtle are programming languages I learned at school, but never used them outside of it. I hated them back then. I don't know if this was teacher or language fault, but these IT lessons couldn't be more boring and confusing.


Teacher and kids not being focused. I learnt using Pascal and it was perfect.


What I don't really like about Pascal is it's pretty verbose. It was originally for educational purposes so everything is very explicit.


I'd rather use Oberon or Modula-3 if I have to use an algol style procedural language.

But really, I'd rather use a modern lisp.


In the future, if you ever create a teaching language, please point out clearly to the students that it's a teaching language.


Pascal is fairly isometric to a C subset (with a few exceptions such as sets, range types and indices, or printing out enumerated constants.) As a result there are readily available Pascal-to-C translators that produce working, compilable C code.

Pascal isn't a memory safe language (e.g. use after dispose()), but the parts of C that are harder to mechanically translate into Pascal are often the unsafe bits.


I won't even bother clicking this click bait title. I learned Pascal as a sophomore in high school and by my senior year I knew I'd never touch it again. At the time it was a great introductory language but there's very little reason anyone should even be thinking about Pascal these days.

Fun fact: I still have my old pascal files on floppy disks in a box somewhere with a not so thin layer of dust on them.


I still think it is brutal we let our industry rookies learn complicated programming concepts in complicated languages like Java, python, js or god forbid anything functional.

I still think it is a huge mistake be Microsoft to let Basic rot away and that our industry should focus teaching on something like Pascal which does not stand in your way


Visual Basic 6 remains unmatched in how to deal with COM.

It is incredible given how much WinDev doubles down on COM, that they keep failing on producing any kind of tooling that is as productive as the VB 6 experiece used to be.


Same here. Including the fun fact.


Question for Castle Engine users:

How is the file size when exporting to web, for an near-empty project?


Seeing devexpress components in Lazarus would be a dream come true.


Unless you want a C++ with a different syntax and less features then there's no reason to learn Pascal in 2023.

And the syntax is much worse, you can't even declare variables in the middle of a function it has to go before the function. This slows you down from real work (art code comes later). It also makes worse code as declaring near the usage point makes code more clear.


Just a few days ago I found a bug in someone's code where a variable was introduced twice with slightly different name, like "var jobno", and then "var jobn". Made me think that Pascal's approach to keep vars declarations at one place is not a bad thing. As for slowing down - probably yes, but I doubt that speedtyping is really a big virtue for a software developer. Also, in a decent IDE it should be possible to make binding/macro to insert declaration after the current function header without jumping away from the current position (though I don't know what Pascal devs use nowadays for IDE)


That's completely missing the point.

While Pascal can be used for serious work, it is first and foremost a teaching language friendly to beginners meant for introducing people (who back in the day) likely started with something like BASIC to concepts of structured and objects oriented programming. The syntax is easy to learn, fairly clean and the language has very fast compilation (C++ really can't compete here). Which is important when teaching - permits rapid iteration.

Concerning archaic/rigid syntax - there are some good reasons why the syntax is like it is. E.g. the concept of declaring a variable in the middle of a block didn't exist in Pascal for a reason - you had to actually allocate memory before you used it, the compiler didn't hide this from you. That's why variables are declared at the start of the block.

When Pascal has been designed it was running on various 8bit and 16bit machines and was originally compiled into bytecode (p-code), so optimizations like statement reordering common today (which permits to declare the variables in the middle of blocks) didn't exist/weren't possible/common.

These days you have a ton of abstractions between you and the "metal" running your code, Pascal is a language designed for much simpler time and hardware.


I was teaching programming with Pascal until three years ago, and I think the pedagogical argument might have made sense in the past, but not in 2023 (or 10 years ago for that matter).

The syntax is rigid in all the wrong places. For example, not being able to declare variables anywhere doesn't really help students, this isn't something that typically leads them to errors.

What does lead to lots of errors? Dereferencing uninitialized pointers or going out of bounds in arrays, for example. And the language and compiler won't help you one bit there. Same opaque segmentation faults as with C, same unpredictable behavior as with C, but with worse tooling for debugging.

Compilation time is not an issue in 2023 - well, maybe it is if you compare with C++, but C++ is about the worst non-joke language I can think of for learning (and it was my first language!). C, Java, Python, etc. are fast enough.

Pascal had its time but I don't think there is much reason to use it now, outside of pure curiosity or wanting to try a historical language.


>For example, not being able to declare variables anywhere doesn't really help students, this

Disagree. Creating variables in the middle is the reason why beginners create spaghetti code and mix up control flow. This is the reason for dangling pointers, memory leaks, and de-referencing pointers in wrong places.

If student needs to declare variable in the middle of method, this is a good sign to split the method or rethink control flow.

What really I learned from Pascal is managing clean control flow, and var section in the beginning forces you to think in advance what you are going to need in this particular method vs what needs to be done in a separate function.


>> Pascal had its time but I don't think there is much reason to use it now, outside of pure curiosity or wanting to try a historical language.

I use Pascal only for side job/hobby, not for daily work. Lazarus+FPC is still my #1 choice for desktop app (is this still a "thing" in 2023?). No Electron, no JVM etc etc.

For writing web/server/backend stuffs, kinda hard to argue with Go.


> While Pascal can be used for serious work, it is first and foremost a teaching language friendly to beginners

But that's not what the linked article argues as a reason to use (Object) Pascal.

> The syntax is easy to learn, fairly clean and the language has very fast compilation (C++ really can't compete here). Which is important when teaching - permits rapid iteration.

1. There are other languages than C++ you know...

2. For language-teaching purposes, basically _any_ language compiles essentially instantaneously on modern hardware (including C++).


Much of the original Mac was Pascal based with 68k for performance pinch points.

Of note it was faster to compile than C with MPW Shell.


Many years ago I learned programming that way. Pascal then C, Assembler and VB Script. I completely agree with the argument that it is a great teaching language. Our teachers even explained this to us. JS, C, C#, Python and Java syntax is just crazy for real newbies. Curly braces basically do not exist in normal life and scare people. And Go and the functional group of languages are just not understandable with their concepts. Imperative programming is much easier to understand.

Pascal is in a sweet spot.


I seem to recall Golang being inspired by it too, but I could be misremembering.


I believe you're right. iirc, that's where the := notation comes from


I've been playing around with Lazarus the last couple of weeks, and I can say there is at least one reason to learn pascal in 2023: it's fun!


Can’t speak about Object Pascal, but the old Pascal was objectively better than C. You can like or dislike the differences in syntax, but Pascal’s type system was pretty solid which can’t be said about C


The way I remember the pascal type system is arrays of different lengths were different types.


In the original Pascal, that was true. The standard (and, I think, Jensen & Wirth) were that way. And it was a fatal flaw.

I had a program that I took over that did a numerical simulation on a 2D grid with user-specified size. The original author simulated that 2D grid with a 1D doubly-linked list. As you might expect, this was both slow and error-prone. But he did it because there was no possible type that you could give to a user-sized array. There was literally no way to talk about such a thing.

We eventually "fixed" it by allocating the largest 2D array possible within our memory limitations, and using only the user-specified part of it.

Most "real" Pascals (Turbo Pascal, but others that were intended to be used in the real world, not just in the classroom) developed some way to give a type to a variable-length array (and also to do a C-ish cast). Unfortunately they all did it in different ways, so there was no code portability between compilers if your code needed such things.

So, yeah, "objectively better than C" is quite a stretch, even just in the type system.


There's more to it than that. You could (from early Pascal days) and can have arrays where the array index range is a previously defined subrange of integers, or maybe even of an enumerated type.

Then you have sets as a built-in type, which is big.

Some more.


They are pretty much everywhere when statically allocated.


well, not in c.


Even in C, citing the standard (6.2.4, 20):

> Array types are characterized by their element type and by the number of elements in the array.


I believe they are in C, but arrays degrade to pointers when you look at them wrong, and _those_ don't have length as part of their type.

At least I'm seeing: `array type 'int[3]'` in messages when I tried to do something wrong with it quick to get an error message (using llvm), maybe that's a compiler specific thing.


> when you look at them wrong

no, almost whenever you look at them. it is very hard to carry around the size of an array in C.

> At least I'm seeing: `array type 'int[3]'` in messages when I tried to do something wrong with it quick to get an error message (using llvm), maybe that's a compiler specific thing.

post some code that illustrates what you are talking about


I was trying to get clang to throw an error that included type information to double check what it thought the type was. Quickest thing I could think of was trying to assign to an array.

The exact error in question was: error: array type 'int[3]' is not assignable

Which seems to imply that clang is considering the length to be part of the type.

edit: You asked for code (my apologies that it's rather trivial).

    int x[3] = {1, 2, 3};
    int y[3];
    y = x;


> but Pascal’s type system was pretty solid

To be honest, this is a matter of styles, coming from a dynamic type background I don’t appreciate strongly typed systems as an advantage.


Strongly typed is orthogonal to dynamic and static typing. Python and Common Lisp are both "strongly" typed and dynamically typed. There's no reason to shun strong typing if you also like dynamic typing.


I would be curious to see references that claims that dynamic and static typing are orthogonal with strongly typed systems, as “strongly type” is rather ambiguous and the only reason I used the term was because that was how Pascal was promoted back in the day (or at least how was taught to me)

From Wikipedia: In 1974, Liskov and S. Zilles defined a strongly-typed language as one in which "whenever an object is passed from a calling function to a called function, its type must be compatible with the type declared in the called function."

Note that the definition refers to type declaration, both being optional in Python and Common Lisp, so I wouldn’t use either as an example of strongly type languages.


"whenever an object is passed" clearly refers to run-time. An object is not being passed when we are compiling the function call.

"declared in the function" clearly means that the function has an internal type check.

An interface declaration (Modula-2 interface file, C header file with prototypes) is not "in the function"; it's compile-time meta-data about a function.

A function call between separately compiled translation units has no idea what is in a function.


You already found the wikipedia page, try reading it. See where it puts a lot of dynamically typed languages under the category of "strongly typed". You've decided it's a good enough source apparently.

But really, I used "strongly" in quotes deliberately. It's a terrible phrase since it means nothing in practice because it can mean too many things (as that same page notes) that often are at odds with each other.

> Note that the definition refers to type declaration, both being optional in Python and Common Lisp, so I wouldn’t use either as an example of strongly type languages.

Even this definition would potentially exclude SML and OCaml where types are inferred, not declared. So according to you those two languages are weakly typed? I think a lot of people would be surprised to learn that.


Dynamic and static merely point to the fact that type analysis is either done at runtime or at compilation time.

It is an unrelated concept to which type rules will be applied by the software at runtime or compilation time.


how can Python be strongly typed if it doesn't enforce types for declared arguments?

What is the value of this supposedly "strongly typed" Python's type system?

class Object:

    pass
def f(arg:int):

    print("type of arg = ", str(type(arg)))
f(1)

f(666.0)

f("kek")

f(Object())

type of arg = <class 'int'>

type of arg = <class 'float'>

type of arg = <class 'str'>

type of arg = <class '__main__.Object'>

and not a single error/warning thrown


How is OCaml strongly typed if it doesn't have declared types?!?!? (EDIT: In case it's not obvious, I'm being sarcastic, I'm pretty sure some people in this discussion won't get that though.)

  let f x = x + 1;; (* What's the type?!?!? *)
Turns out that "strong typing" is a shitty phrase that people should stop using because it means too many conflicting things, and, consequently, means nothing. Static and dynamic typing have well-defined meanings, stick with those terms instead of ones that mean nothing.

But for a demonstration (compare to Perl) try this in your Python REPL:

  >>> 1 + "1"
Does it work? Probably not unless you futzed with the language implementation. In Perl it does, though. So to the extent that "strong typing" means anything, Perl is "weakly typed", Python is "strongly typed", and both are dynamically typed. It's an orthogonal characteristic of the type system and language from when type checking occurs.

----------

EDIT: BTW, formatting code blocks on HN is really easy. Prefix each line of code with two space characters.

  __Replace those _'s with spaces
The result is much cleaner than your comment:

  def foo(x):
    return x + x
No extra newlines needed, more compact, easier for most people to read.


people on HN often claim that Python is "strongly typed" while PHP is loosely-typed, but I don't see the difference honestly. both are pretty loose.

re: 1 + "1"

I didn't get your point really. My reply was to counter claim that Python is supposedly "strongly typed" and I don't understand how this strong typing helps developers. I know that languages can infer types, which is tangential subject. I dont know why you brought this up


> I don't see the difference honestly. both are pretty loose.

There's a clear difference, in PHP 1 + "1" is 2, in Python it's a TypeError, (and as a bonus, in Javascript 1 + "1" is "11").

The definition of "strongly typed" being used is related to type coercion, not type inference. In PHP the string is being coerced to an integer, but Python requires you to explicitly say 1 + int("1") if you want to add the numbers together. This can be helpful to developers because it requires you to make a decision about what behavior you actually want rather than assuming you want to add two numbers or concatenate strings when you may have wanted the opposite.


String concatenation is probably the most reasonable result aside from a type error given how + is used in JS for concatenation elsewhere, but JS actually gets funny.

  "1"*2 // => 2 (not "2")
  "1"*2+3 // => 5
  "1" - 2 // => -1
  "9"/3 // => 3
  "9" + 3 // => "93"
So some mathematical operators will convert string parameters to a number, but not +.


What's the difference between PHP giving "2" as the result of 1 + "1" and Python giving "999" as the result of "9" * 3?


Try that line in both Python and Perl and see how they behave, you'll see that one (Python) respects types (the most useful notion of "strong typing" is no or limited automatic type coercion and "weak typing" as an excessive permissibility around mixing of types in operations) and the other (Perl) does not.

I pointed out the OCaml example because both you and your sibling poster brought up type declarations as somehow mattering with respect to "strong typing". OCaml doesn't require type declarations, so I guess it's weakly typed according to both of you. Which is a surprising result.


I never do 1+”1” in my code, so this example is not useful to me.

I do however annotate types and expect Python to respect type annotations which is not the case. Then I dont understand what is point of annotating types if they are not respected?

If your argument that Python doesn’t convert from one type to another - well, it doesn’t need to do that if doesn’t care about types in the first place and lets you pass any junk into any method (and this is #1 thing that type system is supposed to prevent)

Is it only for documentation so that people reading code could understand what types to pass?


Python type annotations were added to be used by an external type checker, so no they are not enforced by the interpreter itself.

This was an explicit decision: https://peps.python.org/pep-0484/#non-goals


Python type hinting is not useful at runtime (in fact it's flat out ignored). It is useful at "linting" time when run through mypy.


Strong agree. I used the old Turbo Pascal for many years. It was quite routine to produce substantial programs where the compiler (and type system) had caught ALL the bugs.

Meaning: you used sensible types (and had sensible libraries) and once the obvious compile-time errors were fixed, there would be zero bugs found in testing or in live!

I like the flexibility of python, but I know that every substantial program has untested pathways that will fail when they come to be exercised :(


Delphi/Lazarus seems like a fairly productive environment.

Also if I recall correctly some Pascal compilers (similarly to Ada) could actually generate somewhat memory safe code with array bounds checking, which is something that C/C++ usually lacks.


> Unless you want a C++ with a different syntax and less features then there's no reason to learn Pascal in 2023.

For me, personally, there there is a good reason to learn Pascal in 2023; the quick and almost painless ability to produce fast GUI applications.

The alternatives are not nearly as polished or as nice.


recently did a conversion of a Delphi program to Java with some C++ and this is a strange argument to use. pressing F9 at any point in Delphi and having your program run a second later is incredible, that plus catching the vast majority of errors compile time certainly does not slow you down.

Speed wise on this last conversion project with the same first level optimizations GCC is a good 50% faster, the Java parts feel about the same or a bit slower though, Disclaimer I never go to extreme optimization settings on compilers, was bitten several years back and happy to not go cutting edge.

Also Pascal is much less toxic community for noobs I got to the point about ten years back I would rather not ask the question on-line for GCC due to the fact I would be branded stupid for not knowing some obscure linker setting or such like.

end of my career now, happy Pascal was part of it, especially in the early Turbo Pascal DOS days, I doubt I would have carried on at the time if I was forced to use C (I later did and it wasnt that bad but I had to scratch my head at the time how people thought it better)


> you can't even declare variables in the middle of a function

You can since Delphi 10.3.

https://en.m.wikipedia.org/wiki/History_of_Delphi_(software)


Seems like you haven't been using the Object Pascal dialects of Delphi or PascalABC, which let you declare variables everywhere. In the case of Delphi, Embarcadero has been making it as easy as possible to switch between their Delphi and C++Builder products lines.

For Free Pascal, the developers and many users voted against that, because it appears the opinion was it leads to sloppy code and unnecessary errors. The other point, is that Object Pascal has for a very long time allowed nested functions, so it wasn't arguably that much of an issue for those on that side. It seems more like C/C++ users wanted to force their habits on Pascal, which Embarcadero sold out to, for a few extra dollars. PascalABC, copied what Delphi was doing, who they use as a quasi-standard.


I never understood why is declaring variables in the middle of the code a "pro" for some. In my experience that's a "con" especially because of maintainability. Same way you do a comment before declaring a function to explain its purpose, you do a comment for each variable to declare its goal within said function. That's proper maintainability for a project.

Also Delphi, for past 2 years, has this "middle" variable declaration style, and I hate it, never used it. So your comment missed the mark. If you want to advocate "don't learn Pascal in 2023" please do more research and come up with better argument.


Like dynamic typing and other things, declaring things in the middle of your code block seemed like a great idea to young me, but as I’ve gotten older, things like organizing variable definitions, strict use of types (or the kind of sucky type hints in Python), and other kinds of self-documenting declarations have become much more appealing to me. It might save “real work” from happening if you measure by lines of code, but you pay that price later when debugging or reading code you haven’t touched in a long time.


A clear example of why it is a pro is the index of a for loop. Why would it be more maintainable to declare all for loop indexes at the beginning of a function? They are basically bound variables that are only meaningful within their loop, and they are usually not even worth commenting.


More often than not, when maintenance really matter and juniors really choke, the algorithms that powers core functionality usually have nested loops and conditionals. So if you start using inline variable declaration you really have a hard time "where did this was declared". Combine that with compiler switches and you have a recipe for disaster when declaring inline variables vs. a clear section of variables right after function header, where you put all comments to specify what those variables do.

Remember, initial phase of a project is only 10% of it, the maintenance and expanding its functionality is the rest of 90%. Inline variable declarations are a real PITA for that phase.


> I hate it, never used it

You should at least use it in for loops. It's a useful improvement, and especially handy in combination with type inference. For example:

  for var i := 1 to 10 do
    Inc (Total, i);
i is declared inline, it's only in scope inside the for loop, and the type is inferred from the bounds of the loop. You could of course use "for var i : Integer := 1 to 10 do" if you wanted to be explicit about the type.

Documentation: https://docwiki.embarcadero.com/RADStudio/Alexandria/en/Inli...


Please do another "for" right after this one, identical, and then revisit this comment (hint: you'll get a syntax error).


> hint: you'll get a syntax error

No. You won't. You didn't even try it, did you.

Read the documentation, use inline variables, and be happy.


defining all variables as near to their point of use as possible is a pretty obviously good thing, for readability if nothing else.


>"And the syntax is much worse, you can't even declare variables in the middle of a function it has to go before the function."

I wish people would bother to check the facts before posting. It's been available for quite a while along with many more modern features.


> you can't even declare variables in the middle of a function

losing time with "static declaration follows non static" in C programs doesn't help either.


> you can't even declare variables in the middle of a function

Yes you can:

https://docwiki.embarcadero.com/RADStudio/Alexandria/en/Inli...


C used to be that way too, but Pascal stopped evolving when it fell out of popularity.


</main>


Why I prefer brainfuck.


first word they use to justify using Pascal is that it's "modern".

Gave a glimpse at the code source screenshot. No, that syntax is very much the past.


The syntax is timeless, because it is much closer to math than B/BCPL/C heritage.

It has its own issues but overall it is much better thought out than most other languages.

Here is a comment I wrote a couple of months ago with more arguments for Pascal's syntax:

"Yes, Rust does indeed and a long time before that it was Pascal. I really love Pascal's syntax, it makes a lot of sense when you approach it with a math background.

- '=' is for equality only

- assignment is ':=' which is the next best symbol you can find in math for that purpose

- numeric data types are 'integer' and 'real', no single/double nonsense

- 'functions' are for returning values, 'procedures' for side effects

- Function and procedure definitions can be nested. I can't tell you what shock it was for me to discover that's not a thing in C.

- There is a native 'set' type

- It has product types (records) and sum types (variants).

- Range Types! Love'em! You need a number between 0 and 360? You can easily express that in Pascal's type system.

- Array indexing is your choice. Start at 0? Start at 1? Start at 100? It's up to you.

- To switch between call-by-value and call-by-reference all you have to do is change your function/procedure signature. No changes at the call sites or inside the function/procedure body. Another bummer for me when I learned C.

Pascal wasn't perfect but I really wish modern languages had syntax based on Wirth's languages instead of being based on BCPL, B and C."

https://news.ycombinator.com/item?id=32983878


> numeric data types are 'integer' and 'real', no single/double nonsense

Actually in Free Pascal (and AFAIK Delphi) there are Single and Double (and Extended) that map to 32bit and 64bit floats (Extended depends on the target CPU, e.g. for 32bit x86 is 80bit floats but for 64bit x86 is 64bit floats) since you actually do need to differentiate between the two in practice.

> 'functions' are for returning values, 'procedures' for side effects

FWIW even in Turbo Pascal (i don't remember which exact version) functions could also be used as procedures (the return value was ignored). While in theory separating the two sounds nice, in practice it is often useful to be able to ignore function results.

> Array indexing is your choice. Start at 0? Start at 1? Start at 100? It's up to you.

One additional neat bit is that you don't even have to use numbers, any ordinal type will work. Enums are ordinal types so you can do "type Foo = (Bar, Baz, Bad); FooArray = array [Foo] of Integer;" and then use "Bar, Baz, Bad" to access the array. You can use ranges too.

> To switch between call-by-value and call-by-reference all you have to do is change your function/procedure signature. No changes at the call sites or inside the function/procedure body. Another bummer for me when I learned C.

FWIW i prefer the C# approach of being explicit when you pass something as a reference since it makes it obvious on the call site just by reading the code.

Beyond these i agree with your comment.


When it comes to real vs float I should not have called the later nonsense. I agree that both have their place depending whether you want to express things at a lower (closer to hardware or wire protocol) level or more abstractly. It is still sad that languages like C (and even Rust) only offer the lower level types.

What you said about functions and procedures is also true. I still think that it is valuable to have a distinction syntactically, even if they are relatively similar under the hood. Maybe one day we will have a Pascal compiler that can enforce that functions are side-effect free...


> Maybe one day we will have a Pascal compiler that can enforce that functions are side-effect free...

Pure functions are actually one of the WIP functionality in Free Pascal :-). AFAIK the ultimate goal is to have the compiler evaluate them at compile time where possible.


It's only "mathematical" if you chose very specific parts of the language from a very specific version of Pascal when looked at at a very specific angle.

E.g. the argument falls apart immedately:

- there's no assignment operator in maths

- there are no procedures in maths, everything is a function

- functions cannot be nested in maths (unless I'm mistaken)

- types is maths, but Pascal uses a very narrowed down and dumbed down version of it

etc. etc.

Pascal is the way Pascal is because Wirth wanted the simplest possible language according to Wirth's own criteria that could be compiled in a single pass on a 1980s computer.

It's the go [1] of its time.

That is it. Both the syntax and Pascal's concept are severely outdated. It's a very good thing that modern languages never adopted Pascal's syntax and went for something that is actually usable. Hell, Erlang is a more mathematical and timeless syntax than Pascal, and it's a trivial langauge at it's core.

[1] Can't find the actual text now, but there was a rationale by Rob Pike that go was amed at junior engineers, and needed to be simple.


"- there's no assignment operator in maths"

There is no assignment but in math we use := to express that two things are equal per definition. I never said it was the same thing, just that it is the closest thing. From a math perspective it makes total sense, while singular = for assignment makes no sense at all. Especially when you want express the concept of equality as well and cannot use the obvious choice = anymore because you already used it for something else.

"- there are no procedures in maths, everything is a function"

Exactly and that's why functions and procedures in Pascal are separate things, like it is meant to be. Functions have an equivalent in math, procedures don't. Mixing the two concepts up into the weird thing C calls a function is just wrong.

"- functions cannot be nested in maths (unless I'm mistaken)"

In some sense they can. In C they can't because of technical restrictions, no one ever was fond of that restriction, not even back in the day.

"- types is maths, but Pascal uses a very narrowed down and dumbed down version of it"

Pascal is a programming language, not math. My point is solely that Pascals syntax is superior to C's because (among other reasons) the former is closer to centuries old tried and tested and well established syntax of mathematical notation. It has some consistency and elegance and certainly flaws as well. C's syntax decisions were more driven by long gone technical restrictions than what makes sense to a human. Now we have to live with that baggage.


Edit: In his Pascal report Wirth mentions math zero times: http://pascal.hansotten.com/uploads/books/Pascal_User_Manual...

And in fact in 1971 he wrote that it was basically copied from ALGOL: https://oberoncore.ru/_media/library/wirth_the_programming_l...

Edit2: Most relevant part from second link: "The syntax has been devised so that Pascal texts can be scanned by the simplest techniques of syntactic analysis". That's it.

On to other comments which are basically relevant given Wirth's own words:

> but in math we use := to express that two things are equal per definition.

Then it isn't variable assignment. It's what you pretend it is because Pascal defined variable assignment this way, and now you're trying find an angle for which "Pascal is closer to math" works.

When we write "x = f(y)" or "x = y + z where z = f(t)" in maths there's no confusion as to what this expresses. No need for "equal by definition".

Note: Interestingly enough, Wikipedia doesn't list `:=` in its glossary of mathematical symbols [1] And then there's another sign used for definitions: equality sign with delta [2]

> Especially when you want express the concept of equality as well and cannot use the obvious choice = anymore because you already used it for something else.

Math also has the same problem, because equalities are not equal :)

Hence you have:

- equal

- equal by definition

- ~ has six different definitions depending on context

- ≡ has two different definitions

etc.

> Exactly and that's why functions and procedures in Pascal are separate things, like it is meant to be.

It's not "meant to be". Programming languages are not math. The distinction between functions and procedures in Pascal exists only because Wirth decided that's how it should be.

> In some sense they can.

It means that it makes no sense to pretend that nesting functions in Pascal has anything to do with math.

> Pascal is a programming language, not math.

Precisely. And yet, just two paragraphs above you argue for a distinction between functions and procedures because math ;)

> My point is solely that Pascals syntax is superior to C's because (among other reasons) the former is closer to centuries old tried and tested and well established syntax of mathematical notation.

Modern math notation didn't become "old tried and tested" until somethig like 19th century, and even now it still remains somewhat fluid. And it's only closer if you arbitrarily twist definitions and meanings like "equal by definition" is surely "variable assignment". As I said in the very first line of my original comment: "It's only 'mathematical' if you chose very specific parts of the language from a very specific version of Pascal when looked at at a very specific angle."

It's also "better than C" only for some vague defintion of "better" where "is closer to math" has no relation to either reality or to being better.

[1] https://en.wikipedia.org/wiki/Glossary_of_mathematical_symbo...

[2] https://math.stackexchange.com/questions/1289339/what-is-mea...


Yeah. weinzierl has his/her syntax preference, and that's fine. As the ancients said, "There's no disputing about tastes." But the attempt to provide a rationalization for why the taste is correct is complete nonsense.


It's not about taste, really. We all learn mathematical notation in primary school. It is an universal language. Programming languages support many concepts that are either the same in math or very similar.

Using the symbols for the same concepts in both worlds is the obvious and sensible choice.

The reason why C did choose a different convention was that it preferred similarity with FORTRAN which is so old that it was punched on cards that did simply not have a : character.

Now it is 2023 and I argue that we should use the same symbols in programming that children are taught in primary school instead of ones that were chosen merely because more than 70 years ago the proper characters were not yet available.


> Using the symbols for the same concepts in both worlds is the obvious and sensible choice.

Yeah, but has already been established in this conversation, you're not using them for the same concepts. So your argument doesn't work. Repeating it over and over isn't going to make it work, either.

> The reason why C did choose a different convention was that it preferred similarity with FORTRAN which is so old that it was punched on cards that did simply not have a : character.

I'd be interested in seeing your evidence for this assertion that you make so confidently. From my understanding, the logic was this: Assignment happens twice as often as comparison, so assignment gets the shorter symbol. Perfectly reasonable from an information-theory point of view.

And, C didn't avoid := because the proper character wasn't available. They used : in the ternary operator, so they clearly expected it to be available.

So all in all, your data is suspect, and doesn't support your argument.


"Yeah, but has already been established in this conversation, you're not using them for the same concepts. So your argument doesn't work. Repeating it over and over isn't going to make it work, either."

No, we've established that math and programming languages are different things. Both share some concepts that are close enough so that using the same symbol is the obvious and sensible choice. It's not that equality in C would be such a different idea that a completely unrelated symbol was chosen. == is still similar to the = used in math, just not the same for technical reasons that are obsolete for several decades.

"And, C didn't avoid := because the proper character wasn't available."

I never said that. I said C chose to reuse the symbols from FORTRAN. To repeat part of the Wirth quote from above (but it's easy to find other sources):

"It goes back to Fortran in 1957, and has blindly been copied by armies of language designers."

FORTRAN was initially made for the IBM 704, which had a 6-bit character set that included a =, but no : character. The original FORTRAN manual from 1956[1] has the character table in appendix A on page 49.

FORTRAN could not use := for what we would now consider assignment[2]. It did not have yet developed equality testing, just subtraction and branching on the differences signum result, so there was no need for an equality operator. Under these constraints = for assignment wasn't the worst choice, the mistake was sticking to it when technology evolved.

[1] https://web.archive.org/web/20220704193549/http://archive.co...

[2] The original FORTRAN manual from 1956 never speaks of assignment in this context. The ASSIGN statement is unrelated.


> I never said that. I said C chose to reuse the symbols from FORTRAN. To repeat part of the Wirth quote from above (but it's easy to find other sources):

> "It goes back to Fortran in 1957, and has blindly been copied by armies of language designers."

That's a pretty broad brush that Wirth is painting with. All those other language designers were blindly following FORTRAN; I'm the only one with the wisdom to break precedent. Yeah, um... that's way too broad a statement. Each language followed precedent and broke precedent in certain areas, and they all did so for what they thought were good reasons. "What people are used to" is one reason, but far from the only one. The "number of characters" reason was one I read from (IIRC) Brian Kernighan. I trust him to have a better grasp of the design logic of C than Wirth does.


Wirth doesn't mention math as inspiration because it is obvious.

Surely he derived the syntax from ALGOL, that is no secret, but it was his choice to do so and not invent something unconventional like Thompson and Ritchie did.

In addition to that I find it quite telling that ALGOL's designers were all mathematicians while Thompson and Richie were Electrical Engineer and Physicist respectively.

I don't know why you put equals by definition into quotes as if I had invented that and also why you falsely claim that it is not listed on the Wikipedia page you referenced. It is there with := as symbol in the section about equality.

If you need another reference:

":= (the equal by definition sign) means “is equal by definition to”. This is a common alternate form of the symbol “=Def ”, which appears in the 1894 book Logica Matematica by the logician Cesare Burali-Forti (1861–1931). Other common alternate forms of the symbol “=Def ” include def “=” and “≡”, the latter being especially common in applied mathematics." [1]

Sure, there are alternative forms but := is what was taught in Germany and Switzerland in schools and university when I was there and I'm pretty sure also when Niklaus Wirth was there.

[1] https://www.math.ucdavis.edu/~anne/WQ2007/mat67-Common_Math_...

EDIT:

In Niklaus Wirth's own words:

"A notorious example for a bad idea was the choice of the equal sign to denote assignment. It goes back to Fortran in 1957, and has blindly been copied by armies of language designers. Why is it a bad idea? Because it overthrows a century old tradition to let “=” denote a comparison for equality, a predicate which is either true or false. But Fortran made it to mean assignment, the enforcing of equality. In this case, the operands are on unequal footing: The left operand (a variable) is to be made equal to the right operand (an expression). x = y does not mean the same thing as y = x."

—Niklaus Wirth, Good Ideas, Through the Looking Glass


> - '=' is for equality only - assignment is ':=' which is the next best symbol you can find in math for that purpose

That is just syntax. I am not so sure if it is better or worse or not, but there it is.

Different programming languages do have different syntax, and sometimes that can be helpful if their structure is different. However, what I think is that the syntax for types in C is confusing (when you are making types combining by e.g. function of array of integer, or whatever).

> numeric data types are 'integer' and 'real', no single/double nonsense

It is helpful to be able to specify how many bits you want. (In the case of integers, range types (as you also mention) might help, though.)

> 'functions' are for returning values, 'procedures' for side effects

It is useful in C to be able to have side effects and return values, and to be able to ignore return values sometimes. (BASIC doesn't have and that sometimes annoys me)

> Function and procedure definitions can be nested. I can't tell you what shock it was for me to discover that's not a thing in C.

In GNU C you can do that, although a function that refers to stuff in the function it is inside of is only valid before the outer function returns. (And, you are not allowed to declare the inner function as "static" to allow it to work even after it returns in case it does not access stuff that is only valid before the outer function returns.)

> Array indexing is your choice. Start at 0? Start at 1? Start at 100? It's up to you.

That is good, and it is useful. Usually I think starting at zero makes sense, but sometimes there is sense to start at any integer, including positive and negative numbers, so it is good to be able to specify any number.


> Range Types! Love'em! You need a number between 0 and 360? You can easily express that in Pascal's type system.

I did not know those existed that early.

> Array indexing is your choice. Start at 0? Start at 1? Start at 100? It's up to you.

Reminds me again of GNU Guile's arrays, which also allow you to specify what index an array starts with. Very flexible.


> Reminds me again of GNU Guile's arrays, which also allow you to specify what index an array starts with. Very flexible.

Reminds me of the "OPTION BASE" statement in BASIC. Found in Microsoft BASICs all the way from GW-BASIC (and probably even earlier than that) through to classic Visual Basic, VBA and VBScript (but not VB.NET). Not sure when it originated, possibly it goes all the way back to some version of the original Dartmouth BASIC (they changed the array base at some point from 1 to 0, so "OPTION BASE" enabled backward compatibility). Also found in some non-Microsoft BASICs, e.g. IBM System/34 BASIC. ANSI Full BASIC (did anyone ever implement it?) supported arbitrary array bases, not just 0 or 1, e.g. "DIM A(100 TO 200)"


Why? I mostly write Ruby and the code in the screenshot doesn't offend me at all.


The syntax that has influenced Kotlin, Scala, Rust, Typescript, and the various ML.

Yeah, really old fashion.


That could just be C-style prejudice.


What makes syntax "modern"?

Power matters more than style.


All imperative languages are essentially descended from ALGOL, but in the case of Pascal, it also kept its original syntax.

Others have moved on.


Yes they have "moved on".

Which is why Alsol was "a language so far ahead of its time, that it was not only an improvement on its predecessors, but also on nearly all its successors.” — Tony Hoare.


- no struct/record type

- call by name instead of call by reference

- no bitwise operators

- probably a lot more, I'm not too familiar with the language

The quote may have been closer to the truth at some point, but certainly not today. And while begin/end syntax may be a matter of taste I don't think it's coincidence it's not very common anymore.


Of course it's not true today [1]. That's a 70s or 80s quote.

But still, "moving on" is not always "improving upon" was the point.

[1] Well, mostly not true (the author has backed down somewhat, but the post is still illuminating as a comparison):

http://cowlark.com/2009-11-15-go/


Too good quote :)


The syntax is a lot more readable than rust, or C++.


Misread this as Rascal and thought it would be about Turbo Rascal Syntax Error.


As you can see in the comments, Pascal has significant anti rating.

I would meditate on that fact if I was Pascal evangelist, instead of reposting such texts over and over as years go by.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: