Hacker News new | past | comments | ask | show | jobs | submit | more blindseer's comments login

Having had to work in a large pascal codebase, I don't ever want to use this language again.

No metaprogramming in the language meant we had some parts of the code that were written in pascal to generate pascal code before compilation of the main project.

The code was so verbose, and to get the same thing done in alternative languages would have easily been twice as shorter.

Refactoring always took twice as long as I thought it would. Just moving semicolons around was enough to break my concentration when I was in the flow.

I think the kicker was identifiers being case insensitive. That alone would have been enough to drive me crazy. People complain about Nim's case insensitive features a lot, but Nim's implementation is actually good and orders of magnitude better than Pascal's.

Also hiring a good pascal programmer was next to impossible for us at the time.

I don't know why anyone would pick Pascal today over Nim, Zig, Rust, Julia, Go etc.


> No metaprogramming in the language meant we had some parts of the code that were written in pascal to generate pascal code before compilation of the main project.

Weird...if you are talking about Generics, FreePascal has them and work just fine, plus it compiles instantly!

Here's a demo:

    program UseGenerics;

    {$mode objfpc}{$H+}

    type
        generic TFakeClass<_GT> = class
            class function gmax(a,b: _GT): _GT;
        end;

        TFakeClassInt = specialize TFakeClass<integer>;
        TFakeClassDouble = specialize TFakeClass<double>;

        class function TFakeClass.gmax(a,b: _GT): _GT;
        begin
            if a > b then
                result := a
            else
                result := b;
        end;

    begin
        { show max of two integers }
        writeln('Integer GMax: ', TFakeClassInt.gmax(23, 56));
        { show max of two doubles }
        writeln('Double GMax: ', TFakeClassDouble.gmax(23.89, 56.5));
    end.


> > No metaprogramming in the language meant we had some parts of the code that were written in pascal to generate pascal code before compilation of the main project.

> Weird…if you are talking about Generics

But aren’t they very clearly talking about metaprogramming, which is a completely different thing than generics?


Yes, but this example of matrix multiplication does weird stuff:

    {$MODE DELPHI}
    {$modeswitch advancedrecords}
    type
      TMatrix<T, R, C> = record
        fCoordinates: array[R, C] of Double;
      end;
    function mul<T, R, X, C>(A: TMatrix<T, R, X>; B: TMatrix<T, X, C>): TMatrix<T, R, C>;
    var
      i: R;
      j: X;
      k: C;
    begin
      Writeln('yep');
      Writeln(High(R)); // 3
      Writeln(High(X)); // 4
      Writeln(High(C)); // 3
      for i := Low(R) to High(R) do
        for k := Low(C) to High(C) do begin
          Result.fCoordinates[i, k] := 0;
          for j := Low(X) to High(X) do
            Result.fCoordinates[i, k] += A.fCoordinates[i, j] * B.fCoordinates[j, k]
        end
    end;
    type
      R1 = 1..3;
      R2 = 1..4;
    var
      A: TMatrix<Double, R1, R2>;
      B: TMatrix<Double, R1, R1>;
      C: TMatrix<Double, R2, R1>;
    begin
      mul<Double, R1, R2, R1>(A,B); // should fail because B would have to have as many rows as A has columns--and it doesn't.
      mul(A,C) // does not work even with a right-size C--even though it should
    end.


The second bit is because you lack the explicit specialization - you must always tell the compiler how to specialize a generic. Personally i prefer to use generics in objfpc mode as you are more explicit that way. However FPC trunk can allow implicit function specialization if you request it with {$modeswitch implicitfunctionspecialization}. Your code compiles with that.

The first bit is an interesting case and i wonder why it happens. It seems the root cause is that it considers the two type specializations equivalent as even ignoring the generic functions something like "A:=B" is allowed. I trimmed the code down to this:

    {$MODE DELPHI}
    type
      TMatrix<R, C> = record
        X: array[R, C] of Double;
      end;
    type
      R1 = 1..3;
      R2 = 1..4;
    var
      A: TMatrix<R1, R2>;
      B: TMatrix<R1, R1>;
    begin
      A:=B;
    end.
This is clearly a bug and not intentional behavior as doing the specialization by hand shows an error as expected. Also it seems related to multidimensional arrays as with a single element it also shows an error. Note that the other way (B:=A) doesn't work, so my guess is that at some point the type compatibility comparison only checks if assigning one array would fit memory-wise into the other.

I'll check against latest trunk later and if it happens i'll file a bug report.


Thank you!

I also tried

    {$R+}
    type
      R1 = 1..3;
      R2 = 1..4;
    var
      a: R1;
      b: R2;
    begin
      b := 4;
      a := b;
      Writeln(a)
    end.
... and that only gives me a runtime error. So maybe those types count as equal anyway?


The types themselves count as assignable since the values can be compatible and unless the compiler can tell statically a value is out of range it wont complain at compile time (but will at runtime if range checking is enabled).

However using the ranges for arrays is not compatible, if you have a A: array [R1] of Double and a B: array [R2] of Double you can't assign one to the other.


Generics are a very recent addition, though.


Free Pascal had support for generics since 2006, predating even Delphi's. While that isn't as long as Wirth's Pascal, it still is almost two decades old which i wouldn't call that recent.


From the point of view of someone who has worked on MLOC delphi projects for seven years:

* The language has evolved. Delphi's pascal compiler has generics, lambda expressions for both proc and function, it has containers, map and filter functions.

* Design-time: You know how your code will behave without running it, as you are dragging and dropping components. It will draw data from your database and show it in dropdowns and grids. You have fine-tuned control ober widget alignment and anchoring .

* Data modules: Having your orm and data layer as composable components enforce separation of ui and db operations.

* Devexpress: They make the best grid components by far. I use their web products as well.

For the cases Delphi is a great choice, I'd steer clear as far as possible from julia, zig, go and nim. Maybe C# comes close.


Why stay clear from julia, zig, go and nim?


Because I think Delphi is a better choice if you want to program a desktop application with design time support and a decent nested grid.


> I don't know why anyone would pick Pascal today over Nim, Zig, Rust, Julia, Go etc.

When writing desktop apps probably the familiarity with well known IDEs such as Delphi or Lazarus makes the difference. Unfortunately decades passed, still nothing comes even close to them for rapid GUI development. Should one day Lazarus support different languages such as the ones you mentioned, we'll probably hear a rumble around the world. However things in certain fields progress much quicker, and I wouldn't be surprised at all if in some time we could use AI to analyze a drawing then create the corresponding desktop interface description to be linked with non GUI code.


Lazarus is a compelling reason to pick Pascal. But I share your pain and had been looking for alternatives for machine-code-compiled RAD GUIs. To add, only modern Delphi supports var declarations that can be placed elsewhere other than the top of the function.


Actually, the PascalABC dialect[1][2] supports using variables in that way too. However, I've seen arguments that allowing variable declarations anywhere tends to lead to sloppier coding and unnecessary errors, so Free Pascal/Lazarus has not followed that trend. It also appears that Delphi/Embarcadero may have went in that direction to synchronize their C++Builder and Delphi products, to make it easier to jump between them.

1. https://pascalabc.net/en/ (PascalABC) 2. https://github.com/pascalabcnet/pascalabcnet (PascalABC GitHub)


PascalABC seems to only target dot NET though.


I don't write desktop apps these days... but if I have to, my #1 choice is easily Pascal + Lazarus.

For non-GUI apps, sure probably Nim, Rust or Go.


If I had to do a desktop app I’d with Java Swing because of the Java backend libraries and developers. Were I yo ignore those factors I’d go with Gambas for the drag & drop gui.


I still write Java codes... for Android, though.

When were the last time I code in Swing/JavaFX? 4 or 5 years ago, perhaps. The 3rd party libs available for Delphi/FPC are more than enough for my needs.


> Having had to work in a large pascal codebase, I don't ever want to use this language again.

Maybe Pascal isn't good for large code bases, but the same is true for Python, too. And nobody hates Python.


I hate Python, and I've met others who feel the same.

It's a language which had a reason to exist 25 years ago, but has since been surpassed in every way by other faster, more robust, more compatible languages that don't have such quirky syntax. So a lot like Pascal actually. (To be fair to Pascal, its syntax wasn't anything as inconsistent and frustrating as Python's.)


Python has become the lowest common denominator. It's a practical "no-frills" language that can be picked quickly by a sysadmin, a web developer, a scientist doing number-crunching or a kid automating their homework.


The problem with the “no-frills” proposition is that it’s loaded with frills, complexities, and decades of tech debt. Just installing dependencies for a Python project can feel insurmountable.

IMO JavaScript should be the default for any Python use case. It has the same deceptive veneer of beginner-friendly dynamic behavior, same kinds of footguns; but at least JS has a single package manager, much faster optimized engines, and you’d have to learn it for front-end work anyway so it has long-term dividends.


Having used both Python and JS, the only real benefit the latter has it may be faster.

The language has way too many gotchas. I wish they could shed all the legacy aspects to it. My experience with Python, along with many other folks, is that if we don't know something (API, syntax, etc), we often just guess it and it turns out to be right. Fairly intuitive.

Agree with the other commenter: Using pip + venv tends to solve the majority of packaging problems. In my career I deal with a Python dependency headache once every few years.

> and you’d have to learn it for front-end work anyway so it has long-term dividends.

Except when you don't do front-end work ;-)


Poetry does a good job for package management


One of things I really like about Python is the huge standard library. Contrary to JavaScript (which I think is a really weird recommendation), where you need to npm install half the world.


It wouldn’t matter if those dependencies where stable, but JS has a culture of abandon libraries or to make backward-breaking changes every week, so upgrading a project after just a few months is a major pain


That's what I learned when I had to do front-end development using Angular: never upgrade packages.


And you never know if package Y required by package X will break the security of your app.


I've learned Javascript and a framework or two and I hate it with passion.

I patiently wait for the moment in which we can manipulate DOM from within Weabassembly. Meanwhile I shamelessly push for Blazor for front-end, wherever I can.


Python is absolutely not a "no-frills" language. It is loaded with features, I'd be willing to bet that 99% of python programmers don't know all language features (even relatively old and frequently used ones like metaclasses) and that virtually nobody knows the whole stdlib


Additionally if one cares about performance, knowing C or C++ is also a must.


When was the last decade someone cared about performance?


Apparently everyone that writes native libraries to be called from Python, while calling them Python libraries.


> It's a language which had a reason to exist 25 years ago, but has since been surpassed in every way by other faster, more robust, more compatible languages that don't have such quirky syntax.

The same could be said about C++, but yet, here we are. Stroustrop's quip comes to mind: "There are languages people complain about, and languages no one uses". Just look at how much people complain about JS.

I don't know how old people here are, but from the early 2000's till probably the mid 2010's, Python really was the great language, with few alternatives - at least when you consider the libraries available for it.

It's not an accident that people began using it heavily for numerical computing. The only viable alternative at the time was MATLAB.


I don’t understand how programming languages can get so much hate. If you don’t like it, don’t use it…


I think this sentiment underestimates the amount of work people do on existing codebases.

The most reviled languages are often previously loved languages that made writing code easier at the cost of reading and understanding it.

Team 1 secretes reams of instant legacy code, but get money and promotions for shipping it. Team 42 gets handed a steaming pile of manure that is beyond understanding. They then pick up the pitchforks because it is obviously the language's fault.

So, yeah, it isn't the language so much as the average incentives and behaviors around development.


> The most reviled languages are often previously loved languages that made writing code easier at the cost of reading and understanding it.

You claim someone actually loved Objective C or Perl? Or even Scala?


Hahah. I like Objective-C with ARC (in spite of its complexity and limitations), but Swift is more compact and I'm probably never going back (except maybe for hobby projects like writing something for GNUstep.)

I also like how Objective-C++ can mix in C++ code bases. It's an amazing chimera of a language.


That's funny because to me, Perl, is the epitome of a fun write-only language!


I'm sure at least their "parents" did.

$tongue in @cheek


> You claim someone actually loved Objective C

The problem with Objective C is that it requires skill. The average developer really is better off using a more remedial language like Swift.


Most professional programming is done in already existing code and new projects often have to respect existing standards. You rarely get to choose what language to use.


I was recently asked to recommend a language for a new team. My boss (who won't be writing or even reading any of the code) suggested Python. His reasoning: it's what the kids are learning in college these days.

So here I am as the team Senior, looking at current and future team members and deciding on whether to be selfish and choose what's good for me, or trying to figure out what makes something good for the team. Is it better to pick something with a large hiring pool? Is it better to pick something with fewer footguns? Is it better to pick something with higher performance? Is C# really ok if we need Mac support? Is Swift ok for one-off glue apps that run on Linux? Is Zig too young? If a team of five has a Java geek, a Rust zealot, a C# fan, and somebody who only knows Python, can we just agree to all learn and use Go? If I know my replacement will undo whatever I choose, does it really matter?


The safe bet is C#, followed by Go. Swift, Rust, Zig, Python have their problems.

I love Zig and Nim but constantly pick C# or Go.

If time, money, support and performance are not an issue, you can pick anything but I guess someone of the above will always be an issue.


Unless you rewrite it in Rust and link an article on HN.


Surely you can appreciate that the language you're using isn't always your choice.


Not always, but the most languages I have used professionally were my choices.

That means x86, assembly, C, C++, C#, Python and half of Javascript. For Javascript I mean half because I enjoyed doing simple things in JS on front-end long time ago, but these days I kind of dislike JS frameworks and I've learned them because I had to after I signed some contracts. I could have stayed 100% with backend work, but now is too late to complain.

I managed to steer clear of Perl, Objective C, Cobol and other languages I don't enjoy. I managed to learn F# and some Ocaml which I would like to use but never got the chance (personal projects excluded).


Yes, I do understand that, having programmed in business environments with legacy code for over a decade.

I've hated the poor decisions my predecessors had to make, the corners that were cut. I've hated that it's not maintained or documented. I've hated that it doesn't have tests.

Never have I hated the language.


And that sometimes, you make the wrong choice, and grow to hate that language.


> but has since been surpassed in every way by other faster, more robust, more compatible languages that don't have such quirky syntax

Pray tell, which ones are those?


Excelscript


ISO Pascal isn't, but Object Pascal and Modula-2 definitly are.

Unfortunely most complaints about Pascal keep being reduced to that 1970 version of the language.


Or the seemingly lost successor, Oberon. An entire graphical OS and compiler was written in, and completely documented, in a book: "Project Oberon: The Design of an Operating System and Compiler" [0]

[0] https://people.inf.ethz.ch/wirth/ProjectOberon/PO.System.pdf


Do you consider Modula-2 to be a Pascal?

But yes, modern stuff should not be judged by the 1970 edition. I got soured on Pascal by having to use an ISO standard compiler on an embedded system; it was painful in several areas. But I shouldn't judge modern C by pre-ANSI C, or modern C++ by cfront. So, while I still say that ISO Pascal was deeply flawed, I shouldn't hold that against all Pascal versions.


Nicklaus Wirth considers Modula-2 as Pascal evolution, so there is that.

https://dl.acm.org/doi/10.1145/1238844.1238847


Totally agree. Various people seem to be out of touch with modern developments, misinformed, or are being disingenuous (as really advocates for a competing language). A lot of the discussions surrounding Pascal are as if Object Pascal (and its dialects), Delphi, Lazarus, or Oxygene (RemObjects) don't exist or never happened.


Except for anyone who has worked in a large Python code base.

I would take almost any other language, including Pascal, over Python for anything beyond a page of code.


Python is the Settlers of Catan of programming languages. Most people have a different favorite game they would rather play, but enough people don't hate it that you end up playing it anyways.


I don't hate it, but it's just mediocre. Python is fine for short scripting as long as you don't need dependencies (which still remains rocket science in Python world), at that point it's better to look elsewhere.


Dependencies in Python have been sorted for years. Pip works great for install and pip-tools compile works great for locking. Please stop spreading this untruth.


The existence of poetry, hatch, pdm, etc. (and lately rye) is proof that "just use pip-tools" isn't consensus.


It is a current truth that there are modern apps of interest to me written in Python that I can't install because I need some special dependency manager or environment manager or something and it looks like I will do serious harm to my system environment if I follow the installation instructions.


I'm not sure I understand you. If the project uses something "special" like poetry, hatch, pdm, etc., you only need to install those for development. If you simply want to use the project, they all should be installable directly with pip, as they all have pep517-compatible backends.

Even then, how can installing those tools harm your system? You can always use pipx and have each tool be installed in its own virtual environment.


My biggest problem with pip-compile is that it's really slow. Pip-compile runs can take upwards of 20 minutes, depending on how many dependencies your project has.

We use it because we don't have any alternative. I'd really love to have a faster tool though.


Nobody hates python?

The more I use julia, the more I hate python. Which is a lot by now.


Yeah, nobody hates Python


> People complain about Nim's case insensitive features a lot, but Nim's implementation is actually good and orders of magnitude better than Pascal's.

Can you elaborate on this? In what ways specifically is it better?


Go is basically Pascal with a sprinkle of first class concurrency anyway…


One of Go's designer, Rob Griesemer, was a Phd student Niklaus Wirth. Hmm...


Go's method syntax is taken from Oberon-2, just like unsafe package is pretty much the same purpose as SYSTEM.


And, if I understand correctly, Go stole its object file format (or at least large-scale design) from Modula-2.


That I doubt it, given Modula-2 package design, how symbols are made public and type system differences.

If you mean packages as modules, other languages predate Modula-2 in that matter.


No, I mean the object file defines the public symbols at the start of the file. Go very deliberately stole that feature. Rob Pike said that using that idea let Go do something faster when compiling, though I don't remember the details.


There is some embellishment in that statement, but people coming from Pascal/Oberon would likely feel much more comfortable using Go, than other C family languages. This would also include Vlang, and to an extent, Odin as well. These two have both been heavily influenced by Go and Pascal.


> I don't know why anyone would pick Pascal today over Nim, Zig, Rust, Julia, Go etc.

Pascal evolved into Modula-2 and then Oberon which is both smaller and more powerful.

https://miasap.se/obnc/oberon-report.html


Go also lacks metaprogramming, right?


Yes, but there is decent support for code generation. E.g., you can easily get the syntax tree of a go program.


Maybe the tools aren't there but there's no reason you couldn't parse Pascal the same way. It's not a complicated language.


FWIW Free Pascal's FCL has the fcl-passrc package that provides units for scanning FPC code, building syntax trees, resolving identifier references and writing/formatting source code.

Free Pascal comes with pas2js which "transpiles" Free Pascal code to Java Script and is written using fcl-passrc so it should have enough functionality to parse most FPC code.


Go neither, but parsing and AST are in the standard library, and code generation is in the standard build process. All it requires is an (ugly) comment line in your source code. So you've got decent and standard tooling. No make magic required.


Because none of those alternatives have anything close to Lazarus.


This is only going to make it easier people like crypto scammers to boost their activities. Just think about the people that are going to want to pay to have their voices prioritized, and have it be worth it.

Any influencer that wants to sell you ads, any organization that benefits from you buying into their product, any scammer that can trick you from parting with your money, all those people are going to want to pay for this and will be rewarded for doing so.

Meanwhile, I struggle to see why the people that generate actual good discourse (imo I guess), the scientists, the engineers, the writers, the thinkers, etc, would ever consider paying for this.

I'm sure there's a massive bot problem but couldn't that have been dealt in different ways? Getting people to pay to boost their tweets as a value add for the subscription really devalues the platform.


I’ve never had any trouble understanding my code but understanding someone else’s multiple dispatch code has ALWAYS been hard for me.


> I like named parameters for self-documenting code. Julia allows named parameters but naming them doesn't provide any flexibility with regard to their position when calling the function.

I don't understand this complaint? "Named parameters" i.e. keyword arguments in functions work great in Julia!

    julia> foo(;x = 1, y = 2, z = 3) = x + y + z
    foo (generic function with 1 method)

    julia> foo(; z = 4, y = 0)
    5
Perhaps the author meant default arguments?

    julia> foo(x = 1, y = 2, z = 3) = x + y + z
    foo (generic function with 3 methods)

    julia> foo(3)
    8

    julia> foo(0, 0, 0)
    0
> this flexibility does tend to cause bugs working with other people's code. This is compounded by Julia's pursuit of composability. If you cram your custom data value into someone else's function, and if it seems to be the right shape, it will probably work! Unless it doesn't. In that case you just get silently wrong answers. This is a deal-breaker for a lot of scientific computing people where Julia would otherwise shine.

This is almost entirely mitigated by most people using `eachindex` or `axes` or other functions that make the array related code index agnostic. The only reason I say almost entirely is because there's probably some really old code that doesn't work the right way and would silently fail or do the wrong thing if you changed the indexing convention. That said, calling this a "deal-breaker for scientific computing" seems extreme.

> I also find Julia's errors to be fairly obtuse. And I see a lot of them because dynamic typing means the tools can't catch most errors before run-time.

I 100% agree. Julia errors are my biggest gripe with the language at the moment.

> But the big dealbreaker with Julia is it only complies on the fly. That means you can't just hand a compiled executable to someone. You must give them instructions to install Julia and all your dependencies first. That's no good for anything you want to ship like a desktop app or a game. You also don't want Julia on your server recompiling with every web request.

There are packages like PackageCompiler that work pretty well. And I'm positive in the near future (3 years?) we'll have a version of Julia where PackageCompiler will produce small binaries. That said, the convenience of writing in Julia and shipping a precompiled app is pretty awesome, and I personally don't mind it taking more space on disk.

> Like all compiled languages, the development cycle involves a lot of recompiling as you go. For small programs it doesn't matter, but this creeps up as the program grows -- or when you add a heavy dependency, like a plotting library. Julia suffers from this compiled-language drawback, but without the normal advantage of getting a compiled executable you could distribute.

This used to be one of my biggest gripes, but things have gotten a lot better with every version of Julia.

----

My first order approximation when picking a language is this:

1) if I think it'll be easy to write in Python, I should write it in Julia. 2) If I want to ship a precompiled binary that exposes a command line interface, I'll think about using Julia first, and if the size of the binaries are an issue, I might pick Rust.

wrt to Nim, I'm waiting to see how the ORC story shakes out + better documentation about ORC to appear on the scene and for more packages to adopt it. I think Nim currently has too small a community. I've found packages that are commonplace in Rust or Julia are just not even available in Nim. Sometimes a package is 5 years old and hasn't been updated. Yes, it is easy to write interfaces but that requires a lot of work. If it is a project where I'm writing code just by myself, Nim might be a good choice but if I'm working with someone else, having them learn Nim is much harder. The error messages in Nim also need to get better in my opinion.

> You don't have to decide between snake_case or camelCase. You can define your variable either way and both references will work. Ditto for most cases of capitalization. I thought this might be problematic, but in practice it's brilliant. I think that sentiment applies to many of Nim's unexpected design choices.

I personally don't like this at all. Every time I search a nim codebase, I have to use `nimgrep` instead of using `ripgrep`. I have SO many aliases built on top of things like `ripgrep` and `fzf` and none of them are certain to work in `Nim`. It's frustrating that the community is so divided on this, because even though I can see there are benefits to this approach, the benefits pale in comparison to getting user adoption and buy in to use the language.

One of my senior developers on my team agreed to learn Nim in their spare time as a favor to me, to evaluate it for a project at work, and as soon as they came across this "feature" they were so turned off, they wrote the language off as being too weird.

There's a LOT of average programmers out there, and a lot more analysts and data scientists that just want to get shit done, and from personal experience I think it's extremely hard to get adoption for Nim in the scientific community. If 7 out of 10 people have used Python, 4 out of 10 people may have heard of Julia and heard that it's new and modern, but 0 people have even heard of Nim. I've gotten SO many quizzical looks over the years, it is not even funny.


> I've gotten SO many quizzical looks over the years, it is not even funny.

May I ask what's your involvement in the Nim community, I don't recognize your username? Also sorry to say but if your colleagues look you like that maybe they don't respect you?


The idea is that if you just wanted to send someone something to read you could send them a PDF or word document.

If I wanted to present it you could build a ppt.

And having the source for this un Markdown to use with pandoc is nice.


I agree.

What I've learnt from the last few years (decade?) is that you can make pretty much any claim on the internet, and you'll find an audience that agrees with that claim. If you dress up your delivery, your persona, your appearance, or even just evoke strong emotions with that claim, it'll amplify that audience. You'll find strong supporters of your claim even when evidence to the contrary is provided to them directly (sunk cost, difficulty in changing opinions because of human psyche etc).

I wonder where the Carl Sagan's and George Carlin's of this generation are. I have a sneaking suspicion that social media just doesn't allow such talent to flourish in today's world. It makes me sad to think about this and it would make me incredibly sad if that were true.

wrt to Lex Fridman, he's a dressed up persona for spouting intelligent sounding thoughts, without any actual meat or gravitas behind them. I personally rank him in the same category as Joe Rogan and any other random interviewer podcaster, who put in low effort to research content but put in a lot of effort to produce large quantities of work.

And with ad-supported social media, quantity over quality seems to be the name of the game.

Hopefully GPT4 will make the Lex Fridman's of the world replaceable more easily? I still don't think we'll get to an era of high quality content by public speakers any time ever.

Channel 5 with Andrew Callaghan are the only interviews (other than maybe some ones on PBS) that I've been able to watch and recommend to peers.

Any interviewers you all like?


I like Lex Fridman. His podcast expose ideas that's novel to me. I suspect it appeal to many other like me. It's all relative. If you are the smartest person in the room...


A good example of this would be Jordan Peterson.

As for Lex Fridman, he at least sometimes has interesting guests so those podcast episodes are interesting and moreso entertaining. His questions and interviewing skills aren't bad unless he looses himself in "the beauty of life" and starts to overly romanticize. Although I don't really know anything else about him or his "research", he's a decent entertainer but I would watch his show for the guests (some) and not him.


I find something like this a lot more readable:

https://github.com/jkrumbiegel/ReadableRegex.jl

It is in Julia, but if you have it installed locally it’s just a few taps away. You can even generate the regex, and use that in Python and just add the ReadableRegex in a comment nearby.


That reads better than regex, but at that point, why not just use parser combinators? At least for me, whenever I want something more complex than a basic regex, I go for a parser combinator library. Maybe that's what's happening under the hood anyways? I don't know Julia well enough to know.


Not a Julia user, but this looks great!


Wow this looks really cool, do I have to learn julia though?


I found it! If anyone else is interested:

https://news.ycombinator.com/item?id=32300466


Nim is garbage collected / reference counted by default (but there's a way to turn it off, there's lots of GC options). It also ships with a much more batteries included standard library. Nim has operating overloading, dynamic dispatch (although I think this feature is being deprecated), various types of macros, generics, and a whole lot of other features. Nim has everything but the kitchen sink, which can be both good or bad, depending on your perspective. Nim compiles down to C code and that lets you interface with C and C++ libraries in a very native way. If someone likes programming in C but likes the syntax of Python, they'll love Nim (imo). Nim also lets you write Nim code but transpile Nim to Javascript, so it's an alternative to Typescript in some ways. Like I said, everything but the kitchen sink.

Nim's LSP is great and editor tooling is good. The testing framework is only so and so. The package manager in Nim leaves a lot to be desired. The Nim community is well established and big, but without hard data, I wouldn't say it is growing all that much. It's pretty much the same community members from 2-3 years ago that are all doing amazing work, with the addition of a few folk.

Zig is more barebones. It uses LLVM to generate machine code but a couple of backends are in the works as I understand it. It has compile time execution instead of macros, and generics are just compile time features. Zig is a lot like C in that it is simple in its feature set. For example there's no operator overloading. Which means when you read Zig, you kind of know exactly what the program is going to do. It also means code can be very verbose (especially math-y stuff). Try doing complex number arthimetic or 2-D vector calculations and the code is as verbose and ugly as C (imo). Some people will say that this code shows exactly what is going on but (again imo) it is unnecessarily verbose. If people could opt-in to operator overloading somehow it would make Zig really neat for math. I can see Zig being used for web servers, although if it segfaults because of the manual memory management it could be bad. But really the usecase for Zig is bare metal work, maybe software that needs to perform a bunch of work on data. Zig has a unique way of transforming array of structs to struct of arrays, so you get lots of speed improvements while still writing your code in an ergonomic fashion. Zig in a rather unique twist is a better C / C++ compiler than GCC or LLVM. So if you are interested in compiling a C program, you can use Zig to do that. I think Zig is a better alternative to CMake than anything else out there.

I can't speak to using testing in Zig, and I don't believe there's even a package manager at this point. There's very few libraries for doing stuff in Zig but it is growing.

I think a good way to get a sense of the community is to look for conference talks on YouTube or on HackerNews for a language. Nim has about 10 talks a year. Rust will have 30 talks roughly. Zig usually is like 5 talks, and one of them is almost always the creator of the language. Take that for what you like.

Both are great languages and I've had fun trying them out! They unfortunately don't fit my work requirements and are not personally interesting to me.


yes nim is to python as crystal to ruby

nim really should be more interesting for folks who using python for high level and c for low-level, I was very interested in it as I do both python and C, but it somehow was just not that popular, at least my boss will never buy any idea of using it in production.


> A better catchphrase for Julia might be "The best expressiveness / performance tradeoff you have ever seen".

Man, Jakob just has a way of writing that gets right to the point. Nice post.

My only complaint about the post is that in Python you have many different REPL options. For example, both ipython and bpython are miles ahead of the Julia REPL. I agree though that the Julia default REPL is better than the Python default REPL. I particularly like the shell mode in Julia. You can also make your own modes, like a sql mode or any custom DSL mode.

I have been very critical about Julia in the past, so I'll say some positive things before I complain more :)

- The package management in Julia is god-like. Specifically, there's a whole subset of packages that are just binaries and libraries cross compiled on VMs for Windows / Mac / Linux + x86 / x64 / ARM, and it just works. There's no more trying to compile code on a users computer when trying to install a package, it's beautiful. It is truly a phenomenal piece of engineering, and I cannot praise it enough. Hats off to the core team responsible for this.

- Multithreading is such a joy to use. Compared to any other dynamic GC language, Julia and Go are pretty much up there in terms of usability + features. So much better than Python or R. But while I prefer Julia over Go, I do prefer Rust over Julia.

Now for some of my grievances. I'm a nobody in the Julia community so take my word for what it is worth.

- Optimizing Julia code is a joy. However, learning how to optimize Julia code not straightforward. If you are coming from Python and/or don't have experience thinking about memory, cache lines, references, mutability etc, you have your work cut out for you in terms of learning how to optimize Julia code. In addition to that, there are Julia specific things you need to learn to know how to optimize your code. Do you know what the function barrier paradigm is? No? Too bad, now your codebase is 10x slower and refactoring could take weeks. And that's just the theory of everything you need to learn. There's SO many subtle ways your Julia code can be slow because the compiler wasn't able to "see through your code". And the tooling here is getting better but still has a LOONNNGGG way to go. Statically being able to check and assert for problematic inferences will improve this for me a la JET.jl but right now JET.jl is too slow to run on a large codebase.

- Thinking with Multiple Dispatch (MD) is much like Thinking with Portals. Once you get it, you have this ah-ha moment. But the type system is overloaded in my opinion. You HAVE to use the type system for MD but people also use it for interfaces (AbstractArray for example). I think adding inheritance in the abstract type system was a mistake, and a trait or interface like approach would be way better for this. Maybe something like concrete types for dispatch and abstract types for interfaces? I don't think this will EVER change though, not in Julia 2.0 or 3.0 or later, because it is SO ingrained into the Julia community. I'm not explaining this well here but I've complained about it before in previous comments on HN and am too lazy to go find it and copy paste :)

- There's a number of minor syntax / usability gripes I have that I don't think will ever be fixed as well. I generally think a programming language should incentivize you to "do the right thing", often my making it easier to type. In Julia this framework of thinking exists but isn't applied consistently. It is easier to create a immutable struct than a mutable struct

  struct Immutable 
    x::Int
    y::Int
  end
  # vs
  mutable struct Mutable
    x::Int
    y::Int
  end
However, if you want to use it to store user data, if you choose immutable structs, your interface for users is EXTREMELY annoying. For example, if they want to update `x` from `1` to be `2`.

  im = Immutable(1, 3)
  im = Immutable(2, im.y)
With mutable structs:

  m = Mutable(1, 3)
  m.x = 2
There's third party packages that make this easier but this should ABSOLUTELY be in Base.

There's similar complaints I have about type names in Julia. I'm incentivized to write `::String` instead of `::AbstractString`, `::Int` instead of `::Integer`. In Julia using `AbstractString` is almost always preferred. The naming is also quite annoying. Why does `AbstractString` have `Abstract` in the name but `Integer` not, when both of them are abstract types?

I've said this before and I'll say it again. I think Julia core devs should have a usability expert come in and review their whole codebase and workflows and make suggestions. I have no idea how Rust has nailed this so well. In Rust, so many things are just consistent. You can guess what the names or behavior of what you want so often, it's awesome.

TLDR:

If you are thinking of using Julia for a large production codebase, wait 5 more years. I've learnt that the hard way. For personal projects it is amazing though.


> if you choose immutable structs, your interface for users is EXTREMELY annoying. For example, if they want to update `x` from `1` to be `2`

That's the whole point of immutability that you can't "just update". I fail to see how obscure magical updates on immutable (?) structs like advertised in [1] or [2] e.g.

  using Accessors
  @set obj.a.b.c = d
are beneficial. Note that there is _zero_ explanation on the front page of what the above snippet does, how to use it and what actually happens under the hood. One example of why I personally don't have much trust in JuliaLand.

Scala has `x.copy(field=newval)` for its (immutable) case classes. Note how clear it is that one is making a copy. Lenses are also outside of the stdlib (e.g. [3]).

[1] https://github.com/JuliaObjects/Accessors.jl

[2] https://github.com/jw3126/Setfield.jl

[3] https://www.optics.dev/Monocle/


As I understand, the idea of immutability and usage of packages like Setfield.jl to change struct fields is a safety feature for concurrency. When putting it in as an argument to a function, you can be sure that it would not be changed during execution introducing races without needing to acquire a lock. Also, state machines become easier to reason about when immutables are used.


> I generally think a programming language should incentivize you to "do the right thing"

> It is easier to create a immutable struct than a mutable struct

This is exactly an implementation of that principle - since very often one of the goals when using Julia is performance, it makes sense to incentivize the more performant option. [1]

> I'm incentivized to write `::String` instead of `::AbstractString`, `::Int` instead of `::Integer`. In Julia using `AbstractString` is almost always preferred.

I'm on the fence about this one. I used to think this way, but now I think just slapping Abstract types everywhere in code leads only to fake-genericity. It's the sort of trap that leads to the kind of problems Yuri complained about in his (in)famous blog post.

I'd rather someone type ::String and be artificially limited in what they accept (at least for now, until someone complains), rather than be incentivized to say they accept an abstract type and then end up not actually supporting all of the abstract type interface's flexibility.

[1] https://docs.julialang.org/en/v1/manual/types/#Mutable-Compo...


The best part of Julia for me is just that it has numeric primitives built in, so the different ML etc. libraries don't have to implement their own (incompatible) versions. So Julia code can be 1/10th the length in Python, because you literally just plug libraries together, instead of having to pipe distinct data structures together.

It also means the libraries are short and easy to understand, instead of having tens of thousands of lines implementing said primitives etc.


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: