Hacker News new | past | comments | ask | show | jobs | submit login

At this point any new systems language aiming at productivity has to prove itself not just superior to C++, but superior to Rust, without being significantly worse in any aspect. It’s already suspicious by virtue of having a hand-wavy "Int" type (what size/signedness is that?) and it appears to be object-oriented (so we have to rely on compiler optimisations to remove dynamic dispatch) and garbage-collected (so by default any non-trivial type is heap-allocated). These are just ways in which it is worse than Rust as far as performance goes, since "ergonomics"/"productivity" is so subjective. It seems to improve on C++ only by taking the most common C++ patterns and building them into the language so the language doesn’t have to be so immensely complicated, while also improving the syntax. That’s simply not enough for a modern language to be competitive.



> It’s already suspicious by virtue of having a hand-wavy "Int" type (what size/signedness is that?) and it appears to be object-oriented (so we have to rely on compiler optimisations to remove dynamic dispatch) and garbage-collected (so by default any non-trivial type is heap-allocated). These are just ways in which it is worse than Rust as far as performance goes, since "ergonomics"/"productivity" is so subjective.

While I agree that this language doesn't seem to be differentiated enough to compete, I disagree with your apparent premise that new languages need to be as fast as Rust. I personally would welcome a new language that gives me full-spectrum dependent types with great tooling and moderate performance. There are many aspects to programming languages beyond raw speed.

The world has enough cookie cutter procedural and OOP languages. I'd love to see a new language from a different paradigm succeed.


Have a look at Nim: https://nim-lang.org/

It's a system language that's focused on readability and performance. It has OOP but isn't focused on it, and has some of the best AST metaprogamming out there built in as a core principle, so it's easy to extend the language. Strong static typing with type inferrence, specific type for garbage collecting (ref type) - everything else is on the stack by default, or you can manually manage memory.

Looks a bit like Python, compiles to C, C++, ObjectC, Javascript and experimentally to LLVM. Good support for Windows, Linux and Mac (and anything you can target a C compiler for). Performance matches equivilent code in C, C++, and Rust. Programs compile to stand alone exes making them easy to distribute. Compilation is very fast.

If I only have one thing to say about it, my personal experience has been that Nim makes programming more fun by being really low friction; it just gets out of your way, yet runs really fast. It's great for scripting out a prototype for something, but because of the high performance that prototype can be expanded into a full product. It also helps that you can write server and client code in the same language too.


The GP asked for a single key features: dependent type. Nim doesn't have them and never will, why bring it in the discussion ?

Sadly, “Look at Nim” seems to be the new “rewrite it in Rust”…


You're right no dependent types, though to be fair that wasn't the only thing mentioned, and none of the other replies have yet suggested a language with dependent types either.

I was responding to:

> ...great tooling and moderate performance. There are many aspects to programming languages beyond raw speed. The world has enough cookie cutter procedural and OOP languages. I'd love to see a new language from a different paradigm succeed.

Nim's paradigm is fairly open (no small thanks to metaprogramming and unified function call syntax), and drops a lot of the baggage from the usual class (ahem) of OOP languages. There's loads of mainstream languages that focus entirely on OOP and I really resonate with wanting to explore different approaches to creating solutions, as I think OOP tends to colour how a language approaches problems.

Seems a bit sudden to jump from my posting a reply to this as the same vein as "rewrite it in Rust".

In terms of languages with existing dependent type implementations, it looks like the main options would be ATS, Agda, F*, or Idris. Some of these are pretty far away from the OOP paradigm too.

Also:

> Nim doesn't have them and never will

https://github.com/nim-lang/RFCs/issues/172


> In terms of languages with existing dependent type implementations, it looks like the main options would be ATS, Agda, F, or Idris. Some of these are pretty far away from the OOP paradigm too.

This* is an OK response to the original question.

> Seems a bit sudden to jump from my posting a reply to this as the same vein as "rewrite it in Rust".

The thing is: 90% of comments talking about Nim comes from people like you, whose entire comment history is over 90% about Nim, and most of the time, it comes in context where it's borderline irrelevant to the subject.

Aggressive proselytism like this has hurt Rust a lot, and it's definitely going to hurt Nim as well if you aren't careful.


My comment history is 90% talking about Nim because this is the account I talk about Nim on :) Probably it's the same for other people. My last comment was 7 months ago on the 1.0 release.

It seems like when talking about a smaller language you have to walk that fine line between putting an experience of using them out there, and being a PR ambassador. I'm really not into that, but I guess that's the reality.


For ‘moderate performance’ surely JVM based languages are what you’re looking for? There’s great tooling and a very low barrier to creating new languages.

Creating a new systems programming language like C++, Rust or Zig is by contrast a lot more effort and means having significantly worse support for debugging and IDEs unless you put a lot of effort in (generating good DWARF debug data for a new language is hugely complex).


> For ‘moderate performance’ surely JVM based languages are what you’re looking for? There’s great tooling and a very low barrier to creating new languages.

Not sure I understand what you're suggesting. I was asking for a language with dependent types (or anything that isn't just another procedural/OOP language). Such a language could use any runtime, whether it be the JVM or anything else.


I’m working on a language that will hopefully meet both of your criteria, so at least you can take encouragement that you are not alone.

I’m working on a language based around the recent work of Pfenning, Reed, and Pruiksma (Adjoint Logic) and Krishnaswami’s Dependent/Linear research (both of which go back to Nick Benton’s ‘94 work). It is definitely not OOP, it is a compositional language (a lot like the concatenative language family) and is rooted in explicit parallel and sequential composition. With one of the adjoint logics being the type theory implementation of Intuitionistic Logic (Martin-Lof Dependent Type Theory).

There are people working on things all over the non-OOP and the advanced static types spectrums, don’t loss faith in progress yet. I have plans to release the 0.1 website and ‘compiler’ before July 1. Of course it is going to be a bumpy road, but I’m having a great time working this project.


Presumably LLVM closes the gap significantly?


> I personally would welcome a new language that gives me full-spectrum dependent types with great tooling and moderate performance.

I'd propose to have a look at Swift. It is very similar to Rust in many aspects (particularly the type system), with slower performance (due to some of the abstractions). The tooling on macOS is already really good, on Linux it is getting there, and on Windows the next release will add official support.


I don't see how Swift is similar to Rust -- it's a garbage collected, OO language, not really appropriate for low level / systems level programming. It's a nice looking language for what it is, with some nicer modern features in its type system etc, but its niche is not the same as Rust.


> it's a garbage collected, OO language, not really appropriate for low level / systems level programming

I'm of the impression that Swift is reference counted, which, while technically a kind of GC, is also appropriate for low level / systems programming (which isn't to say that Swift is a good language for low level / systems programming; only that its memory management isn't the disqualifying factor).


I wouldn't consider reference counted GC systems level appropriate. It _can_ be more deterministic, but not when it's tracing and collecting cycles (which a decent modern RC implementation must do) and it usually plays havoc on L1 cache (by touching counts).

You can make RC quite nice (I've written cycle collecting implementations before) but there's reasons why C++ programmers are encouraged generally to avoid shared_ptr and use unique_ptr whenever possible: harder to reason about or trace object lifetimes, and performance implications.

Now if the garbage collection was optional, and one could still freely heap or stack allocate and free, then, yes, I could see it. But I don't think that's the case w/ Swift, at least not last time I looked at it. It's also why Go is imho not a 'systems' programming language.


> I wouldn't consider reference counted GC systems level appropriate. It _can_ be more deterministic, but not when it's tracing and collecting cycles (which a decent modern RC implementation must do)

Swift has what you would call an ‘indecent’ RC design, then, because it doesn’t.

> and it usually plays havoc on L1 cache (by touching counts).

Swift (like Objective-C) can store the refcounts either inline or in supplementary structures, for performance.

> Now if the garbage collection was optional

In the common case, it practically is, as (most) value types will only ever live on the stack, or inline in another data structure.


imho inline will still mess with cache somewhat as adding a reference count still often requires bringing something into cache or evicting something from cache that wouldn't happen with a pure pointer reference.

Glad to hear that value types are optimized well.


It is for Apple, where the long term roadmap is for Swift to become the systems programming languages of their platforms.


Swift is just C++ with rubberized corners. Distinctly “meh” as a language in its own right, embarrassingly knotty around its ObjC bridging (there’s a basic impedance mismatch between those two worlds), and certainly doesn’t have anything as powerful as dependent types.


> embarrassingly knotty around its ObjC bridging (there’s a basic impedance mismatch between those two worlds)

I think they've done an incredible job with their ObjC interop, given said mismatch.

But you're right — the person above who said that

> The world has enough cookie cutter procedural and OOP languages.

definitely isn't looking for Swift.


“I think they've done an incredible job with their ObjC interop, given said mismatch.”

Which is to say, the Swift devs have done an incredible job of solving the wrong problem.

Apple needed a modern language for faster, easier Cocoa development. What they got was one that actually devalues Apple’s 30-year Cocoa investment by treating it as a second-class citizen. Gobsmacking hubris!

Swift was a pet project of Lattner’s while he was working on LLVM that got picked up by Apple management and repurposed to do a job it wasn’t designed for.

Swift should’ve stayed as Lattner’s pet project, and the team directed to build an “Objective-C 3.0”, with the total freedom to break traditional C compatibility in favor of compile-time safety, type inference, decent error handling, and eliminating C’s various baked-in syntactic and semantic mistakes. Leave C compatibility entirely to ObjC 2.0, and half the usability problems Swift has immediately go away. The result—a modern dynamic language that feels like a scripting language while running like a compiled one, which treats Cocoa as a first-class citizen, not as a strap-on.

(Bonus if it also acts as an easy upgrade path for existing C code. “Safe-C” has been tried before with the likes of Cyclone and Fortress, but Apple might’ve actually made it work.)

Tony Hoare called NULL a billion-dollar mistake. Swift is easily a 10-million-man-hour mistake and counting. For a company that once prided itself for its perfectly-polished cutting-edge products, #SwiftLang is so very staid and awkward-fitting.


I don't have enough experience with Swift to agree or disagree with you about it, but it strikes me that there were already Smalltalk-like languages out there that fit this niche somewhat -- such as F-Script -- and Apple could have gone down that road instead of shoehorning Swift into the role.

Objective C already had Smalltalk-style message dispatch syntax, and something fairly close to Smalltalk blocks/lambdas. So it's not like existing Cocoa programmers would have been frustrated or confused.

Clearly the original NeXT engineers were inspired by Smalltalk and wanted something like it, but had performance concerns etc, perhaps there would have been performance concerns with moving to a VM based environment for mobile devices, but I think with a modern JIT these problems could be alleviated. As we've seen with V8, etc..

So I think it was actually a missed opportunity for Smalltalk to finally have its day in the sun :-)


Agreed. Swift is a language designed by and for compiler engineers; and it shows. Contrast Smalltalk which was designed by and for users and usability. Chalk and cheese at every level—especially user level.

Alas, I think decades of C has trained programmers to expect and accept lots and lots of manual drudgework and unhelpful flakiness; worse, it’s selected for the type of programmer who actively enjoys that sort of mindless makework and brittleness. Busyness vs productivity; minutiae vs expressivity. Casual complexity vs rigorous parsimony.

Call me awkward, but I firmly believe good language design means good UI/UX design. Languages exist for humans, not hardware, after all. Yet the UX of mainstream languages today is less than stellar.

(Me, I came into programming through automation so unreasonably expect the machines to do crapwork for me.)

Re. JIT, I’m absolutely fine with baking down to machine code when you already know what hardware you’re targeting. (x86 is just another level of interpretation.) So I don’t think that was it; it was just that Lattner &co were C++ fans and users, so shaped the language to please themselves. Add right time, right place, and riding high on (deserved) reputation for LLVM work. Had they been Smalltalk or Lisp fans we might’ve gotten something more like Dylan instead. While I can use Swift just fine (and certainly prefer it over C++), that I would have enjoyed. Ah well.


I mean, I work in C++ all day (chromecast code base @ Google), and I like the language. But I also know where it does and doesn't belong. For application development, particularly _third party_ app dev, it makes no sense. And neither did Objective C, which is the worst of both worlds. I had to work in it for a while and it's awful.

I agree Dylan (or something like it) would have been a good choice, except that it wouldn't mate well with the Smalltalk style keyword/selector arguments in Cocoa, also it has the taint of being associated with the non-Jobs years and so maybe there would have been ... political... arguments against it.

They just needed a Smalltalk dialect with Algolish/Cish syntactic sugar to calm people down, and optional or mandatory static typing to making tooling work better.


But it doesn’t have dependent types does it?


Nope. I think it periodically gets floated on Swift-Evolution, but someone would have to design and implement it… and I suspect the Swift codebase (which is C++; it isn’t even self-hosting) is already fearsomely complex as it is.

Or, to borrow another Tony Hoare quote:

“There are two ways of constructing a software design: One way is to make it so simple that there are obviously no deficiencies, and the other way is to make it so complicated that there are no obvious deficiencies. The first method is far more difficult.”

..

Alas, big complex codebases facilitate fiddling with the details over substantive changes; and that’s even before considering if the Swift language’s already-set syntax and semantics are amenable to expressing concepts they weren’t originally designed for. Stuff like Swift evo’s current thrash over trailing block[s] syntax reminds me of Python’s growth problems (e.g. its famously frustrating statement vs expression distinction that’s given rise to all its `lambda…`, `…if…else…`, etc nonsense).

It’s hard to scale after you’ve painted yourself into a corner; alas, the temptation is to start digging instead. I really wish Alan Kay had pursued his Nile/Gezira work further. That looked like a really promising approach to scalability. We really need better languages to write our languages in.


> has to prove itself not just superior to C++, but superior to Rust

... and Zig and Nim and D, I guess? None of them are seeing enough usage to even score them meaningfully against C++. The incumbents are C/C++, and there's no one else within two orders of magnitude. Just experiments at various stages; some are still in the lab, others have started some field trials, but that's about it.


As I'm writing this, I spent X hours trying to find where my C project was leaking memory. Turns out one of the openssl pointers needed to be freed explicitly, which was my fault from having just seen their docs and them not explicitly showing so.

Point is, with Rust this wouldn't be a thing. I wouldn't have to compile my program with a number of clang flags and then run the sanitizers and try to fish out where this could possibly be happening. That is just 1 clear obvious productivity win for Rust.

Have you written any recent C/C++ and have used/played with Rust?


If the library were written in modern C++, then it also wouldn't be a problem. It would have given you a std::unique_ptr, ownership would have been clear, and deletion would have been handled automatically.


Yes, and Rust doesn't protect you from memory leaks, BTW, although it does make them less likely. The overall value of a language can only be evaluated after years and many projects. My personal favorite to replace C/C++ is, by far, Zig, but I can't claim that it's the one to beat because it's years away from proving its worth, as are Nim, Rust, and, well, Bosque, I guess. Fashion forums like HN can pass judgment quickly -- that's what fashion forums are for, and why they're good entertainment but not to be taken too seriously -- but the real world of the software industry takes much, much longer, and has a far higher bar for evidence.


Let me clarify, in this actual case it would have. In Rust, memory that gets allotted in a function are freed when they go out of scope. So function returns -> stuff gets freed unless explicitly telling compiler not to.

I haven't seen Zig and I'll check it out. But some of the "fanfare" is necessary to get people involved and things built. Many other langs and projects that are technically worthwhile never get any of it and just languish.


I don't have a problem with the fanfare, but let's not drink our own kool-aid, yeah?

If there's a new language that wants to try its luck, it still only needs to beat the incumbent, not the rest of the wannabes (one or some of which may well one day be the incumbent, but none are anywhere near that yet).


Indeed, at the end of the day there are certain domains where they are unavoidable, and regardless of countless rants from our side, those are the tools that get picked when one of said domains needs to be addressed.

Given your line of work, why not Java itself, on an hypothetical future where Valhalla is done, and AOT is a standard feature in equal footing with JIT capabilities/performance?


Maybe, but I'm allowed to like more than one language, no? :)

But seriously, I don't think AOT can ever match a JIT on peak performance without reducing the abstraction level and significantly increasing programmer burden, and much of Java's design is around "we automatically turn RAM into speed." It's a great value proposition for huge swathes of the software industry, but I don't think it's necessarily the best strategy for niches that require performance and are RAM-constrained.

I like Java and I like low-level programming even though it requires more effort regardless of language.


Sure, just curious. :)

Currently I am on a mix of .NET (C# F#) alongside C++.

I agree with the performance part, that is why the best is to be able to have both around, AOT + JIT.


You can't really compare Rust to Zig & Nim anymore. 2016 is over, Rust is deployed in the wild on millions of computers (Dropbox, VSCode, and obviously Firefox), there are dozens of Rust packages in Debian apt repositories and it's being used by many companies (Microsoft, Amazon, Facebook, Google, etc. even Nike!).

Of course it's incomparable to C and C++, and it will never replace them, but comparing it to Nim and Zig makes no sense either.


The difference between 0.000001% and 0.001% is much less important than between 0.001% and 20%. Rust has less penetration than Haskell and Erlang, and regardless of what you think of their merits, influential as they may be, none of them are factors in the software ecosystem. Languages with 10x more use and 10x more exposure than Rust disappeared with little impact over a very short period of time. Of course, Rust could become a serious force one day, as could Zig or Nim.


> The difference between 0.000001% and 0.001% is much less important than between 0.001% and 20%.

Even if your math is pretty much arbitrary, there is the big difference between having libraries and not having them, and having jobs or not having them.

> none of them are factors in the software ecosystem

It will probably disappoint you, but C and C++ aren't really “relevant factors in the software ecosystem” anymore, and they haven't been for the past two decades. It's been PHP, Java, JavaScript, and some C#, all over for the past 20 years. And Rust won't change anything in that hierarchy even in the best case scenario, neither will Zig or Nim. And that's completely fine.

Yet, Rust has reached a significant existence, which means you can build stuff without having to build all the necessary libs by yourself, you can hire people knowing it, or even people who don't know it yet but will become proficient reasonably quickly because there are tons of learning material. You can find a job at some company already using it or use it for an internal project at your current company because you can show your manager that this isn't too much of a risk. All that even if you don't live in the bay area but in Europe. None of it was possible in 2016 for Rust, and none of it is in 2020 for Nim (and Zig isn't even stable, so it's more comparable to what Rust was in 2013 or something). If everything goes well for them, it could become the case by 2025, but not today. (First, they really need to bring more core contributors to the language, as they are still mainly developed by a single lead: at this point Andrew Kelley still has authored 50% o fall Zig commits, while Andreas Rumpf is responsible for one third of Nim's. The bus/burnout factor of both languages are still pretty much 1).


Rust has a real existence that's somewhere in the vicinity of that of Elixir, Haskell, Ada, and Delphi. It's definitely alive, but it's not "the one to beat" or a major factor, even in the systems programming space, which is what I was responding to.


Then you misunderstood the comment you were responding, because it claimed Rust was the one to beat in terms of design, never in terms of market share.

I someone wanted to release a new functional programming language, it would be totally legitimate to ask how it compare with Haskell, and complain that it's a poor reinvention of standard ML, even if Haskell only has a tiny market share.


Then you misunderstood my comment because what I said was that because of its market share and age, it isn't the one to beat in terms of design, because we have no sensible way to judge how good its design is, even in comparison to other languages. The sample size is just too small, and the experiment too young.


Yes, that's what you said, and to make your point, you compared Rust to two super new languages that are still mostly developed by a single person and not being used anyware.

Then you rectified yourself, and more accurately compared Rust with Haskell and other languages. At this point you already gave up on your initial argument, because no-one would consider Haskell too small to be a proper benchmark of a new functional programming language.

Of course, as this is an internet argument, you're not going to recognize it. Now, since we are now circling back to the initial claims, I don't think this conversation is worth pursuing. Have a nice (and safe) day.


I think most people would think Haskell is too small and that we have enough experience with it to judge the merits of its design.


... and Pascal !

Many languages are superior to C/C++

C/C++ just wins from the amount of available libraries and high-quality implementations.


I don't understand what people are trying to categorize when they say C/C++. They're incredibly different languages. It's hard to find similarities between C and modern C++ (C++2x flavor). If we put C and C++ under the same umbrella, why not put Java there too? I just think when you say C/C++ you need to clarify what exactly do you mean by that.


C/C++ => C and C++, simple plain English grammar simplification.

Used all over the place in ISO C++ papers, official documentation from all compiler vendors, long gone famous magazines like 'The C/C++ Users Journal', reference books well respected in the C++ community,....

Yet a couple of people still insist into making a point out of it, but don't start one when Java/C, Java/C++, C#/C++, Python/C,..., gets written somewhere.


Maybe I'm too much of a pragmatist, but to me, the amount of available libraries and the quality of the implementations is part of what "superior" means.

You may think that Pascal is superior on paper (that is, just the language specification). If we were programming on paper, that might matter. But we aren't.


It's very task-dependent.

If your needs are towards "indie software" that needs a snappy native UI and can make use of subprocesses to cover major dependencies, the Pascal options are really good and time-tested. It's probably underused for games in particular, in fact. Fast iteration times for game projects are a great selling point for contemporary use of Pascal, if you can justify the investment in engine code.

If you are writing server backends, there is nothing interesting going on in Pascal and you will be scraping around to find the bindings to do what you need. Likewise with a number of other common industry tasks.


> It’s already suspicious by virtue of having a hand-wavy "Int" type (what size/signedness is that?)

That is a common misunderstanding by people who haven’t spent time with high level languages.

In any language more precise numeric types allows for faster arithmetic operations if used appropriately. This is the primary reason Java is still a few times faster than JavaScript when comparing application benchmarks focusing on arithmetic and why those performance differences drop considerably when comparing non-arithmetic operations.

The lower level the language is, closer to the metal, the more important these performance cases matter. This is also acceptable from a design perspective because you need to also be worried about memory management, pointer arithmetic, type conversion, and various other low level concerns anyways.

The whole point of a high level language is to not worry about those things. The compiler/interpreter does that heavy lifting. For example Java and C# are both garbage collected so you, by default, don’t get a say in memory management. If you wanted that control then just use C++.

A major pain point in Java is conversion of numeric types. JavaScript only has one numeric type, that is really shitty in all respects, and writing numeric operations in JavaScript is so much cleaner and more fluid in the code.


This is correct, and applies (and is helpful) to many cases.

But the moment you need to shift an "Int", or do bitwise operations or transmit it in a network packet (think about endianness, etc.), is the moment you may regret that the compiler is doing too much heavy-lifting for you.


If it actually is a unlimited-width integer, bit logic/shifting can work fine, and network packets have to be x&0xFF,x>>8&0xFF,etc anyway. But that's painful for direct translation to machine code (what happens when I try to return 340'282'366'920'938'463'463'374'607'431'768'211'456 from a function?), so either it's secretly a fixed-width integer (and they're being deliberately vague about what width, which is suspicious), or the language defaults to possibly allocating memory (and possibly getting a OOM error) on every single arithmetic operation, which is a catastrophically undesirable property in a systems programming language.


A better C++ could be an attractive proposition. There are precedents of succession languages which improve on existing ones, like eg. Coffeescript, Kotlin (at least at the beginning), Reason.


Have you used Rust? It doesn’t have backwards compatibility with C++, but the semantic model really is very similar to modern C++, just without hundreds of the footguns.


Arguably garbage collection is a huge boon to productivity though. I agree about the first two, but I think the whole memory allocation debate is too contextual to be an issue in the general case. Big projects tend to have customized memory allocators which make most benchmarks usefulness dubious - that and reference counting can be nondeterministic too (you deallocate an object triggering a huge chain of frees).


Even a cascading deallocation is still deterministic. It’s true that it doesn’t have hard latency guarantees, though (and true GC with hard latency guarantees is arguably more useful in many cases).


> It’s already suspicious by virtue of having a hand-wavy "Int" type (what size/signedness is that?)

This is a silly criticism. The language is brand new; they haven't gotten around to ironing out low-risk minutia like naming numeric types. Anyway, "Int" can still be well-specified even if the name doesn't indicate size (the signedness is just as clear as with Rust and C++).

> it appears to be object-oriented (so we have to rely on compiler optimisations to remove dynamic dispatch)

Not a fan of OOP or C++, but implicit dynamic dispatch isn't a property of OOP as C++ demonstrates. And to that end, Bosque seems to copy C++ in this regard, or at least the code snippets show methods annotated with "virtual".




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: