Hacker News new | past | comments | ask | show | jobs | submit login

Languages come and go but C++ remains. I wonder if this one will be different.

Im not trying to be a jerk. Just voicing my concerns. Learning a new language and producing code in it is a huge commitment. It better be around in 10 years. Will it still be as clean and simple, or will it have grown complex. What if Mozilla stops sponsoring it?

C++ is 35 years old. It has stood the test of time. I would not be surprised if it was still popular in 35 years time, when I retire. There is so much C++ infrastructure code in this world, that its not going anywhere.

In the meantime C++ has evolved. I cant remember the last time I had an uninitialized memory, null pointers, or buffer overflow -bug. Those are low level C problems. If you avoid C-style programming and stick with modern C++ (RAII, value semantics, STL, boost, etc) you wont get them.

C++ is changing fast. In the next few years we are getting high level concurrency and parallel language support and libraries (no more low level thread problems), we are getting modules (no more preprocessor hacks), compile time reflection (no more manual serialisation), concepts (no more ugly error messages), ranges (no more cumbersome iterators), and a whole lot more.

And finally there is "C++ Core Guidelines" which aims to be checkable by tools. So you get a warning when you are relying on undefined behaviour.

I think C++ is still the future.




> Languages come and go but C++ remains.

And those other languages have slowly eroded C++'s dominance. You have to remember that in the early '90s, C and C++ had nearly 100% market share for all applications. That dwindled with the rise of Java, then dwindled more with the rise of dynamic languages, and the mobile/modern era made it dwindle even more with Objective-C, Swift, and Go. Now C++ is a popular language for a certain class of applications, but by and large new programmers aren't even learning it anymore. That's a huge shift.

> In the meantime C++ has evolved. I cant remember the last time I had an uninitialized memory, null pointers, or buffer overflow -bug.

I do. https://bugzilla.mozilla.org/buglist.cgi?query_format=specif...

> And finally there is "C++ Core Guidelines" which aims to be checkable by tools. So you get a warning when you are relying on undefined behaviour.

As I've said before, I have questions about this tool: how much like C++ it will really be after ruling out so much code, and how they plan to statically guarantee noalias on function boundaries in the presence of shared_ptr.


True, C++ has slipped since its hay days. Its natural in a way. C++ as a general purpose language cant compete with domain specific languages at their own domain. Even if every domain got its own language, there might still be room for a generalist language that works across many domains.

But C++ also has some domains of its own where it is the king. Systems programming and any application where performance is a priority.

If C++ started slipping against a new languages in those areas, then it might be a sign of the end for C++. I don't think its going to happen for a very long time. Its hard to beat C++ when it comes to performance (while still having high level abstractions). And any general purpose language needs to know how to do "a bit about everything", so it would probably be just as complex as C++.

(Even though C++ has slipped in percentages, I bet there are no fewer C++ programmers today than there ever was, there's are just more programmers in general.)

If something is going to kill C++, its probably a language that scales very well to multicore. Maybe a functional language. I be happy if that happened as that would be a revolutionary breakthrough.

On you final points. I think C++ programmers are slowly but surely moving away from C-style coding towards modern C++ (and we will see less of those types of bugs). It takes time though. Id be happy if C++ at some point was sub-setted and the worst parts of C where deprecated (Thats the other aim of "C++ Core Guidelines" as I understand it).

About the aliasing problem. I'm not sure how it could detect that either. We will probably have to live with it. In general though smart_pointers are still just pointers, and its better to avoid them when one can (and use value semantics instead).

http://www.tiobe.com/index.php/content/paperinfo/tpci/index....


> Its hard to beat C++ when it comes to performance (while still having high level abstractions).

Re: performance, Rust's doing pretty well (on a bunch of arbitrary microbenchmarks):

https://benchmarksgame.alioth.debian.org/u64q/compare.php?la...

Re: high level abstractions, Rust doesn't have many of the template shenanigans C++ does (but that's a good thing, IMHO). However Rust does have a phenomenal type system for a language with such predictable and solid performance. And it has many wonderful high-level abstractions (exhaustive pattern matching on algebraic data types, iterator syntax for collections, many functional idioms, an AST-based macro system, etc.).

> If something is going to kill C++, its probably a language that scales very well to multicore.

Rust has fantastic concurrency, and the libraries for it are still in relative infancy. The type system keeps un-synchronized shared mutable state from happening (unless you intentionally turn off certain checks using unsafe code, IIRC). There are freely available libraries for lock-free data structures and stack-scoped threads. As an aside, it's also really easy to use those libraries without having to vendor some random header files that may or may not conform to your coding style.

> Maybe a functional language. I be happy if that happened as that would be a revolutionary breakthrough.

Rust also has some pretty cool functional syntax, although I'm not an expert on functional programming, I've been enjoying it quite a bit.

> About the aliasing problem. I'm not sure how it could detect that either.

Not sure how C++ intends to do it, but Rust does it with a strict ownership system that is fully enforced at compile time.


Yeah lets see how it plays out. I might have a look at rust at some point.

On the benchmark though, its doesn't seem quite fair. At least that first one is comparing a single threaded c++ program, to a multi threaded rust program. And for the regex example the author doesn't use C++ std::regex and std::futures (std::async) like he does for Rust. It makes me think a proper C++ implementation would do much better. (Personally I use the Parallel Patterns Library. I like it a lot. We will get something similar in C++ in a few years)

I don't know much about rust. It worries me though the examples are still using threads and mutexes. I think we need much higher level abstraction (like ppl::task, coroutines, etc) to get better scaling. Also lock free data structures dont scale that well either. As they still require synchronization and that hurts scaling). I think we need some kind of revolution in how we code to scale our programs to hundreds of cores.


  > And for the regex example the author doesn't use C++ 
  > std::regex and std::futures (std::async) like he does 
  > for Rust.
Rust's standard library implements neither regexes nor futures, so the code you're seeing must be coming from third-party libraries. If anything, this comparison would favor C++, since its own libraries are likely to be more mature and optimized than Rust's.

  > It worries me though the examples are still using 
  > threads and mutexes.
You shouldn't be worried. :) C++ may require higher-level abstraction to make concurrency tenable, but Rust was designed as a concurrent language from the outset. Rust's type system prevents data races at compile-time, so programming with raw threads isn't nearly as fraught as it is in every other language and refactoring concurrent code can be performed with compiler-assisted confidence. I recommend Aaron Turon's blog post "Fearless Concurrency with Rust": http://blog.rust-lang.org/2015/04/10/Fearless-Concurrency.ht...

  > Also lock free data structures dont scale that well 
  > either. As they still require synchronization and that 
  > hurts scaling
This is another assumption that I suspect that Rust obviates. Let me recommend another of Aaron Turon's blog posts, "Lock-freedom without garbage collection": http://aturon.github.io/blog/2015/08/27/epoch/


> hundreds of cores

That was the dream of the mid-2000s. But now hundreds of cores are never going to happen. The CPU vendors have decided that the industry has run out of time to parallelize their programs and are now refusing to scale up. Our hope for speedups now lies in using SIMD and GPUs effectively ("heterogeneous computing").


For what it's worth I actually think that Rust (and a smattering of other languages and tools) could push us to revisit the "hundreds of cores" thing again if these parallel-first/friendly tools get popular enough that CPU vendors see a market of highly parallel consumer-grade applications.


It is worth noting that the implementations for those benchmarks are contributed by a community, not all written by the same person (IIRC), so if someone is interested in using a language to the best of its hypothetical performance ceiling, they can definitely submit a solution.

I'm not sure about the details for the "fasta" benchmark, but I know it's at least partly I/O bound, and many implementations for the benchmark are single-threaded:

https://benchmarksgame.alioth.debian.org/u64q/performance.ph...

Notably, Rust also edges out a multi-threaded C implementation for that one. I'm sure they've done some very hairy optimizations to get to the top on that board, but it's cool to see that it's possible.

On the subject of regexes, I am not familiar with the C++ std::regex implementation, but it does look like (some rather slow) C++ implementations are using a boost regex library. Those are generally fairly "standard" in C++, right?


> Notably, Rust also edges out a multi-threaded C implementation for that one. I'm sure they've done some very hairy optimizations to get to the top on that board, but it's cool to see that it's possible.

The most important optimization I did was to avoid regex machinery as much as possible. In particular, Rust's regex library has very good support for prefix literal optimizations. In particular, regexes like `abc[x-z]foo|bar` will compile down to an aho-corasick[1] automaton over the strings `abcxfoo`, `abcyfoo`, `abczfoo` and `bar` with the failure transitions completely evaluated. (The end result is an automaton represented by a matrix. This is memory intensive, so only prefix literals of a certain size can be accommodated. But you don't need a lot to realize huge gains!)

I wrote more about it here: https://www.reddit.com/r/rust/comments/39unje/ahocorasick_fa...

Hint: most of the benchmark game regexes are relatively simple and compile down to either simple `memchr` calls (most regex engines will do that, nothing special) or a aho-corasick DFA that completely avoids the traditional regex evaluation machinery.

In general, regex implementations differ dramatically in the optimizations they perform. It's hard to draw conclusions about them without some explicit knowledge of how it is implemented.

[1] - https://github.com/BurntSushi/aho-corasick


I cant be sure about Rust, but C++ should at least be as fast as C for any fair benchmark.

About the regex example.

https://benchmarksgame.alioth.debian.org/u64q/program.php?te...

The boost regex library is fast and widely used. It doesnt look like the example is using it though, but a library called "re2". Never heard of it before. Googling turned up this https://github.com/google/re2


There are 4 C++ benchmark submissions for regex-dna. One of them does indeed use boost, but it is significantly slower than RE2: https://benchmarksgame.alioth.debian.org/u64q/program.php?te... --- I don't know much about boost's regex support, but after a quick search, it does support backreferences, so it's likely a backtracking implementation. It's no surprise to me that it is beaten handedly by an implementation that uses DFAs (RE2, and in this case, Rust's regex library as well).


Re: regex, I was referring to some of the other C++ implementations:

https://benchmarksgame.alioth.debian.org/u64q/program.php?te...

https://benchmarksgame.alioth.debian.org/u64q/program.php?te...

I have no idea whether it's the regex implementations that are causing the comparative slowness, to be honest. Just seemed interesting that some C++ implementations using what I would think are common techniques fall behind many other languages' implementations.

Of course, benchmarks are always arbitrary, and some of them will just disadvantage approaches that would be perfectly realistic ways to solve "real-world" problems, so it's always a grain-of-salt situation. However I'm not sure it's necessarily fair to say that "C++ should be at least as fast as C for any fair benchmark." Doesn't that expose you to the danger of redefining fair benchmarks as "those benchmarks which reinforce my preconceived notions of which tools have what performance characteristics"?


Probably bad wording by me. Cpp vs c has been benchmarked so extensively that it would be big news if C was quicker than Cpp at anything. Cpp is often faster though because of inlining of templates. Finally c is mostly just a subset of cpp. Its difficult to see how compiling the same code would be slower with a cpp compiler.


Rust's ownership system makes shared memory much more safe than you'd expect. And you can also use shared-nothing approaches if you prefer those.


  > any general purpose language needs to know how to do "a 
  > bit about everything", so it would probably be just as 
  > complex as C++.
Unless C++ massively breaks backwards compatibility (which would be a death sentence for the language, IMO), new general-purpose languages will be capable of achieving an order of magnitude less complexity than C++ simply by learning from C++'s history and avoiding all of its unfortunate missteps. Rust is one such language, having studied C++ extensively.

  > If something is going to kill C++, its probably a 
  > language that scales very well to multicore.
Rust was conceived in order to write a browser engine with extreme fine-grained concurrency and parallelism, at every level, and capable of scaling from one core to as many as you can throw at it. See https://github.com/servo/servo/wiki/Design for a basic idea of its structure.

  > I think C++ programmers are slowly but surely moving 
  > away from C-style coding towards modern C++
This is not my experience. As far as I can tell, C++98 remains the most popular dialect of C++ used in a day-to-day capacity in industry. Even among the relatively "hip" Hacker News-ish crowd, I have seen just as many people advocating that C++ usage revert to the "C with classes" approach as I have seen people advocating modern C++.

  > [TIOBE link]
Even if you accept TIOBE as an authority on language popularity, you need only scroll down to the historical graph to see that C++ has eroded heavily over the past 15 years, whereas Java and C have more-or-less held steady. Even among the tier of languages that C++ leads now, PHP, Python, and C# have all held historical market share greater than C++ does today, which to me indicates that we are past the point in time where C++ can be said to hold a commanding position in the market. And even disregarding that it has Java and C to compete with, it would supremely difficult for C++ to regain a commanding market position now that it has to compete with new languages on multiple fronts where it formerly dominated: systems programming (Rust), server programming (Go), application programming (Swift).

So no, C++ isn't going to "die" (what language ever truly dies? COBOL still powers all the world's banks, and I've personally been paid to write RPG-LE), but I wouldn't bet on its ascendancy.


>In the meantime C++ has evolved. I cant remember the last time I had an uninitialized memory, null pointers, or buffer overflow -bug. Those are low level C problems. If you avoid C-style programming and stick with modern C++ (RAII, value semantics, STL, boost, etc) you wont get them.

While your small walled garden within C++14 is nice, clean, safe, and well maintained. Massive parts of it are built ON THE VERY FOUNDATIONS YOU ARE ATTEMPTING To AVOID.

This is the irony of C++. While there is a very nice subset of the language, that does nearly all the same things Rust does. You still have 30 years of legacy laying around. You likely won't find a job that EXCLUSIVELY follows the C++14 guidelines. You WILL have to learn the legacy patterns, and work with legacy code, and make legacy mistakes.

>C++ is 35 years old. It has stood the test of time. I would not be surprised if it was still popular in 35 years time, when I retire. There is so much C++ infrastructure code in this world, that its not going anywhere.

I don't debate this fact. But legacy C code never stood in the way of C++'s adoption. There are always gonna be MORE programmers tomorrow, then today. People adopting a language is hardly a zero sum game.


> [...] This is the irony of C++. While there is a very nice subset of the language, that does nearly all the same things Rust does. You still have 30 years of legacy laying around. You likely won't find a job that EXCLUSIVELY follows the C++14 guidelines. You WILL have to learn the legacy patterns, and work with legacy code, and make legacy mistakes.

Large existing codebases are a strength of C++ though. Any new replacement language won't magically make those codebases go away.

> [...] But legacy C code never stood in the way of C++'s adoption. There are always gonna be MORE programmers tomorrow, then today. People adopting a language is hardly a zero sum game.

The killer feature of C++ was the seamless interoperability with C which allowed trivial use of C libraries, and, most importantly, easy piecemeal evolution of large C codebases.

For example GCC is begin ported to C++ with relative ease, while a rewrite in rust would be a significantly more complex endeavor.

Then again, beyond lifetime checking, rust and C++ are very similar languages and seamless interoperability is not inconceivable. I believe that's the missing feature for rust to achieve world domination.


>Large existing codebases are a strength of C++ though. Any new replacement language won't magically make those codebases go away.

Which I already said...

>>I don't debate this fact. But legacy C code never stood in the way of C++'s adoption. There are always gonna be MORE programmers tomorrow, then today. People adopting a language is hardly a zero sum game.

>For example GCC is begin ported to C++ with relative ease, while a rewrite in rust would be a significantly more complex endeavor.

Source or opinion?

>Then again, beyond lifetime checking, rust and C++ are very similar languages and seamless interoperability is not inconceivable. I believe that's the missing feature for rust to achieve world domination.

No I agree. Seemless importing with C++ would be amazing. Name mangling in C++ is just a bit of an issue. Rust works great with C. The FFI is very stable. I use Rust libraries at work as generating 32bit DLL's is possible with nightly.


I haven't used it yet, but Rust supposedly works very well with C libraries, both calling into them and being called from them.


The hardest part is creating the comfortable, Rust-ish interface. Actually interacting with C code is trivial.


I've read your replies and I think you're arguing too passionately and missing facts.

It doesn't matter that there's much legacy C++ code. Two things will happen:

a) it will stay C++03, regardless of what changes in C++ or what other languages pop up. This kind of code is probably part of a working product which fulfills its mission and maybe gets small updates or fixes.

b) it will continue to be maintained, and it will be able to benefit from some or most of the changes coming in new C++ versions. It doesn't mean that it will be converted to super duper C++14, but that's not how software maintenance works anyway! And guess what, they probably won't be converted to Rust either, but they will be better due to the new standards. You are building a strawman when claiming that it's either Rust or failing to use pure C++14. Successful software tools offer backwards compatibility when making improvements and iterate instead of replacing everything. Just ask the Python and Perl teams.

And this is how C++ was able to build marketshare, by offering a transition path from C. What exactly is that path for C++ to Rust, rewriting everything?

Finally, new projects will use the new standards. My current project is C++11. The next one will be C++14 and we will absolutely make use of the new features where it makes sense.


> And guess what, they probably won't be converted to Rust either, but they will be better due to the new standards.

What we need is memory safety, and new standards don't provide that. Maybe in the future they will, but I have significant questions about how suitable the ISO Core C++ lifetime checker will be.

> Just ask the Python and Perl teams.

Python and Perl were not backwards compatible with anything when they were released.

Rust is not a new version of C++. It is a new language.

> And this is how C++ was able to build marketshare, by offering a transition path from C. What exactly is that path for C++ to Rust, rewriting everything?

No. It's a very comprehensive FFI, IMO among the best in any language. We even have (preliminary, quite hacky) C++ support.

We had to spend time developing this, because we actually use Rust for Servo and we had to develop incrementally, retaining large C++ components. It works great.


>> Just ask the Python and Perl teams.

> Python and Perl were not backwards compatible with anything when they were released.

I think the implication is that the teams will tell you that the upgrades to Python 3 and Perl 6 were problematic because they didn't offer backwards compatibility. If so, it's a bit silly, because the point of those upgrades was to advance the languages beyond would could be done with backwards compatible iteration. Asking the teams would probably yield a bunch of people that were adamant that the decision was sound, and it needed to be done, but there were missteps in execution. Asking the communities would likely yield a more polarized set of opinions.

That said, IMO using either of these items as examples to bolster an argument is almost always a mistake, since their own discussions are so large and polarizing, they rarely ever make make a point more clear.


Python and Perl were not backwards compatible with anything when they were released.

Perl 5 (and older) was highly backwards compatible from a programmer standpoint. Much of Perl is an agglomeration of preexisting Unix command line tools and little languages into one language.


Well, it's closest semantically to awk, but awk is not syntactically compatible with Perl [1]. That's pretty similar to the situation with Rust and C++: semantically compatible but not syntactically compatible.

[1]: http://www.arl.wustl.edu/projects/fpx/references/perl/learn/...


"semantically compatible but not syntactically compatible" would have been a more impressive way of expressing it.

If Rust is highly semantically compatible with C++, then automated porting of C++ to Rust isn't such a far-out idea.


Definitely it'd be interesting. But I feel like such a thing is going to have the same issues I worry about with the ISO Core C++ lifetime checker: existing C++ code just isn't designed for lifetimes, so you will have to rework the logic a lot to get the static analysis to pass. When you have that much manual intervention, I wonder whether you're porting or really just rewriting—and if you're rewriting, syntactic compatibility doesn't matter so much.


On the plus side 30 years of legacy laying around also includes a lot of mature and well maintained libraries.


Which don't follow the much professed C++14 guidelines and can introduce the errors that Rust is designed to prevent.

:.:.:

I don't have an issue with C++ itself, its just the meme that "C++14 will solve all of C++'s problems". It won't. C++'s problems are now locked in, and fixed. They've been known for 25-30 years, and we'll deal with them for another 50+.

Yes the tooling is great, there are amazing libraries, there is still virtual templated necromancy, there are still null pointers, and use after frees. There still will be tomorrow, and there still will be in 50 years.


It's funny to see how these languages topics always explode. I read it when it had about 20 comments and then it predictably devolved into a mess of hype and comparisons.

Anyway, I think your strategy is sound. Rust is a young language and I would be pleasantly surprised to see code written today still compile in five years. I got bit by this with Swift v1 in a little app. When 1.x came, it failed to correctly transform the source and the whole thing was a major PITA to update. Completely soured me from using Swift.

I can understand why Mozilla wants to use Rust. It solves a problem for them, they largely control how the project evolves and can plan for it. To me, the big questions are what will happen to Rust if Mozilla changes strategies (Persona, Thunderbird, FirefoxOS) and whether Mozilla will continue to be a successful organisation. Personally, I root for them, in spite of their recent missteps.

I think Rust might be a good choice now for language aficionados and early adopters which can afford to waste time in exchange for other potential benefits. I would absolutely not use it to bring a project to market or for an OSS project which has a long term vision. In 5+ years I expect such a decision could be worth revisiting.


> Rust is a young language and I would be pleasantly surprised to see code written today still compile in five years.

We have had a code stability promise for months now in 1.0. So what you are saying here is that you do not believe that Rust will adhere to what it very publicly planned to do. Do you have specific reasons for this?

> To me, the big questions are what will happen to Rust if Mozilla changes strategies (Persona, Thunderbird, FirefoxOS) and whether Mozilla will continue to be a successful organisation.

Rust has a core team and community that overlaps a lot with, but is very much not identical to, Mozilla. Probably most Rust users aren't even Firefox users.

Rust is not "Mozilla's language". It's the Rust community's language.


> We have had a code stability promise for months now in 1.0. So what you are saying here is that you do not believe that Rust will adhere to what it very publicly planned to do. Do you have specific reasons for this?

Since 1.0, Rust has in fact had multiple significant stable-breaking compiler changes, not to mention that, among other things, adding any new method to a trait or type in the standard library has the potential to cause breakage. The official code stability promise has not been violated by these changes only because it's riddled with exceptions.


> Since 1.0, Rust has in fact had multiple significant stable-breaking compiler changes, not to mention that, among other things, adding any new method to a trait or type in the standard library has the potential to cause breakage. The official code stability promise has not been violated by these changes only because it's riddled with exceptions.

This is essentially the same as any other language. Adding a new method to traits or types can also break JavaScript, the other example you gave.

In fact, I know of one such example where code was broken in the wild because of new methods being added, and we did not revert the change. Marijn Haverbeke's winning js1k entry relied on the 4th and 7th letters (or something along those lines) of every CanvasRenderingContext2D method being unique, because it compressed code by renaming the canvas methods with what was then a perfect hash function. A new Firefox version was released months later that added new methods and broke the code. That was deemed an acceptable piece of breakage, and Rust's policies are very similar.

Anyway, the vast majority of breakage that we've seen so far was due to the libcpocalypse. This was an issue with people not specifying specific versions of libraries they depended on, not the language.


Here's another example with less trivial breakage, which was reverted:

https://www.fxsitecompat.com/en-US/docs/2015/string-prototyp...

Unless you mean to refute the idea that JS doesn't break anything ever - which it clearly does, occasionally, but proof-of-concept golf code that intentionally takes fragile shortcuts isn't the best example.

...But JS has moved away from monkey patching builtin objects for good reason, whereas Rust tends to be relatively promiscuous in adding methods to other people's types via trait impls. This isn't bad per se, but I do think it creates bigger compatibility risks.


I don't think this phrasing is particularly charitable, yet we don't need to argue about semantics, because we have numbers.

  > Approximately 96% of published crate revisions that build with the 1.0 compiler 
  > build with the 1.5 compiler.
https://internals.rust-lang.org/t/rust-regressions-2015-year...


I've seen it, and if that rate of 4% of code breaking in half a year were to continue, in the above stipulated 5 years it would look pretty bad. Hopefully it will not, given that some of those regressions involved soundness issues and other things (integer fallback) that were "clearly wrong", which ought to be mostly weeded out before long. That said, the rate of new features in the language and especially libraries seems likely to increase a bit in the nearish features, given the size of the feature request backlog (or not - you'd know better than I).

But even at a rate of significantly less than 4%, there's still trouble. That figure is based on the number of root regressions, right? (If I'm wrong, disregard the rest of this paragraph, but I don't think so...) That is, the number of crates that break on their own accord, rather than those that won't compile because one of their dependencies is broken. If one application depends on dozens of crates (or is just really large), the chances of it breaking somewhere are much higher.

This can be partially mitigated by the crates themselves releasing upgrades, but look at Python 3 - there will eventually be a lot of code out there that's completely unmaintained, especially within the "dark matter" of internal or closed source applications. Also, a fix in the latest version of a crate will probably not be backported to long deprecated major versions of it - which means that if I dust off a crate that hasn't been touched in years and uses such a version, I'd have to either upgrade the dependency or do the backport manually.

And the ancestor post did say five years: last I checked on the lists, there was no real consensus as to when Rust 2.0 should be released, on a scale from "in a year" to "never". Maybe there is more consensus internally, but I'd be surprised if it didn't happen in that long...

edit: It's not an entirely fair comparison, but for what my ideal stability promise for a programming language would look like, one need not look very far - only as far as Mozilla's other programming language.


I'm convinced that the breakage in new C++ compilers is worse than Rust's breakage. New C++ compilers and language revisions break more than 4% of code. Likewise with many other languages, such as Ruby. We're just very up front about regressions when they do occur.

> edit: It's not an entirely fair comparison, but for what my ideal stability promise for a programming language would look like, one need not look very far - only as far as Mozilla's other programming language.

As a matter of fact, JS's stability promise is very similar. See my other post for an example in which JS was broken due to, essentially, one of the reasons you cited, and we lived with it. The JS stability promise is not "we will never break any JavaScript in theory with new browser versions", because that would prevent anyone from, as an extreme example, changing the sorting algorithm for Array.sort. It's "we will not break you in practice, except in a carefully delineated set of circumstances." (Actually, JS's stability promise is much weaker than that: its stability in practice is decided by browser consensus and market forces rather than a clear set of rules.)


You're correct about impacts of popular crates, this is why crater is so important. A nice benefit is that popular crates should also have more attention on them, since they're popular, so if they are hit by a soundness issue, it should be addressed quickly.

The core team feelings on a 2.0 is much closer to the "never" end of the spectrum than the "in a year" side.


My only reason is that experience and surprising developments have made me into a skeptic when discussing technology. I was not aware of any promises.


Are there any large deployments of Rust outside Mozilla? (I'm not being sarcastic, just wondering who with resources besides Mozilla is incentivized to put money into Rust development.) Thanks!


The largest is currently Dropbox, who re-wrote the way they store the bytes on disk in Rust. It was deployed in late December, they have promised a report after it's been in production longer than a few weeks.


thank you; I'll put a note on my calendar to check in a couple weeks

edit: Once you google for dropbox rust, you find some really interesting stuff; here's an overview

https://news.ycombinator.com/item?id=9724849


Author of that comment here, let me update it with some new links for you. :)

Putting Rust into production at Dropbox: https://www.reddit.com/r/rust/comments/3wrgl0/what_are_you_f...

Some preliminary feedback on using Rust at Dropbox: https://www.reddit.com/r/programming/comments/3w8dgn/announc...


"Rust is a young language and I would be pleasantly surprised to see code written today still compile in five years."

I'm myself willing to accept the promise of code stability, so I think you're looking at slightly the wrong problem. While I'd expect current code to continue to compile, I also expect what is regarded as good Rust now won't be in five years. That's not a bad thing, but it also means that hopping on the Rust bandwagon might be premature.

The worst case? How about Haskell, where new language features make old code look worse even as it still works? Or C++, with it's accelerating rate of style changes?


That's an interesting point regarding changing idioms/culture.

Can't speak about Haskell, however I believe that C++ changes can be incrementally adopted: we had C++98, then boost established itself, some parts of boost were included in TR1 and then were part of C++11. A similar process is taking place with other libraries such as asio and filesystem. I don't have the feeling that modern C++ style is in flux, its pillars continue to be RAII, generic programming and the STL.

I think Bjarne & co have explicitly stated that they want to grow the language while maintaining backwards compatibility as much as possible and from my experience, the code I write today builds on patterns and idioms that started to crystalize a long time ago.


Yep RAII was already a thing in the late 90's for example.


That is an interesting point. I'd like to see pcwalton respond to it. That people are having problems with current Rust idioms means they might [rightly] change them. So, starting with Rust now could reinforce bad practices that come with a retraining or software re-engineering cost later. So, this is a real risk that I overlooked in prior discussions.

Good catch.


According to TIOBE, Rust is now more popular than Go. Both have corporate backing. Go is definitely used in production, and I wouldn't be surprised if people are dipping their toes in the water with Rust in production as well.

Even if it remains a small niche, I think we've seen that Rust is popular enough that using it long-term is fairly safe. Maybe you'll be one of only a few thousand shops who use it -- that's enough. The benefits of writing secure code vastly outweigh to drawbacks of using something unpopular.


TIOBE's good for trivia, but probably not for picking a specialisation path or choosing which tech to build a project on.

When choosing a language to learn, I consider:

* what a particular industry or project of significance is using

* job availability

* compatibility with my existing toolbox

* the power/fun factor

But that's far from the whole story, because programming languages are relatively fungible and investing in domain knowledge, people skills, architecture and many other things will bring bigger benefits than learning that brand new programming language!

And yet here we are, discussing how Rust will deliver us.


> * what a particular industry or project of significance is using

> * job availability

That just boils down to popularity once again. I prefer to pick languages based on suitability for the problem domain. For me, Rust is suitable in a way that C++ is not, due to memory safety among many other reasons.


I think we are mostly in agreement. If an industry is standardizing on a language, it's probably a good match for the kind of problems that they solve, i.e. for their problem domain.

Job availability is not just about number of jobs, but also about project quality, companies and personal preference. E.g: I don't have an interest in containers, and most Go jobs involve containers, so this is a minus for Go.

This works in the other direction too. Let's say I want to work for Mozilla, I notice they use mostly C++ & Js, so I decide to invest time brushing up my skills in those areas.


> programming languages are relatively fungible

I agree with that regarding learning them, but I disagree strongly when it comes to the end result.

I make a lot of my money fixing/replacing PHP projects. As has been pointed out many times before, there are issues with PHP that are guaranteed to cause bugs. The same is true of many languages.

But there is a spectrum! Some compilers/interpreters scream at you and terminate if you contradict yourself. Some of them even recognize security vulnerabilities.

I agree that people skills, domain knowledge, and architecture are hugely important, but great teams are still less effective when they use slow or confusing tools.

You might be the greatest contractor in the world, but if someone asks you to build a skyscraper with plywood and super glue, you're going to end up with something dangerous, fragile, and guaranteed to fall apart as soon as someone tries to use it.


>>many other things will bring bigger benefits than learning that brand new programming language!

The other narrative is also true. Using newer technologies is the best way to be a part of new projects and avoid working legacy maintenance kind of projects.


It's interesting that a synonym for "legacy maintenance kind of projects" is "successful".


"And yet here we are, discussing how Rust will deliver us."

So why are you here, debating about Rust, if you already know your time would be better spent investing in domain knowledge, people skills, or architecture?


Spending let's say 1h on HN is not at the same level of time investment as learning a programming language. ;)

I am not really discussing Rust per se, as I am not familiar enough with it, however, when I see biased or misleading statements I do sometimes feel obliged to offer a counterpoint.


IIRC, doesn't TIOBE have a problem with indexing Go because they only search for "Google Go" (or something restrictive like that)?


According to a post from a few years ago, they were using "Go programming", "Google Go", and "golang"[1].

You'd have a similar problem with Rust, Ruby, Python, Java, Ada, Julia, and probably tons of other languages. It must also be really difficult to differentiate between C, C#, and C++, considering most search algorithms will ignore the # and + characters.

1. https://groups.google.com/d/msg/golang-nuts/TtPzOvhG6bM/-XCr...


  >  I would be pleasantly surprised to see code written today still compile
  > in five years.
That's the goal, modulo soundness fixes. We don't currently have any plans for a 2.0.

  > what will happen to Rust if Mozilla changes strategies
Well, we're just starting the process of integrating Rust code into Firefox. Yes, Mozilla has killed off a lot of things, but I don't think Firefox is going away any time soon.


>Rust is a young language and I would be pleasantly surprised to see code written today still compile in five years.

Without changes to the code? Has there even been a language where that has been true?


Many, I'm sure. Most Java from 5 years ago will still compile on newer javac versions, for example.


Not just most, I'm betting the overwhelming majority of Java code will compile. But all popular languages break small things with every release (and this is probably a good thing in the long run), so look hard enough and you're guaranteed to find code that won't work.

For instance, here's the compatibility guide to Java 8: http://www.oracle.com/technetwork/java/javase/8-compatibilit...


C#. Yesterday i rebuild a project (with latest compiler version) that i've created 12 years ago.


Note that C# famously broke backwards compatibility when it added generics. I say "famously" because it opted not to follow Java's type-erasure approach (which was chosen so as to explicitly maintain backwards compatibility), and history seems to agree that the minor amount of pain back then was well worth the improvements that it brought to the language (especially relative to Java).


They also changed the for loop var semantics in C# 5.

There a few other little breaking changes as well.


BASIC. Fortran. Pascal. Oberon. Any language designed to be simple and map well to pseudo-code. Even the source-to-source converters tend to have few problems with those. One of reasons I recommended them for certain projects in the past where longevity and talent were concerns.


There was a time not too long ago that nobody ever said "C++ remains", what they said was "Nothing can challenge C++". It was the "business programming" language. The fact that it's now "surviving" and "remaining" is a testament to how far it's fallen, not how successful it's become. It didn't rise from 0 to "used" over the last 5 years, that's something a new language Go can claim with pride; no - it went from "used everywhere by everyone" to "still used, somewhat". C is still more popular, at least according to TIOBE.

Java stole it's crown for "enterprise" programming and Microsoft even pretty much abandoned it in favor of .NET. I'm not sure their massive push for "C++ is back!" that happened 'round C++11 has really met with that much success - the world moved on, honestly.

There's fewer and fewer niches for it - the web could care less for it, heavy-duty "enterprise" programmers aren't giving up their safe and GC-based Java/.NET languages and giant 3rd-party eco-systems, and even video games now can be written in a higher-level language where only a small core 99% of people never interact with is in C/C++.

The problem is, IMO, that it's too late for C++ to be a decent language. If the first iteration of it was C++11, then sure. But there are vast code bases out there written in the horrible mid-90's or late 80's dialect of C++ that aren't going anywhere. Even something like Qt isn't true "modern" C++, and don't get me started on MFC. Look at Google's official C++ guidelines - they barely allow any modern C++ into their code-base.


  > Languages come and go but C++ remains. I wonder if this one will be different.
As I said the other day, unfortunately, the only way to get an old language is to start with a new one, and then let time pass.

You're not _wrong_, exactly, but with this logic, no new programming languages should ever be made. There are certainly valuable things about using a truly mature ecosystem, but we also need to build better mature ecosystems.


> but with this logic, no new programming languages should ever be made

Well I mean after LISP it was all down hill. /s


Languages come and go but COBOL remains. I wonder if C++ will be different.

I'm not trying to be a jerk. Just voicing my concerns. Learning a new language and producing code in it is a huge commitment. It better be around in 50 years. Will it still be as clean and simple, or will it have grown complex.

Note: That the C++ and COBOL counterpoints look identical and laughable never gets old for me.


I actually laughed out loud. Well played.

Taking the comparison seriously though for a moment. In my opinion the difference is that C++ isn't just a better Cobol. Its not just the same thing re-imagined or cleaned up. There is a revolution between them. You can do things in C++ that you cant in Cobol.

Im waiting for that revolutionary new general purpose language that goes mainstream. I think it will be a language that scales much better and easier to hundreds of cores. Maybe it will be a functional language.


"I actually laughed out loud. Well played."

It's a new meme of mine. Glad you had fun with it. :)

"Im waiting for that revolutionary new general purpose language that goes mainstream. I think it will be a language that scales much better and easier to hundreds of cores. Maybe it will be a functional language."

I am too. I think Rust is a nice alternative to C++ for doing that sort of thing. Far as next step, I'm waiting too. I encourage you to check out Julia because it seems to have some of those traits. It's a LISP on the inside with a productive, imperative skin on the outside. Lots of potential for such a hybrid.

Also, check out ParaSail if you're interested in languages designed for easy parallelism. It's an Ada superset with some interesting design decisions. Might be worth factoring into the next, ideal language. ;)


They do come and go. 90% of a language's success is legacy apps that use it and can't switch off. The other 10% is how powerful the hype train is for getting new apps written in it, which then form that language's own 90%. Rust can actually get that done. The community has the power, and they have the hype. It's happening.


The thing is, it's quite hard to ban unsafe practice from C++, so long as people (rightly) insist on backwards compatibility. The checkable guidelines sound great if people can be persuaded to use them and make them stick - but people aren't necessarily already using those features that exist.

C++ isn't going to go away any time soon, but it might gradually fade into the background.


> compile time reflection (no more manual serialisation)

Are we? I've been hoping but this wasn't on the last C++17 meeting in Kona, Hawaii

https://www.reddit.com/r/cpp/comments/3q4agc/c17_progress_up...


Which is quite interesting for someone like me.

Around 1994 C++ seemed the path forward from Turbo Pascal, given that I favoured type safety, C was a meh when compared with Turbo Pascal, but Turbo Pascal was a PC only thing.

Back in those days C++ was regarded like Rust and other C++ wannabe replacements are nowadays.

We were the hipster of the 90's, with C devs targeting home systems slowly accepting that not all functions needed to be inline Assembly wrappers.

So it is interesting for a grey beard like me to see C++ being described just like C and Pascal vs C++ compilers were back in the mid-90's.

Nowadays my area of work is dominated by JVM, .NET and the native languages of mobile OSes.

I wish Bjarne et al succeed pushing "C++ Core Guidelines" forward, but they will not change the mentality of those that program C with C++ compiler, which is what I usually see at the typical corporations.


Learning a new language and producing code in it is a huge commitment. It better be around in 10 years.

10 years is a long time; a few weeks to learn a language is not.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: