Hacker News new | past | comments | ask | show | jobs | submit login

Funny.

But the problem here is complex and hard to solve. Cycles and memory leaks are nigh impossible to solve completely without sacrificing on that altar ability to create custom data structures or no-GC by default. Which was non-starter for Rust.




Well, I'm still not quite proficient with Rust, so maybe my feelings are misguided, but the whole situation seems really terrifying to me. Like, way too terrifying than that we could just make a point about "not doing like that again" and calmly move on to the 1.0.

After all, why Rust? Because we're all tired of problems with C++ and stuff. And the point isn't really about making a language that would be safe "as often as possible", it would be quite terrible goal, actually. The point is, I believe, to make it nearly impossible to write unsafe code accidentally, without even noticing it.

And again, maybe it's just me, but what I understood from that post is totally non-intuitive to me. I don't really understand what are the guidelines and how many more gotchas like that there could be. "Not to mess with unsafe code" is not really good guideline, because it turns out that it can be not quite obvious that the code is unsafe.


Please see my comment below (https://news.ycombinator.com/item?id=9447938) -- a key point is that, as always, you have to write `unsafe` to get memory unsafety in Rust, period. This is Rust's core guarantee. The issue being reported here is a safety problem caused by a particular use of `unsafe` blocks. The debate is largely about what unsafe blocks should or shouldn't be able to assume.


Well, the thing is, until Rust is formally tested, as in a subset of Rust is proven in CoQ to be safe, you don't have any guarantees.

I assume much of it is safe, with few probably hidden caveats, hidden in more complex logic. In C++ writing code that causes this behavior is trivial, here you need to jump through a lot of hoops. So it offers you comparatively more safety, but this kind of bugs are potential gold mine for crackers.

Anyway core developers are on the scene deciding how to tackle this issue.


Oh, sure, but let's be reasonable. If it's something like the notorious sort problem in Java, I'd say it's ok. It's preferable when things are simple enough that somebody with a good understanding of the whole system can make assumptions about its safety without using CoQ. If things are complicated enough that we actually need Agda/CoQ/Idris to feel safe — seems that things went really wrong at some point.

> but this kind of bugs are potential gold mine for crackers

Exactly! And I'm just saying that a problem which is "not very likely to trigger, but is very hard to find if happens" is far, far more dangerous than a problem which is "easy for a novice to miss, but every experienced developer would notice".


      > If things are complicated enough that we actually need Agda/CoQ/Idris to 
     feel safe — seems that things went really wrong at some point.
I disagree. Complex software/hardware is by its nature full of bugs. What Rust set out to do is complex and it's bound to have bugs, but doesn't mean it's reason to abandon it, no more than we should discard LHC or ITER because they had issues starting up.

Whether it's a wrong invariant on TimSort or a really obscure multi-threading case, or a weird glitch in time library, its still an error. Each one is exploitable.

ALL software could use some coverage/tests and theorem proving. I just wish Rust started with a proven safe subset.

      > Exactly! And I'm just saying that a problem which is "not very likely to 
     trigger, but is very hard to find if happens" is far,   
     far more dangerous than a problem which is "easy for a 
     novice to miss, but every experienced developer would notice".
Again, I disagree. The matter of security is one about making the assailant less likely to breach your software. If you take two comparable pieces of code, the one with more holes will take less effort to break.

If your language offers fewer avenues of exploit, makes these exploits more expensive. A rare bug is worth more than a run of the mill bug. Which prohibits the number of assailants which could purchase it, which reduces number of potential attackers.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: