Hacker News new | past | comments | ask | show | jobs | submit | rumanator's comments login

> I don't see a future where C survives

Meanwhile C is running strong since the 70s.

> the lack of package manager

What do you call linux distro's package managers then? I mean, in distributions like Debian you can even download a package's source code with apt-get.


>What do you call linux distro's package managers then?

If you want to count them as package managers, they're by far the worst ones of all the well known languages (with some notable exceptions e.g. guix's and nixos's).

They're not portable between distributions or even different versions of the same distribution (!), since it's non-trivial to install older versions of libraries (or, hell, different versions of the same library at the same time). Not to mention that it's a very manual and tedious process in comparison to all the other language specific package manager. 'Dependency hell' is a problem virtually limited to distro package managers (and languages like C and C++ that depend on them).

Getting older, unmaintained C programs to run on Linux is an incredibly frustrating experience and I think a perfect demonstration of how the current distro package manager approach is wholly insufficient.


> If you want to count them as package managers, they're by far the worst ones of all the well known languages (with some notable exceptions e.g. guix's and nixos's).

The have the only feature I care about: cross-language dependency management.

Unless you are suggesting to reimplement everything in each language and then make users install ten different XML parser, SSL implementations, etc. just because not-implemented-in-my-favorite-language syndrome.


Only on platforms where UNIX is the name of the game.


C has been "losing ground" not because of random per peeves of those who never wrote a line of code in C but because since C's last standard update there have been other programming languages that offer developers something of value so that the trade-off between using C or any alternative starts to make technical sense.

It also helps that C's standardization proceeds in ways that feel somewhat between sabotage and utter neglect.

Meanwhile, C is still the absolute best binary interop language devised by mankind.


> C has been "losing ground" not because of random per peeves of those who never wrote a line of code in C

This is not a random pet peeve, and WalterBright is as far as you can get from someone "who never wrote a line of code in C". This is the cause of numerous security bugs in the past and currently, and the reason most C material written in the 70s/80s is unsafe to be used today (mostly due to usage of strlen/etc vs strnlen/etc).


Frankly I never would have made the proposal if I didn't love C. I've made proposals to add D features to C++, too.


A question: since your company also makes a C/C++ compiler (and the repo has very :), have you considered adding this addition to it, as an experimental feature, perhaps to demonstrate its usefulness to other developers and standard bodies? (Although, now that I think of it, D itself might serve the same purpose)


I don't see much point in it. I've proposed this change to C in front of several audiences, and it never received any traction. If you want to experiment with it, you can use DasBetterC, i.e. running the D compiler with the `-betterC` switch, which enables programs to be built requiring only the C Standard Library.

Fair warning - once you get accustomed to DasBetterC, you're not likely to want to go back to C :-)


> If you want to experiment with it, you can use DasBetterC, i.e. running the D compiler with the `-betterC`

I've been meaning to experiment with DasBetterC for a while, and I have a project C I've been wanting to migrate to something with proper strings (it's an converter for some binary file formats, but now I want it to import some obscure text formats too). Maybe that's the push I needed :)

After 20 minutes and about 250 out of 2098 lines converted, the error messages are very good and give very nice hints about what to change, I must say I prefer them to Rust's verbose messages.


Great!

DasBetterC's trial-by-fire was when I used it to convert DMD's backend from C to D.

I'm sure you already know this, but the trick to translating is to resist the urge to refactor and fix bugs while you're at it. Convert files one at a time, and after each run the test suite.

Only after it's all converted and passing the test suite can refactoring and bug fixing be considered.


I don't get why it hasn't gotten traction. When I read it, it was immediately obvious to me that this would be extremely helpful. I want it yesterday, and so should everyone.


Pro tip: Google the name of the person before responding to them, it can help avoid the taste of foot in your mouth which you are currently experiencing.


> Google the name of the person before responding to them

Is it a rule at HN that you can't take someone else's name? Otherwise, there's no guarantee that you're talking to the "Real" Walter Bright...

... or that you're talking to that Walter Bright, come to think of it.


There can be only one.


But The One doesn't get the username.


I’m new here, so this seems like a valid criticism to me — but judging by the number of downvotes, it may not be. Can someone explain why this comment is incorrect?


Perhaps because so many of know Walter from his work and his history here on HN? Sometimes you have to just trust that someone is who we all say they are.


Cool, thanks for the explanation.


It's fair argument but I also check karma point is high so it looks like legitimate account name in this case.


[flagged]


There's "arguments of authority" and then there's "accusing Walter Bright of having never written a line of code before".


AKA: Conflating authority with expertise


What argument from authority is being made by anyone?

The GP decided, out of the blue, to accuse the author of never having written a line of C code in his life. That's kind of inappropriate in any context, IMO, but just downright laughable when the author is well-known for singlehandedly writing several compilers and a whole new language.


Anyone that has to rely on their name for an argument isn't worth listening to.


You misunderstand what the conversation is that’s occurring. The parent implied the person had never written C.


He never said explicitly, he was just making a general statement. Not that it matters whether he did or didn't, there's a lot of things wrong with C, it will most likely eventually disappear, but not for reasons outlined in this article. That's what he was saying.


Well, he dismissed Bright’s argument as a random pet peeve from people who haven’t written a line of code in C before, so yes, I do think he said it explicitly.


> Please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize. Assume good faith.

This is one of HN's comment guidelines. If you're not sure that someone is who you think they are, you can just ask, e.g.: "Hey, are you Walter Bright who did X and Y?"


? I’m confused how your comment relates to mine. Did you post on the wrong thread?


Someone once said COBOL would disappear.

C will still be used long after you and I and everyone here have returned to dust.


Try becoming a COBOL developer and see how that works for you. Likening C to COBOL isn't doing it any favors.


What's the implication here? I only know one COBOL developer but they seem to be doing quite well for themselves, making over $400k a year for something like 15 hours of work a week.


COBOL developers commanding a high salary is directly related to it not being a thriving language.


> C will still be used long after you and I and everyone here have returned to dust.

there are also people still riding horses. does not make it relevant in any way.


> Meanwhile, C is still the absolute best binary interop language devised by mankind.

You're mistaking the “C” ABI with the C language. The so-called C ABI should actually be called the UNIX-derived ABI, as (i) C doesn't define an ABI and (ii) C can perfectly produce binaries using another ABI (such as e.g. the “Pascal” one, common on the DOS platform).


Maybe people are voting this down because they think it's directed at Walter Bright in particular, but I think there is actually some truth in the harsh comment.

Nothing about Walter Bright in this statement, but some of the harshest criticisms from others I have seen of C are not from expert practitioners in C.

People who are experts and also critics seem to have a more practical, realistic, nuanced critique, that understands history and challenges to adoption, admits that the long history and difficulty of replacing C isn't exactly for no reason.


That's the way I interpreted it because it's true. A lot of the criticisms are misdirected one by people that haven't used C except being forced to use it for few assignments in school, C++ jockeys that think C is the 30 year out of date version of C that's supported by C++, and people that haven't used it at all for anything real.

I also agree that what the standard committee has been doing for the last 20 years amounts to willful sabotage.


So what the improvements between C89 and C18 in regards to UB and security, for any ISO C compliant compiler?


Between c89 and c18 is close to 30 years.

What about between c99 and c18? Is there anything you can think of? I think the _s() functions, advertised as security features, are a weak effort. Anything else come to mind?


Nothing really, if anything VLAs have proven such a mistake that Google lead an effort to remove all instances of VLA use out of the Linux kernel.

Also the amount of UB descriptions just increased and are now well over 200.

Annex K was badly managed, a weak effort as you say, given that pointer and size were still handled separately, and in the end instead of coming up with a better design with struct based handles, like sds, everything was dropped.

ISO C drafts are freely available, I recommend everyone that thinks that they know C out of some book, or have only read K&R book, to actually read them.


> some of the harshest criticisms from others I have seen of C are not from expert practitioners in C.

But were they expert practitioners of C in the past? My experience is that most of the harshest criticisms of C come from former C experts who moved on to other languages because it became clear to them that C would never be fixed - Walter Bright included.


I also have extensive (20 years) experience with the solution I proposed.


Yes I know, and for clarity I appreciate your work and insight, and frequently enjoy your comments here.

My point was that people were mistaking the comment for an attack on you, which I don't think was necessarily intended or needs to be without it being a valid point about a different set of critics.


Lol at “never wrote a line of code in C”. Surely you are not addressing the article’s author?


> If they reduce price to $10/m they'll need 5x customers to reach same revenue

This assertion could only start to make any sense if you believe that the law of supply and demand doesn't exist, and that economies of scale don't apply.

Meanwhile, I am a paying customer of cloud providers eventhough I don't use them just because they are affordable. This service expects that people like me spend 600€/year when I can have the same exact service somewhere else for 60€/year.


Well the assertion is just a mathematical fact, there is a disagreement on the demand curve.

Maybe the company isn’t targeting anyone who is a paying customer of cloud providers, and is targeting full time cloud architects who need more than the other products offerings.


> Well the assertion is just a mathematical fact,

It really isn't. You're somehow assuming that the law of supply and demand doesn't exist. As the premise is blatantly wrong then this mistake renders all subsequent assertions mute.

> Maybe the company isn’t targeting anyone

Irrelevant. The initial assertion was that higher prices somehow had no inpact in demand and thus revenue would be proportional. This assertion is blatantly wrong. The fact is that lower prices increase demand, and increased demand enables economies of scale, which lead to higher revenue. Conversely, higher prices lower demand, etc etc. Market segmentation is tangential to this point.


> It really isn't.

The assertion was "$10/m they'll need 5x customers to reach same revenue"

Revenue = Price * Quantity Sold (i.e. customers in this context).

Let x = Quantity Sold, and y = revenue

$50 * x = y

$10 = $50/5

$50/5 * x = y/5

$10 * 5x = y

OP stated a fact, and you said it was incorrect. I'm just trying to say the disagreement is actually about how elastic the price is (i.e. a change of x% in price results in a y% change in the number of customers).


What a weird comment. I mean, it adds nothing to the discussion and intentionally sidesteps the core of any service: it's business model.

If you refuse to even discuss the pricing model of a service, why bother wasting your time posting anything at all?


I think it adds a fair bit to the discussion. The business model must be aimed at larger cloud users who’s savings will outweigh $50/mo


Those cloud users can have the same benefit by spending a fraction of the price. That's the point. The question you're trying to ask is whether the added value being promised justifies a price tag that's 10x higher than the ones offered by established services.


> For EC2 instances and the like it will be almost all X86 for a long time.

I wouldn't bet on it being so lopsided. AWS is betting stringing it's ARM processors, which appear to have a better price/performance ratio.


You're focusing your argument on a specific person while ignoring the point about how a global increase in liquidity triggers global inflation.


Maybe, im no economist but wouldnt the psychological effects of cash windfalls be more applicable? I doubt these amounts would have much effect in the scheme of the global markets


> These concepts were developed for the web. Not APIs on the web, but HTTP itself.

This assertion is just plain wrong. REST is an architectural principle focused on providing APIs for web services, and Roy Fielding was quite vocal in making it extremely clear that there is nothing in REST that is HTTP-specific, let alone makes it tied to HTTP. REST is an architectural style that relied on the concept of resources, which should be linkable. That's it.


there is nothing http (or HTML) specific in REST, but REST itself is specific to the domain problem where http and HTML are used


What domain problem is that? Communicating through a network?


it is the domain of CDN-like servers


> Most of the time, those “things” are dumb business logic applications that can’t explore the hypermedia to discover new functionality and furthermore are not designed to do anything with it anyway.

Dumb business logic is not the point. One of the benefits of HATEOAS is that it allows REST clients to be loosely coupled with REST services, in a way that APIs can change freely change (i.e., change endpoints around, break away functionalities into other services, etc.) without requiring clients to be updated under the penalty of breaking compatibility.

The main reason why no one does REST and everyone does RPC-over-HTTP is that while REST clients require tracking and updating state to support service discovery, RPC-over-HTTP just requires a single HTTP request to achieve the same purpose, albeit without the resilience and future-proof.


> One of the benefits of HATEOAS is that it allows REST clients to be loosely coupled with REST services, in a way that APIs can change freely change

My point is that even though they're loosely coupled, the API actually cannot change freely because the actual consumer of the API, the business logic, is still tied to the API through the client. If your API changes and the client/browser is still able to traverse it fine, but your business logic breaks, does that actually mean that the API is free to change? I don't believe so.


> My point is that even though they're loosely coupled, the API actually cannot change freely because the actual consumer of the API, the business logic, is still tied to the API through the client.

The main promise of REST is that following that particular application style does indeed allow the API to freely change without breaking backwards compatibility.

The main drawback of REST is that no one actually follows those principles.

With REST, the client is not tied to an API. The client seeks resources, and relies on content discovery processes to determine how to access those resources. The only endpoint that may be hard coded is a home/root endpoint, and all other resources are accessible through it by following hypermedia. With REST you care about the what, not the where.

> If your API changes and the client/browser is still able to traverse it fine, but your business logic breaks

This is where you're getting it wrong. REST is all about the interface. The interface has zero to do with the business logic. If you have an interface that allows you to access the same data but for some reason your business logic breaks due to non-opersrional reasons (i.e. extra latency from the content discovery process) you need to have a talk with whoever screwed that up.


> Dumb business logic is not the point. One of the benefits of HATEOAS is that it allows REST clients to be loosely coupled with REST services, in a way that APIs can change freely change (i.e., change endpoints around, break away functionalities into other services, etc.) without requiring clients to be updated under the penalty of breaking compatibility.

There is a very limited amount of useful things that you can do here without breaking "conforming clients", at the cost of forcing those clients to implement much more expensive and error-prone logic in order to not break.

In reality, clients aren't going to conform, they're going to hardcode logic and they're going to break if you do any of the things that "shouldn't break conforming clients".

This means that in practice, on top of being more annoying to use and implement, REST APIs are more likely to break than RPC-APIs, which you can trivially extend with optional parameters or additional procedures.


That is true, but I would argue that typical applications are hard to implement in a RESTful way, since there is always some kind of understanding necessary. A human could understand that "inbox" was renamed to "incoming mail" in an email application. But what should some kind of RESTful client without understanding the semantics do?


If a random guy online uses an account created a couple of days ago to say he has knowledge of something without providing any proof or substance then everyone should just blindly trust him and take his word for it.

Right?


> Electron exists to poach web developers into app development, because native code and frameworks are hard.

I don't agree with the "poaching" statement, but I would argue that "frameworks are hard" should be replaced with "desktop gui frameworks are appalingly poor".

Webview-based rendering frameworks trade away native look-and-feel for a myriad of tools and processes and techniques and workflows and expertise that you simply do not get with plain old widget frameworks. GUI developers know this, and in particularly GUI frameworks vendors are well aware of this. In fact, check XAML or QML.


Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: