Hacker News new | past | comments | ask | show | jobs | submit login
D at 20: Hits and Misses [pdf] (digitalmars.com)
123 points by noch on Oct 18, 2019 | hide | past | favorite | 53 comments



I think the presentation exposes the big picture: D is a very loaded language. It tries to serve every possible exotic need imaginable. Probably not as much as C++ but that’s a committe-designed language, it doesn’t count.

The feature bloat makes it very hard to learn D, let alone being an expert on it. I think it’s telling why its creator focuses on individual features to assess its success. Now, all those missed features either have to be maintained or carefully removed which is a lot of work and might actually break existing code.

I like D because of its vision: C++ for humans. It fails on that vision simply because it has become a gigantic incoherent product like a research project.

It probably needed a harsher approach to feature requests to make sure the language stayed on the vision. Now it’s a programmer’s dream and a programmer’s nightmare at the same time.

I think Go, for instance, has made a better decision to stay barebones.

I see that the creator intends to include Rust’s borrow-checking semantics which reminds me that Rust is probably what D should have been in all aspects.


> The feature bloat makes it very hard to learn D

Not at all. For example, if you know C-style programming, you can be productive in D with a pretty trivial investment in time.

> Rust is probably what D should have been in all aspects

While D will likely get some sort of borrow checking, it will not adopt the look and feel of Rust.


Please define "productive", doing what, to what end, with whom, at what level of complexity? Your clear thoughts here will be illuminating and worthwhile.


Productive means simply that you're getting useful work done at an acceptable cost to you.


Ok, you're claiming productive in /all/ circumstances? Going to have to disagree there. Also with the tone of that as a response but perhaps you didn't mean it.

Maybe giving those circumstances a bit more thought would be instructive and useful to you. Maybe not. Good luck!


> The feature bloat makes it very hard to learn D, let alone being an expert on it.

Learning 100% of any programming language should not be a goal. I know less than 10% of the functionalities of my microwave oven or my washing machine. This doesn't prevent me from successfully using them and save a lot of time.

Bjarne Stroustrup himself says learning all the nitty-gritty details of C++ is counterproductive and unnecessary.


The problem with feature bloat is related to a very well known fact: Code is read more often than it is written. As a consequence, if you don't know the little details of a language, it might be a harder to read code written by someone else that is more knowledgeable.

Consider for instance Lua, which is often read by people that have superficial knowledge of the language. Lua has a very standard syntax that anyone with basic general programming knowledge will understand:

> player:set_pos({x=0, y=0})

But it also has syntactic sugar:

> player:set_pos{x=0, y=0} -- note the braces instead of parens

This is where you lose half of your readers because they have to look for that weird syntax in the manual.

In defense of language implementers, it is sometimes difficult to "resist" feature requests. It's difficult to say "no" in general, in particular when you are not Niklaus Wirth and have half a dozen languages designs on your resumé; all you have is your gut feelings as justification.


I have 15 years experience as a professional C++ programmer in various teams and companies (and although I don't know 100% of C++17, I consider myself productive!).

When diving into new codebases, the couple of esoteric-used-features-you-did-not-know-existed in C++ was never a big deal. Learning the required C++ features was always a matter of days (because the codebase is full of real examples).

Understanding the overall structure of the code, the interactions/dependencies between the components, in what file/directory should go your first feature... this is the hard part. And it's many orders of magnitude harder than learning a couple of new language features.

Don't get me wrong, I'd really like that C++ and D be smaller, and I don't like feature bloat. I once wrote a source code processor based on clang, and then you realize how big the language really is (there are many AST node types), and having all your code written in it starts to feel like technical debt.

However, I pragmatically have to recognize that feature bloat isn't a real problem for professional C++ programmers (It might be, for compiler vendors - at this point, I'm not expecting any new C++ compiler to be created).

PS1: C++ certainly gives you some shiny tools to make interesting messes (I've seen codebases where simply including a header would change the program behavior). This isn't related to the feature count, but rather to the semantic of the individual features themselves, like macros/templates. At least we don't have monkey patching ...

PS2: About your Lua example: even without knowing the syntax, can you sincerely have the slightest doubt about what "player:set_pos{x=0, y=0}" actually does?


Weird, because I agree with the general sentiment, yet with C++ I feel you have to learn most of the language if you want to be productive with it...


To put it bluntly: there's already one too many languages that make one bend into a pretzel to satisfy a tool.

If D tried to humanize C++, which is not a given, Rust tried and succeeded to dehumanize it into something that can be analyzed by a simple tool.


If the overhead of automatic GC doesn't fit your project you only have two choices:

a) Keep all the ownership and lifetime information in your head and make sure it gets communicated to other team members. Then write tests to make sure nothing was missed.

b) Formally specify ownership and lifetime so it can be checked by the compiler.

I'm not sure (a) is any less dehumanizing than (b)


Between A and B there's quite a few type system tricks & tools that one can use which bring many benefits without incurring the costs of fully verified memory ownership.

I stand by my original statement though:

a) in a language like C and C++ one has to use their knowledge to come up with a solution and verify it. Pretty human.

b) in a GC language, one mostly does not care. When they do, they do a).

c) in Rust one does their best at a) and then does what the robot tells them to, even if the code is correct but the robot can't yet figure it out.


For Rust I am still looking forward to not having to scatter Rc<RefCell<>> everywhere when using Gtk-rs and similar UI toolkits.


Not too familiar with inner workings of gtk-rs, but: would ECS approach work there? Or e.g., a generational slotmap, so you could pass around copyable integer keys as much as you want.


And then eventually get use-after-free access errors unless I use vector clocks to keep track of indexes usage, or reach to a third party library that does it for me.


> The feature bloat makes it very hard to learn D

I suppose you aren't using any maths either because of the existance of semisimple lie algebras and topological quantum fields ?


Yep. Every team project in maths where someone is completely free to use "semisimple lie algebras and topological quantum fields" has to count me out.

If a language feature exists and you program in that language you have to know how to deal with that feature, even if you don't write code using it you have to be able to re-factor it and re-write code that used it.

Unless we become the kind of person that doesn't ever work in teams or we just hate our team mates and care little for project success.

The point the parent makes is a legitimate one. It can be argued against but your response hasn't achieved that in a meaningful way.


Looks like this is slides from a talk Walter Bright gave which has been posted to YouTube [1]. It helps with some missing context.

For example, I wondered why in the slides he felt implementing contract programming in D was a miss; it seemed like a strong selling point for the language. According to the talk (around 1:34:30), however, he felt that contract programming was relatively unpopular/unused in D and that "assert" covered most of the use cases for it.

[1] https://www.youtube.com/watch?v=p22MM1wc7xQ


Contracts are interesting but I think to make it usefull 1) are ON on release! and 2) Become part of the docs.

One example:

     fun date(y:i32, m:i32, d:i32)
     {... code ...}
This scream for more context (and better param names!). With contracts:

     fun date(y:i32, m:i32, d:i32)
     assert
         y in 0..9999
         m in 1..12
         d in 1..31
     {... code ...}
But I need to see it in the docs, in the tooltips, etc. Now are useful!


Good old February 31.


Nothing like a bug behind a safety feature :)


That’s … not a bug.

A buggy assertion would be one that fires when the input is correct.

Assertions aren’t formal verifiers or replacements for a test suite.



And year 0.


I think that is why making the contract visible in the docs/intellisense or similar is valuable: You will understand more about the expectations of the code. Even if are incomplete or partial...


> (and better param names!)

And/or better param (sub-)types.


I think it's because most of the time an assert can offer just about the same guarantees.


That's right.


I use binary literals all the time, for what it's worth, in code involving hardware interfacing. Only complaint with C++14's implementation is that they didn't allow underscores to be inserted for clarity, a la Verilog.

Those underscores turn out to be a really big deal, and leaving them out of C++ was a massive mistake. Nobody wants to read constants like 00110100101101011011011101011110, but 00110100_10110101_10110111_01011110 is fine. An implementation of binary constants that doesn't allow this is going to be less popular.


You can use ' as the digit separator, like:

    0b00110100'10110101'10110111'01011110
It's not ideal, but using underscores was off the table, because things like _100_ are actually valid variable names and it would have caused parsing issues.


_100_ is an identifier in D, too. The simple rule is _ can be embedded in a numeric literal, it just cannot start one.


Actually I got this wrong - the ambiguity occurs with C++11 user-defined literals, not with variable names. See http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2008/n274...


That's cool, I didn't know they supported that. Good enough!

Of course, it's not as if '0' wouldn't be ambiguous, so I don't really buy the underscore-as-variable-name argument. C++ left context-free grammar behind a long time ago...


Would you use a binary literal for a single bit set in a mask? I'd think a `1 << n` would be clearer in that case.


My favorite use for them is when you've got to set a multi-bit field in a mask or register. If the hardware says bits 25-23 have to be 110 and bits 13-10 have to be 0101,

    reg = (0b110  << 23) || 
          (0b0101 << 10);
is by far the clearest way I know to write that. And since hardware descriptions usually (but not always) give descriptions in binary rather than decimal, it's the easiest form to check against the datasheet or manual.


Interesting that garbage collection is listed as a miss due to latency, and Go has gotten popular with garbage collection with an emphasis on low latency.


The big issue is lack of technical resources to improve GC's performance on D's runtime.

Some improvements have been made lately, but not at the level of manpower and CS background from other GC enabled languages.

I am quite confident that if D had something like Microsoft behind it, its GC would long be as good as Sing# or System C# (M#) were.


> The big issue is lack of technical resources to improve GC's performance on D's runtime.

The underlying issue is that to make the GC more performant requires the addition of "write gates", where writing through a pointer notifies the GC that the memory object has changed. This is a reasonable choice when a language is heavily GC dependent (like Java).

However, D is not at all heavily GC dependent. But it is very dependent on memory references. Hence the GC speedup from adding write gates would be more than lost from the cost of executing those write gates.


Go's GC makes certain compromises that affect the language as a whole; this is not reasonable for d, which is frequently used without GC.

The situation could definitely be improved by a lot; d's GC is just plain bad right now. But looking at other languages' GCs may give a misleading impression because those GCs' designs are not suitable for d.


Will the "misses" be removed from D eventually?


Many of them have been already.


A tree needs pruning now and then :-)


I hope that doesn't mean removing the GC in favour of borrow checking or manual memory management.


The GC, although often maligned, is highly useful and will stay.


What I find D interesting is that it is a really good fit for scientific programming. Maybe Julia is an answer, but in terms of producing interoperable code, it is the holly Grail. And people can port their Java code mostly painlessly. Science is the killer app of D where GC is a blessing in disguise.


Some of the slides are contradictory: miss lack of safety by default and miss focus on GC. Well if you want safety by default, then either you use a GC or you're a Rust/cyclone clone, I don't see a third way..


The third way is not throwing out the baby with baby water.

The path pursed in the past by Mesa/Cedar, Modula-3, Sing# or more recently System C#.

Provide automatic memory management by default, while offering all the tooling stack and global allocation, with low level stuff and untracked references requiring explicitly unsafe code.

For more modern approaches that combine those ideas with Cyclone ones, Chapel, Swift, Ada/SPARK, ongoing Haskell and OCaml support for linear types.

So you get the productivity of automatically memory management, graphs, self referencial structures and what not, while being able to go low level for more fine-grained resource management.


There is a third way. We're working on it.


D is 20? That's.......telling.


Yes. We are persistent.


That is if we count the development of D1 which appeared in 1999. Later in 2007 D2 emerged with breaking changes. D1 was discontinued in 2011 and D2 became simply D. So, yes formally D is 20 years old but one could say that it's newer reworked version is just 12.


I started on D 20 years ago. It was a couple years before I even managed to make a prototype that people could play with. 20 years ago it was a few notes on a piece of paper :-)


I guess they never miss huh




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: