Hacker News new | past | comments | ask | show | jobs | submit login

How would hygienic macros fix this? It seems like the macros were working fine, but the amount of generated code increased compile time significantly. Wouldn’t a more sophisticated macro system still generate a bunch of extra code and result in slow compile times? It even seems like the solution was to fall back to “dumb” macros when feasible for compile-time performance reasons.



> Wouldn’t a more sophisticated macro system still generate a bunch of extra code and result in slow compile times?

Not necessarily.

The preprocessor code here picks up the original source, and blows up the initial code (which is about "min3(long_a, long_b, long_c)") to 47 MB of code. no fancy stuff, just 47MB of C-Code on the disk. That's a lot of code the compiler then has to parse and handle.

If hygienic macros are a first-class citizen in the compiler, the compiler parses the original macro code once and then just modifies it in-memory. There is no reason to write 47MB of code somewhere and read it back, this would just happen as an AST modification in memory.

But that is also a much smaller reason. First-class macros allow the compiler to reason based off of the types and structure of the macro inputs. You don't have to guess if something is constant, bounded, unbounded and such. Strong types can enforce this safely and macros and optimization can use these strong guarantees. And sufficiently strong type information can open doors for far, far more powerful optimizations overall.

Just for the record - I'm fully aware why the kernel is where it is, and why it will stay there, but there is far improved compiler and language theory from there.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: