In my experience (Clojure) macro-heavy libraries tend to be powerful but brittle. A lisp programmer can do things in a library that non-lisp programmers can't realistically attempt (Typed Clojure, for example). But the trade offs are very real though hard to put a finger on.
There are several Clojure libraries that are cool and completely reliant on macros to be implemented. Eventually, good Clojure programmers seem to stop using them because the trade-offs are worse error messages or uncomfortable edge-case behaviours. The base language is just so good that it doesn't really matter, but I've had several experiences where the theoretically-great library had to be abandoned because of how it interacts with debugging tools and techniques.
It isn't the macros fault, they just hint that a programmer is about to do something that, when it fails, fails in a way that is designed in to the system and can't be easily worked around. Macros are basically for creating new syntax constructs - great but when the syntax has bugs, the programmer has a problem. And the community tooling probably won't understand it.
In racket you can define simple macros with define-syntax-rule but in case of an error you get a horrible ininteligible message that shows part of the expanded code.
But the recomendation is tp use syntax-parse that is almost a DSL to write macros, for example you can specify that a part is an identifier like x instead of an arbitrary expresssion like (+ x 1). It takes more time to use syntax-parse, but when someone uses the macro they get a nice error at the expansion time.
(And syntax-parse is implemented in racket using functions and macros. A big part of the racket features are implemented in racket itself.)
> Eventually, good Clojure programmers seem to stop using them because the trade-offs are worse error messages or uncomfortable edge-case behaviours.
I am not so familiar with Clojure, but I am familiar with Scheme. The thing is not, that a good programmer stops using macros completely, but that a good programmer knows, when they have to reach for a macro, in order to make something syntactically nicer. I've done a few examples:
pipeline/threading macro: It needs to be a macro in order to avoid having to write (lambda ...) all the time.
inventing new define forms: To define things on the module or top level without needed set! or similar. I used this to make a (define-route ...) for communicating with the docker engine. define-route would define a procedure whose name depends on for which route it is.
writing a timing macro: This makes for the cleanest syntax, that does not have anything but (time expr1 expr2 expr3 ...).
Perhaps the last example is the least necessary.
Many things can be solved by using higher-order functions instead.
My canonical use-case for macros in Scheme is writing unit tests. If you want to see the unevaluated expression that caused the test failure, you'll need a macro.
Python's pytest framework achieves this without macros. As I understand it, it disassembles the test function bytecode and inspects the AST nodes that have assertions in them.
Inspecting the AST ... That kind of sounds like what macros do. Just that pytest is probably forced to do it in way less elegant ways, due to not having a macro system. I mean, if it has to disassemble things, then it has already lost, basically, considering how much its introspection is lauded sometimes.
1) OMG confusing -> almost never use them
2) OMG exciting -> use them faaar too much
3) Huh. Powerful, but so powerful you should use only when required. Cool.
Quite a few things turn out to be like that skill-progression-wise and at this point I've just accepted it as human nature.
my plan there was to write ferocity0 in Java that is able to stub out the standard library and then write ferocity1 in ferocity0 (and maybe a ferocity2 in ferocity1) so that I don't have to write repetitive code to stub out all the operators, the eight primitive data types, etc. I worked out a way to write out Java code in a form that looks like S-expressions
which bulks up the code even more than ordinary Java so to make up for it I want to use "macros" aggressively which might get the code size reasonable but I'm sure people would struggle to understand it.
I switched to other projects but yesterday I was working on some Java that had a lot of boilerplate and was thinking it ought to be possible to do something with compile time annotations along the lines of
> I want to use "macros" aggressively which might get the code size reasonable but I'm sure people would struggle to understand it.
A long time ago I wrote a macro language for the environment I was working in and had grand plans to simplify the dev process quite a bit.
While it did work from one perspective, I was able to generate code much much faster for the use cases I was targeting. But the downside was that it was too abstract to follow easily.
It required simulating the macro system in the mind when looking at code to figure out whether it was generating an apple or an orange. I realized there is a limit to the usability of extremely generalized abstraction.
EDIT: I just remembered my additional thought back then was that a person that is really an expert in the macro language and the macros created could support+maintain the macro based code generation system.
So the dev wouldn't be expected to be the maintainer of the underlying macro system, they would just be using the systems templates+macros to generate code, which would give them significant power+speed. But it's also then a one-off language that nobody else knows and that the dev can't transfer to next job.
There's also the use of macros for internal stuff, for instance automatic api generation. Obviously this is a thing everywhere, but you can implement all of that inside the library itself rather with a secondary tool.
There are several Clojure libraries that are cool and completely reliant on macros to be implemented. Eventually, good Clojure programmers seem to stop using them because the trade-offs are worse error messages or uncomfortable edge-case behaviours. The base language is just so good that it doesn't really matter, but I've had several experiences where the theoretically-great library had to be abandoned because of how it interacts with debugging tools and techniques.
It isn't the macros fault, they just hint that a programmer is about to do something that, when it fails, fails in a way that is designed in to the system and can't be easily worked around. Macros are basically for creating new syntax constructs - great but when the syntax has bugs, the programmer has a problem. And the community tooling probably won't understand it.