I'm a big fan of the "backwards" build system Make. It's very concise, portable enough, and not language-specific. I use it on literally 100% of my projects to hold small scripts. Example: "setup Python virtualenv" or "do a fast feedback loop with Lint" or "do a slower high quality feedback loop with Pylint and the full test suite".
I'm thrilled people are developing other options, building a project "forwards".
FYI "strace" on Linux makes it doable to trace build dependencies. Alas it doesn't work on macOS, and I haven't found an easy equivalent.
I agree. While I like the idea of tup (https://gittup.org/tup/ -- the first "forward" build system I remember hearing of), writing a makefile is easy enough that thinking about the problem upside-down doesn't offer a compelling reason to switch.
I use Make the exact same way. Basically, it's a very concise way to express "here are some tasks with dependencies on each other. Abort if any of them fail" and sometimes "some of these tasks can be skipped if the inputs haven't changed". You also get free tab-completion on 99% of desktop Linux systems.
If I'm working on a complex software project, I'll probably use an appropriate build system. Maybe Bazel, CMake, sometimes even Autotools.
But I'll probably also keep a Makefile somewhere, and use it to automate the process of running that shiny build system.
I've been using a simple unity-build ad-hoc build.bat/build.sh "system" for years now, works wonders.
YAGNI will serve you well, 99.99999(repeating)% of everyone's code will only be built and run on 1, maybe 2 platforms, why bother with these insane monstrosities that we call 'build systems"?
The few times I've needed to build for a new platform I just wrote that build script then and there, took a few minutes and that was it.
Modern machines can churn through a tens if not hundreds of K lines of C code in less than a second, so incremental builds aren't needed either (and if anything, with too many translation units you end up with linking being a bottleneck).
Single TU benefits:
- Global optimizations "for free".
- Make all functions static (except main) and you get --gc-sections "for free".
- Linking is blazingly fast.
- Don't have to bother with header files.
- No one has to download anything to built my code, I make it work on a default msvc/gcc/clang install (i.e if you have gcc, cl or clang in your path when running build.bat/build.sh, it will build).
* one TU with code that is fast to compile and modified often
* another TU with code that is slow to compile but modified rarely
I still use header files, but the fast-to-compile and modified-often code goes directly into headers, so I can still organize my code into separate files.
I got sick of juggling code that migrated from one category to the other, so I wrote a little script that deals with chopping up a large source file into multiple TUs before feeding them to the compiler.
The problem with bazel here is that every project, no matter how small, ends up needing some beefy WORKSPACE and bazelrc, especially when you start bringing in this party dependencies.
How do we streamline this? bzlmod and bcr are steps in this direction, but it still seems far away to me.
Yes, that declarative style makes more sense to me than a forwards build statement of steps.
I developed my own tool for building C/C++ projects which follows this style. It automatically handles things like Qt moc and scans the source files for headers, so in your example even the hdrs line can be omitted.
after all these years in the industry, chatgpt has enabled me to be productive in gnu make. both are such wonderful tools.
really like gnu make now. i know it can solve the problems i have, and chatgpt helped me get there. when dependencies are setup properly, it's quite amazing to see it work.
I'm thrilled people are developing other options, building a project "forwards".
FYI "strace" on Linux makes it doable to trace build dependencies. Alas it doesn't work on macOS, and I haven't found an easy equivalent.