It kind of is for individuals. They don't bother punishing you too hard for pirating it anymore. I can't change my wallpaper unless I go legit? Oooo, I'm so scared!
Microsoft remembers Bill Gates's mantra that piracy at the individual level actually gets you an audience who will then sign on the dotted line once the big-money deals -- enterprise license contracts -- need to be made.
We have giant Go monorepo at work. Bazel build. The ex-Google guy who set it up, left. No one around who wants to put in the time to learn it. It seems like a general increase in complexity. It seems to replace the go mod stuff so we have something called gazelle that figured it out and third party ide plug-ins. The plug in for IntelliJ is janky. To top it all off, somehow we managed to get a build that is slow locally, slow on CI.
We had the same setup, ex-Googler SRE demanded builds be done with Bazel for our Go monorepo. He convinced someone to switch it all over, all builds, deploys, CI, testing, etc were all dependent on Bazel.
That person left, and the Bazel stuff was left unmaintained as no one had the interest nor time to learn it.
Later Bazel decide to kill off rules_docker and replace it with rules_oci, which means we can no longer update our Golang version without a super painful migration where we end up breaking production a bunch of times because of quirks in the super difficult migration.
Eventually we invested the time to rip the whole thing out and replace it with standard Go tooling with a multistage docker build for the container image. Everything from running tests, using Golang tooling, legibility of builds, CI, and deploys is easier.
The best thing we did was remove it and move to standard Golang tooling.
Personally, most cases I have seen are caused by a systematic under-investment into developer toolings and infrastructure at the company. Management layers often don't understand the software development assembly lines are not composed of just workers(software engineers), but also tools and machines that enable faster workflows. This often results in some critical processes in the pipeline from code to prod to be maintained by 1 guy: monitoring, alerting, deployments, builds, git etc... and when that 1 guy left, the system failed and the company suffered.
I think successful Bazel adoption in an org is often a signal that the company has grown in size and values its developer's time and happiness. Failure to adopt often means a lack of investment in dev experience in general.
I worked in a huge big tech with infinite resources. They say monorepo using bazel and gazelle is a success there, I personally found it the worst dev experience ever... Everything was so slow, IDE doesn't work properly, there was no easy debugger, intellisence, refactoring, Code generation... Now every enterprise which calls me on LinkedIn, I ask if there is usage of monorepo, if it has, I just answer that I am not interested...
I agree in principle, although I’m not sure in our scenario that Bazel was even a value add when it was maintained. It just meant our Golang developers had to use a bunch of obtuse tooling like Gazelle instead of the normal Go tooling they’re familiar with already. It meant a bunch of smart people ended up blocked and having to constantly ask questions in Slack channels to other devs when they otherwise wouldn’t have been. Some teams even avoided joining our monorepo and built out their own CI, builds, etc in another repo solely because we were using Bazel in the monorepo.
The reality is, in a post-layoffs world, practically every company is scrimping on these sorts of “back office” things. If people couldn’t even get these complex build systems to work in the pre-layoffs low-interest-rate tech world where labour was more abundant, how can they get it done and maintained now?
The builds were much faster afterwards, but Go builds always are fast. Maybe in other languages that are slower to build the Bazel cache is more beneficial.
This amazes me since I switched to go because its builds are so insanely fast and the module system is quite clean. Out of all the languages, go probably needs it the least!
I did use bazel long ago for c++ and it was quite good at it, but we didn’t have very many dependencies.
There are plenty of native languages that compile Go fast, unfortunely the scripting language revolution, and too much focus on C and C++ tooling kind of messed it all up.
Rob Pike even has a talk about how fast ALGOL compilers used to take, naturally one needs to take into account the additional issue with reading punch cards, but still quite fast for early 1960's hardware.
Or Object Pascal and Modula-2 compilers in the early 1990's.
Eventually we (as in industry) started focusing on the wrong points.
I have warm fuzzies for way back when I coded in Turbo Pascal. Even on the 8086 I had at the time, compilation speed was incredibly fast even on midsized projects. Much faster than my day to day work with Go on an M3 Max.
I had to install it last night in DOSBox-X and compile the Breakout example project. The warm fuzzies are justified.
Yes. Example of the top of my head - on my recent project (only 50 kloc) go build took 30+ seconds to run when you change one line even fairly high up in the dep graph. When you have grpc/proto or large cgo dependencies and a team of people you need to either dockerize all dev workflows (slow and complex) or write massive bash/makefile abominations (fragile and complex)
It sucks that we’re still at “best practices” phase. We’ve been in this phase for the last three decades [1], and I really hope we enter “good theory” phase soon.
Give ‘em a shout out by saying “hi” in a request parameter or something while you’re reverse engineering.
reply