The simplest examples have over a thousand (literally) dependencies. Amongst them, are GTK, GDK, pango, etc. It literally depends on another toolkit, which is the weirdest thing IMHO.
Because of GNOME's insistence on not implementing Server Side Decorations, you can't not depend on libadwaita. This is what I imagine pull in all of the GTK dependencies.
You can very much draw a border on a windows and a "close" button without any libraries.
Usually, I'd understand if you're lazy and can't be bothered and just pull in some dependency to do it for you, but if you're implementing a toolkit, this is the kind of thing that is SHOULD provide.
I think this is pretty common on Linux. You would want to GTK (or Qt) I would think to draw the top level window and perhaps system menus, etc. even though the UI itself is drawn using a GPU canvas.
> You would want to GTK (or Qt) I would think to draw the top level window and perhaps system menus, etc. even though the UI itself is drawn using a GPU canvas.
No, you would want to draw for Wayland or X. GTK and Qt themselves don't burden with importing each-other to work, for example.
My guess is that they import GTK only to get a title bar on GNOME, as GNOME forces applications to render their own. They could go custom and cut the dependency but it never looks quite right when apps do that.
> They could go custom and cut the dependency but it never looks quite right when apps do that.
This is literally what the GNOME devs advocate: that each application draw their own borders and titles. You might consider that it doesn't look quite right, but that's the design choice they're going with.
No. On Wayland all of that should be in the compositor. Window sizing and positioning can not be done by the apps, so it makes sense that the controls for that are drawn and handled by the WM. But Gnomes gotta gnome...
Browser runs complex untrusted code from the internet. Most desktop programs don't do anything like that. The servo programmers were riding a motorbike. Using Rust for a desktop program would be more like wearing a crash helmet in a car.
No advantage to it. Worse quality code to gain what? A smaller number hiding ultimately the same amount of code? Also, since the unit of compilation is a crate, fewer opportunities for concurrent compiling.
A multitude of tiny dependencies has a multitude of solo maintainers, who eventually walk away, or sometimes get compromised.
A few big dependencies each have a team and a reputation that has earned trust and established release process and standards. If there's a serious problem in a small part of a big dependency, there are a few trusted maintainers of the big dependency who can be reached and can resolve it.
The theory of small dependencies was a good theory, 10 years ago when js devs using NPM started the trend of making them "as small as possible". But it really seems like the emergent pattern is the opposite of what is claimed. These JS and Rust projects end up taking longer to build and resulting in bigger outputs. Instead of a couple of "huge" 200KB dependencies, you end up with _thousands_ of 1KB dependencies including different versions and alternative implementations, you end up with megabytes of "accidental" code you don't really need.
And we can reason about why. In an ecosystem where something has 1 to 3 large deps, well sometimes a dependency pulls in another sub-dependency with code you don't need. But in an ecosystem where something has 10 to 100 deps, this still happens, but 50x more overall. It's a exponential trend: you have 3 big deps that each have 2 big deps that each have 1 big dep, vs you have 20 small deps that each have 15 small deps that each have 10 small deps.