Hacker News new | past | comments | ask | show | jobs | submit login

> And when security issues are fixed, updating my system fixes them for all applications instead of just the ones which update their flatpaks (and from the article, it seems like many don't).

That is an ideal, the reality is somewhat different. It takes constant effort to ensure this. See https://wiki.debian.org/EmbeddedCopies for details.




Not sure why you'd cite that wiki page, as it describes how Debian maintainers should be removing embedded copies of dependencies and (if necessary) patching the software to use the system versions.

Sure, it's not perfect, and some packages end up having embedded dependency copies, but that's a far cry from flatpak, where everything is an embedded copy, by design.


Flatpak has that same problem?


I guess with Flatpack it is the responsibility of producer of the package.

With apt/yum/etc it is the responsibility of the distro maintainers to package it.


And which one of them has a better track record pushing out security fixes, especially for older software?

Containers are a security nightmare because developers tend to have a release-and-forget mindset. Packagers are the ones doing the thankless work of constantly tracking issues, backporting fixes, dealing with dependency issues, etc. Take the packagers out of the equation and the result will be entirely predictable--much more stale software sitting on the hard drives of oblivious users, just as we see with containers.


> backporting fixes

IMHO this should be thankless. If you want updated software then update it.

By backporting fixes, you create an untested and unsupportable version. They pull patches that may or may not have dependencies on other changes to the software, which they avoid because they don't want to update, and generate a whole bunch of noise for application developers.


No software exists in a vacuum. It's the job of the packagers to create a platform that works as a whole. That's why distributions exist in the first place.

If the upstream releases a new version and the packagers just throw it into the distro, they've created an untested and unsupportable system.


> No software exists in a vacuum

This is true. So stop trying to pull apart tens or hundreds of other people's work hours on some attempt to create some frankenware.

The rest of what you wrote doesn't make sense. Of course it would get tested whether they backport or not. Of it doesn't, then they aren't testing what they are doing now.

The current state is that we don't get much cross pollination as the version being pushed by RedHat isn't the same as the version being pushed by Debian, so bugs are introduced and resolution doesn't help each other.

Your distro doesn't live in a vacuum. Stop acting like it does.


I'm not quite sure what's rubbed you up the wrong way here, but I'll address these points in order:

> This is true. So stop trying to pull apart tens or hundreds of other people's work hours on some attempt to create some frankenware.

I'm not involved in this, so I'm not entirely sure what you want me personally to stop doing, but I definitely appreciate the efforts of distributions to create systems that work together as a whole, and don't spontaneously fail because an upstream author has put out a new release and is now forcing me to choose between carrying a vulnerability and breaking compatibility.

> The rest of what you wrote doesn't make sense.

I beg to differ. Perhaps I should simplify my language.

> Of course it would get tested whether they backport or not.

By whom? There's a reason Debian Stable has a painfully long release cycle, and it's partially because they're taking that time to make sure the complete set of versions you get when you do an `apt-get install <whatever>` works together, as a coherent system. If you change `libfoo-1.0.0` to `libfoo-2.0.0` and just push that out to everyone, the combination of `libfoo` with any of its dependencies or dependents is now untested.

You can make a separate argument for rolling release distros if you like, but the problem doesn't go away.

Again, this is why the concept of "distribution" came into being in the first place: because throwing together arbitrary versions and expecting the result to work is madness. Or, if you prefer, Debian Unstable.

> The current state is that we don't get much cross pollination as the version being pushed by RedHat isn't the same as the version being pushed by Debian, so bugs are introduced and resolution doesn't help each other.

Correct. And if upstream authors took responsibility for their previous releases and backported patches themselves, they wouldn't have divergence in the first place. But that's not necessarily workable, so we've got the next best thing: other people keeping the lights on for them.

> Your distro doesn't live in a vacuum. Stop acting like it does.

I'm not claiming anything of the sort.


> And which one of them has a better track record pushing out security fixes, especially for older software?

given the amount of security issues that are being fixed silently, even sometimes unadvertently, in newer releases of any software, I really believe that doing this instead of forcing ppl to update just creates a less secure world.


Which distribution provides security updates for all their packages for let's say three or five years? Debian Stable has exceptions, so does Ubuntu and it also makes a huge distinction for thousands of packages in the Universe repository, which for the most part don't get any support at all. Fedora releases are AFAIK only supported for a year or two.

And then you also get the issue of bugs that are actually security issues, but weren't labeled/identified as such, and therefore never get fixed in those stable distributions.


RHEL and CentOS have 5 years of updates for their core packages.


According to their git repository[1] the last time they updated the WebKitGTK library was half a year ago. In the meantime there have been multiple upstream releases, fixing multiple security vulnerabilities[2-6]. Or does this git mirror not reflect the current state of the version they're shipping?

[1] https://git.centos.org/rpms/webkit2gtk3/commits/c8 [2] https://webkitgtk.org/security/WSA-2020-0001.html [3] https://webkitgtk.org/security/WSA-2020-0002.html [4] https://webkitgtk.org/security/WSA-2020-0003.html [5] https://webkitgtk.org/security/WSA-2020-0004.html [6] https://webkitgtk.org/security/WSA-2020-0005.html


Looks like that package is part of the AppStream collection and therefore does not have the same guarantees as the core packages. That's at least what some quick googling told me.

RHEL and CentOS have pretty good backporting support for packages that they support, but most installs of them that I have seen use/include packages from other collections that are not supported, which is of course the wrong way to do it.


> And which one of them has a better track record pushing out security fixes, especially for older software?

Why the question? I was only trying to state what I think is fact, not sell one approach or another.


Flatpak has a package split into three parts: a program itself, a runtime, and a SDK. Runtime includes common libraries a program is usually required, and SDK includes a development counterpart of them.

Flatpak program is pinned to a runtime version, e.g. org.freedesktop.Platform//18.08 (with 18.08 being a major version) of which receives security updates periodically. When user installed a package, Flatpak will install a program and its runtime to Flatpak directory.

If a program only depends on libraries in a runtime (and not requiring any bundled dependencies) then packager won't need to worry about upgrading those libraries at all, as updating a runtime will update those dependencies.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: