I think pypi should require larger packages, like tensorflow, to self-host their releases.
There is all support for that already - the pypi index file contains arbitrary URL for data file and a sha256 hash. Let pypi store the hashes, so there is no shenanigans with versions being secretly overridden, but point the actual data URLs to other servers.
(There must obviously be a balance for availability vs pypi's cost, so maybe pypi hosts only smaller files, and larger files must be self-hosted? Or pypi hosts "major releases" while pre-releases are self-hosted? And there should be manual exceptions for "projects with funding from huge corporations" and "super popular projects from solo developers"...)
There is all support for that already - the pypi index file contains arbitrary URL for data file and a sha256 hash. Let pypi store the hashes, so there is no shenanigans with versions being secretly overridden, but point the actual data URLs to other servers.
(There must obviously be a balance for availability vs pypi's cost, so maybe pypi hosts only smaller files, and larger files must be self-hosted? Or pypi hosts "major releases" while pre-releases are self-hosted? And there should be manual exceptions for "projects with funding from huge corporations" and "super popular projects from solo developers"...)