This seems to be somewhat of a Python side-effect, same goes for almost any Python projects thrown together by people who hasn't spent 10% of their life fighting dependency management in Python.
But agree with uv being the best way. I'm not a "real" Python programmer, similar boat to parent that I just end up running a bunch of Python projects for various ML things, and also create some smaller projects myself. Tried conda, micromamba, uv, and a bunch of stuff in-between, most of them breaks at one point or another, meanwhile uv gives me the two most important things in one neat package: Flexible Python versions depending on project, and easy management of venv's.
So for people who haven't given it a try yet, do! It does make using Python a lot easier when it comes to dependencies. These are the commands I tend to use according to my history, maybe it's useful as a sort of quickstart. I started using uv maybe 6 months, and this is a summary of literally everything I've used it for so far.
# create new venv in working directory with pip + specific python version
uv venv --seed --python=3.10
# activate the venv
source .venv/bin/activate
# on-the-fly install pip dependencies
uv pip install transformers
# write currently installed deps to file
uv pip freeze > requirements.txt
# Later...
# install deps from file
uv pip install -r requirements.txt
# run arbitrary file with venv in path etc
uv run my_app.py
# install a "tool" (like global CLIs) with a specific python version, and optional dependency version
uv tool install --force --python python3.12 aider-chat@latest
There's been a movement away from requirements.txt towards pyproject.toml. And commands like "uv add" and "uv install" take most of the pain of initializing and maintaining those dependencies.
Thanks, as mentioned, I'm not really a Python programmer so don't follow along the trends...
I tried to figure out why anyone would use pyproject.toml over requirements.txt, granted they're just installing typical dependencies and didn't come up with any good answer. Personally I haven't had any issues with requirements.txt, so not sure what pyproject.toml would solve. I guess I'll change when/if I hit some bump in the road.
1. You can differenciate between different dependency groups like build dependencies, dev dependencies, test dependencies and regular dependencies. So if someone uses some dependency only in dev previously you either had to install that manually or your requirements.txt installed it for you without you needing it.
2. It adds a common description for project metadata that can be used
3. Adds a place where tool settings like those of a linter or a formatter can be stored (e.g. ruff and black)
4. Its format is standardized and allows it to be integrated with multiple build tools, toml is a bit more standardized than whatever custom file syntax python used before
virtualenvs are great, but they're not great on their own. requirements.txt work sorta, but then any package with more than 50 requirements requires a non-trivial amount of manual labor to maintain. (Hell even 20 deps are a pain)
Astral uv and poetry both maintain the pyproject.toml for you -- and as a bonus, they maintain the virtualenv underneath.
Then for the complete python newbs, they can run 'uv sync' or 'poetry install' and they don't have to understand what a virtualenv is -- and they don't need root, and they don't have to worry about conflicts, or which virtualenv is which, etc.
So the simple case:
mkdir test
cd test
# init a new project with python 3.13
uv init -p 3.13
# Add project deps
uv add numpy
uv add ...
# Delete the venv
rm -rf .venv
# reinstall everything (with the exact versions)
uv sync
# Install a test package in your venv
uv pip install poetry
# force the virtualenv back into a sane state (removing poetry and all it's deps)
uv sync
# update all deps
rm uv.lock
uv lock
Now cat your pyproject.toml, and you'll see something like this:
[project]
name = "test"
version = "0.1.0"
description = "Add your description here"
readme = "README.md"
requires-python = ">=3.13"
dependencies = [
"numpy>=2.2.5",
"pillow>=11.2.1",
"scipy>=1.15.2",
]
I use nix-shell when possible to specify my entire dev environment (including gnumake, gcc, down to utils like jq)
it often doesn't play well with venv and cuda, which I get. I've succeeded in locking a cuda env with a nix flake exactly once, then it broke, and I gave up and went back to venv.
over the years I've used pip, pyenv, pip env, poetry, conda, mamba, younameit. there are always weird edge cases especially with publication codes that publish some intersection of a requirements.txt, pyproject.toml, a conda env, to nothing at all. There are always bizarro edge cases that make you forget if you're using python or node /snark
I'll be happy to use the final tool to rule them all but that's how they were all branded (even nix; and i know poetry2nix is not the way)
I generally use nix-shell whenever I can too, only resorting to `uv` for projects where I cannot expect others to neccessarly understand Nix enough to handle the nix-shell stuff, even if it's trivial for me.
AFAIK, it works as well with cuda as any other similar tool. I personally haven't had any issues, most recently last week I was working on a transformer model for categorizing video files and it's all managed with uv and pytorch installed into the venv as normal.
I'm assuming most people run untrusted stuff like 3rd party libraries in some sort of isolated environment, unless they're begging to be hacked. Some basic security understanding has to be assumed, otherwise we have a long list to go through :)
https://docs.astral.sh/uv/
Sadly it appears that people in the LLM space aren't really all that good at packaging their software (maybe, on purpose).