generate a second requirements file after having done `pip install -r requirements.txt` to get the exact versions that were installed
So you'd need to have a `requirements.txt` with loose versions suitable for upgrading your apps deps, run `pip install -r requirements.txt` and then `pip freeze > requirements.locked.txt`. Then everyone should be using `pip install -r requirements.locked.txt` as well as during your build. But that's cumbersome and error prone and doesn't free you from having the wrong version of a dep in case you `pip install some-package` later on.
I'm not sure why you'd need two requirements.txt. You'd normally create a virtualenv, pip install what you need, then lock the versions with "pip freeze > requirements.txt". You don't need an initial requirements.txt to install new packages.
This way you can run pip install -r requirements.txt when you want to update your dependencies and then lock the resolved dependencies in requirements.locked.txt so that you get deterministic builds when the code runs in production environments where reproducibility and reliability are important. It also gives you a clearer idea of what are top level dependencies and what are transitive dependencies because the transitive dependencies will only be listed in requirements.locked.txt. However this system has limitations and isn't standardized. If you want to have different groups, say development, production, testing. You end up with
And even if you can tell which are your transitive dependencies by comparing .locked.txt to .txt it does not tell you why a given transitive dependency is in your locked dependencies e.g you don't know which of your top level dependencies is pulling it in.
One common reason is avoiding defining hard dependencies to versions of your transitive dependencies. In my current Django project I have 19 declared dependencies and 26 transitive dependencies. We have one file for the declared one and then another we generate with pip freeze. This way the transitive dependencies can evolve on their own without us having to keep track of them.
Pipfile looks like a definite improvement over the pip install, pip freeze workflow.
You're misunderstanding. The whole point is that nothing slips in, but at the same time, you don't have to force a specific version of something in order to achieve that. The killed feature of Bundler for long term maintenance is the ability to upgrade a single requirement in a minimal fashion.
So you start with a Gemfile that is your minimum requirements with no versions specified, the first time you `bundle install` it generates a Gemfile.lock which is then sticky. Over time your requirements are completely frozen until you decide to update, which you can do piecemeal via `bundle update gem1 gem2 etc...`. If you have a reason to avoid a newer library, then put a soft restriction in the Gemfile, preferably with a comment as to why that restriction is there and you have a very powerful long-term system for managing versions over time.
Just freezing and forgetting is a recipe for disaster when you have to update months or years later, and the transitive dependency updates are overwhelming and conflicted. Similarly exact versions specified make it fiddly to upgrade and hard to tell if there were reasons behind specific versions.
generate a second requirements file after having done `pip install -r requirements.txt` to get the exact versions that were installed
So you'd need to have a `requirements.txt` with loose versions suitable for upgrading your apps deps, run `pip install -r requirements.txt` and then `pip freeze > requirements.locked.txt`. Then everyone should be using `pip install -r requirements.locked.txt` as well as during your build. But that's cumbersome and error prone and doesn't free you from having the wrong version of a dep in case you `pip install some-package` later on.