Today, we (and I mean me and my company) use poetry for prod code delivery and it just works. Poetry uses pip and virtualenv under the hood. It's worth understanding virtualenv regardless.
For every project you're developing, there will be virtualenv which has all the dependencies that project needs (which may be different than what's installed in the system, and different than what other projects may need).
"python -m venv init project/venv" will create it.
"rm -rf project/venv" will delete it.
Usually the virtualenv goes somewhere well known, like "project/venv". Sourcing the activate script ( "source project/venv/bin/activate" ) changes your shell environment to use the virtualenv instead of the system python environment. Once activated "pip install package" installs to the activated virtualenv. "deactivate" will turn off the virtualenv.
This is semantic sugar for what's really happening behind the curtains. There's a copy of python in the virtualenv: "project/venv/bin/python" which runs in the virtual env regardless of whether the virtualenv is activated or not. "activate" just adds "project/venv/bin" to the start of the PATH. "deactivate" removes it.
Regardless you can always see which python you're using by typing "which python". System python (/usr/bin/python) will use system packages. The venv python (project/venv/bin/python) will use the virtualenv python packages.
This allows you to have different virtualenvs to try out things like new versions of python, say. And each virtualenv is isolated from every other virtualenv.
And poetry is just a nice wrapper around this process that also figures out total project dependencies and creates a "lock" file to freeze all deps to a particular version for consistent releases. Pipenv is basically the same thing.
"pip install --user poetry" will install it in your home directory.
It's not recommended to use conda and pip together, basically the recommendation is to use one or the other, though I've heard miniconda is better in this regard. YMMV.
For every project you're developing, there will be virtualenv which has all the dependencies that project needs (which may be different than what's installed in the system, and different than what other projects may need).
"python -m venv init project/venv" will create it. "rm -rf project/venv" will delete it.
Usually the virtualenv goes somewhere well known, like "project/venv". Sourcing the activate script ( "source project/venv/bin/activate" ) changes your shell environment to use the virtualenv instead of the system python environment. Once activated "pip install package" installs to the activated virtualenv. "deactivate" will turn off the virtualenv.
This is semantic sugar for what's really happening behind the curtains. There's a copy of python in the virtualenv: "project/venv/bin/python" which runs in the virtual env regardless of whether the virtualenv is activated or not. "activate" just adds "project/venv/bin" to the start of the PATH. "deactivate" removes it.
Regardless you can always see which python you're using by typing "which python". System python (/usr/bin/python) will use system packages. The venv python (project/venv/bin/python) will use the virtualenv python packages.
This allows you to have different virtualenvs to try out things like new versions of python, say. And each virtualenv is isolated from every other virtualenv.
And poetry is just a nice wrapper around this process that also figures out total project dependencies and creates a "lock" file to freeze all deps to a particular version for consistent releases. Pipenv is basically the same thing.
"pip install --user poetry" will install it in your home directory.
It's not recommended to use conda and pip together, basically the recommendation is to use one or the other, though I've heard miniconda is better in this regard. YMMV.