Pip-like python environment management
The mainline family of python package managers
April 18, 2011 — March 9, 2025
Suspiciously similar content
pip
is the default Python package installer. It is the most widely used and most widely supported package manager for Python, especially if you include the many derivative versions which extend the baseline implementation.
1 Basics
It is best invoked as
2 pip
without venv
I assume throughout here that we are using venv
to manage our Python environments. This is the best practice, and I recommend it.
Note that we can use pip
to install packages outside of a virtual environment. It will happily execute outside an active environment without complaint.
But don’t do that. Doing so does extremely weird things, installing possibly-conflicting packages into a global Python. I have no idea what it would be good for, apart from creating confusing errors and introducing bugs. Depending on which platform I am on, it will either work, fail, or introduce subtle problems that I will notice later when other things break. I have not found a use case for installing packages outside a contained virtual environment in more than a decade of Python programming. Maybe in a Docker container or something?
Anyway, read on for the important bit.
3 pip
+ venv
venv
is the built-in Python virtual environment system in Python 3, replacing virtualenv
which we still find documented about the place. It creates self-contained environments that do not interfere with each other, which is what most people want.
venv works, for all that I would like it to work more smoothly. It is a built-in Python virtual environment system in Python 3. While it doesn’t support Python 2 (but, also, let Python 2 go unless someone is paying you money to keep a grip on it), it does fix various problems, e.g., it supports framework Python on macOS which is important for GUIs, and is covered by the Python docs in the Python virtual environment introduction. venv
is a good default choice, widely supported and adequate, if not awesome, workflow.
# Create venv in the current folder
python3 -m venv ./venv --prompt some_arbitrary_name
# or if we want to use system packages:
python3 -m venv ./venv --prompt some_arbitrary_name --system-site-packages
# Use venv from fish OR
source ./venv/bin/activate.fish
# Use venv from bash
source ./venv/bin/activate
Hereafter, I assume we are in an active venv
. Now we use pip
. I always begin by upgrading pip
itself and installing wheel
which is some bit of installation infrastructure that is helpful in practice. Thereafter, everything else should install more correctly, except when it doesn’t.
To snapshot dependencies in requirement.txt
:
I do not recommend using the freeze
command except as a first draft. It is too specific and includes very precise version numbers and obscure, locally specific sub-dependencies. Best keep a tally of the actual hard dependencies and let pip
sort out the details.
To restore dependencies from a requirements.txt
:
Version specification in requirements.txt
looks something like this:
MyProject
YourProject == 1.3
SomeProject >= 1.2, < 2.0
SomeOtherProject[foo, bar]
OurProject ~= 1.4.2
TheirProject == 5.4 ; python_version < '3.8'
HerProject ; sys_platform == 'win32'
requests [security] >= 2.8.1, == 2.8.* ; python_version < "2.7"
The ~=
is a handy lazy shortcut; it permits point releases, but not minor releases, so e.g. ~=1.3.0
will also satisfy itself with version 1.3.9
but not 1.4.0
.
Gotcha: pip
’s requirements.txt
does not actually specify the version of Python itself when you install from it, although you might think it from the python_version
specifier. See Python versions to see how to stipulate the Python version at package development time.
4 uv
uv is “an extremely fast Python package and project manager, written in Rust.” It is getting strong reviews, e.g. Loopwerk: Revisiting uv. Source at astral-sh/uv.
Claimed highlights:
- 🚀 A single tool to replace
pip
,pip-tools
,pipx
,poetry
,pyenv
,twine
,virtualenv
, and more.- ⚡️ 10–100x faster than
pip
.- 🐍 Installs and manages Python versions.
- 🛠️ Runs and installs Python applications.
- ❇️ Runs scripts, with support for inline dependency metadata.
- 🗂️ Provides comprehensive project management, with a universal lockfile.
- 🔩 Includes a pip-compatible interface for a performance boost with a familiar CLI.
- 🏢 Supports Cargo-style workspaces for scalable projects.
- 💾 Disk-space efficient, with a global cache for dependency deduplication.
- ⏬ Installable without Rust or Python via
curl
orpip
.- 🖥️ Supports macOS, Linux, and Windows.
uv is backed by Astral, the creators of Ruff.
Notably, uv
has deep, highly specific pytorch support, so if you are doing ML on the GPU, this might be a good choice.
One thing it does not provide is a build back-end; we need to choose one. Who knew that was a thing? I did not know there was a build-front-end/vs build-backend distinction.
The uvx command invokes a tool without installing it.[…] Tools are installed into temporary, isolated environments when using uvx.
4.1 uv
installation
We can install uv
many ways. I prefer pipx
on Linux and homebrew on macOS.
Shell completions require manual intervention.:
# Determine your shell (e.g., with `echo $SHELL`), then run one of:
echo 'eval "$(uv generate-shell-completion bash)"' >> ~/.bashrc
echo 'eval "$(uv generate-shell-completion zsh)"' >> ~/.zshrc
echo 'uv generate-shell-completion fish | source' >> ~/.config/fish/config.fish
Annoyingly, uv
will not autocomplete project entry points, so e.g. uv run
will not autocomplete with the list of permissible scripts.
4.2 uv
project setup
uv
requires us to specify a build system in the pyproject.toml
file if we want to “install” the package we are currently working on (e.g. do relative imports, have sub-folders in the source tree…). It does not provide one per default. To get one, create the project as one of the following:
If the env is already created, we need to add the build system manually into pyproject.toml
; some useful ones are included.
Alternatively, we can set tool.uv.package = true
in pyproject.toml
.
4.3 Migrating to uv
- mkniewallner/migrate-to-uv: Migrate a project from Poetry/Pipenv/pip-tools/pip to uv package manager
- poetry-to-uv is a script which automates (some of) the process of migrating from poetry. (This one was a little fragile for me, but it might work for you.)
- uv-migrator: A New Tool to Easily Migrate Your Python Projects to UV Package Manager : r/Python
- python - How to migrate from Poetry to UV package manager? - Stack Overflow
- Sebastián Ramírez: “BTW, you can migrate from Poetry to uv in 5-10 min 🚀 uv uses the standard pyproject.toml format used by almost all others, Hatch, PDM, Flit… (except Poetry) 🤓 You can use another tool (PDM) to migrate the configs to this standard 📜 It’s 4 steps.
- Loopwerk: How to migrate your Poetry project to uv.
- Migrating to uv - Instructor.
4.4 uv
shell completions
This seems to allow me to get tab-completion for uv run
in fish
shell.
function __fish_uv_run_completions
set -l uv_output (uv run 2>/dev/null)
string match -r -- '^\s*- (\w+)' $uv_output | cut -d' ' -f2
end
complete -c uv -n '__fish_seen_subcommand_from run' -a '(__fish_uv_run_completions)'
Add to ~/.config/fish/completions/uv.fish
and then source ~/.config/fish/completions/uv.fish
.
5 Poetry
No! Wait! The new new new hipness is poetry
. All the other previous hipnesses were not the real eternal ultimate hipness that transcends time. I know we said this every previous time a new Python packaging system came out, but this time it’s real and our love will last forever ONO.
Surprise twist: it turns out this love was not actually eternal and my ardour for poetry has cooled. Poetry no longer has an edge over other similar projects in terms of function and has a problematic history of getting logjammed; see Issue #4595: Governance—or, “What do we do with all these pull requests?”.
It might be usable if your needs are modest or you are prepared to jump into the project discord, which seems to be where the poetry hobbyists organise, but since I want to use this project merely incidentally, as a tool to develop something else, hobbyist levels of engagement are not something I can participate in. poetry
is not ready for prime-time, at least for my use-case.
Note also that poetry is having difficulty staying current with the (admittedly annoying) local versions, as made famous by CUDA-supporting packages. There is an example of the kind of antics that make it work below.
Poetry is a tool for dependency management and packaging in Python. It allows you to declare the libraries your project depends on, and it will manage (install/update) them for you.
From the introduction:
Packaging systems and dependency management in Python are rather convoluted and hard to understand for newcomers. Even for seasoned developers, it might be cumbersome at times to create all files needed in a Python project:
setup.py
,requirements.txt
,setup.cfg
,MANIFEST.in
and the newly addedPipfile
.So I wanted a tool that would limit everything to a single configuration file to do: dependency management, packaging and publishing.
It takes inspiration from tools that exist in other languages, like
composer
(PHP) orcargo
(Rust).And, finally, I started
poetry
to bring another exhaustive dependency resolver to the Python community apart from Conda’s.What about Pipenv?
In short: I do not like the CLI it provides, or some of the decisions made, and I think we can make a better and more intuitive one.
Low-key dissing on similarly dysfunctional competitors is an important part of Python packaging.
Lazy install is via this terrifying command line (do not run if you do not know what this does):
Poetry is similar to pipenv
, in that it (by default, but not necessarily) manages dependencies in a local venv
. It has a more full-service approach than systems built on pip
. For example, it has its own dependency resolver, which uses modern dependency metadata but also works with previous dependency specifications by brute force if needed. It separates specified dependencies from the ones it resolves in practice, which means dependencies seem to transport much better than conda, which generally requires you to hand-maintain a special dependency file full of just the stuff you actually wanted. In practice, its many small conveniences and thoughtful workflow are helpful. For example, it sets up the current package for development by default so that imports work as similarly as possible across this local environment and when it is distributed to users.
poetry shell
finds the wrong venv
Yes, it does this for me sometimes too. It is not consistent, though, and seems to be a particular shell environment that causes this glitch.
Force it to use the correct venv with:
In fact, they have removed poetry shell
as of 2.0.0 because it is no good.
The new way is something like:
5.1 CUDA and other local versions in poetry
As mentioned below, poetry does not support installing build variants/profiles, which means I cannot install GPU software, and thus in practice it is burdensome to use for machine learning applications. There are workarounds: Instructions for installing PyTorch show a representative installation specification for PyTorch.
[tool.poetry.dependencies]
python = "^3.10"
numpy = "^1.23.2"
torch = { version = "1.12.1", source="torch"}
torchaudio = { version = "0.12.1", source="torch"}
torchvision = { version = "0.13.1", source="torch"}
[[tool.poetry.source]]
name = "torch"
url = "https://download.pytorch.org/whl/cu116"
secondary = true
Note that this produces various errors and downloads gigabytes of supporting files unnecessarily, but it eventually works. It was too burdensome for my workflow, so I switched back to pip.
There is a new way for torch 2.0 and later:
poetry source add --priority=supplemental torch https://download.pytorch.org/whl/cu118
poetry add torch==2.0.1+cu118 --source torch
or:
poetry add "https://download.pytorch.org/whl/cu118/torch-2.0.0%2Bcu118-cp310-cp310-linux_x86_64.whl"
I have not tried it.
poetry and PyTorch notionally play nice in PyTorch 2.1, in the sense that PyTorch 2.1 is supposed to be installable with poetry, with CUDA. It is not yet clear to me how we would set up PyTorch so it works either with or without CUDA.
5.2 Jupyter kernels from my poetry env
Easy:
5.3 Dev dependencies
Poetry does not specifically support dev dependencies. What they do support are generic dependency groups which might happen to be dev dependencies but would say “don’t label me, man.”
[tool.poetry.group.dev] # This part can be left out
optional = true
[tool.poetry.group.dev.dependencies]
ipdb = "~0.13.13"
ipykernel = "~6.29.4"
scalene = "~1.5.41"
Now, we install:
6 pipenv
⛔️⛔️UPDATE⛔️⛔️: Note that the pipenv
system does not support “local versions” and is therefore unusable for machine learning applications. This project is dead to me. (Bear in mind that my opinions will become increasingly outdated depending on when you read this.)
venv
has a higher-level, er, …wrapper (?) interface called pipenv.
Pipenv is a production-ready tool that aims to bring the best of all packaging worlds to the Python world. It harnesses Pipfile, pip, and virtualenv into one single command.
I switched to pipenv from poetry because it looked less chaotic than poetry. I think it is, although not by much.
HOWEVER, it is still pretty awful for my use-case. To be honest, I’d just use plain pip and requirements.txt
, which, while primitive and broken, are at least broken and primitive in a well-understood way.
At the time of writing, the pipenv website was 3 weeks into an outage, because dependency management is a quagmire of sadness and comically broken management with terrible Bus factor. However, the backup docs site is semi-functional, albeit too curt to be useful and, as far as I can tell, outdated. The documentation site inside GitHub is readable. See also an introduction showing pipenv and venv used together.
The dependency resolver is, as the poetry devs point out, broken in its own special ways. The procedure to install modern ML frameworks, for example, is gruelling.
For my system, important settings are:
To get the venv inside the project (required for sanity in my HPC) I need the following:
Pipenv will automatically load dotenv files, which is a nice touch.
7 pipx
Pro tip: pipx:
pipx is made specifically for application installation, as it adds isolation yet still makes the apps available in your shell: pipx creates an isolated environment for each application and its associated packages.
That is, pipx is an application that installs global applications for you.
8 Rye
9 PDM
is a modern Python package and dependency manager supporting the latest PEP standards. But it is more than a package manager. It boosts your development workflow in various aspects. The most significant benefit is it installs and manages packages in a similar way to
npm
that doesn’t need to create a virtualenv at all!Feature highlights:
10 Flit
Make the easy things easy and the hard things possible is an old motto from the Perl community. Flit is focused on the easy things part of that, and leaves the hard things up to other tools.
Specifically, the easy things are pure Python packages with no build steps (neither compiling C code, nor bundling Javascript, etc.). The vast majority of packages on PyPI are like this: plain Python code, with maybe some static data files like icons included.
It’s easy to underestimate the challenges involved in distributing and installing code, because it seems like you just need to copy some files into the right place. There’s a whole lot of metadata and tooling that has to work together around that fundamental step. But with the right tooling, a developer who wants to release their code doesn’t need to know about most of that.
What, specifically, does Flit make easy?
flit init
helps you set up the information Flit needs about your package.- Subpackages are automatically included: you only need to specify the top-level package.
- Data files within a package directory are automatically included. Missing data files have been a common packaging mistake with other tools.
- The version number is taken from your package’s
__version__
attribute, so it always matches the version that tools like pip see.flit publish
uploads a package to PyPI, so you don’t need a separate tool to do this.Setuptools, the most common tool for Python packaging, now has shortcuts for many of the same things. But it has to stay compatible with projects published many years ago, which limits what it can do by default.
11 Hatch
Hatch is a modern, extensible Python project manager.
Features:
- Standardised build system with reproducible builds by default
- Robust environment management with support for custom scripts
- Easy publishing to PyPI or other indexes
- Version management
- Configurable project generation with sane defaults
- Responsive CLI, ~2-3x faster than equivalent tools
12 Python versions
If we use conda or uv then the Python version is handled for us, along with generic dependency managers. With pip
, we need to manage it ourselves. Poetry is in between — it knows about Python versions but cannot install Python for us.
See pyenv for the most common solution to manage Python versions.
13 GPU/TPU/etc pain
Users of GPUs must ignore any other options, no matter how attractive all the other options might seem at first glance. The stupid drudge work of venv
is the price of hardware support for now. Only pip and conda support hardware specification in practice.
UPDATE: poetry
now supports Pytorch with CUDA. uv
has had a crack at it too.
Terminology you need to learn: Many packages specify local versions for particular architectures as a part of their functionality. For example, pytorch comes in various flavours, which when using pip
, can be selected in the following fashion:
# CPU flavour
pip install torch==1.10.0+cpu -f https://download.pytorch.org/whl/cpu/torch_stable.html
# GPU flavour
pip install torch==1.10.0+cu113 -f https://download.pytorch.org/whl/cu113/torch_stable.html
The local version is given by the +cpu
or +cu113
bit, and it changes what code will be executed when using these packages. Specifying a GPU version is essential for many machine learning projects (essential, that is, if I don’t want my code to run orders of magnitude slower). The details of how this can be controlled with regard to the Python packaging ecosystem are somewhat contentious and complicated and thus not supported by any of the new wave options like poetry
or pipenv
. Brian Wilson argues,
During my dive into the open-source abyss that is ML packages and
+localVersions
I discovered lots of people have strong opinions about what it should not be and like to tell other people they’re wrong. Other people with opinions about what it could be are too afraid of voicing them lest there be some unintended consequence. PSF has asserted what they believe to be the intended state in PEP-440 (no local versions published) but the solution (PEP-459) is not an ML Model friendly solution because the installation providers (pip, pipenv, poetry) don’t have enough standardised hooks into the underlying hardware (cpu vs gpu vs cuda lib stack) to even understand which version to pull, let alone the Herculean effort it would take to get even just pytorch to update their package metadata.
There is no evidence that this logjam will resolve any time soon. However, it turns out that this machine learning thing is not going away, and ML projects use GPUs. It turns out that packaging projects with GPU code is hard. Since I do neural network stuff and thus use GPU/CPU versions of packages, I can effectively ignore most of the Python environment alternatives on this page. The two that work are conda and pip. These support a minimum viable local version package system de facto which does what we want. If you want something fancier, try containerization using a GPU-compatible system such as apptainer.