… and there’s a reason: Dependency Management.
Coming from languages like Node, Go, and Rust I am used to decent dependency management (although Go was really bad at it for the first years, and has become decent only recently with the addition of go mod
.). Python, in my opinion, lacks a good and well-defined Package Management. I’m sure there are ways, but the past days I spent working with and around Virtual Environments, global installations, and figuring out what the heck PEP-668 is supposed to mean. I don’t want to rant, I could do that on Mastodon or Bsky. No, I want to share some of my thoughts as a non-Python Developer having to work with Python as part of my Job.
Disclaimer: There’s a not-so-small chance I’m just not deep enough into Python and its eco system. I’m very happy to learn and discuss so please tell me if and where I am wrong.
Also, I do enjoy writing Python. It’s a fun language, and I’m starting to like it. This is also part of the reason I’m so frustrated with the package and dependency management.
pip
My first touchpoint with Python dependency management was pip. It’s a tool which can be used to install python packages, for example by reading a requirements.txt
or being passed a name on the CLI directly. All the things are installed… in a global directory?! In my case things are placed in /opt/homebrew/lib/python3.12/site-packages
– and it looks as if this is where all the packages for all my installed tools are. 😳
So if two tools require different versions … what happens? According to the pip documentation, pip will try and resolve all the dependencies and the dependency dependencies and so on – it’s a lot of resolving to do, and a complex task. This is something all package managers need to do, especially if they try to be smart about re-using already installed libraries.
Anyhow, with pip install we have global packages and nothing is scoped to the tool being installed (or the library required by code). Everything is global. So we must be lucky that things don’t break, and tool requirements must be loose to allow for minor or patch updates.
venv
Enter venv
, or virtual environments. venv
itself is part of the Python standard library and solves the issue of globally installed dependencies. It creates a folder in the project path, installs all libraries into the path and uses pip and python binaries from this path as well so we have a encapsulated environment in which we can execute our code. This is awesome!
It feels a bit like npm
, which also installs copies of the used libraries in a local folder (node_modules
). Why this is not something pip does I do not understand. It seems reasonable to offer a way of managing and using virtual environment directly from pip- which brings us to the next tool.
pipx
Now this are coming together!
pipx
is a tool especially used for installing python executables – in self-contained virtual environments, all managed by pipx
. This is functionallity I would expect from pip, and similar to npm install -g
, which installs a tool and its dependencies in a global scope.
With pipx, we finally have a tool which allows us to execute a command like “pipx install ansible
” and get a fully working, venv-enabled ansible installation in ~/.local/pipx/venvs/ansible
(the path may differ on your system).
Perfection 🧑🍳😘
Although pipx is a great solution, and I’m using it for all tool installations myself, for my specific work-related case it wasn’t a good solution. The problem I faced was developers having issues updating one of the tools my team provides them. They ran into the PEP-668 issue and couldn’t update the tool unless they specified --break-system-packages
which isn’t something they should do – and I wouldn’t want to tell them to do!
I ended up using venv
directly to ensure our tool is installed in a reliable location on ever users computer, the installer we used will also manage the PATH
by adding the venv binary into the .bashrc
or .zshrc
, and also creates a symlink into /usr/local/bin
.
In the process I also added a “update” command to make further updates easier, and I found a nice way of doing it in my opinion:
import sys
import subprocess
def update():
subprocess.check_call([
sys.executable,
"-m",
"pip",
"install",
"-U",
"--index-url", f"{index_url}",
"--extra-index-url", "https://pypi.org/simple",
"toolname"
])
This code will use the python executable which is used to run the code, then execute a pip install -U
which updates the tool (here named toolname
). The python executable is the one from the virtual environment, so the update command is always executed in the context of the venv – neat! 🐍
Conclusion
I wanna say that “I don’t like Python” was a little clickbait-y. I do like Python, and I see its benefits. Especially when compared to shell scripting, especially for Glue code in Infrastructure and in one-off tooling and scripting. Still, for complex CLI tools I, personally, would choose either Go or Rust. Both have great libraries (clap for Rust, and cobra for Go), compile to binaries, and are easy to distribute with all dependencies being bundled in the binary.
If you want to discuss this post, feel free to comment or join the discussion on Mastodon, Bluesky, or LinkedIn!