Skip to content

Why Use a Package Manager?

Eric Apgar edited this page Feb 12, 2026 · 5 revisions

Effectively dealing with the worst part of Python.

Typical Life of a Python Developer

The following is the normal progression of working in Python and installing 3rd party packages/libraries as you slowly gain more experience with Python:

  1. Install whatever you need directly into the system Python.
    • Pure chaos, as this just adds libraries to your single available system environment with complete disregard for version conflicts or whether the specific project you're working on needs them.
    • You do this when you're just trying to write your first hello world and you don't know any better.
  2. Thinking Anaconda is a good idea.
    • At first, it seems like someone collecting a bunch of libraries and versioning them for you is a good thing.
    • Someone else installing and trying to manage packages for you is never the solution.
  3. Discovering Virtual Environments.
    • The perfect way for switching between different setups for different projects.
    • Manage these with pip and keep careful track of the installed libraries with requirements.txt.
    • Life is good.
  4. Clean local imports start requiring pyproject.toml.
    • At some point during the typhoon of managing local imports, you will find yourself shipwrecked and washed upon the beach here.
    • There is only one good way to import local packages (i.e. classes from another file within your repo) cleanly: install the project itself into the environment which gives you clean access to the elements within your project.
    • pyproject.toml is now "required".
    • Life is no longer good.
  5. You Python skills grow beyond a simple requirements.txt.
    • For the majority of your Python life, a well managed requirements.txt file was sufficient for 99% of what you tried to do.
    • But now you want to do advanced things like: create your own libraries, have optional library installs, create indexes for handling weird libraries like PyTorch, etc.
    • A simple requirements.txt won't get you the flexibility you need.
  6. Might as well use Astral uv.
    • Given that we're starting to add files and structure to manage our repo and environments effectively, and uv uses a lot of this framework, it's a smaller jump to use uv than it was when all we needed was a simple requirements.txt file with basic libraries.
    • It works on Linux and Windows, which means no split instructions for library management. Your working process is the same everywhere!
    • It manages Python versions as well as virtual environments, which means no more working with pyenv on Linux but not on Windows, etc. Pyenv is replaced by uv.
    • It installs libraries into virtual environments more intelligently - libraries that are common in version across virtual environments are installed once at a system level instead of in every virtual environment which cuts down on size and install time.
    • Life is definitely more complex from the early days, and there's more hoops to jump through initially, but that's because you're doing more. And ultimately you save so much time and headache working with uv.
    • Life is good again.

What to do?

Create a template repo that has all the uv required files (most importantly the pyproject.toml) so that you have a solid starting point for every new repo. Edit the .toml as needed hit the ground running with uv.

uv is lightweight and can install on Raspberry Pi's, Windows, Linux, basically anywhere. It does everything well and works very consistently.

Clone this wiki locally