TLDR
- Python local development is frustrating due to dependency management and virtual environments.
- Tools like Hatch and Poetry try to help but often add complexity instead of solving the root problem.
- Mixing venv with Docker creates extra overhead.
- The simplest reliable setup I have found is to just use uv and Docker together
Introduction
Throughout the years I have used many different technologies and tools, and none frustrate me more than Python when developing locally. To "fix" the dependency management problem we now need virtual environments. No need to worry, here comes hatch or poetry to the rescue to add even more complexity to "solve" the problem. I thought all hope was lost, until I started using uv as my local development tool. It's lightweight, fast, and super simple to get a local development environment up and running.
First I'll explain some of what makes the other tools so frustrating to use.
Virtual Environments without Tooling
It's not that it is difficult to get a virtual environment up and running with Python, it's more the pain that comes along with even doing something so simple.
Before anything I'd suggest setting the following PIP configuration to require virtual environments to install any packages:
PIP_REQUIRE_VIRTUALENV=true
First we create our virtual environment:
python -m venv .venv
source .venv/bin/activate
Easy enough right? Well unfortunately that isn't enough to get your intellisense working properly. You install your packages from requirements.txt and yet you go to start importing and none of your packages show up. If you deal with this pain enough, you realize that you now need to update your Python interpreter to point to the virtual environment. We now have a working virtual environment that we can code in. However, there is still something missing: Docker.
Adding Docker to the Equation
Assuming your repo is a standalone API, most technologies really only need a Dockerfile so everyone can build and run locally with minimal effort. With Python, though, this becomes an addition to your virtual environment rather than a replacement. The difference between Python and other languages is the mental overhead of managing two virtual environments, if you are using Docker.
Most languages just blanket rely on Docker, because it is a proven technology with great tooling. With Python it gets a bit murkier, with people not feeling like worrying about two virtual environments. You essentially have to choose complexity (Docker and venv) or running locally different from production (venv without Docker).
As long as the dependency management issues exist within Python, the goal should be to limit the complexity of juggling venv and Docker. If not, engineers will continue to lean more towards just using venv, or, worse, add yet another tool to work locally.
Hatch and Poetry to Solve Everything
I have only used Hatch in production settings so I will stick to discussing that here
When I came back to using Python from Java, go, dotnetcore, and NodeJS I was shocked at how bad the tooling was. Then I started working with hatch and it made my head hurt even more. Hatch in and of itself is a nice tool, but it doesn't solve the real problem Python has, so it ends up just adding complexity. Hatch essentially is tooling around virtual environments to make it "easier", but from my experience it makes everything more confusing. Let's show a simple toml file that can be used with hatch:
[project]
name = "my-fastapi-app"
version = "0.1.0"
description = "A simple FastAPI service"
authors = [{ name = "Your Name", email = "you@example.com" }]
dependencies = [
"fastapi",
"uvicorn[standard]",
]
requires-Python = ">=3.11"
[tool.hatch.envs.dev]
dependencies = [
"black",
"flake8",
"mypy",
"httpx",
]
[tool.hatch.envs.dev.scripts]
start = "uvicorn main:app --reload --host 0.0.0.0"
lint = "flake8 app"
typecheck = "mypy app"
[tool.hatch.envs.test]
dependencies = [
"pytest",
"pytest-asyncio",
"coverage",
]
[tool.hatch.envs.test.scripts]
unit = "pytest tests"
Note that this is a simple version of what you might see, most projects I have worked in get way more complex than this. We now essentially have two environments, one for running the API itself (dev) and one for working with and running unit tests (test). With hatch we can spin these virtual environments up by doing something like
hatch shell dev
Easy enough, but if I also want to spin up a virtual test environment I need to open another terminal. Also, since we have different packages defined, I will need to switch interpreters if I am working on unit tests. You can argue this is also not an optimal way to leverage hatch, which I agree with, but it makes a point that if you have a monorepo with many different services and dependencies, keeping track is a nightmare.
Another pitfall of using hatch is that engineers start seeing it as the tooling needed in CI/CD and in Docker. In my opinion hatch is unnecessary in production, but because it's at the core of local development, it's confusing to then not use it everywhere else.
I do not believe this is engineers' fault. Engineers are conditioned by tools like Docker to think local dev and production should be as close to each other as possible. Leaving out a tool a lot of engineers are familiar with is confusing.
Docker already gives you an isolated environment, so adding hatch to docker is running an isolated environment inside an isolated environment.
Can uv Save Us?
I just want to preface this with: uv does not inherently solve all of the problems, but it sure does offer the simplest approach to local Python development. For starters it creates the virtual environment for you so you don't need to worry about doing that each time.
Another really nice thing uv does is creates a lock file uv.lock which ensures everyone is working off the same dependency versions. This also handles transitive dependencies unlike requirements.txt which has a flat dependency tree.
The thing I loved the most though is how simple uv was to use, and leverage alongside Docker. To get started all you need to do is run:
uv init
This will create a project.toml file and also automatically create a .venv for you. This is why I said uv makes virtual environments easier, but it does not eliminate them, it automates their creation.
Want to add a package?
uv add fastapi
Want to remove a package?
uv remove fastapi
Your project.toml file stays super light and easy to understand.
[project]
name = "sample-uv"
version = "0.1.0"
description = "Add your description here"
readme = "README.md"
requires-python = ">=3.13"
dependencies = [
"fastapi>=0.116.1",
"uvicorn>=0.35.0",
]
Now we can add a Dockerfile that can handle local development for us, while also keeping it configurable for our deployed environments.
FROM ghcr.io/astral-sh/uv:python3.12-bookworm-slim
WORKDIR /app
COPY pyproject.toml uv.lock ./
RUN uv sync --frozen --no-cache --no-dev
COPY . /app
EXPOSE 8000
CMD ["sh", "-c", "uv run uvicorn main:app --host 0.0.0.0 --port 8000 ${UVICORN_RELOAD:+--reload}"]
The --reload flag lets us see local file changes in Docker without a rebuild. The only time we will need to rebuild our image is if we add/remove dependencies.
In order to properly take advantage of this we will need to first build the image:
docker build -t sample-uv-api .
docker run -p 8000:8000 -e UVICORN_RELOAD=1 -v $(pwd):/app sample-uv-api
NOTE: this assumes using a Mac, Windows directory would be different.
Just like that we have a minimal local environment without all the extra overhead of learning poetry or hatch, but much more feature rich than using pip with venv.
Someday Python might fix all of this and there won't be a need to figure out how to make local development more manageable, until that time I highly suggest looking into using uv for your local tooling.