Why Python Local Development Feels Overcomplicated

2025-09-07

TLDR

Introduction

Throughout the years I have used many different technologies and tools, and none frustrate me more than Python when developing locally. Dependency management is a mess, so to "fix" it we now need virtual environments. No need to worry, here comes hatch or poetry to the rescue to add even more complexity to "solve" the problem. Python needs to fix its core problem, and not try to invent new tooling to workaround it. No technology is perfect, but when local development is this difficult, it makes it really hard to want to voluntarily work with Python, especially for API development. Everything mentioned below is assuming the use of VSCode.

Let's get into more detail now on how Python tooling make local development difficult.

Virtual Environments without Tooling

It's not that it is difficult to get a virtual environment up and running with Python, it's more the pain that comes along with even doing something so simple.

First we create our virtual environment:

python -m venv .venv
source .venv/bin/activate

Easy enough right? Well unfortunately that isn't enough to get your intellisense working properly. You install your packages from requirements.txt and yet you go to start importing and none of your packages show up. If you deal with this pain enough, you realize that you now need to update your Python interpreter to point to the virtual environment. We now have a working virtual environment that we can code in. However, there is still something missing: Docker.

Adding Docker to the Equation

Assuming your repo is a standalone API, most technologies really only need a Dockerfile so everyone can build and run locally with minimal effort. With Python, though, this becomes an addition to your virtual environment rather than a replacement. The difference between python and other languages is the mental overhead of managing two virtual environments, if you are using Docker.

Most languages just blanket rely on Docker, because it is a proven technology with great tooling. With Python it gets a bit murkier, with people not feeling like worrying about two virtual environments. You essentially have to choose complexity (Docker and venv) or running locally different from production (venv without Docker).

As long as the dependency management issues exist within Python, the goal should be to limit the complexity of juggling venv and Docker. If not, engineers will continue to lean more towards just using venv, or, worse, add yet another tool to work locally.

Hatch and Poetry to Solve Everything

I have only used Hatch in production settings so I will stick to discussing that here

When I came back to using Python from Java, go, dotnetcore, and NodeJS I was shocked at how bad the tooling was. Then I started working with hatch and it made my head hurt even more. Hatch in and of itself is a nice tool, but it doesn't solve the real problem Python has, so it ends up just adding complexity. Hatch essentially is tooling around virtual environments to make it "easier", but from my experience it makes everything more confusing. Let's show a simple toml file that can be used with hatch:

[project]
name = "my-fastapi-app"
version = "0.1.0"
description = "A simple FastAPI service"
authors = [{ name = "Your Name", email = "you@example.com" }]
dependencies = [
  "fastapi",
  "uvicorn[standard]",
]
requires-Python = ">=3.11"

[tool.hatch.envs.dev]
dependencies = [
  "black",
  "flake8",
  "mypy",
  "httpx",
]

[tool.hatch.envs.dev.scripts]
start = "uvicorn main:app --reload --host 0.0.0.0"
lint = "flake8 app"
typecheck = "mypy app"

[tool.hatch.envs.test]
dependencies = [
  "pytest",
  "pytest-asyncio",
  "coverage",
]

[tool.hatch.envs.test.scripts]
unit = "pytest tests"

Note that this is a simple version of what you might see, most projects I have worked in get way more complex than this. We now essentially have two environments, one for running the API itself (dev) and one for working with and running unit tests (test). With hatch we can spin these virtual environments up by doing something like

hatch shell dev

Easy enough, but if I also want to spin up a virtual test environment I need to open another terminal. Also, since we have different packages defined, I will need to switch interpreters if I am working on unit tests. You can argue this is also not an optimal way to leverage hatch, which I agree with, but it makes a point that if you have a monorepo with many different services and dependencies, keeping track is a nightmare.

Another pitfall of using hatch is that engineers start seeing it as the tooling needed in CI/CD and in Docker. In my opinion hatch is unnecessary in production, but because it's at the core of local development, it's confusing to then not use it everywhere else.

I do not believe this is engineers' fault, they are conditioned by tools like Docker to think local dev and production should be as close to each other as possible. Leaving out a tool you use extensively locally is confusing.

Docker already gives you an isolated environment, so adding hatch to docker is running an isolated environment inside an isolated environment.

What is the Solution?

Unfortunately the real solution is for Python to fix it's dependency management to make local development actually enjoyable. However, since that is not likely to happen, I'd say go with the simplest possible approach (venv and Dockerfile). Adding extra tooling is just adding more bloat and confusion to your project. You can create a bootstrap file to do all the env stuff for you:

#!/usr/bin/env bash
set -e

# 1. Ensure .venv exists
if [ ! -d ".venv" ]; then
  Python3 -m venv .venv
  echo "Created .venv"
fi

# 2. Activate the venv
source .venv/bin/activate
echo "Activated .venv using $(Python --version)"

# 3. Upgrade pip + install deps
Python -m pip install --upgrade pip
if [ -f requirements.txt ]; then
  Python -m pip install -r requirements.txt
fi

This still doesn't solve the Docker out-of-sync problem when adding packages to requirements.txt, but it makes your local app much easier to understand for other engineers. I plan to follow this up with a blog trying to get Docker and venv to stay in sync automatically so you can run from a Docker image that better mimics what will run in your hosted environments.