Python packaging has been “getting better” for fifteen years. There have been real improvements: pyproject.toml replaced setup.py, PEP 518 gave us build system declarations, pip gained dependency resolution. The community is clearly working on it.
And yet if you ask a developer coming from Node.js, Rust, or Go to set up a Python project, they will encounter a tooling landscape that requires a tour guide. The problems are not fully solved. Some are structural.
The Core Problem Has Not Changed
The fundamental issue: Python’s packaging ecosystem evolved without a single authoritative tool, and the proliferation of partially-compatible solutions has created a situation where every project needs to choose between multiple valid options, each with different trade-offs.
In 2026, if you want to manage a Python project’s dependencies, you have legitimate choices including:
- pip + requirements.txt (classic, limited)
- pip + pyproject.toml (modern pip)
- Poetry (full dependency management, but slow resolver)
- PDM (Poetry-compatible, PEP 517/518 compliant)
- Hatch (flexible but complex)
- uv (fast, newer, Rust-based, limited feature set)
- conda/mamba (scientific/data science context, separate ecosystem)
These tools are partially compatible with each other and partially incompatible. A project using Poetry lockfiles cannot be built by a developer who only has pip. A project using conda environments cannot be fully replicated with pip tools.
Compare this to Rust’s cargo, Go’s go mod, or Node’s npm - single authoritative tools where there is essentially no choice to make.
The Virtual Environment Problem
Python requires virtual environments to isolate project dependencies. This is not a flaw; it is necessary because Python has a global interpreter rather than per-project interpreter isolation. But the tooling around virtual environments is fragmented.
You can create a virtual environment with:
python -m venv .venv(stdlib, manual)virtualenv(third-party, more features)conda create(conda ecosystem)pyenv + pyenv-virtualenv(version management + environments)- Any of the high-level tools above (Poetry, PDM, etc.)
The resulting environments are largely compatible - they are all just isolated Python installations - but activating them, knowing which is active, and managing them across projects requires either a specific tool’s workflow or manual bookkeeping.
What uv Actually Changes
uv, the package manager built by Astral (the team behind Ruff), is the most significant development in Python packaging in years.
The core claim: it is 10-100x faster than pip. This is not marketing exaggeration. The benchmark is real:
| Operation | pip | uv |
|---|---|---|
| Cold install (no cache) | 45s | 3s |
| Warm install (cached) | 8s | 0.5s |
| Dependency resolution | 15s | 1s |
uv achieves this by being written in Rust, using parallel downloads, and implementing a more efficient dependency resolver.
uv also handles Python version management (replacing pyenv for many workflows), virtual environment creation, and project management with a pyproject.toml-based workflow.
The significant limitation as of 2025-2026: uv does not have full feature parity with Poetry for complex publishing and distribution workflows. It is excellent for application development. For library authors with complex publishing requirements, the choice is less clear.
The pyproject.toml Standardization
One genuine improvement: pyproject.toml is now the standard place to declare project metadata and build system configuration. This is specified in PEPs 517, 518, and 621.
[project]
name = "my-project"
version = "0.1.0"
requires-python = ">=3.11"
dependencies = [
"httpx>=0.27",
"pydantic>=2.0",
]
[project.optional-dependencies]
dev = [
"pytest>=8.0",
"ruff>=0.4",
]
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.core"
The structure is standardized. The tool you use to install from it is not. pip, uv, and pdm can all install from pyproject.toml. This is progress - at least the file format is agreed upon.
The Problems That Remain Structural
Conflicting dependencies with no resolution. When package A requires requests>=2.28 and package B requires requests==2.27, there is no solution. In npm, packages can have different versions of the same dependency. Python packages share a single version per environment. This means some combinations of packages are genuinely impossible to install together.
Binary packaging is still fragile. Python packages with compiled extensions (numpy, scipy, pydantic, cryptography) distribute as wheels for specific Python versions and platforms. When a wheel does not exist for your platform and Python version, pip falls back to building from source, which requires compiler tooling and can fail opaquely.
Reproducible environments are not trivial. A requirements.txt file without pinned versions is not reproducible. A requirements.txt with pinned versions is not portable. Lock files solve this, but only if everyone uses the same tool. Poetry lockfiles are not compatible with pip’s dependency resolution.
The Practical Recommendation
For new projects in 2026:
Use uv if you want speed and are developing an application (not a library). It is fast, handles Python version management, and has a clean workflow:
uv init my-project
uv add httpx pydantic
uv add --dev pytest ruff
uv run pytest
Use Poetry if you are building a library you intend to publish and need its mature publishing workflow.
Use plain pip + pyproject.toml if you are in a team that values minimizing tooling overhead and everyone is comfortable with the standard tools.
Do not use conda unless you have specific scientific computing dependencies (NumPy, CUDA bindings) that are better managed through conda channels.
What Would Actually Fix It
The Python community needs what every other modern language has: one tool, with one workflow, blessed by the language stewards. Not optional compatibility between five tools. One tool.
PEP 723 (inline script metadata) and ongoing PEP processes are moving toward this. The PSF and Python Steering Council are more engaged on packaging than they were five years ago. But the path from “multiple partially compatible tools” to “one authoritative tool” in an ecosystem this large and conservative is measured in years.
Bottom Line
Python packaging is better than it was in 2020 but still worse than it needs to be. uv is the most promising recent development - it solves the speed problem and the Python version management problem in a single fast tool. pyproject.toml standardization means the project configuration format is settled even if the tooling is not. New projects should default to uv for application development. The structural problem of one canonical tool still does not exist, and that will continue to be a friction point for developers coming from other ecosystems.
Comments