Latest update (2022-02)
There seems to be major update in pip just few days old (version 22.0, release notes + relevant issue on github).
I haven’t tested it in more detail but it really seems to me that they optimized installation order calculation in complex case in such way that it resolves many issues we all encountered earlier. But I will need more time to check it.
Anyway, the rest of this answer is still valid and smart requirements pinning suitable for particular project is a good practice imo.
Since I encountered similar issue I agree this is quite annoying. Backtracking might be useful feature but you don’t want to wait hours to complete with uncertain success.
I found several option that might help:
- Use the old resolver (
--use-deprecated=legacy-resolver) proposed in the answer by @Daniel Davee, but this is more like temporary solution than a proper one. - Skip resolving dependencies with
--no-depsoption. I would not recommend this generally but in some cases you can have a working set of packages versions although there are some conflicts. - Reduce the number of versions pip will try to backtrack and be more strict on package dependencies. This means instead of putting e.g.
numpyin my requirements.txt, I could trynumpy >= 1.18.0or be even more strict withnumpy == 1.18.0. The strictness might help a lot.
Check the following sources:
- Fixing conflicts
- Github pip discussion
- Reducing backtracking
I still do not have a proper answer that would always help but the best practice for requirements.txt seems to “pin” package versions. I found pip-tools that could help you manage this even with constrains.txt (but I am in an experimental phase so I can not tell you more).
Update (2021-04)
It seems author of the question was able to fix the issue (something with custom gitlab server) but I would like to extend this answer since it might be useful for others.
After reading and trying I ended up with pinning all my package versions to a specific one. This really should be the correct way. Although everything can still work without it, there might be cases where if you don’t pin your dependencies, your package manager will silently install a new version (when it’s released) with possible bugs or incompatibility (this happens to me with dask last this year).
There are several tools which might help you, I would recommend one of these approaches:
Easiest one with pipreqs
- pipreqs is a library which generates pip requirements.txt file based on imports of any project
- you can start by
pip install pipreqsand runnning justpipreqsin your project root (or eventually with--forceflag if your requirements already exists) - it will easily create
requirements.txtwith pinned versions based on imports in your project and versions taken from your environment - then you can at any time create new environment based on this
requirements.txt
This is really simple tool (you even do not need to write your requirements.txt). It does not allow you to create something complex (might not be a good choice for bigger projects), last week I found one strange behavior (see this) but generally I’m happy with this tool as it usually works perfectly.
Using pip-tools
There are several other tools commonly used like pip-tools, Pipenv or Poetry. You can read more in Faster Docker builds with pipenv, poetry, or pip-tools or Python Application Dependency Management in 2018 (older but seems still valid to me). And it still seems to me that the best option (although it depends on your project/use case) is pip-tools.
You can (this is one option, see more in docs):
- create
requirements.in(the same format asrequirements.txt, it’s up to you whether you pin some package dependency or not) - then you can use it by
pip install pip-toolsand runningpip-compile requirements.in - this will generate new
requirements.txtfile where all versions are pinned, it’s clear, what is the origin - (Optionally) you can run it with
--generate-hashesoption - then you can (as with
pipreqs) at any time create new environment based on thisrequirements.txt pip-toolsoffer you--upgradeoption to upgrade the final reqs- supports layered requirements (e.g. having dev and prod versions)
- there is integration with pre-commit
- offers
pip-synctool to update your environment based on requirements.txt
There are few more stuff you can do with it and I really love the integration with pre-commit. This allows you to use the same requirements as before (just with .in suffix) and add pre-commit hook that automatically updates requirements.txt (so you will never experience having different local environment from the generated requirements.txt which might easily happen when you run something manually).