pip-tools (2024)

pip-tools (1)pip-tools (2)pip-tools (3)pip-tools (4)pip-tools (5)pip-tools (6)pip-tools (7)pip-tools (8)pip-tools (9)

A set of command line tools to help you keep your pip-based packages fresh,even when you've pinned them. You do pin them, right? (In building your Python application and its dependencies for production, you want to make sure that your builds are predictable and deterministic.)

pip-tools (10)

Installation

Similar to pip, pip-tools must be installed in each of your project'svirtual environments:

$ source /path/to/venv/bin/activate(venv) $ python -m pip install pip-tools

Note: all of the remaining example commands assume you've activated yourproject's virtual environment.

Example usage for pip-compile

The pip-compile command lets you compile a requirements.txt file fromyour dependencies, specified in either pyproject.toml, setup.cfg,setup.py, or requirements.in.

Run it with pip-compile or python -m piptools compile (orpipx run --spec pip-tools pip-compile if pipx was installed with theappropriate Python version). If you use multiple Python versions, you can alsorun py -X.Y -m piptools compile on Windows and pythonX.Y -m piptools compileon other systems.

pip-compile should be run from the same virtual environment as yourproject so conditional dependencies that require a specific Python version,or other environment markers, resolve relative to your project'senvironment.

Note: If pip-compile finds an existing requirements.txt file thatfulfils the dependencies then no changes will be made, even if updates areavailable. To compile from scratch, first delete the existingrequirements.txt file, or seeUpdating requirementsfor alternative approaches.

Requirements from pyproject.toml

The pyproject.toml file is thelatest standard for configuringpackages and applications, and is recommended for new projects. pip-compilesupports both installing your project.dependencies as well as yourproject.optional-dependencies. Thanks to the fact that this is anofficial standard, you can use pip-compile to pin the dependenciesin projects that use modern standards-adhering packaging tools likeSetuptools, Hatchor flit.

Suppose you have a 'foobar' Python application that is packaged using Setuptools,and you want to pin it for production. You can declare the project metadata as:

[build-system]requires = ["setuptools", "setuptools-scm"]build-backend = "setuptools.build_meta"[project]requires-python = ">=3.9"name = "foobar"dynamic = ["dependencies", "optional-dependencies"][tool.setuptools.dynamic]dependencies = { file = ["requirements.in"] }optional-dependencies.test = { file = ["requirements-test.txt"] }

If you have a Django application that is packaged using Hatch, and youwant to pin it for production. You also want to pin your development toolsin a separate pin file. You declare django as a dependency and create anoptional dependency dev that includes pytest:

[build-system]requires = ["hatchling"]build-backend = "hatchling.build"[project]name = "my-cool-django-app"version = "42"dependencies = ["django"][project.optional-dependencies]dev = ["pytest"]

You can produce your pin files as easily as:

$ pip-compile -o requirements.txt pyproject.toml## This file is autogenerated by pip-compile with Python 3.10# by the following command:## pip-compile --output-file=requirements.txt pyproject.toml#asgiref==3.6.0 # via djangodjango==4.1.7 # via my-cool-django-app (pyproject.toml)sqlparse==0.4.3 # via django$ pip-compile --extra dev -o dev-requirements.txt pyproject.toml## This file is autogenerated by pip-compile with Python 3.10# by the following command:## pip-compile --extra=dev --output-file=dev-requirements.txt pyproject.toml#asgiref==3.6.0 # via djangoattrs==22.2.0 # via pytestdjango==4.1.7 # via my-cool-django-app (pyproject.toml)exceptiongroup==1.1.1 # via pytestiniconfig==2.0.0 # via pytestpackaging==23.0 # via pytestpluggy==1.0.0 # via pytestpytest==7.2.2 # via my-cool-django-app (pyproject.toml)sqlparse==0.4.3 # via djangotomli==2.0.1 # via pytest

This is great for both pinning your applications, but also to keep the CIof your open-source Python package stable.

Requirements from setup.py and setup.cfg

pip-compile has also full support for setup.py- andsetup.cfg-based projects that use setuptools.

Just define your dependencies and extras as usual and runpip-compile as above.

Requirements from requirements.in

You can also use plain text files for your requirements (e.g. if you don'twant your application to be a package). To use a requirements.in file todeclare the Django dependency:

# requirements.indjango

Now, run pip-compile requirements.in:

$ pip-compile requirements.in## This file is autogenerated by pip-compile with Python 3.10# by the following command:## pip-compile requirements.in#asgiref==3.6.0 # via djangodjango==4.1.7 # via -r requirements.insqlparse==0.4.3 # via django

And it will produce your requirements.txt, with all the Django dependencies(and all underlying dependencies) pinned.

(updating-requirements)=

Updating requirements

pip-compile generates a requirements.txt file using the latest versionsthat fulfil the dependencies you specify in the supported files.

If pip-compile finds an existing requirements.txt file that fulfils thedependencies then no changes will be made, even if updates are available.

To force pip-compile to update all packages in an existingrequirements.txt, run pip-compile --upgrade.

To update a specific package to the latest or a specific version use the--upgrade-package or -P flag:

# only update the django package$ pip-compile --upgrade-package django# update both the django and requests packages$ pip-compile --upgrade-package django --upgrade-package requests# update the django package to the latest, and requests to v2.0.0$ pip-compile --upgrade-package django --upgrade-package requests==2.0.0

You can combine --upgrade and --upgrade-package in one command, toprovide constraints on the allowed upgrades. For example to upgrade allpackages whilst constraining requests to the latest version less than 3.0:

$ pip-compile --upgrade --upgrade-package 'requests<3.0'

Using hashes

If you would like to use Hash-Checking Mode available in pip sinceversion 8.0, pip-compile offers --generate-hashes flag:

$ pip-compile --generate-hashes requirements.in## This file is autogenerated by pip-compile with Python 3.10# by the following command:## pip-compile --generate-hashes requirements.in#asgiref==3.6.0 \ --hash=sha256:71e68008da809b957b7ee4b43dbccff33d1b23519fb8344e33f049897077afac \ --hash=sha256:9567dfe7bd8d3c8c892227827c41cce860b368104c3431da67a0c5a65a949506 # via djangodjango==4.1.7 \ --hash=sha256:44f714b81c5f190d9d2ddad01a532fe502fa01c4cb8faf1d081f4264ed15dcd8 \ --hash=sha256:f2f431e75adc40039ace496ad3b9f17227022e8b11566f4b363da44c7e44761e # via -r requirements.insqlparse==0.4.3 \ --hash=sha256:0323c0ec29cd52bceabc1b4d9d579e311f3e4961b98d174201d5622a23b85e34 \ --hash=sha256:69ca804846bb114d2ec380e4360a8a340db83f0ccf3afceeb1404df028f57268 # via django

Output File

To output the pinned requirements in a filename other thanrequirements.txt, use --output-file. This might be useful for compilingmultiple files, for example with different constraints on django to test alibrary with both versions using tox:

$ pip-compile --upgrade-package 'django<1.0' --output-file requirements-django0x.txt$ pip-compile --upgrade-package 'django<2.0' --output-file requirements-django1x.txt

Or to output to standard output, use --output-file=-:

$ pip-compile --output-file=- > requirements.txt$ pip-compile - --output-file=- < requirements.in > requirements.txt

Forwarding options to pip

Any valid pip flags or arguments may be passed on with pip-compile's--pip-args option, e.g.

$ pip-compile requirements.in --pip-args "--retries 10 --timeout 30"

Configuration

You can define project-level defaults for pip-compile and pip-sync bywriting them to a configuration file in the same directory as your requirementsinput files (or the current working directory if piping input from stdin).By default, both pip-compile and pip-sync will look firstfor a .pip-tools.toml file and then in your pyproject.toml. You canalso specify an alternate TOML configuration file with the --config option.

It is possible to specify configuration values both globally and command-specific.For example, to by default generate pip hashes in the resultingrequirements file output, you can specify in a configuration file:

[tool.pip-tools]generate-hashes = true

Options to pip-compile and pip-sync that may be used more than oncemust be defined as lists in a configuration file, even if they only have onevalue.

pip-tools supports default values for all valid command-line flagsof its subcommands. Configuration keys may contain underscores instead of dashes,so the above could also be specified in this format:

[tool.pip-tools]generate_hashes = true

Configuration defaults specific to pip-compile and pip-sync can be put beneathseparate sections. For example, to by default perform a dry-run with pip-compile:

[tool.pip-tools.compile] # "sync" for pip-syncdry-run = true

This does not affect the pip-sync command, which also has a --dry-run option.Note that local settings take preference over the global ones of the same name,whenever both are declared, thus this would also make pip-compile generate hashes,but discard the global dry-run setting:

[tool.pip-tools]generate-hashes = truedry-run = true[tool.pip-tools.compile]dry-run = false

You might be wrapping the pip-compile command in another script. To avoidconfusing consumers of your custom script you can override the update commandgenerated at the top of requirements files by setting theCUSTOM_COMPILE_COMMAND environment variable.

$ CUSTOM_COMPILE_COMMAND="./pipcompilewrapper" pip-compile requirements.in## This file is autogenerated by pip-compile with Python 3.10# by the following command:## ./pipcompilewrapper#asgiref==3.6.0 # via djangodjango==4.1.7 # via -r requirements.insqlparse==0.4.3 # via django

Workflow for layered requirements

If you have different environments that you need to install different butcompatible packages for, then you can create layered requirements files and useone layer to constrain the other.

For example, if you have a Django project where you want the newest 2.1release in production and when developing you want to use the Django debugtoolbar, then you can create two *.in files, one for each layer:

# requirements.indjango<2.2

At the top of the development requirements dev-requirements.in you use -c requirements.txt to constrain the dev requirements to packages alreadyselected for production in requirements.txt.

# dev-requirements.in-c requirements.txtdjango-debug-toolbar<2.2

First, compile requirements.txt as usual:

$ pip-compile## This file is autogenerated by pip-compile with Python 3.10# by the following command:## pip-compile#django==2.1.15 # via -r requirements.inpytz==2023.3 # via django

Now compile the dev requirements and the requirements.txt file is used asa constraint:

$ pip-compile dev-requirements.in## This file is autogenerated by pip-compile with Python 3.10# by the following command:## pip-compile dev-requirements.in#django==2.1.15 # via # -c requirements.txt # django-debug-toolbardjango-debug-toolbar==2.1 # via -r dev-requirements.inpytz==2023.3 # via # -c requirements.txt # djangosqlparse==0.4.3 # via django-debug-toolbar

As you can see above, even though a 2.2 release of Django is available, thedev requirements only include a 2.1 version of Django because they wereconstrained. Now both compiled requirements files can be installed safely inthe dev environment.

To install requirements in production stage use:

$ pip-sync

You can install requirements in development stage by:

$ pip-sync requirements.txt dev-requirements.txt

Version control integration

You might use pip-compile as a hook for the pre-commit.See pre-commit docs for instructions.Sample .pre-commit-config.yaml:

repos: - repo: https://github.com/jazzband/pip-tools rev: 7.4.1 hooks: - id: pip-compile

You might want to customize pip-compile args by configuring args and/or files, for example:

repos: - repo: https://github.com/jazzband/pip-tools rev: 7.4.1 hooks: - id: pip-compile files: ^requirements/production\.(in|txt)$ args: [--index-url=https://example.com, requirements/production.in]

If you have multiple requirement files make sure you create a hook for each file.

repos: - repo: https://github.com/jazzband/pip-tools rev: 7.4.1 hooks: - id: pip-compile name: pip-compile setup.py files: ^(setup\.py|requirements\.txt)$ - id: pip-compile name: pip-compile requirements-dev.in args: [requirements-dev.in] files: ^requirements-dev\.(in|txt)$ - id: pip-compile name: pip-compile requirements-lint.in args: [requirements-lint.in] files: ^requirements-lint\.(in|txt)$ - id: pip-compile name: pip-compile requirements.in args: [requirements.in] files: ^requirements\.(in|txt)$

Example usage for pip-sync

Now that you have a requirements.txt, you can use pip-sync to updateyour virtual environment to reflect exactly what's in there. This willinstall/upgrade/uninstall everything necessary to match therequirements.txt contents.

Run it with pip-sync or python -m piptools sync. If you use multiplePython versions, you can also run py -X.Y -m piptools sync on Windows andpythonX.Y -m piptools sync on other systems.

pip-sync must be installed into and run from the same virtualenvironment as your project to identify which packages to installor upgrade.

Be careful: pip-sync is meant to be used only with arequirements.txt generated by pip-compile.

$ pip-syncUninstalling flake8-2.4.1: Successfully uninstalled flake8-2.4.1Collecting click==4.1 Downloading click-4.1-py2.py3-none-any.whl (62kB) 100% |................................| 65kB 1.8MB/s Found existing installation: click 4.0 Uninstalling click-4.0: Successfully uninstalled click-4.0Successfully installed click-4.1

To sync multiple *.txt dependency lists, just pass them in via commandline arguments, e.g.

$ pip-sync dev-requirements.txt requirements.txt

Passing in empty arguments would cause it to default to requirements.txt.

Any valid pip install flags or arguments may be passed with pip-sync's--pip-args option, e.g.

$ pip-sync requirements.txt --pip-args "--no-cache-dir --no-deps"

Note: pip-sync will not upgrade or uninstall packaging tools likesetuptools, pip, or pip-tools itself. Use python -m pip install --upgradeto upgrade those packages.

Should I commit requirements.in and requirements.txt to source control?

Generally, yes. If you want a reproducible environment installation available from your source control,then yes, you should commit both requirements.in and requirements.txt to source control.

Note that if you are deploying on multiple Python environments (read the section below),then you must commit a separate output file for each Python environment.We suggest to use the {env}-requirements.txt format(ex: win32-py3.7-requirements.txt, macos-py3.10-requirements.txt, etc.).

Cross-environment usage of requirements.in/requirements.txt and pip-compile

The dependencies of a package can change depending on the Python environment in which itis installed. Here, we define a Python environment as the combination of OperatingSystem, Python version (3.7, 3.8, etc.), and Python implementation (CPython, PyPy,etc.). For an exact definition, refer to the possible combinations of PEP 508environment markers.

As the resulting requirements.txt can differ for each environment, users mustexecute pip-compile on each Python environment separately to generate arequirements.txt valid for each said environment. The same requirements.in canbe used as the source file for all environments, usingPEP 508 environment markers asneeded, the same way it would be done for regular pip cross-environment usage.

If the generated requirements.txt remains exactly the same for all Pythonenvironments, then it can be used across Python environments safely. But usersshould be careful as any package update can introduce environment-dependentdependencies, making any newly generated requirements.txt environment-dependent too.As a general rule, it's advised that users should still always execute pip-compileon each targeted Python environment to avoid issues.

Maximizing reproducibility

pip-tools is a great tool to improve the reproducibility of builds.But there are a few things to keep in mind.

  • pip-compile will produce different results in different environments as described in the previous section.
  • pip must be used with the PIP_CONSTRAINT environment variable to lock dependencies in build environments as documented in #8439.
  • Dependencies come from many sources.

Continuing the pyproject.toml example from earlier, creating a single lock file could be done like:

$ pip-compile --all-build-deps --all-extras --output-file=constraints.txt --strip-extras pyproject.toml## This file is autogenerated by pip-compile with Python 3.9# by the following command:## pip-compile --all-build-deps --all-extras --output-file=constraints.txt --strip-extras pyproject.toml#asgiref==3.5.2 # via djangoattrs==22.1.0 # via pytestbackports-zoneinfo==0.2.1 # via djangodjango==4.1 # via my-cool-django-app (pyproject.toml)editables==0.3 # via hatchlinghatchling==1.11.1 # via my-cool-django-app (pyproject.toml::build-system.requires)iniconfig==1.1.1 # via pytestpackaging==21.3 # via # hatchling # pytestpathspec==0.10.2 # via hatchlingpluggy==1.0.0 # via # hatchling # pytestpy==1.11.0 # via pytestpyparsing==3.0.9 # via packagingpytest==7.1.2 # via my-cool-django-app (pyproject.toml)sqlparse==0.4.2 # via djangotomli==2.0.1 # via # hatchling # pytest

Some build backends may also request build dependencies dynamically using the get_requires_for_build_ hooks described in PEP 517 and PEP 660.This will be indicated in the output with one of the following suffixes:

  • (pyproject.toml::build-system.backend::editable)
  • (pyproject.toml::build-system.backend::sdist)
  • (pyproject.toml::build-system.backend::wheel)

Other useful tools

Deprecations

This section lists pip-tools features that are currently deprecated.

  • In the next major release, the --allow-unsafe behavior will be enabled bydefault (https://github.com/jazzband/pip-tools/issues/989).Use --no-allow-unsafe to keep the old behavior. It is recommendedto pass --allow-unsafe now to adapt to the upcoming change.
  • The legacy resolver is deprecated and will be removed in future versions.The new default is --resolver=backtracking.
  • In the next major release, the --strip-extras behavior will be enabled bydefault (https://github.com/jazzband/pip-tools/issues/1613).Use --no-strip-extras to keep the old behavior.

A Note on Resolvers

You can choose from either default backtracking resolver or the deprecated legacy resolver.

The legacy resolver will occasionally fail to resolve dependencies. Thebacktracking resolver is more robust, but can take longer to run in general.

You can continue using the legacy resolver with --resolver=legacy althoughnote that it is deprecated and will be removed in a future release.

pip-tools (2024)
Top Articles
Capital Markets Outlook: 2Q 2024
An 8-Step Guide For Beginners – Piggyvest Blog
No Hard Feelings (2023) Tickets & Showtimes
Monthly Forecast Accuweather
Greedfall Console Commands
Practical Magic 123Movies
Nc Maxpreps
Ashlyn Peaks Bio
Embassy Suites Wisconsin Dells
Craigslist Free Grand Rapids
Newgate Honda
Reddit Wisconsin Badgers Leaked
24 Hour Walmart Detroit Mi
Interactive Maps: States where guns are sold online most
Wausau Obits Legacy
Lista trofeów | Jedi Upadły Zakon / Fallen Order - Star Wars Jedi Fallen Order - poradnik do gry | GRYOnline.pl
Dragonvale Valor Dragon
Understanding Gestalt Principles: Definition and Examples
Troy Gamefarm Prices
Directions To Nearest T Mobile Store
University Of Michigan Paging System
Marokko houdt honderden mensen tegen die illegaal grens met Spaanse stad Ceuta wilden oversteken
Access a Shared Resource | Computing for Arts + Sciences
Radical Red Ability Pill
Delta Township Bsa
The Collective - Upscale Downtown Milwaukee Hair Salon
Pulitzer And Tony Winning Play About A Mathematical Genius Crossword
417-990-0201
First Light Tomorrow Morning
Adecco Check Stubs
Edward Walk In Clinic Plainfield Il
Tyler Sis 360 Boonville Mo
Wednesday Morning Gifs
Metro By T Mobile Sign In
ATM Near Me | Find The Nearest ATM Location | ATM Locator NL
Regis Sectional Havertys
3302577704
Devotion Showtimes Near The Grand 16 - Pier Park
Lamp Repair Kansas City Mo
Craigslist Central Il
Dr Mayy Deadrick Paradise Valley
Nu Carnival Scenes
All Weapon Perks and Status Effects - Conan Exiles | Game...
Craigslist Mendocino
The Machine 2023 Showtimes Near Roxy Lebanon
Joy Taylor Nip Slip
Tìm x , y , z :a, \(\frac{x+z+1}{x}=\frac{z+x+2}{y}=\frac{x+y-3}{z}=\)\(\frac{1}{x+y+z}\)b, 10x = 6y và \(2x^2\)\(-\) \(...
March 2023 Wincalendar
Ubg98.Github.io Unblocked
Latest Posts
Article information

Author: Melvina Ondricka

Last Updated:

Views: 6651

Rating: 4.8 / 5 (68 voted)

Reviews: 83% of readers found this page helpful

Author information

Name: Melvina Ondricka

Birthday: 2000-12-23

Address: Suite 382 139 Shaniqua Locks, Paulaborough, UT 90498

Phone: +636383657021

Job: Dynamic Government Specialist

Hobby: Kite flying, Watching movies, Knitting, Model building, Reading, Wood carving, Paintball

Introduction: My name is Melvina Ondricka, I am a helpful, fancy, friendly, innocent, outstanding, courageous, thoughtful person who loves writing and wants to share my knowledge and understanding with you.