How do I read dependencies from requires.txt of a Python package

python requirements.txt example
create requirements.txt python
requirements.txt python version install_requires
python create egg with dependencies
pip install dependencies

I need the dependencies because I want to add these to my RPM meta-data.

To build I use:

python bdist_rpm

When I build the package cryptography-2.2.2 it creates a file /src/cryptography.egg-info/requires.txt

It contains:


[:platform_python_implementation != 'PyPy']

[:python_version < '3']

How can I read all dependencies, evaluating the expression between []?

I'm using Python 2.7 (don't ask)

I need the following output:


I want to omit other sections like [doc], [test] etcetera.

The requires.txt is part of the dependency metadata, so you can use the same tools easy_install uses when installing the egg. Assuming the file requires.txt is in the current directory:

In [1]: from pkg_resources import Distribution, PathMetadata

In [2]: dist = Distribution(metadata=PathMetadata('.', '.'))

Now you can filter all dependencies for your current platform with Distribution.requires():

In [3]: sys.version
Out[3]: '3.6.4 (v3.6.4:d48ecebad5, Dec 18 2017, 21:07:28) \n[GCC 4.2.1 (Apple Inc. build 5666) (dot 3)]'

In [4]: dist.requires()

The list would be different if I used Python 2.7:

In [4]: sys.version
Out[4]: '2.7.10 (default, Oct  6 2017, 22:29:07) \n[GCC 4.2.1 Compatible Apple LLVM 9.0.0 (clang-900.0.31)]'

In [5]: dist.requires()

or PyPy:

In [2]: sys.version
Out[2]: '3.5.3 (fdd60ed87e941677e8ea11acf9f1819466521bf2, Apr 26 2018, 01:25:35)\n[PyPy 6.0.0 with GCC 4.2.1 Compatible Apple LLVM 9.1.0 (clang-902.0.39.1)]'

In [3]: d.requires()

Now, if you want to generate a list of requirement strings (like when you want to generate a requirements file for pip), convert the requirements to strings:

In [8]: os.linesep.join(str(r) for r in dist.requires())

PEP 508

If you also want to take PEP 508 environment markers into account independent of the current platform, things can get a bit trickier, but still manageable. First, convert the requirements with env markers:

In [22]: dep_map_pep508 = {k: v for k, v in dist._build_dep_map().items() if k and k.startswith(':')}

In [24]: reqs_pep508 = [str(r) + ';' + k.lstrip(':') for k, v in dep_map_pep508.items() for r in v]

In [25]: reqs_pep508
["cffi>=1.7;platform_python_implementation != 'PyPy'",
 "enum34;python_version >= '3'",
 "ipaddress;python_version >= '3'"]

Now handle the platform-independent deps, these house under the None key in dist's dependency map:

In [26]: reqs_no_platform = [str(r) for r in dist._build_dep_map()[None]]

In [27]: reqs_no_platform
Out[27]: ['idna>=2.1', 'asn1crypto>=0.21.0', 'six>=1.4.1', 'cffi!=1.11.3,>=1.7']

Combine both lists to a string ready to be written to requirements file:

In [28]: os.linesep.join(reqs_no_platform + reqs_pep508)
Out[28]: "idna>=2.1\nasn1crypto>=0.21.0\nsix>=1.4.1\ncffi!=1.11.3,>=1.7\ncffi>=1.7;platform_python_implementation != 'PyPy'\nenum34;python_version >= '3'\nipaddress;python_version >= '3'"

install_requires vs requirements files, For example, if the project requires A and B, your install_requires would be like so: It is not considered best practice to use install_requires to pin dependencies to “Concrete”, i.e. associated with a particular index or directory of packages. Create the files in the root of your project: Call python package : this will delete and recreate the wheelhouse folder on the root of your project, then it will collect the packages into the wheelhouse folder. The last step will create the archive in the dist folder on the root of your project.

So I was able to find one working solution for the same, there might be other possibilities as well, but i think this should work on most versions

import pkg_resources

lines = open("requirements.txt").readlines()

load_packages = True
for line in lines:
    if line.strip().startswith("#"):

    if line.startswith("[:"):
        # this is marker, let's evaluate
        load_packages = pkg_resources.evaluate_marker(line.strip()[2:-1])
    elif line.startswith("["):
        # this is a subsection ignore it
        load_packages = False

    if load_packages and line.strip():

The output of the same is below


If I change the requirements.txt like below


[:platform_python_implementation == 'PyPy']

[:python_version > '3']

The output changes to


Why and How to make a Requirements.txt - Robert Boscacci, Nov 18, 2018 · 7 min read In short, we generate and share requirements.txt files to make it easier for other developers to Open-source python packages — like beautifulsoup, or jupyter, or any of the other 158,872+ All of the program's “​dependencies” will be downloaded, installed, and ready to go in one fell swoop. Specifying Dependencies¶. If you’re using Python, odds are you’re going to want to use other public packages from PyPI or elsewhere. Fortunately, setuptools makes it easy for us to specify those dependencies (assuming they are packaged correctly) and automatically install them when our packages is installed.

Specifying Dependencies, Fortunately, setuptools makes it easy for us to specify those dependencies running egg_info writing requirements to funniest.egg-info/requires.txt writing writing dependency_links to funniest.egg-info/dependency_links.txt reading manifest  Note that when your application includes definitions of Python source packages, they (and their dependencies) can be added to your pipenv environment with pipenv install -e <relative-path-to-source-directory> (e.g. pipenv install -e . or pipenv install -e src ).

Manage package dependencies with a requirements.txt file, A requirements.txt file describes a project's dependencies. 03/18/2019; 2 minutes to read file ( that contains a list of commands for pip that installs the required versions of dependent packages. The most  It provides historical context on why Python's code library packaging was painful for a long time, and what's been fixed to make building and installing application dependencies so much better. A non-magical introduction to virtualenv and pip breaks down what problems these tools solve and how to use them.

How to use requirements.txt to install all dependencies in a python , If you are using Linux as your OS then you can follow the below-mentioned steps​:- Firstly, remove matplotlib==1.3.1 from requirements.txt. Dependencies. One of the new features of Package Control 3.0 is support for dependencies. Dependencies are non-user-visible packages that contain binaries, shared libraries or Python modules. Dependencies are not a way to install multiple related packages. Currently no such functionality exists. How They Work

Manage dependencies using requirements.txt, Update a requirements file. In an import statement of a Python file, click a package which is not yet imported  How To Package Your Python Code¶ This tutorial aims to put forth an opinionated and specific pattern to make trouble-free packages for community use. It doesn’t describe the only way of doing things, merely one specific approach that works well.

  • Have you tried ?
  • @ChristianSauer When I do a pip install it evaluates the expressions such as [:python_version < '3']. I'd like to access what has been used during the build of the package.
  • pip freeze after install gives the exactly installed versions. Not per dependency, just the whole set. Otherwise look at pipenv.
  • @ThePjot Good idea, but it also lists build time dependencies which are not runtime dependencies. I don't want the tools required to build documentation on my target system, only the minimum it needs to run. I'll check out pipenv.
  • Is this file pip compatible? Because my pip will error out with this requirements file with error InvalidRequirement: Invalid requirement, parse error at "u'[:platfo'"
  • This is great. It allows me to pick up a full list after building a package. Thanks!
  • Nice. This works in most cases but, I want to omit other sections like [doc], [test] etcetera. The accepted answer covers that.
  • @EddyPronk My solution omits those sections but I agree, the accepted answer is what I would go with as well.