13

I have a lot of packages to install in my pip requirement and I'd like to process them in parallell.

I know that, for example, that if I want n parallel jobs from make I have to write make -j n; is there an equivalent command for pip requirements?

Thanks!

1

2 Answers 2

21

Sometimes pip uses make to build dependencies. If before it starts you set MAKEFLAGS like:

export MAKEFLAGS="-j$(nproc)" pip install -r requirements.txt 

This may help building native dependencies.

Note: nproc resovles as the number of CPUs in your system.

Sign up to request clarification or add additional context in comments.

2 Comments

This works. My wxPython build is now happening on 8 processes in parallel Thanks. I used -j8 just for good measure.
Hello, is there a similar alternative for this answer which works on Windows?
-4

I think that the best approach to have a better speed is to see where is the bottleneck. Try to analyse which processes are taking place when you use pip command.

Probably the most time is spent downloading from PyPI and for compiling the libraries to native (such as PIL). You could try to make your own PyPI repository and to pre-compile the sources that are needed to be compiled. In the past there has been much talk of this but there isn't a really speed up if launching pip in parallel.

1 Comment

Ok, then just install that package once and then go to site_packages and copy it when you need it. OS and architecture must need to match in order to achieve it and then invoke pip install so it could find that is already installed.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.