140

I have a Dockerfile where I tried to activate python virtualenv so that, it should install all dependencies within this env. However, everything still gets installed globally. I used different approaches and none of them worked. I am also not getting any errors. Where is the problem?

1. ENV PATH $PATH:env/bin

2. ENV PATH $PATH:env/bin/activate

3. RUN . env/bin/activate

I also followed an example of a Dockerfile config for the python-runtime image on Google Cloud, which is basically the same stuff as above.

Setting these environment variables are the same as running source /env/bin/activate.

ENV VIRTUAL_ENV /env

ENV PATH /env/bin:$PATH

Additionally, what does ENV VIRTUAL_ENV /env mean and how it is used?

4
  • source ../bin/activate tried ? Commented Feb 1, 2018 at 11:52
  • Are you running multiple python apps in the same Docker container? Commented Feb 1, 2018 at 12:24
  • 2
    It's likely not best practice to use virtualenv in a Dockerfile since you'd ideally just install globally using the one app per container practice. However, I'm glad I happened upon this because I have a unit testing use case that requires virtualenv in a Dockerfile. It might seem odd but part of the test is for virtualenv integration. Thank you for asking this question. Commented Mar 26, 2018 at 22:32
  • 1
    re: "everything still gets installed globally". Most of the time when I see that happen, it's because someone is using the global pip. Build a venv in your Docker image, and then use thepip corresponding to the target virtualenv for installing packages into that virtualenv. If you call /path/to/venv/bin/pip (note the the full venv path) you'll likely find success. Commented Jun 29, 2022 at 10:59

10 Answers 10

155

You don't need to use virtualenv inside a Docker Container.

virtualenv is used for dependency isolation. You want to prevent any dependencies or packages installed from leaking between applications. Docker achieves the same thing, it isolates your dependencies within your container and prevent leaks between containers and between applications.

Therefore, there is no point in using virtualenv inside a Docker Container unless you are running multiple apps in the same container, if that's the case I'd say that you're doing something wrong and the solution would be to architect your app in a better way and split them up in multiple containers.


EDIT 2022: Given this answer get a lot of views, I thought it might make sense to add that now 4 years later, I realized that there actually is valid usages of virtual environments in Docker images, especially when doing multi staged builds:

FROM python:3.9-slim as compiler ENV PYTHONUNBUFFERED 1 WORKDIR /app/ RUN python -m venv /opt/venv # Enable venv ENV PATH="/opt/venv/bin:$PATH" COPY ./requirements.txt /app/requirements.txt RUN pip install -Ur requirements.txt FROM python:3.9-slim as runner WORKDIR /app/ COPY --from=compiler /opt/venv /opt/venv # Enable venv ENV PATH="/opt/venv/bin:$PATH" COPY . /app/ CMD ["python", "app.py", ] 

In the Dockerfile example above, we are creating a virtualenv at /opt/venv and activating it using an ENV statement, we then install all dependencies into this /opt/venv and can simply copy this folder into our runner stage of our build. This can help with minimizing docker image size.

Sign up to request clarification or add additional context in comments.

18 Comments

The point is to save space. You can copy the virtualenv directory as is without the need of python3-virtualenv in the target image. That saves you the whole toolchain (gcc and friends) and thus a few hundred megabytes.
Many Python packages only support installation in a virtual environment, in which case it's useful to be able to activate the venv inside a docker container.
Downvoting for offtopic. If author is concerned about specific problem of usage of virtualenv together with Docker it means that he actually needs to use virtualenv with Docker.
@GillBates that's an assumption (and perpetual debate on SO). Clearly if someone was asking how to put sugar in their gas tank we wouldn't all say, look they REALLY want to know how to get sugar in their gas tank. The author's ignorance regarding docker/venv is unknown, so it's hard to tell what they REALLY want. That said, don't agree venv in containers never make sense. My case: the base image packages conflict with python install
So, today, in 2023 when PEP 668 is out and basically blocking pip install outside of venv by default, what is the recommendation for Docker images? I tend to agree with this answer, as I'm already in Docker, I don't really care about venv. However now it seems I'm somehow forced to... Would be great to see one more update on the answer for today's situation...
|
102

There are perfectly valid reasons for using a virtualenv within a container.

You don't necessarily need to activate the virtualenv to install software or use it. Try invoking the executables directly from the virtualenv's bin directory instead:

FROM python:2.7 RUN virtualenv /ve RUN /ve/bin/pip install somepackage CMD ["/ve/bin/python", "yourcode.py"] 

You may also just set the PATH environment variable so that all further Python commands will use the binaries within the virtualenv as described in https://pythonspeed.com/articles/activate-virtualenv-dockerfile/

FROM python:2.7 RUN virtualenv /ve ENV PATH="/ve/bin:$PATH" RUN pip install somepackage CMD ["python", "yourcode.py"] 

2 Comments

this will not work if yourcode.py creates a subprocess, I think. You also need to fiddle with $PATH, as explained in monitorius' answer.
If your code creates a subprocess that invokes python use sys.executable to get the path the virtualenv interpreter. e.g. subprocess.run([sys.executable, '-m', 'foo',]) which is generally a good idea anyhow for a lot of other scenarios.
58

Setting this variables

ENV VIRTUAL_ENV /env ENV PATH /env/bin:$PATH 

is not exactly the same as just running

RUN . env/bin/activate 

because activation inside single RUN will not affect any lines below that RUN in Dockerfile. But setting environment variables through ENV will activate your virtual environment for all RUN commands.

Look at this example:

RUN virtualenv env # setup env RUN which python # -> /usr/bin/python RUN . /env/bin/activate && which python # -> /env/bin/python RUN which python # -> /usr/bin/python 

So if you really need to activate virtualenv for the whole Dockerfile you need to do something like this:

RUN virtualenv env ENV VIRTUAL_ENV /env # activating environment ENV PATH /env/bin:$PATH # activating environment RUN which python # -> /env/bin/python 

4 Comments

Another pretty popular option is to run a bash script as an entry point and let it do the rest heavy-lifting.
Entry point is executing in runtime, when an image is already built and deployed. It should be a really special case if you want to install your packages to virtualenv while in runtime, instead of image build time
Thank you ! This is the best answer for my point of view, because this is the only one that provided a solution to run pip install without any error related to venv not being activated.
This shoud be the accepted answer. The only one that actually worked for me. Thanks !
18

Although I agree with Marcus that this is not the way of doing with Docker, you can do what you want.

Using the RUN command of Docker directly will not give you the answer as it will not execute your instructions from within the virtual environment. Instead squeeze the instructions executed in a single line using /bin/bash. The following Dockerfile worked for me:

FROM python:2.7 RUN virtualenv virtual RUN /bin/bash -c "source /virtual/bin/activate && pip install pyserial && deactivate" ... 

This should install the pyserial module only on the virtual environment.

5 Comments

Thanks for the provided solution, although it did not work for me. Now, the dependency (django) is installed but I cannot find where as python 2/3 cannot import it while being outside or inside of virtualenv. I do not have a complex app, therefore I'd stick to the main purpose of Docker for now, although, there are still threads where it is explained why creating venv inside the docker container is still a fine practice. Example
Hope you solved the problem anyway. However that's odd, how do you check where the installation is done?
Is the "&& deactivate" at the end really needed? docker is starting subsequent RUNs in new shells anyway, right?
Right, I just added it to be clean in case the activation had any impact on the filesystem, which would remain in the resulting Docker image. It is most likely dispensable.
@pinty Would you maybe have any update on your answer given PEP-668 is out in 2023 and now the system basically blocks non-venv pip install?
7

Sometimes you have to use venv within a docker container.

Some docker image authors build their containers in such a way that they will not allow you to pip install without creating a venv first.

(There may be ways around this, but why fight against the system?)

One way to make it work is to do the following:

RUN python3 -m venv venv RUN ./venv/bin/pip install <list of packages to install> ENTRYPOINT ["./venv/bin/python3", "main.py"] 

In other words, call python3 and pip from within the venv directly.

If you have a requirements.txt:

COPY ./requirements.txt . RUN python3 -m venv venv RUN ./venv/bin/pip3 install --no-cache-dir -r requirements.txt ENTRYPOINT ["./venv/bin/python3", "main.py"] 

Further info here:


Also note: You can use this same method to maintain a global virtual environment on a regular system. (Nothing to do with Docker.)

eg: Create python3 -m venv /home/user/venv, which you can then activate with

source /home/user/venv/bin/activate 

or call python3 and pip3 directly with

/home/user/venv/bin/python3 /home/user/venv/bin/pip3 

Create an alias to these paths for convenience in your .bash_aliases or .bashrc.

2 Comments

This is what worked for me; I built a django app from scratch using alpine. Attempting to install packages with pip install --no-cache-dir -r requirements.txt needed a venv to work, followed by CMD ["./env/bin/python", "manage.py", "runserver", "0.0.0.0:8082"]
@Katana24 The problem with your suggestion is which python are you running? It might be the "system" python, which could be /usr/bin/python2. It's hard to tell what that does. Better to explicitly call the python3 interpreter that you want to use.
0

The only solution that worked to me is this

CMD ["/bin/bash", "-c", "source <your-env>/bin/activate && cd src && python main.py"] 

Comments

0

All python programs executing within a virtual env have to have that env activated first. Activation must be done by a parent process, before running the child python, or very early in the child python process. The parent is often bash, but in a Dockerfile, the parent could be your ENTRYPOINT program. To activate you must:

  1. Un-set PYTHONHOME
  2. Prepend the virtual env's path to PATH
  3. Pass in at least these environment vars to the child python process when doing exec

For example, if your parent process or ENTRYPOINT were a golang process you might do something like this before executing the python sub-process:

 // Our python program uses virtual environments, so activate the virtual // environment for python sub-processes before running it, so the // env vars can be inherited when its executed. execpath := os.Getenv("PATH") os.Setenv("PATH", "/venv/bin:"+execpath) os.Unsetenv("PYTHONHOME") 

...if the virtual env were at /venv for example.

Comments

0

Another (possibly cleaner) way to run commands inside a virtualenv is to use a heredoc to place multiple commands inside a single RUN call, for example:

# Create a Python virtual environment to hold installed packages. RUN python -m venv ai # Install packages inside the virtual environment. RUN <<EOF source ai/bin/activate pip install --upgrade pip pip install \ onnx \ onnxoptimizer \ onnxruntime \ onnxsim \ ultralytics pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cpu EOF 

Comments

-3

If you your using python 3.x :

RUN pip install virtualenv RUN virtualenv -p python3.5 virtual RUN /bin/bash -c "source /virtual/bin/activate" 

If you are using python 2.x :

RUN pip install virtualenv RUN virtualenv virtual RUN /bin/bash -c "source /virtual/bin/activate" 

Comments

-6

Consider a migration to pipenv - a tool which will automate virtualenv and pip interactions for you. It's recommended by PyPA.

Reproduce environment via pipenv in a docker image is very simple:

FROM python:3.7 RUN pip install pipenv COPY src/Pipfile* ./ RUN pipenv install --deploy ... 

2 Comments

Sorry if this is a silly question but how can I use the dependencies that were installed by pipenv when using the actual image? My understanding is that pipenv installs to a virtualenv with a random name. So if I pull this image, clone my repo, and try to run pipenv run pytest then it doesn't have those installed requirements accessible from my folder. Thanks
@RayB This is the good question! I personally add --system argument to the RUN from my answer. Then you can just call pytest. But this have some caveats which is about content of a system python site-packages for a particular OS: the content can be differ. So this way is not so enterprise-ready. But usable for development. For enterprise grade solution you need to set or catch the virtualenv name, imho.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.