4

Best practice for Python is to use venv to isolate to the imports you really need. I use python -m venv.

For development, it's very convenient to use IPython and notebooks for exploring code interactively. However, these need to be installed into the venv to be used. That defeats the purpose of the venv.

I can make two venvs, one for usage and one venv-ipy for interactive exploration, but that's hard to manage, and need to be synchornized.

Is there a solution or practice?

4
  • @Wayne uv may be much faster than pip, but how does it address the question of the OP? Can you post an example of using uv to do what's requested? Commented Mar 28 at 17:43
  • Can you explain why it's difficult to manage? What difficulties do you want to overcome? Commented Apr 2 at 12:51
  • Are you ok with having jupyter installed in your global python environment? If so you can use --break-system-packages flag to forcefully install jupyter. Commented Apr 6 at 10:29
  • You don't need iPython installed in the current virtualenv, only for the global Python that matches your virtualenv's Python version. See this question: stackoverflow.com/questions/20327621/… Commented Apr 7 at 23:31

5 Answers 5

1
+150

I work on multiple projects most of which run in production, each with it's own dependencies. The practice which works for me is to create separate requirements.txt files: one for production environment, the other one contains additions used for development, and a small script that creates/u[dates 2 venvs at the same run: <ProjectName>_prod and <ProjectName>_dev.

So the requirements_dev.txt can contain only extensions like jupyter, matplotlib and other packages that are used only in development and there is no need to sync the files until reaching the production stage.

To install the dependencies in dev environment you mention both files:

pip install -r requirements.txt -r requirements_dev.txt 

I also use conda to create/manage environments as I find it more convenient than venv.

Sign up to request clarification or add additional context in comments.

1 Comment

Simple and effective!
1

I use Jupyter during development as an alternative to the REPL for quick hypothesis testing and some other puproses. So, I have solved the same problem using different methods. I won't describe all of them, I just will share two practices that I came to in the end.

  1. When working with requirements.txt, I started making entries manually instead of using pip freeze. This is a useful practice because it doesn't catch package dependencies and keeps the file clean. Additionally, it helps me to not lose sight of the overall project. So, you can just delete (or comment) ipykernel line at the production stage.

  2. Also, I'm the one who only try new tools if existing ones are not enough. Recently, I realized that it was time to move away from requirements.txt and use pyproject.toml. This tool had been lying around waiting for my attention and turned out to be perfect solution. Because pyproject.toml allows different groups for different purposes.

Here's an abstract example of how my pyproject.toml file looks:

[project] dependencies = ["Django>=5.1"] [dependency-groups] dev = ["ipykernel >=6.29.5"] test = ["pytest >=8.3.5"] 

I appreciate the fact that pyproject.toml is a native Python solution the same as requirements.txt always was. Nowadays I use uv for managing it, but it is optional way, of course. So I think pyproject.toml is a best solution to divide dependencies in the project.

1 Comment

Thanks. This is a good answer and if possible, I would accept it as well. It would be stronger if you would show how do you switch between the dev and test environments.
0

I think you can manage this automatically since the hassle is creating two venvs.

This is how I will go about automatic switching if I need to be hassle free with this. But you have to fill in the gap since you didn't mention which development environment you are using.

1. Install direnv

You can see details here based on your operating system https://direnv.net/docs/installation.html

2. Try to hook direnv into your shell

You can see details in this link how to hook it in your shell https://direnv.net/docs/hook.html

An example in my own case using Bash, I will run the commands below in my terminal:

echo 'eval "$(direnv hook bash)"' >> ~/.bashrc
source ~/.bashrc

3. Create a .envrc file in your project folder

Once you have created a .envrc file, you can then put echo 'use virtualenv venv' > .envrc inside the file.

4. Once you are done run direnv allow once to approve the .envrc file.

This way, when you cd into directory of your project, it will do automatic switching and activate the correct environment without any hassle. This may work for you and this may not, but I personally found direnv a blessing for such cases as yours.

2 Comments

Thanks. I'm using python -m venv. Can your method work for that?
@SRobertJames, why dont you just do it and let us know if it solves your problem and mark the suggesting as helpful? python -m venv typically creates "venv" folder with dependencies in your directory. use virtualenv venv on the other hand tells it to look for that created venv folder.
0

Jupyter already natively supports this via ipykernel and jupyter kernelspec (see https://ipython.readthedocs.io/en/latest/install/kernel_install.html). Similar to this post.

Assuming you have two environments base and dev (where dev does not have IPython installed and base does), you would have to install ipykernel in dev to have dev add itself as a kernel to base:

$ conda activate dev # . ./dev/bin/activate in case of virtualenv (dev)$ pip install ipykernel (dev)$ ipython kernel install --prefix /path/to/env/base --name=dev 

You can also use --user instead of --prefix /path/to/env to have the kernel installed in the user's python.

Once you have the new kernel installed, startup your jupyter server/notebook (I prefer jupyter lab) and, under "kernel", select "change kernel".

To see what kernels you have installed:

jupyter kernelspec list 

Comments

-1

The way I go about this is using ipykernel in my virtual environments which I need to use with the Jupyter notebooks so that I can switch to the appropriate environments when using the notebook.

All I need to do is to switch between environments to make sure I only need the one that is for the Jupyter notebook.

P.S. a new utility is in town called Puppy. You might want to give this a read!

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.