1

I am working on a project and I have cloned a repository from github.
After first compile I realized that the project that I cloned has some dependencies and they were in requirements.txt file.

I know I have to install these packages, but I dont want to cause I am on windows development environment and after finishing my project I am going to publish it to my ubuntu production environment and I dont want to take the hassle of double installation.

I have two options:

  1. Using a virtualenv and installing those packages inside it

  2. Downloading the packages and use them the direct way using import foldername

I wanna avoid the first option cause I have less control over my project and the problem gets bigger and bigger If for example I were inside another project's virtualenv and wanted to run my project's main.py file from its own virtualenv and etc... Also moving the virtualenv from windows (bat files) to linux (bash / sh files) seems ugly to me and directs me to approaches that I choose to better avoid.

The second option is my choice. for example I need to use the future package. The scenario would be downloading the package using pip download future and when done extracting the tar.gz file, inside the src folder I can see the future package folder, And I use it with import future_package.src.future without even touching anything else.

enter image description here

Aside from os.path problems (which assume I take care of):

Is this good practice?

I am not running the setup.py preventing any installation. Can it cause problems?

Is there any better approach that involves less work (like the second one) or the better one is my mentioned first approach?

UPDATE 1: I have extracted future and certifi packages which were part of the requirements of my project and I used them the direct way and it is working in this particular case.

9
  • I am, quite honestly, still unclear on the original premise of why creating a virtualenv and letting pip install all dependencies is such a terrible move… Commented Jul 26, 2017 at 8:14
  • @deceze It is not terrible move, I said it is an option, I know I have to use virtualenv somedays and they are perfect choices for fast developement, but I personally try to skip automated things like pip and want to have control over my stuff, and If I can simply extract, move and call import why should I bother using virtualenv for a simple project (and maybe doable on more complex ones)? Commented Jul 26, 2017 at 8:41
  • Because dependencies can get very complex and that's exactly the problem package managers like pip solve…!? If there's only two dependencies you can extract by hand and those happen to be self contained… great! Lucky you. But wait until you're trying to install a library that has dozens of dependencies which each also have dozens of dependencies. Good luck tracking all the right versions down by hand. Commented Jul 26, 2017 at 8:43
  • @deceze well I said simple projects, but you are right then :) Commented Jul 26, 2017 at 8:49
  • Modern software development which largely builds systems on systems on systems does have its own complexities, and tools like pip and virtualenv exist to make those complexities more manageable. If you want to be productive/build complex stuff, it's something you pretty much have to accept. Commented Jul 26, 2017 at 8:49

0

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.