I am trying to build a pip package from source code in a Git repository that has multiple packages that share a common package. (I am not allowed to change the structure of this Git repository.)
The structure is:
├── common │ ├── __init__.py │ ├── run_helpers │ │ ├── __init__.py │ │ ├── aws.py │ │ └── s3.py └── components └── redshift_unload ├── redshift_unload │ ├── __init__.py │ └── run.py └── setup.py My setup.py is as follows:
from setuptools import setup, find_packages setup( ... packages=find_packages(), package_dir={"": "."}, entry_points={ "console_scripts": ["redshift_unload=redshift_unload.run:main"] } ) Looking at other answers here, things I have tried so far include:
- Specifying the actual package names in the
packages=line instead of usingfind_packages(). - Passing
where="../../"tofind_packages() - Using
find_packages() + find_packages(where="../../") in thepackages=` line. - Everything I can think of in the
packages_dirline.
When I run pip install . I get, the package installs fine, but then when I run the installed python script I get:
# redshift_unload Traceback (most recent call last): File "/usr/local/bin/redshift_unload", line 5, in <module> from redshift_unload.run import main File "/usr/local/lib/python3.8/site-packages/redshift_unload/run.py", line 9, in <module> from common._run_helpers.aws import get_boto3_session ModuleNotFoundError: No module named 'common' What did work:
- If I moved the
commondirectory tocomponents/redshift_unload, then it works fine. But I can't do this. I also tried placing a symlink there in its place, but seems like that doesn't work either.
Is there a way to make this work?