Skip to content

Packaged script execute permissions lost from v46.1.0 onwards #2041

@dhallam

Description

@dhallam

Issue

We use pipenv and, as a result, are forced to use the latest version of setuptools as it internally pins to the latest version.

We observed that executable scripts that are part of python packages have lost their execute flag from setuptools v46.1.0 onwards. The example to demonstrate this bug uses pyspark which includes a number of executable scripts in its package.

The issue was introduced by commit 7843688 where the copy_file() function is now called with preserve_mode=False. The change log states the reason for the change as:

Prevent keeping files mode for package_data build. It may break a build if user's package data has read only flag.

Unfortunately, this has the side effect of stripping all execute permissions from files, meaning users can't use the scripts "out of the box" - they have to set the execute permissions manually.

Demonstration Script

#!/bin/bash set -eu wget -nc https://files.pythonhosted.org/packages/9a/5a/271c416c1c2185b6cb0151b29a91fff6fcaed80173c8584ff6d20e46b465/pyspark-2.4.5.tar.gz for version in "46.0.0" "46.1.0"; do rm -rf .venv pyspark-2.4.5 tar xzf pyspark-2.4.5.tar.gz virtualenv -q -p /usr/bin/python3.7 .venv . .venv/bin/activate python3 -m pip install --upgrade setuptools="==${version}" wheel pushd pyspark-2.4.5 python3 setup.py -q bdist_wheel pushd dist unzip -q pyspark-2.4.5-py2.py3-none-any.whl echo -e "\n\n${version}: Here are the permissions for spark-submit:\n" ls -l ./pyspark/bin/spark-submit echo -e "\n\n" popd popd done 

Expected result

-rwxr-xr-x 1 dave dave 1040 Feb 2 19:35 ./pyspark/bin/spark-submit 

Actual Result

-rw-rw-r-- 1 dave dave 1040 Feb 2 19:35 ./pyspark/bin/spark-submit 

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions