Rebuild Python package for PyPi and republish - lmmx/devnotes GitHub Wiki

After incrementing the package version number in setup.py, run the following to remove old distribution archives, regenerate them, and then reupload them to PyPi:

EGG_INFO=$(find ./ -iname "*.egg-info");
if [ -d "$EGG_INFO" ]; then
    rm -rf "$EGG_INFO";
fi;
rm -rf build/ dist/;
python3 setup.py sdist bdist_wheel;
python3 -m twine upload dist/*

Source: https://packaging.python.org/tutorials/packaging-projects/

As a convenient one-liner, that first part is:

EGG_INFO=$(find ./ -iname "*.egg-info"); if [ -d "$EGG_INFO" ]; then rm -rf "$EGG_INFO"; fi; rm -rf build/ dist/;

Also note:

  • an API token may be generated (see here)
  • ...and that API token may be saved with keyring (another Python module), preventing you from having to re-enter your credentials each time you reupload

The twine docs explain how to do that here, just run:

keyring set https://upload.pypi.org/legacy/ __token__

and then paste in the API token (to clarify: the username is __token__ and the password is the token itself) and you'll save it to your [Python] keyring, and when you run python3 -m twine upload dist/* it'll automatically resolve to that repository URL and since that matches the keyring with username __token__ it won't ask for your credentials and just upload immediately.


I've written some convenience functions to manage the bumping of version numbering, in a workflow that (following a commit/push) goes like so:

tag_new_release_micro "Increment micro version manually via tag"
pypi_republish
  • ...where the part in quotation marks is the tag annotation

There are 3 tag_new_release functions in my .bashrc, before which I make sure I'm at the top level of the git repo:

function pretag_gcd {
        git_tld=$(git rev-parse --show-toplevel)
        if [ $git_tld != $(pwd) ]; then
                cd $git_tld
        fi
}

function tag_new_release_micro() {
        pretag_gcd
        eval $(grep -E "^version =" version.py | tr -d " " | tr "." ",")
        v_inc=$(python -c "v=[$version]; v[2]+=1; v=map(str,v); print('.'.join(v))") \
        && git tag -a "v$v_inc" -m "$@" \
        && git push origin "v$v_inc"
}

function tag_new_release_minor() {
        pretag_gcd
        eval $(grep -E "^version =" version.py | tr -d " " | tr "." ",")
        v_inc=$(python -c "v=[$version]; v[1]+=1; v[2]=0; v=map(str,v); print('.'.join(v))") \
        && git tag -a "v$v_inc" -m "$@" \
        && git push origin "v$v_inc"
}

function tag_new_release_major() {
        pretag_gcd
        eval $(grep -E "^version =" version.py | tr -d " " | tr "." ",")
        v_inc=$(python -c "v=[$version]; v[0]+=1; v[1]=v[2]=0; v=map(str,v); print('.'.join(v))") \
        && git tag -a "v$v_inc" -m "$@" \
        && git push origin "v$v_inc"
}

...which bump either the micro, minor, or major part, tag it, then push the tag to GitHub.

  • The && ensures that any errors occurring (e.g. in the Python call) will stop the git operations going ahead.

Additionally there's one function pypi_republish:

function pypi_republish {
        EGG_INFO=$(find ./ -iname "*.egg-info")
        if [ -d "$EGG_INFO" ]; then rm -rf "$EGG_INFO"; fi
        rm -rf build/ dist/
        python3 setup.py sdist bdist_wheel
        python3 -m twine upload dist/*
}

...in fact I updated this to ensure it always runs in the standard conda environment (not the 'base', just the default) and then reactivates the one that was running [if there was one] afterward

function pypi_republish {
        EGG_INFO=$(find ./ -iname "*.egg-info")
        if [ -d "$EGG_INFO" ]; then rm -rf "$EGG_INFO"; fi
        rm -rf build/ dist/
        if [ ! -z "${CONDA_DEFAULT_ENV+x}" ]; then
                STORED_CONDA_ENV_NAME=$CONDA_DEFAULT_ENV
                conda deactivate
        fi
        python3 setup.py sdist bdist_wheel
        python3 -m twine upload dist/*
        if [ ! -z "${STORED_CONDA_ENV_NAME+x}" ]; then
                conda activate "$STORED_CONDA_ENV_NAME"
                unset STORED_CONDA_ENV_NAME
        fi
}

It's important here to note that the call to setup.py is now within the pypi_republish step itself, and as such may fall behind which was not the case when the version was being obtained interactively from a call to python setup.py --version.

Specifically, the version stored in version.py will fall behind immediately after git tag is run. I don't see this as likely to cause any misunderstanding, however it would be possible to avoid this by not writing the version to file and solely obtaining it interactively.

These tag_new_release functions must be seen as "bump release" functions, but I don't think it is possible to [mistakenly] run them more than once, as the push would be rejected, as the tag would already exist. In fact, creating the git tag itself would probably fail due to the tag already existing.

In the spirit of making it completely foolproof then, some all-in-one convenience functions:

function retag_republish_micro() {
        tag_new_release_micro "$@" && pypi_republish
}

function retag_republish_minor() {
        tag_new_release_minor "$@" && pypi_republish
}

function retag_republish_major() {
        tag_new_release_major "$@" && pypi_republish
}

Finito!

So the next time I've committed and pushed my changes, and want to update GitHub and PyPi, I'll just run:

retag_republish_micro "Hello world I'm a new tag annotation"

...and I'll get a new tagged release with the micro part of the version bumped, both on GitHub and PyPi. Simples.