Showing multiple versions in Doctr - goerz/cookiecutter-pypackage GitHub Wiki
Doctr has many advantages over ReadTheDocs. However, out of the box, it does not have any way to show the documentation for different releases, the way ReadTheDocs does with its "version menu".
A similar version menu, looking like the above screenshot (from the krotov
project), can be generated through an interplay of different scripts:
-
This file included in every rendered html page to produce the versions menu based on information in a file
versions.json
in root of thegh-pages
branch.An example
versions.json
file is:{ "folders": [ "master", "v0.1.0", "v0.2.0", "v0.3.0", "v0.4.0", "v0.4.1", "v0.5.0", "v1.0.0" ], "labels": { "master": "master (dev)", "v0.1.0": "v0.1.0", "v0.2.0": "v0.2.0", "v0.3.0": "v0.3.0", "v0.4.0": "v0.4.0", "v0.4.1": "v0.4.1", "v0.5.0": "v0.5.0", "v1.0.0": "v1.0.0 (latest release)" }, "versions": [ "master", "v0.1.0", "v0.2.0", "v0.3.0", "v0.4.0", "v0.4.1", "v0.5.0", "v1.0.0" ], "hidden": [], "outdated": [ "v0.1.0", "v0.2.0", "v0.3.0", "v0.4.0", "v0.4.1", "v0.5.0" ], "unreleased": [ "master" ], "latest_release": "v1.0.0", "downloads": { "master": [], "v0.1.0": [], "v0.2.0": [], "v0.3.0": [], "v0.4.0": [], "v0.4.1": [], "v0.5.0": [], "v1.0.0": [ [ "pdf", "https://dl.bintray.com/qucontrol/krotov/krotov-v1.0.0.pdf" ], [ "htmlzip", "https://dl.bintray.com/qucontrol/krotov/krotov-v1.0.0.zip" ] ] } }
The versions-menu in the screenshot contains the following:
- For every
folder
not inhidden
, the corresponding label (in bold, if viewing a file inside the folder, and as a link to the corresponding current page in the other version folders otherwise) - For the current folder, any defined links in
downloads
- Links to the project on Github and the issue tracker
- If the current folder is in the
unreleased
oroutdated
lists, inject a warning into the page that links to thelatest_release
.
- For every
-
This script is called during
doctr deploy
to generate the aboveversions.json
file. It works in conjunction withversions.py
.It looks for all subfolders in the
gh-pages
branch. The names of these folders are assumed to correspond to git tags or branch names. That means that for release versions, the tag name should be something likev0.1.0
. The most current of these (non-pre-)releases (according to PEP 440 /parse_version
) is detected as thelatest_release
. All other releases will be detected asoutdated
. Non-releases (e.g.master
) or pre-releases will be detected asunreleased
.For every folder, download links are read from a text file
_downloads
directly inside that folder, if it exists. The labels for the download links are derived from the file extension (the part of the URL after the last dot).It also verifies that there is a
.nojekyll
file in thegh-pages
root (to avoid problems with folders whose names start with underscores) and writes anindex.html
the redirects to the default folder (the currentlatest_release
ormaster
if there has not been a release) -
This script is called in
.travis.yml
to build the documentation and generate download links (in the_downloads
file)
The doctr_build.sh
script handles building the documentation both in HTML format as well binary artifacts such as a pdf, epub, or zipped-html versions of the documentation. It is not recommended to include these binary artifacts in gh-pages
. Git is not great at handling binary files, and these artifacts can quickly blow up your repository size. Instead, the doctr_build.sh
script should upload the artifacts to some suitable provider, and append the resulting download link to the _downloads
file in gh-pages
.
One recommended provider is Bintray. Another possibility is to attach the documentation artifacts to Github releases.
Bintray has a generous free account for open source project. Create an account there, and set up a "repo" and "package" mirroring the names of the Github project.
Uploading to Bintray requires to set the BINTRAY_USER
, BINTRAY_SUBJECT
, BINTRAY_REPO
, BINTRAY_PACKAGE
and BINTRAY_TOKEN
environment variables in .travis.yml
. In doctr_build.sh
, you can verify that these environment variables are set correctly:
if [ ! -z "$TRAVIS" ] && [ "$TRAVIS_EVENT_TYPE" != "pull_request" ]; then
echo "## Check bintray status"
# We *always* do this check: we don't just want to find out about
# authentication errors when making a release
if [ -z "$BINTRAY_USER" ]; then
echo "BINTRAY_USER must be set" && sync && exit 1
fi
if [ -z "$BINTRAY_TOKEN" ]; then
echo "BINTRAY_TOKEN must be set" && sync && exit 1
fi
if [ -z "$BINTRAY_PACKAGE" ]; then
echo "BINTRAY_PACKAGE must be set" && sync && exit 1
fi
url="https://api.bintray.com/repos/$BINTRAY_SUBJECT/$BINTRAY_REPO/packages"
response=$(curl --user "$BINTRAY_USER:$BINTRAY_TOKEN" "$url")
if [ -z "${response##*$BINTRAY_PACKAGE*}" ]; then
echo "Bintray OK: $url -> $response"
else
echo "Error: Cannot find $BINTRAY_PACKAGE in $url: $response" && sync && exit 1
fi
fi
Then only when deploying the documentation of a tagged release, and assuming the documentation artifacts have been generated in docs/_build/artifacts
, the following code uploads them:
echo "Upload artifacts to bintray"
for filename in docs/_build/artifacts/*; do
url="https://api.bintray.com/content/$BINTRAY_SUBJECT/$BINTRAY_REPO/$BINTRAY_PACKAGE/$TRAVIS_TAG/$(basename $filename)"
echo "Uploading $filename artifact to $url"
response=$(curl --upload-file "$filename" --user "$BINTRAY_USER:$BINTRAY_TOKEN" "$url")
if [ -z "${response##*success*}" ]; then
echo "Uploaded $filename: $response"
echo "https://dl.bintray.com/$BINTRAY_SUBJECT/$BINTRAY_REPO/$(basename $filename)" >> docs/_build/html/_downloads
else
echo "Error: Failed to upload $filename: $response" && sync && exit 1
fi
done
echo "Publishing release on bintray"
url="https://api.bintray.com/content/$BINTRAY_SUBJECT/$BINTRAY_REPO/$BINTRAY_PACKAGE/$TRAVIS_TAG/publish"
response=$(curl --request POST --user "$BINTRAY_USER:$BINTRAY_TOKEN" "$url")
if [ -z "${response##*files*}" ]; then
echo "Finished bintray release : $response"
else
echo "Error: Failed publish release on bintray: $response" && sync && exit 1
fi
Attaching files to a Github release requires a GITHUB_TOKEN
for authorization in the .travis.yml
file.
Note that such a token has very broad authorization to all repositories for a particular user account. If you use such a token, you might as well use it also for deploying Doctr (in lieu of the more fine-tuned deploy key).
In doctr_build.sh
, the GITHUB_TOKEN
can be verified as
if [ ! -z "$TRAVIS" ] && [ "$TRAVIS_EVENT_TYPE" != "pull_request" ]; then
echo "## Check GITHUB_TOKEN status"
# We *always* do this check: we don't just want to find out about
# authentication errors when making a release
if [ -z "$GITHUB_TOKEN" ]; then
echo "GITHIB_TOKEN must be set" && sync && exit 1
fi
GH_AUTH_HEADER="Authorization: token $GITHUB_TOKEN"
url="https://api.github.com/repos/$TRAVIS_REPO_SLUG"
curl -o /dev/null -sH "$AUTH" "$url" || { echo "Error: Invalid repo, token or network issue!"; sync; exit 1; }
fi
Then, for tagged releases where the documentation artifacts have been built in docs/_build/artifacts
, the files can be uploaded with:
url="https://api.github.com/repos/$TRAVIS_REPO_SLUG/releases"
echo "Make release from tag $TRAVIS_TAG: $url"
API_JSON=$(printf '{"tag_name": "%s","target_commitish": "master","name": "%s","body": "Release %s","draft": false,"prerelease": false}' "$TRAVIS_TAG" "$TRAVIS_TAG" "$TRAVIS_TAG")
echo "submitted data = $API_JSON"
response=$(curl --data "$API_JSON" --header "$GH_AUTH_HEADER" "$url")
echo "Release response: $response"
url="https://api.github.com/repos/$TRAVIS_REPO_SLUG/releases/tags/$TRAVIS_TAG"
echo "verify $url"
response=$(curl --silent --header "$GH_AUTH_HEADER" "$url")
echo "$response"
eval $(echo "$response" | grep -m 1 "id.:" | grep -w id | tr : = | tr -cd [[:alnum:]]=')
echo "id = $id"
for filename in docs/_build/artifacts/*; do
url="https://uploads.github.com/repos/$TRAVIS_REPO_SLUG/releases/$id/assets?name=$(basename $filename)"
echo "Uploading $filename as release asset to $url"
response=$(curl "$GITHUB_OAUTH_BASIC" --data-binary @"$filename" --header "$GH_AUTH_HEADER" --header "Content-Type: application/octet-stream" "$url")
echo "Uploaded $filename: $response"
echo $response | python -c 'import json,sys;print(json.load(sys.stdin)["browser_download_url"])' >> docs/_build/html/_downloads
done