PDS EN CI: A Design Memoir - NASA-PDS/nasa-pds.github.io GitHub Wiki

From @nutjob4life:

Since repositories are cheap (well, maybe not to Zenhub's speed), I'd make an integration repo (or add features to the "corral" repo) whose job is mainly to contain tests. The repo could contain the support files necessary to make an integration test happen plus the Actions workflows that contain triggers to get the tests going. Other repos can trigger events via the workflow dispatch, repository dispatch, or our own "GitHub Ping". There's also scheduled runs—essentially cron for GitHub Actions.

So what would such a repo + workflow do?

For each kind of integration test, I'd envision a single directory plus the Docker Composition that goes with it:

📁  test 1/
    📄 docker-compose.yaml
    📁 etc/
    📄 run-tests.sh
📁  test 2/
    📄 docker-compose.yaml
    📁 etc/
    📄 run-tests.py
…

The docker-compose.yaml defines the services (containers) needed to participate in the test, while the run-tests driver (of whatever flavor) does the non-Docker related setup and checks if the integration works. The action could cycle through each directory and run each test serially. For example, for an end-to-end Registry API test, this would:

  • Pull the latest published registry image from Docker Hub
    • Start a registry container with a published port
    • Start its depend containers (elastic search, database, etc.)
  • Pull the latest published API client from PyPI
    • Install it into a local venv
  • Run some Python the API client
    • Here a doctest is ideal because you can mix narrative expectations of the code with the code that makes the assertions, essentially making integration tests a README.txt.

The script would execute whatever registry operations we deem relevant for testing. Note that the Docker Hub already has this feature (see https://docs.docker.com/docker-hub/builds/automated-testing/; atomated builds and testing are available with a free Hub account):

services:
    sut:  # :point_left: Here's the automated testing part; it must be called sut¹
        depends_on: client
        build: .
        command: run-tests.sh
    client:
        image: nasapds/api-client:latest
        depends_on: server
        volumes: etc:/mnt/tests  # ²
        command: --testdir /mnt/tests --server http://server:8080/  # ³
    server:
        image: nasapds/registry-api:latest
        ports: "8080:8080"
        networks: …
networks:
    …

But this requires Dockerization of every aspect to be tested. The client API would have to be in an image and not come from PyPI. (Wihch isn't such a bad thing, really. When it comes to runnable software, people are more and more looking to the Docker Hub.) And although it would live at Docker Hub, it can be triggered from GitHub.

¹System Under Test ²Assume test files are in etc ³This violates some Docker principles though: images are supposed to be lightweight and contain only the code germane to their operation. Here, the api-client image has some extra baggage to support the --testdir command-line option.