Testing - GermanZero-de/localzero-monitoring GitHub Wiki

Concept

We have the following kinds of tests:

  1. unit tests for our python code, located in cpmonitor/tests
  2. tests for our database migrations: e2e_tests/database
  3. end to end tests for our Django admin frontend: e2e_tests/test_admin.py
  4. end to end tests for our end user facing frontend: e2e_tests/customer_frontend

Tools used

All tests are written with pytest. End-to-end tests are written with the playwright plugin for pytest. The end-to-end admin tests use pytest-django to set up a live server and provide a test-database that is cleaned after each test. We use Docker do provide a end to end testing environment similar to the production server.

Running the tests

[!IMPORTANT] All commands should be run within the poetry virtual environment. See How to get started.

Install playwright

The first time playwright is used, let it download the tools it needs with:

playwright install

Playwright might print some instructions for installing further dependencies. If it does, follow those instructions.

Run unit tests for backend

Run backend unit tests:

pytest --ignore e2e_tests

Run a single test (add --headed to view what's happening):

pytest <path-to-test>

And pytest has a few more ways to specify exactly what tests to run.

Run the end-to-end tests using the Django admin frontend and the Next.js customer facing frontend

Prepare the database, images and Docker networks (must only be done once):

rm db/db.sqlite3
rm cpmonitor/images/uploads-backup-migration-0029 -r
mkdir cpmonitor/images/uploads/local_groups
poetry run python manage.py migrate --settings=config.settings.local
poetry run python manage.py loaddata --settings=config.settings.local e2e_tests/database/test_database.json
cp -r e2e_tests/database/test_database_uploads/. cpmonitor/images/uploads
docker network create testing_nginx_network
docker network create production_nginx_network

Then, start the local staging environment and the reverse proxy compositions (see Containerization-and-Deployment) and run the tests:

docker-compose up --build --detach
pytest -k e2e_tests
docker-compose down --volumes

Test conventions

  • New test files have to be named according to the convention: test_*.py.
  • Test names should follow the convention: test_should_do_x_when_given_y.

Test Data

From a local database filled with suitable data, generate a database dump named example_database_dump with

python -Xutf8 manage.py dumpdata -e contenttypes -e auth.Permission -e admin.LogEntry -e sessions --indent 2 --settings=config.settings.local > e2e_tests/database/your_database.json

(The -Xutf8 and --indent 2 options ensure consistent and readable output on all platforms.)

The arguments -e contenttypes -e auth.Permission -e admin.LogEntry -e sessions exclude tables which are pre-filled by django or during usage by django and whose content may change depending on the models in the project. If they are included, everything works fine at first, since loaddata will silently accept data already there. However, as soon as the data to load clashes with existing content, it will fail. -e admin.LogEntry excludes references to content types which may otherwise be inconsistent.-e sessions excludes unneeded data which otherwise would clog the JSON file.

Manual tests with data from production

On the test server

To start the testing app with production data from some backup run the script

/home/monitoring/start-testing-with-prod-data.sh [backup-folder]

Locally

Retrieve the current database from the server

rm db/db.sqlite3
scp lzm:testing/db/db.sqlite3 db/
rm -r cpmonitor/images/uploads
scp -r lzm:testing/cpmonitor/images/uploads cpmonitor/images/

To find out on which migration version this database is based use:

ssh -tt lzm docker exec -it djangoapp-testing python manage.py showmigrations --settings=config.settings.container
# or
ssh -tt lzm docker exec -it djangoapp-production python manage.py showmigrations --settings=config.settings.container

Possibly migrate, test the data, and check that the size is reasonable. Then make it available to others with:

SNAPSHOT_NAME=prod_database_$(date -u +"%FT%H%M%SZ")
python -Xutf8 manage.py dumpdata -e contenttypes -e auth.Permission -e admin.LogEntry -e sessions --indent 2 --settings=config.settings.local > e2e_tests/database/${SNAPSHOT_NAME}.json
cp -r cpmonitor/images/uploads e2e_tests/database/${SNAPSHOT_NAME}_uploads
echo "Some useful information, e.g. the migration state of the snapshot" > e2e_tests/database/${SNAPSHOT_NAME}.README
du -hs e2e_tests/database/${SNAPSHOT_NAME}*

Commit the result.

Use the database from production

# select the snapshot to use
SNAPSHOT_NAME=prod_database_<some date found in e2e_tests/database/>

# remove previous data
rm db/db.sqlite3
rm -r cpmonitor/images/uploads

# create the database
python manage.py migrate --settings=config.settings.local

# optionally migrate back to a suitable version (see the .README file corresponding to the snapshot you're about to load):
python manage.py migrate cpmonitor <some-earlier-migration> --settings=config.settings.local
python manage.py loaddata --settings=config.settings.local e2e_tests/database/${SNAPSHOT_NAME}.json
cp -r e2e_tests/database/${SNAPSHOT_NAME}_uploads cpmonitor/images/uploads

If the snapshot you want to use is based on an older model version, migrations have to be applied and are tested:

python manage.py migrate --settings=config.settings.local

The E2E tests will most likely fail, since they are based on another DB dump, e.g. with other password settings. But manual tests with the dev server or container-based tests should be possible and the images should be visible:

python manage.py runserver --settings=config.settings.local
#or
docker compose --env-file .env.local up --detach --build