Tortoise Tests - NetLogo/Tortoise GitHub Wiki

There are a few categories the Tortoise tests are divided into.

Docking tests are tests that use the headless version of NetLogo to run some NetLogo code, then use Tortoise to run the same code, and compare the results to make sure they are identical. This code can be stand-alone snippets or it can be full-fledged model runs. The state of each test is exported to JSON and the JSON is compared to identify any discrepencies. As such, these tests can be very slow.

Language tests are tests that run NetLogo code snippets directly and contain assertions as to what the results should be. Most of the language tests are actually defined in NetLogo desktop and are imported for use by Tortoise.

  • The exception is the Tortoise.txt file, which is a good place to temporarily put NetLogo code to quickly test while working on an issue in Tortoise. Run with netLogoWeb/testOnly *TestCommands -- -z Tortoise
  • The language tests are broadly split into TestCommands and TestReporters, with further divisions by NetLogo language primitive.
  • Because the TestCommands can take a long time to run, it's nice pick just the one you're interested in testing out, like this: netLogoWeb/testOnly *TestCommands -- -z Ask - run all command tests with "Ask" in the title
  • If some docking or language tests fail, you can run the browseFailures sbt command it order to launch your default web browser with a local HTML file listing the failed tests. Clicking on one of those will open a HTML file that you can inspect in the web developer console to more quickly debug the actual generated Javascript of the language test.
    • These tests are stored in netlogo-web/target/last-test-run-reports/ as HTML files if you need to examine them yourself. They are not automatically cleared or updated when tests pass, so you'll still see the files there after a successful test run. This could cause confusion if you're just reloading the files in your browser while fixing an issue.

There are standard arrange, act, assert tests for different parts of the Scala codebase as well, in compiler/shared/src/test/scala, compiler/js/src/test/scala, compiler/jvm/src/test/scala/json, and netlogo-web/src/test/scala.

There is a Scala style checker setup that checks for common code-formatting issues. You can use the stylecheck task in sbt to check all the Scala sub-projects.

There are also model Javascript compilation dump tests, more on that below.

Tests are also categories by speed, with the options being crawl, slow, medium, and fast.

There is a bash shell script to run a relatively thorough and not too slow subset of tests. This is good to run before finalizing commits for a PR or push up to GitHub to run the full test suite through the build server.

# from the root of the Tortoise repository:
./resources/scripts/run-quick-tests.sh

There is also a more in-depth ./resources/scripts/run-medium-tests.sh script that will take a bit longer.

Here is the full lists of tests that are run by our build tool when a branch is pushed to Github:

compilerJVM/test:compile
compilerJVM/test:test
compilerJVM/depend
compilerJS/test:compile
compilerJS/test:test
netLogoWeb/test:compile
netLogoWeb/test:fast
netLogoWeb/test:language
netLogoWeb/test:crawl

These tests take a long time to finish, so in general it's easier to let the build tool handle running them and report back any problems.

Model Dump Checks

Tortoise contains a set of model dumps - the generated Javascript from compiling the NetLogo code of some of our models. This is maintained to give a "last resort" check to make sure that the changes made to the compiler are resulting in the intended Javascript in the outputted code. The Javascript files are stored in resources/test/dumps.

To check the model dumps against your Tortoise changes, in the sbt console run: netLogoWeb/testOnly *ModelDumpTests When this task runs, it will store any Javascript files that do not match in target/ and notify you by failing the test.

If the change is intended, then the model dumps Javascript files must be updated and committed. This is as simple as copying the target/*.js files back over the resources/test/dumps files. Using macOS bash terminal, you can do:

mv target/*.js resources/test/dumps/

Note this won't delete dumps for models that have been renamed or deleted.

When model Javascript dumps are updated, they're generally added as a single stand-alone commit directly after the commit(s) that caused the Javascript output to change, like this.