Quality Guidelines - HMS-Analytical-Software/SASUnit GitHub Wiki
Before releasing a new version, the following tests have to be condcuted and checked succesfully. All deviances have to be documented.
For all the following checks, use Microsoft Edge in most current version, unless stated otherwise.
SASUnit uses continouus integration (CI), therefore most runs are triggered automatically.
Parts of these test are:
-
SASUnit self tests
Please note that the purpose of this test suite is to have a number of tests which check for correct functionality of SASUnit itself. This implies that some tests have to fail and that testing involves checking whether every test has the intended outcome. -
SASUnit example project
Please note that the purpose of this project is the demonstration of SASUnit functionality to the end user (i.e. the SAS programmer who wants to test his programs) and to serve as a starting point for his own tests. When comparing boxplot results, please note that as always in report inspection there might be some small visual differences between the actual and the expected outcome.
Common functionality moved into CI tool (e.g. GitHub - actions / workflow)
Using a pipleline helps in automating testing and reducing testing effort.
Therefore some workflows (aka GitHub actions) were created.
This sections focuses on the workflow test.yaml that runs after each push to a branch.
The workflow contains the following global tests as jobs:
-
Run Self Test
This is a matrix that starts self tests in English and German under Windows 10 (German) and CentOS 7 (English). So this job will run four tests. -
Test Path with spaces
This job tests SASUnit under CentOS with a path that contains spaces to ensure proper handling of this paths.
Each of these jobs runs the following steps:
-
Run SASUnit self tests
- Create SASUnit start scripts by executing sasunit.setup.EE_CI_xx scripts
- Run only changed programs
by executing sasunit.<SASVersion>.<Operating system>.<language>.fast.cmd (.sh) - Run all programs with full documnentation
by executing sasunit.<SASVersion>.<Operating system>.<language>.overwrite.full.cmd (.sh)
-
Run SASUnit example project
- Create SASUnit start scripts by executing sasunit.setup.EE_CI_xx scripts
- Run all programs with full documnentation
by executing sasunit.<SASVersion>.<Operating system>.<language>.overwrite.full.cmd (.sh)
-
Test Distribution on Linux System (in /etc /opt /var)
This is recreates the distribution of files and data into /etc, /opt/ and /var tree. The job runs under CentOS.- Distribute programs and data into the /etc, /opt, /var structure.
- Create SASUnit start scripts by executing sasunit.setup.EE_CI_CentOS_etcoptvar.sh.
- Run all test of the example project
by executing sasunit.9.4.linux.de.overwrite.full.sh.
Procedure to thoroughly test all environments:
-
For first platform: Windows 10, 64-Bit, German / SAS 9.4 64 bit / SASUnit settings English
-
check status of pipelines in CI workflow (GitHub: test.yaml)
- Check the status of each run noted above.
-
Check the example project test suite
- check test scenario page:
- All scenarios present? Check with scenario programs.
- Result as expected for each scenario? Result must be red where indicated in test scenario description, otherwise green or white!
- All scenario logs with exactly those errors and warnings expected from the various test cases?
- Links to test scenario details and programs functional? Check a small random sample!
- Correct tooltips on mouse hover for each column except duration? Check a small random sample!
- Correct overall appearance?
- check test cases page
- All scenarios present? Check with scenario programs.
- Info and Links in header same as in scenario table? Check for a small sample!
- Links to test case detail pages functional? Check for a small sample!
- Links to units under test functional? Check for a small sample!
- Links to logs functional? Check for a small sample!
- Correct tooltips on mouse hover for each column except duration? Check for a small sample!
- Information and links on headers of test case details pages as in test cases page? Check for a small sample!
- Correct overall appearance?
- check a sample of test case detail pages, at least two occurrences of every assertion type
- Check page Units under Test
- tooltips on mouse over correct?
- Check navigation tree
- Same scenarios, test cases and tests as on scenario report pages? Check with a sample!
- Proper tooltips on all levels of navigation tree below scenarios? Check with a sample!
- Same units under test, test cases and tests as on units under test page? Check with a sample!
- Proper tooltips on all levels of navigation tree below units under test? Check with a sample!
- check main page of output:
- correct title with link to SF?
- correct footnote with link to SF and correct version and build number?
- correct overall appearance?
- run_all.log must not contain any errors or warnings
- check test scenario page:
-
Check the self-test suite
- check test scenario page:
- All scenarios present? Check with scenario programs.
- Result as expected for each scenario? Result must be red where indicated in test scenario description, otherwise green or white! Test scenario number 001 must be red although not stated in the description.
- check test cases page
- All scenarios present? Check with scenario programs. Every program ending in "_test.sas" in the saspgm/test folder is a scenario program.
- For each scenario
- All test cases per scenario present? Check with source code for each scenario!
- Log with exactly those errors and warnings expected from the various test cases?
- Result as expected for each test case? Result must be red where indicated in test case description, otherwise green or white! Test case 001 of scenario 007 (reportsasunit_emptyscn_test.sas) must fail although not stated in description.
- For each test case open test case detail page
- Result as expected for each test? Result must be red where indicated in test description, otherwise green (or white for assertreport only)!
- Manually check for each occurrence of assertreport (white result) whether results are as indicated in test description!
- Check page Units under Test
- All units under test present, specified in at least one test scenario? Check with source code of test scenarios!
- All units under test correctly specified per program library and with test scenario? Check with source code of test scenarios!
- Results correct?
- Check main page of output,
- run_all.log must not contain any errors or warnings
- must be as follows, all links must be functional:
Name of project
&g_project
SASUnit
Root directory
&g_root
[check root directory and link]
Path to test repository
&g_target
doc/sasunit/en
Program libraries
(macro autocall paths)
&g_sasautos
&g_sasautos1
&g_sasautos2
&g_sasautos3
saspgm/sasunit
saspgm/test
saspgm/test/pgmlib1
saspgm/test/pgmlib2
SAS configuration file for test scenarios
&g_sascfg
bin/sasunit.9.3.windows.en.cfg
Folder for test data
&g_testdata
dat
Folder for reference data
&g_refdata
dat
Folder for specification documents
&g_doc
doc/spec
Path to SASUnit macros
&g_sasunit
saspgm/sasunit
SAS log of reporting job
doc/sasunit/en/run_all.log
Platform
&SYSCPL
W64_7PRO
SAS Version
&SYSVLONG4
9.03[id of maintenance release]
User ID
&SYSUSERID
[user id of operator]
SASUnit Language
SASUNIT_LANGUAGE
en
Number of test scenarios
Self Tests:
189 (6)
Example project:
12 (1)Number of test cases
Self Tests:
516 (2)
Example project:
55 (1)Number of assertions
Self Tests:
2165 (2)
Example project:
142 (1)
- check test scenario page:
-
test incremental build facility with example project
On a local checkout of the cuurent version do the following checks:- Touch (save without changes) programs example/saspgm/nobs.sas and getvars_test.sas
- Run example project test suite in non-overwrite-mode in English
- Check last run date/time on scenario overview page - must have been updated for nobs.sas and getvars.sas tests and unchanged for all other scenarios except for scenario tree1_test.sas which will always run.
- Check generation date/time for test case page - must have been updated
- Check generation date/time of a small sample of test case details pages of test cases for nobs.sas and getvars.sas - must have been updated
- Check generation date/time of a small sample of test case details pages of other test cases - must be unchanged
-
Check rendering in Firefox (latest version)
- Open report of example project (Windows SAS 9.3 64 bit, OS settings German, SASUnit English) in Firefox (latest version) and compare with IE9
- open doxygen for example project in Firefox (latest version) and compare with IE9
-
Check rendering in Chrome (latest version)
- Open report of example project (Windows SAS 9.3 64 bit, OS settings German, SASUnit English) in Firefox (latest version) and compare with IE9
- open doxygen for example project in Firefox (latest version) and compare with IE9
-
-
On each of the following further platforms repeat the following steps
-
Platforms
- Windows 10, 64-Bit, German / SAS 9.4 64 bit / SASUnit settings English
- CentOS 7 / SAS 9.4 64 bit / SASUnit settings English
- CentOS 7 / SAS 9.4 64 bit / with spaces
- CentOS 7 / SAS 9.4 64 bit / etc_opt_var
-
Steps
-
Check example project test suite
- compare to baseline: compare all main pages and a sample of the test case details pages
-
Check self-test suite
- compare to baseline: compare all main pages and a sample of the test case details pages
-
Check example project test suite
- "Compare to baseline" means:
Compare the results on each platform to the results of the baseline (Windows 10, 64-Bit, English / SAS 9.3 64 bit) above. Compare HTML reports using Firefox (latest version) under Microsoft Windows 7. - Please note that the sort order of test scenarios differs between Windows and Linux: Linux ignores the underscore when sorting scenario program names while windows doesn't.
-
Platforms
Back to Development Guidelines