Physics Checking agenda July 20, 2017 - GeneralizedNuclearData/SG43 GitHub Wiki
-
Wiki updated with initial list of physics checking rules
Kenichi and his Japanese colleagues don't use any locally developed checking codes;
CHECKR
,FIZCON
,PSYCHE
from ENDF-6 Checking & Utility Codes. Sammy does internal checking. PUFF does some additional checking. The errors that these check for need to be added to the list. -
Severity levels have been expanded.
-
fatal
— something really bad happened; can't continue. -
error
— the data is clearly wrong and there is no unambiguous way to fix it. -
warning
— the data is not perfect, but we can work around it. -
notice
— not an error condition, but may require additional investigation or special handling -
info
— informational messages; typically given after earlier problem.
Levels i--iii should be followed by an informational message to provide additional information. These levels and definitions adequately span the phase space for our error checking.
-
Please read through the rules before the meeting. We don't want to have to explicitly mention every rule during the meeting; that wouldn't be a good use of our time. Be prepared to talk about:
-
Any issues with current list of rules.
There were no issues brought up, but few have had the chance to look through the errors before the meeting. Even fewer have added their own set of errors that their institution looks for.
-
What severity should be issued for each failed check.
A lot of the errors can stem from limited precision of ENDF data. This kind of error can probably be adjusted--bumped.
-
Prioritization of current rules.
Need to balance the importance of the error with the simplicity of checking May need to separate the checking into different regions:
- Resolved
- Unresolved
- Fast
- etc. The physics checking may be different for each region. Could have one command that will issue separate sub-commands.
Standard unit testing framework for Java and many other languages. Python's unittest
module is implementation used in FUDGE. It is very simple to setup a unit test:
import org.junit.*;
public class FoobarTest {
@Before
public void setUp() throws Exception {
// Code executed before each test
}
@Test
public void testOneThing() {
// Code that tests one thing
}
@After
public void tearDown() throws Exception {
// Code executed after each test
}
}
Simple format for storing/reporting test results
<?xml version="1.0" encoding="UTF-8" ?>
<testsuites id="20140612_170519" name="New_configuration (14/06/12 17:05:19)"
tests="225" failures="1262" time="0.001">
<testsuite id="codereview.cobol.analysisProvider"
name="COBOL Code Review" tests=“45" failures="17" time="0.001">
<testcase id="codereview.cobol.rules.ProgramIdRule"
name="Use a program name that matches the source file name" time="0.001">
<failure message="PROGRAM.cbl:2 Use a program name that matches the source file name"
type="WARNING">
WARNING: Use a program name that matches the source file name
Category: COBOL Code Review – Naming Conventions
File: /project/PROGRAM.cbl
Line: 2
</failure>
</testcase>
</testsuite>
</testsuites>
The "WARNING"
value for the type
attribute on the failure
node is where our severity levels would go. The body of the failure
node is where the error message will be written.
See JUnit XML Format for specifications.
Need to come up with a standard message for each failed check. That way---if the checking code is compliant---the failure message is known. Leave the body of the nodes for arbitrary messages that can be code dependent; each code/institution can decide what goes in the body. This goes for all the nodes; i.e., the
<testsuites>
,<testsuite>
,<testcase>
, and<failure>
nodes.
There are many tools built to make JUnit XML more useful
- Python:
- junit-xml 1.7
- xmlrunner
- Jenkins and Hudson CI systems automatically display
- junit-viewer
- Jeremy Conlin
- Doro Wiarda
- Kenichi Tada
- Still need contributions to the list of errors. (Only LANL has contributed at this point.)
- Jeremy will create a simple example of what a JUnit output may be for our kind of tests
- Each institution is responsible for providing to Jeremy their prioritized list of rules.
August 29, 2017 at 9:00 AM MDT.