DISP‐S1 Gamma Acceptance Testing Instructions - nasa/opera-sds-pge GitHub Wiki
This page contains instructions for performing Acceptance Testing for the DISP-S1 Gamma delivery from the OPERA-ADT team. These instructions pertain to the latest version of the Gamma release, currently v4.2. These instructions assume the user has access to the JPL FN-Artifactory, and has Docker installed on their local machine.
The image is currently hosted on JPL FN-Artifactory, which requires JPL VPN access and JPL credentials. You may also need to be added to the gov.nasa.jpl.opera.adt organization.
Once you have access, the container tarball delivery is available under general/gov/nasa/jpl/opera/adt/disp_s1/r4.2/gamma/dockerimg_disp_s1_gamma.tar
. Sample inputs and outputs are also available under general/gov/nasa/jpl/opera/adt/disp_s1/r4.2/gamma/delivery_data_small.tar
. (Note, the delivery_data_full.tar should not be used with the AT due to long runtime).
Documentation for the delivery is under general/gov/nasa/jpl/opera/adt/disp_s1/r4.2/gamma/documents/
Download both images to a location on your local machine. This location will be referred to throughout this instructions as <DISP_S1_DIR>
Note that the sample data is almost 4 gigabytes, so the download from AF may take some time.
The first step in running the DISP-S1 image is to load it into Docker via the following command:
docker load -i <DISP_S1_DIR>/dockerimg_disp_s1_gamma.tar
This should add the Docker image to your local repository with the name opera/disp-s1
and the tag 0.3.2
.
Once the delivery_data_small.tar
file is downloaded to your local machine, unpack it to <DISP_S1_DIR>
:
tar -xvf delivery_data_small.tar
This will create a delivery_data_small
directory within <DISP_S1_DIR>
containing the following directories:
- config_files/
- dynamic_ancillary_files/
- ionosphere_files/
- ps_files/
- static_layers/
- troposphere_files/
- dem.tif
- watermask.tif
- golden_output/
- forward/
- compressed_slcs/
- historical/
- compressed_slcs/
- forward/
- input_slcs/
In order to execute the SAS, the input file directory, runconfig and an output location will be mounted into container instance as Docker Volumes. To help streamline this process, we recommend making the following changes to the delivery_data_small
directory:
-
Create a directory named
runconfig
just underdelivery_data_small
, and move the existing runconfig YAML files into it:mkdir -p <DISP_S1_DIR>/delivery_data_small/runconfig
mv <DISP_S1_DIR>/delivery_data_small/config_files/*.yaml <DISP_S1_DIR>/delivery_data_small/runconfig/
NOTE: There will be 2 sets of runconfig files one historical
and one forward
. These will be used as input for separate runs of the container.
- Update the
algorithm_parameters_file
setting inrunconfig_historical.yaml
to point at the version we just copied:
# REQUIRED: Path to file containing SAS algorithm parameters.
# Type: string.
algorithm_parameters_file: runconfig/algorithm_parameters_historical.yaml
- Update the
algorithm_parameters_file
setting inrunconfig_forward.yaml
to point at the version we just copied:
# REQUIRED: Path to file containing SAS algorithm parameters.
# Type: string.
algorithm_parameters_file: runconfig/algorithm_parameters_forward.yaml
- In both the copied version of
runconfig_foward.yaml
andrunconfig_historical.yaml
, assign usage of the Ionosphere/Troposphere ancillary files:
# List of paths to TEC files (1 per date) in IONEX format for ionosphere correction. If none
# provided, ionosphere corrections are skipped.
# Type: array | null.
ionosphere_files:
- dynamic_ancillary_files/ionosphere_files/jplg0060.23i
- dynamic_ancillary_files/ionosphere_files/jplg1860.23i
- dynamic_ancillary_files/ionosphere_files/jplg3110.22i
- dynamic_ancillary_files/ionosphere_files/jplg3230.22i
- dynamic_ancillary_files/ionosphere_files/jplg3350.22i
- dynamic_ancillary_files/ionosphere_files/jplg3470.22i
# List of paths to troposphere weather model files (1 per date). If none provided,
# troposphere corrections are skipped.
# Type: array | null.
troposphere_files:
- dynamic_ancillary_files/troposphere_files/ERA5_N36_N41_W124_W118_20221107_14.grb
- dynamic_ancillary_files/troposphere_files/ERA5_N36_N41_W124_W118_20221201_14.grb
- dynamic_ancillary_files/troposphere_files/ERA5_N36_N41_W124_W118_20230506_14.grb
- dynamic_ancillary_files/troposphere_files/ERA5_N36_N41_W124_W118_20221119_14.grb
- dynamic_ancillary_files/troposphere_files/ERA5_N36_N41_W124_W118_20221213_14.grb
- Edit the copied versions of the
runconfig_forward.yaml
andrunconfig_historical.yaml
to change the `threads_per_worker' setting from 16 to 1:
# Number of threads to use per worker. This sets the OMP_NUM_THREADS environment variable in
# each python process.
# Type: integer.
threads_per_worker: 1
- Also change the
n_parallel_bursts
setting from 9 to 1:
# If processing separate spatial bursts, number of bursts to run in parallel for wrapped-
# phase-estimation.
# Type: integer.
n_parallel_bursts: 1
- Edit the copied version of
algorithm_parameters_historical.yaml
to change then_parallel_jobs
setting from 4 to 1:
# Number of interferograms to unwrap in parallel.
# Type: integer.
n_parallel_jobs: 1
Change directory into the delivery_data_small/
directory.
cd <DISP_S1_DIR>/delivery_data_small
We're now ready to execute the 'forward' configuration of the DISP-S1 Gamma delivery. Run the following the command to kick off execution with the test assets:
NOTE: the relative path to the runconfig file must be specified in the docker run command
docker run --rm --user $(id -u):$(id -g) \
--volume <DISP_S1_DIR>/delivery_data_small:/work \
opera/disp-s1:0.3.2 disp-s1 run runconfig/runconfig_forward.yaml
The docker container will output progress messages as it runs, e.g.:
2024-04-15T15:55:48 - dolphin - INFO - Found SLC files from 2 bursts
...
Execution time for the small test case on opera-dev-pge was about 10 minutes.
When the docker run is finished, scratch/forward/
and output/forward/
directories will be created.
The output/forward directory will contain the product file:
-rw-r--r-- 1 collinss cloud-user 67108864 May 20 19:10 20221107_20221213.unw.nc
There will also be a several .png files and a compressed_slcs/
directory:
-rw-r--r-- 1 collinss cloud-user 106318 May 20 19:10 20221107_20221213.unw.connected_component_labels.png
-rw-r--r-- 1 collinss cloud-user 331363 May 20 19:10 20221107_20221213.unw.interferometric_correlation.png
-rw-r--r-- 1 collinss cloud-user 107216 May 20 19:10 20221107_20221213.unw.persistent_scatterer_mask.png
-rw-r--r-- 1 collinss cloud-user 318023 May 20 19:10 20221107_20221213.unw.temporal_coherence.png
-rw-r--r-- 1 collinss cloud-user 320306 May 20 19:10 20221107_20221213.unw.unwrapped_phase.png
drwxr-xr-x 2 collinss cloud-user 134 May 20 18:54 compressed_slcs
The compressed_slcs/
directory contains compressed .h5 files:
-rw-r--r-- 1 collinss cloud-user 137036879 May 20 18:54 compressed_t042_088905_iw1_20221107_20221119_20221213.h5
-rw-r--r-- 1 collinss cloud-user 137037418 May 20 18:54 compressed_t042_088906_iw1_20221107_20221119_20221213.h5
Change directory into the DISP_S1_DIR
directory.
cd <DISP_S1_DIR>/delivery_data_small
We're now ready to execute the 'historical' DISP-S1 Interface.
Note: the relative path to the runconfig file must be specified in the docker run command
docker run --rm --user $(id -u):$(id -g) \
--volume <DISP_S1_DIR>/delivery_data_small:/work \
opera/disp-s1:0.3.2 disp-s1 run runconfig/runconfig_historical.yaml
The docker container will output progress messages as it runs, e.g.:
2024-04-15T16:56:50 - dolphin - INFO - Found SLC files from 2 bursts
...
Execution time for the small test case on opera-dev-pge was about 15 minutes.
When the docker run is finished, scratch/historical/
and output/historical/
directories will be created.
The output/historical directory will contain the following product files:
-rw-r--r-- 1 collinss cloud-user 100050 May 20 19:44 20221107_20221119.unw.connected_component_labels.png
-rw-r--r-- 1 collinss cloud-user 307122 May 20 19:44 20221107_20221119.unw.interferometric_correlation.png
-rw-r--r-- 1 collinss cloud-user 67108864 May 20 19:44 20221107_20221119.unw.nc
-rw-r--r-- 1 collinss cloud-user 107216 May 20 19:44 20221107_20221119.unw.persistent_scatterer_mask.png
-rw-r--r-- 1 collinss cloud-user 318023 May 20 19:44 20221107_20221119.unw.temporal_coherence.png
-rw-r--r-- 1 collinss cloud-user 317434 May 20 19:44 20221107_20221119.unw.unwrapped_phase.png
-rw-r--r-- 1 collinss cloud-user 100050 May 20 19:44 20221107_20221201.unw.connected_component_labels.png
-rw-r--r-- 1 collinss cloud-user 297956 May 20 19:44 20221107_20221201.unw.interferometric_correlation.png
-rw-r--r-- 1 collinss cloud-user 67108864 May 20 19:44 20221107_20221201.unw.nc
-rw-r--r-- 1 collinss cloud-user 107216 May 20 19:44 20221107_20221201.unw.persistent_scatterer_mask.png
-rw-r--r-- 1 collinss cloud-user 318023 May 20 19:44 20221107_20221201.unw.temporal_coherence.png
-rw-r--r-- 1 collinss cloud-user 308557 May 20 19:44 20221107_20221201.unw.unwrapped_phase.png
-rw-r--r-- 1 collinss cloud-user 106318 May 20 19:44 20221107_20221213.unw.connected_component_labels.png
-rw-r--r-- 1 collinss cloud-user 331363 May 20 19:44 20221107_20221213.unw.interferometric_correlation.png
-rw-r--r-- 1 collinss cloud-user 67108864 May 20 19:45 20221107_20221213.unw.nc
-rw-r--r-- 1 collinss cloud-user 107216 May 20 19:45 20221107_20221213.unw.persistent_scatterer_mask.png
-rw-r--r-- 1 collinss cloud-user 318023 May 20 19:44 20221107_20221213.unw.temporal_coherence.png
-rw-r--r-- 1 collinss cloud-user 320306 May 20 19:44 20221107_20221213.unw.unwrapped_phase.png
There will also be a compressed_slcs/
directory containing the following:
-rw-r--r-- 1 collinss cloud-user 137036879 May 20 19:45 compressed_t042_088905_iw1_20221107_20221119_20221213.h5
-rw-r--r-- 1 collinss cloud-user 137037418 May 20 19:45 compressed_t042_088906_iw1_20221107_20221119_20221213.h5
Now that we've successfully executed the SAS container and generated outputs, the last step is to perform a QA check against the expected outputs.
A Python program to compare DISP-S1 products generated by DISP-S1-SAS with expected outputs “golden datasets” is included in the Docker image. The disp-s1 application can run a "validate" workflow that accepts two input files: the golden dataset and the test dataset.
The docker command to run this is:
docker run --rm --volume <DISP_S1_DIR>/delivery_data_small:/work \
opera/disp-s1:0.3.2 \
disp-s1 validate <path to golden dataset> <path to output dataset>
For example, if the SAS was run using the example command above and the result is in the output/ directory, the validation program can be run as as follows:
docker run --rm --volume <DISP_S1_DIR>/delivery_data_small:/work \
opera/disp-s1:0.3.2 \
disp-s1 validate golden_output/forward/20221119_20221213.unw.nc output/forward/20221119_20221213.unw.nc
Currently the small test case does not pass validation due to how embedded strings are compared. However, this error should be considered benign for this release.
Small-size test case sample validation output:
2024-05-20T19:54:41 - disp_s1.validate - INFO - Comparing HDF5 contents...
2024-05-20T19:54:41 - disp_s1.validate - INFO - Checking connected component labels...
2024-05-20T19:54:42 - disp_s1.validate - INFO - Test unwrapped area: 61705513/62045298 (99.452%)
2024-05-20T19:54:42 - disp_s1.validate - INFO - Reference unwrapped area: 61705513/62045298 (99.452%)
2024-05-20T19:54:42 - disp_s1.validate - INFO - Intersection/Reference: 61705513/61705513 (100.000%)
2024-05-20T19:54:42 - disp_s1.validate - INFO - Intersection/Union: 61705513/61705513 (100.000%)
Traceback (most recent call last):
File "/opt/conda/bin/disp-s1", line 8, in <module>
sys.exit(cli_app())
^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/click/core.py", line 1157, in __call__
return self.main(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/click/core.py", line 1078, in main
rv = self.invoke(ctx)
^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/click/core.py", line 1688, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/click/core.py", line 1434, in invoke
return ctx.invoke(self.callback, **ctx.params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/click/core.py", line 783, in invoke
return __callback(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/disp_s1/cli/validate.py", line 12, in validate
compare(golden, test, data_dset)
File "/opt/conda/lib/python3.11/site-packages/disp_s1/validate.py", line 485, in compare
compare_groups(hf_g, hf_t)
File "/opt/conda/lib/python3.11/site-packages/disp_s1/validate.py", line 59, in compare_groups
compare_groups(
File "/opt/conda/lib/python3.11/site-packages/disp_s1/validate.py", line 68, in compare_groups
_compare_datasets_attr(golden_dataset, test_dataset)
File "/opt/conda/lib/python3.11/site-packages/disp_s1/validate.py", line 106, in _compare_datasets_attr
raise ComparisonError(
disp_s1.validate.ComparisonError: /metadata/pge_runconfig dtypes do not match: |S7899 vs |S5603