Two Level DBM model in rodents and humans - GarzaLab/Documentation GitHub Wiki
For longitudinal data, we need to use the twolevel_dbm.py
. However, it can be used for cross-sectional data too.
This is based on Cobra Lab's but applied in ADA LAVIS.
Copy all your input files to an input folder. These files should be the preprocessed output in nii.gz
.
rsync -avze ssh --progress --ignore-existing *input_files.nii.gz [email protected]://path/to/input_folder/
Clone the pipeline where you want. You will add a path to your script anyway.
git clone https://github.com/cobralab/twolevel_ants_dbm
Make a bash script with all the preprocessing you need. This example is from the CobraLab.
#!/bin/bash
module load cobralab && module load minc-toolkit ANTs
export PATH=/home/m/mchakrav/egarza/twolevel_ants_dbm:$PATH
twolevel_dbm.py --jacobian-sigmas 0.27 0.14 --modelbuild-command ${QUARANTINE_PATH}/twolevel-dbm/niagara2-antsMultivariateTemplateConstruction2.sh \
--cluster-type=slurm --rigid-model-target /home/m/mchakrav/egarza/Fischer344_nii_v4/Fischer344_template.nii 2level input.csv
Importantly, --jacobian-sigmas
are the smoothing parameters. You need to calculate the sigmas from the FWHM.
This is done as such:
FWHM = 2.354820 * SIGMA = 1.359556 * RMS
sigma = 0.57735027 * rms = 0.42466090 * fwhm
--modelbuild-command
uses the antsMultivariateTemplateConstruction.sh
--rigid-model-target
is your atlas or template. For rat we use the Fischer now.
For this example, it is a 2level
, meaning in this case longitudinal.
The input.csv
is simple to build. Each column is a factor and each row a subject.
/scratch/m/mchakrav/egarza/projects/DBM_long_test/input/sub-10_ses-1_pp.nii.gz,/scratch/m/mchakrav/egarza/projects/DBM_long_test/input/sub-10_ses-2_pp.nii.gz
/scratch/m/mchakrav/egarza/projects/DBM_long_test/input/sub-11_ses-1_pp.nii.gz,/scratch/m/mchakrav/egarza/projects/DBM_long_test/input/sub-11_ses-2_pp.nii.gz
/scratch/m/mchakrav/egarza/projects/DBM_long_test/input/sub-5_ses-1_pp.nii.gz,/scratch/m/mchakrav/egarza/projects/DBM_long_test/input/sub-5_ses-2_pp.nii.gz
To actually run your script, it is recommended to use tmux. To do so, run tmux
in your command line, following by bash twolevel_dbm.sh
bash twolevel_dbm.sh
Full Help:
usage: twolevel_dbm.py [-h]
[--jacobian-sigmas JACOBIAN_SIGMAS [JACOBIAN_SIGMAS ...]]
[--rigid-model-target RIGID_MODEL_TARGET]
[--resample-to-common-space RESAMPLE_TO_COMMON_SPACE]
[--skip-dbm] [--dry-run] [--N4] [--metric METRIC]
[--transform {SyN,BSplineSyN,TimeVaryingVelocityField,TimeVaryingBSplineVelocityField,Affine,Rigid}]
[--reg-iterations REG_ITERATIONS]
[--reg-smoothing REG_SMOOTHING]
[--reg-shrinks REG_SHRINKS] [--float]
[--average-type {mean,normmean,median}]
[--gradient-step GRADIENT_STEP]
[--model-iterations MODEL_ITERATIONS]
[--modelbuild-command MODELBUILD_COMMAND]
[--cluster-type {local,sge,pbs,slurm}]
[--walltime WALLTIME] [--memory-request MEMORY_REQUEST]
[--local-threads LOCAL_THREADS]
{1level,2level} input
This pipeline performs one or two level model building on files using
antsMultivariateTemplateConstruction2.sh and generates smoothed jacobian
determinent fields suitable for deformation based morphomometry (DBM)
analysis.
positional arguments:
{1level,2level} What type of DBM processing to run on input file, see
input for details on how to format CSV file for
different types.
input Input CSV file for DBM, for 1level mode, a single
column, for 2level, each each row constructs a first
level model followed by a second level model of the
resulting first level averages. File paths must be
absolute.
optional arguments:
-h, --help show this help message and exit
--jacobian-sigmas JACOBIAN_SIGMAS [JACOBIAN_SIGMAS ...]
List of smoothing sigmas used for final output,
defaults to 2x finest resolution input file or rigid
model target if provided. (default: None)
--rigid-model-target RIGID_MODEL_TARGET
Target file to use for rigid registration of the
second level, otherwise unbiased average to start
(default: None)
--resample-to-common-space RESAMPLE_TO_COMMON_SPACE
Target nifti file of atlas space to resample to
jacobians to after unbiased model build, typically an
MNI model, triggers a registration to this target
(default: None)
--skip-dbm Skip generating DBM outputs (default: False)
--dry-run Don't run commands, instead print to stdout (default:
False)
advanced options:
--N4, --no-N4 Run N4BiasFieldCorrection during model build on input
files. (default: False)
--metric METRIC Specify metric used for non-linear template stages
(default: CC[4])
--transform {SyN,BSplineSyN,TimeVaryingVelocityField,TimeVaryingBSplineVelocityField,Affine,Rigid}
Transformation type used during model build (default:
SyN)
--reg-iterations REG_ITERATIONS
Max iterations for non-linear stages (default:
100x100x70x20)
--reg-smoothing REG_SMOOTHING
Smoothing sigmas for non-linear stages (default:
3x2x1x0)
--reg-shrinks REG_SHRINKS
Shrink factors for non-linear stages (default:
6x4x2x1)
--float, --no-float Run registration with float (32 bit) or double (64
bit) values (default: True)
--average-type {mean,normmean,median}
Type of average used during model build, default
normalized mean (default: normmean)
--gradient-step GRADIENT_STEP
Gradient step size at each iteration during model
build (default: 0.25)
--model-iterations MODEL_ITERATIONS
How many registration and average rounds to do
(default: 3)
--modelbuild-command MODELBUILD_COMMAND
Command to use for performing model build, must accept
same arguments as
antsMultivariateTemplateConstruction2.sh (default:
antsMultivariateTemplateConstruction2.sh)
cluster options:
--cluster-type {local,sge,pbs,slurm}
Choose the type of cluster system to submit jobs to
(default: local)
--walltime WALLTIME Option for job submission specifying requested time
per pairwise registration. (default: 20:00:00)
--memory-request MEMORY_REQUEST
Option for job submission specifying requested memory
per pairwise registration. (default: 8gb)
--local-threads LOCAL_THREADS, -j LOCAL_THREADS
For local execution, how many subject-wise modelbuilds
to run in parallel, defaults to number of CPUs
(default: 4)
twolevel_dbm.py
produces three types of log jacobian determinant images from the model builds: nlin
, relative
and absolute
. nlin
files are the raw registration warp fields converted to jacobians
, relative
files have residual affine components of the warp field removed using ANTSUseDeformationFieldToGetAffineTransform
and absolute
files have the affine jacobian added to the nlin to account for bulk volume changes. relative
and absolute
files are generally expected to be used for statistical analysis.
For two-level pipelines, jacobians are produced for two different transformations, resampled
are within-subject jacobians, resampled into the final average anatomical space, overall are jacobains encoding all volumetric differences between the individual subject input files to the final anatomical average. For most applications the resampled
jacobians are suggested for analysis.
In all cases the values of the log jacobians are to be interpreted as follows:
- positive values indicate that the voxel in template space must be expanded to get to the subject space, i.e. the subject voxel is larger than the template voxel
- negative values indicate that the voxel in template space must be reduced to get to the subject space, i.e. the subject voxel is smaller than the template voxel
First we need to create a mask for the analysis. All relative
and absolute
output are in common-space
which is the average of your images, unless you define a --resample-to-common-space
which is not necessary or recommended.
For that we follow these steps in your PC:
You may have noticed in your outputs from the twolevel_ants_dbm that the mask provided is an otsu mask, not appropriate for use in your model. Here we will walk through the process of registering a mask to the template produced by your dbm. As an example, I will be using mouse brains, but the procedure is the same for rats and humans.
What you need:
- a model average, such as the
Fischer atlas template
, henceforth referred to as "model" - a mask associated with the model, such as the
Fischer atlas mask
, henceforth referred to as "mask" - the template produced from your dbm run, which can be found within your run directory at
./output/secondlevel/secondlevel_template0.nii.gz
henceforth referred to as "template"
All files must be niftis.
module load minc-toolkit
Step 1: Register your model to your template. Here is the command:
antsRegistrationSyN.sh -d 3 -f <template> -m <model> -o model_to_template -j 1 -p f
-d 3
signifies the images are 3 dimensional
-f
means your template is the fixed image
-m
means your model is the moving image
--Bonus mini-lesson: In my case, the model is much higher resolution than my template (40microns vs 70microns). By using the lower resolution image as the fixed image, we are conserving computation time, which should be fine for a mask. In theory you could also reverse the fixed and model images and then apply the inverse of the transform in Step 2, but for simplicity and to avoid waste, we will do it this way.--
-o
means the outputs will have the prefix "model_to_template"
-j
means the registration will use histogram matching
-p
means the precision will be a float
My call:
antsRegistrationSyN.sh -d 3 -f secondlevel_template0.nii.gz -m /home/egarza/atlases/Rat/Fischer344_nii_v3/Fischer344_nii_v3/Fischer344_template.nii -o Fischer_to_template -j 1 -p f
Step 2: Apply the transformations garnered from Step 1 to the mask so it is fits the template Command:
antsApplyTransforms -d 3 -i <mask> -o mask_on_template.nii -t model_to_template1Warp.nii.gz -t model_to_template0GenericAffine.mat -n GenericLabel --verbose -r <template>
-d
signifies the images are 3 dimensional
-i
is the input mask
-o
is the name of the output file, which will be the mask on the template.
-t
are your transform files
-n
the interpolator used will be Generic Label, which is the best interpolator for discrete data (added by the venerable Gabriel A. Devenyi)
--verbose
provides terminal outputs
-r
is your reference image.
My call:
antsApplyTransforms -d 3 -i /home/egarza/atlases/Rat/Fischer344_nii_v3/Fischer344_nii_v3/Fischer344_mask.nii -o Mask_on_template.nii.gz -t Fischer_to_template1Warp.nii.gz -t Fischer_to_template0GenericAffine.mat -n GenericLabel --verbose -r /home/egarza/atlases/Rat/Fischer344_nii_v3/Fischer344_nii_v3/Fischer344_template.nii
Once you have run both of these you can use convert the output
, mask_on_template.nii
, to a mnc
with nii2mnc
and use this in your statistical analyses.
View your mask on your template average.
Display <template.mnc> -label mask_on_template.mnc
Switch the background image to greyscale.
If your mask seems to fall over the edge a bit, like the example, you may need to erode it.
In the terminal, type:
mincmorph -erosion mask_on_template.mnc eroded_mask.mnc
This will perform a single erosion on the mask and output the file eroded_mask.mnc.
You can then display this the same way and visually compare the results!