CONN Pipeline - mind-lab-bos/MBI_Project_MRI_Analysis GitHub Wiki
How to create a new CONN project with fMRIPrep output
Preparing musbid files
- Drag /Dropbox/Music_MCI/MCI_Testing_Materials/fMRI_Battery/mus-bid/*_mus-bid_biddata.txt to /Dropbox/Music_MCI/NU_MCI_Behavioral_data/MRI_mus-bid_task if haven’t done so already
- Copy subs’ *mus-bid_biddata.txt files into /Dropbox/Music_MCI/NU_MCI_Behavioral_data/MRI_mus-bid_task/conn/pre or /post
- Run script for creating onsets_conn_pre.csv and onset_conn_post.csv files: /Dropbox/Music_MCI/NU_MCI_Behavioral_data/MRI_mus-bid_task/scripts/musbid_automated_for_conn.R (to import all subs' conditions into CONN with a single .csv file)
- CONN needs the .csv to have and only have these columns: condition, sub, session (1 for pre, 2 for post), onsets (separated by spaces), duration (in seconds)
- Upload output onsets_conn_pre.csv and onset_conn_post.csv files to the mci_conn folder you are in on Discovery
Open a new MATLAB session on Discovery
Paste the following to MATLAB console:
cd /work/mindlab/Projects/GammaMBI/MRI/Analyses/mci_conn
addpath('/work/mindlab/Programs/spm12')
addpath('/work/mindlab/Programs/conn')
conn
Preparing 1st-level covariates files
- Open /work/mindlab/Projects/GammaMBI/MRI/Analyses/mci_conn/sep_1stlevelcov_musbid.m on MATLAB
- Run the script; if no error, you should be able to find “sub-*_musbid_realignment.tsv”, “sub-*_musbid_QCtimeseries.tsv”, “sub-*_musbid_wcompcor.tsv”, and “sub-*_musbid_ccompcor.tsv” in each subject’s func folder
Create a “New” project in CONN
Save the conn_*.mat file in the /work/mindlab/Projects/GammaMBI/MRI/Analyses/mci_conn folder you’re in
Good practice: save after every step (in case MATLAB times out or crashes which happens very often)
Setup
- Tools —> Convert / Import —> from fMRIPrep dataset
- Navigate to /work/mindlab/NUBIC/MCI_Study/dcm2bids/BIDS/derivatives/fmriprep/pre folder in the rightmost window
- Select the subs you want in the leftmost window (only select subs with both pre & post data; use ctrl/shift to multi-select)
- Select “ses-1_task_musbid” in the middle window
- Select “import selected files” on the bottom rather than the default (copy to DERIVATIVES folder and import)
- Click “Import”
Session 1 structural, functional, ROI, and 1st-level covariates files should have already been imported. Manually import session 2 files following these steps. Please note that these steps apply to all of the following sections: structural, functional, ROI, and 1st-level covariates (wherever I say "import" this is how you do it).
- Select all subjects in the leftmost window (ctrl+A)
- Select “session 2” in the middle window
- Navigate to /work/mindlab/NUBIC/MCI_Study/dcm2bids/BIDS/derivatives/fmriprep/post folder in the rightmost window
- In “filter” underneath, paste the part of the file name that is consistent across subs, click “Find”, scroll down, select all files (using ctrl/shift)
- Click “Import”
Create a list of subs, labelled with CONN subject number, for your own reference
Basic
- Number of subjects (should be automatically set)
- Number of sessions or runs (per subject) = 2
- Repetition Time (seconds) = 0.475
- Acquisition Type = Continuous (should also be automatically set)
Structural
- Select “Multiple anatomical scans per subject”
- Import session 2 scans by searching for “_ses-3_space-MNI152NLin2009cAsym_desc-preproc_T1w.nii.gz”
Functional
- Import session 2 scans by searching for “_ses-3_task-musbid_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz”
ROI
-
Import Grey Matter, White Matter, CSF by searching for the following respectively
- _ses-3_space-MNI152NLin2009cAsym_label-GM_probseg.nii.gz
- _ses-3_space-MNI152NLin2009cAsym_label-WM_probseg.nii.gz
- _ses-3_space-MNI152NLin2009cAsym_label-CSF_probseg.nii.gz
-
For all ROIs, select “Compute average BOLD signal within each region” on the top; select “from unsmoothed volumes (secondary dataset)” underneath (should be the default; since smoothing should only be applied after extracting ROI signals for voxel-wise FC analysis)
-
For GM, WM, and CSF, check “subject-specific ROI” and “session-specific ROI” on the bottom
-
For networks and atlas, select “Atlas file” on the bottom
- Do not check “regress out covariates” before computing avg BOLD signal since you would do that during denoising and you wouldn’t want to double-regress
-
To import your own ROIs in addition to the CONN default atlases:
- Click on “+” on the bottom left, search for your own ROI file in the rightmost window and import your newly created ROI .nii files
- Leave “mask with grey matter” unchecked (since not all ROIs you imported would overlap with CONN’s GM mask - if not, conn would remove signals in those regions that do not overlap, e.g. subcortical ROIs)
Conditions
- Highlight and delete all preexisting conditions
- Condition tools —> Import condition info from text file(s) —> CONN-legacy (single .txt or .csv file) —> select the *pre.csv file for session 1 and *post.csv for session 2
- Choose “Allow missing data” on the bottom right
- Click “+”, highlight all subs, highlight “Session 1”, name it as “pre”, select “condition spans entire selected session(s)”; for “Session 2”, select “condition is not present in selected session(s)”
- Click “+”, highlight all subs, highlight “Session 1”, name it as “post”, select “condition is not present in selected session(s)”; for “Session 2”, select “condition spans entire selected session(s)”
Covariates (1st-level)
-
Import realignment, QC_timeseries, QC_w_comp_cor, and QC_c_comp_cor by highlighting the corresponding var on the left and searching for the following on the right:
- _musbid_realignment.tsv
- _musbid_QCtimeseries.tsv
- _musbid_wcompcor.tsv”
- _musbid_ccompcor.tsv”
-
Click on “— covariate tools —“ on the bottom left —> Compute new/derived first-level covariates —> Compute ‘scrubbing’: thresholded list-of-outliers covariate —> leave first 3 the same, change name of the output to “scrubbing_conn” (this is because fmriprep scrubbing threshold was too stringent and we end up scrubbing out too many scans; so we recompute using conn's scrubbing threshold)
-
Delete everything else but realignment, QC_c_comp_cor, QC_w_comp_cor, and scrubbing_conn
Covariates (2nd-level)
- Click “+” to create a variable called “gMBI” and another called “HEI”
- For gMBI, in Values, type in a series of 1s for gMBI subs and 0s for HEI subs
- For HEI, do the reverse - type in 0s for gMBI and 1s for HEI
- The total number of 1s & 0s should match the number of subjects in your project
Options
- Enabled analyses: ROI-to-ROI, Seed-to-Voxel (for my analyses, you can check more as you need)
- Analysis space (voxel-level) -> Volume: same as functionals (we use 3mm voxels rather than the default which is 2mm)
- Optional output files -> Create confound-corrected time-series (to output denoised data)
Preprocessing (click on button on the bottom)
- Add -> functional Smoothing (spatial convolution with Gaussian kernel)
- Process functional/structural data; Process all subjects & sessions
- Distributed processing (run on Slurm computer cluster) -> Start
- Number of jobs = num of subs needing to be preprocessed (no more than 50 at a time)
- Smoothing kernel: 8mm
- After these jobs are done, check the subjects’ func folder, see if “ssub-*” file has been modified Save the project, click “Done” -> “do not overwrite”, “run on slurm”
Note: Slurm jobs will keep running even if MATLAB session timed out Do NOT click “Done” while data are being preprocessed (we learned that the hard way).
Denoising
The Confounds window includes all the noise covariates you will regress out during denoising. If you hover over the Confounds window, you will see an “all effects” window, which contains variables you don’t care about. You can move the variables from one window to another using the arrow in between.
- Keep “realignment” and “scrubbing_conn”
- Unselect “White Matter” and “CSF”
- Select QC_w_comp_cor, QC_c_comp_cor
- Select all the conditions of interest (e.g., Effect of bp_pre; do not select conditions you're not comparing in this analysis)
- Select session constant (“Effect of pre” and “Effect of post”)
- Highlight all the condition & session effects, click on “add 1st order derivatives” and change it to “no temporal expansion” (this is because our protocol uses block design so each condition/trial lasts quite long)
- Highlight QC_w/c_comp_cor, under “Confound dimensions”, replace Inf with however many PCs you wish to include (rule of thumb is 5 but you can customize based on your own needs)
- Under Band-pass filter (Hz):
- change bandpass filter to [0.008 Inf] (we only need high-pass filter (lower bound) for task-based fMRI; we need both lower and upper bounds for resting)
- change "After regression (RegBP)" into "Simultaneous (simult)" (so that we bandpass filter while regressing out covariates, rather than after, b/c fmriprep covariates are already bandpass filtered)
- select no linear detrending
- Look at distribution on the right, see if the greenish yellow curve is normally distributed and centered around 0 (if you are only interested in comparing across conditions rather than the absolute correlation values, it's OK if it's not centered around 0; we had trouble centering it around 0 using fmriprep output, but it's all good with conn output - we don't yet know why)
- Regressors you include during denoising are subject to change according to your needs, but whatever you decide, it should be well-documented and justifiable
- CONN default: realignment (12P including 1st-order derivates), WM (5P including global signal + 4PCs), CSF (5P including global signal + 4PCs), scrubbing (varied across participants), session effect (2P including 1st-order derivate), task effect (2P including 1st-order derivate)
How to add more subjects to an existing CONN project
(Do this only after the new subject has finished post-test)
Preparing musbid files (same as above)
Open a new MATLAB session on Discovery (same as above)
Preparing 1st-level covariates files
- Open /work/mindlab/Projects/GammaMBI/MRI/Analyses/mci_conn/sep_1stlevelcov_musbid.m on MATLAB
- Edit the subList variable to include only the newly added subs; comment out the original subList line of code and uncomment the new subList line (to avoid recomputing for existing subs)
- Run the script; if no error, you should be able to find “sub-*_musbid_realignment.tsv”, “sub-*_musbid_QCtimeseries.tsv”, “sub-*_musbid_wcompcor.tsv”, and “sub-*_musbid_ccompcor.tsv” in each new subject’s func folder
Open an existing project in CONN
Open /work/mindlab/Projects/GammaMBI/MRI/Analyses/mci_conn/conn_*.mat (project file you’ve created)
Again, good practice: save after every step
Setup
- Tools —> Convert / Import —> from fMRIPrep dataset -> navigate to
- Navigate to /work/mindlab/NUBIC/MCI_Study/dcm2bids/BIDS/derivatives/fmriprep/pre folder in the rightmost window
- Select the new subs you want to add in the leftmost window
- Select “ses-1_task_musbid” in the middle window
- Select “import selected files” on the bottom rather than the default (copy to DERIVATIVES folder and import)
- Click “Import”
The number of subject should automatically increase
Session 1 structural, functional, ROI, and 1st-level covariates files should have already been imported. Manually import session 2 files.
- Select your newly added subjects in the leftmost window (ctrl/shift to multi-select)
- Select “session 2” in the middle window
- Navigate to /work/mindlab/NUBIC/MCI_Study/dcm2bids/BIDS/derivatives/fmriprep/post folder in the rightmost window
- In “files” underneath, paste the part of the file name that is consistent across subs, click “Find”, scroll down, select all files (using ctrl/shift)
- Click “Import”
Add the new subs to your subject list, in the same order as in CONN
Basic
should also be automatically set
Structural
- Import session 2 scans by searching for “_ses-3_space-MNI152NLin2009cAsym_desc-preproc_T1w.nii.gz”
Functional
- Import session 2 scans by searching for “_ses-3_task-musbid_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz”
ROI
- Import Grey Matter, White Matter, CSF by searching for the following respectively
- _ses-3_space-MNI152NLin2009cAsym_label-GM_probseg.nii.gz
- _ses-3_space-MNI152NLin2009cAsym_label-WM_probseg.nii.gz
- _ses-3_space-MNI152NLin2009cAsym_label-CSF_probseg.nii.gz
Conditions
- Condition tools —> Import condition info from text file(s) —> CONN-legacy (single .txt or .csv file) —> select the *pre.csv file for session 1 and *post.csv for session 2
Covariates (1st-level)
- Import realignment, QC_timeseries, QC_w_comp_cor, and QC_c_comp_cor by highlighting the corresponding var on the left and searching for the following on the right:
- _musbid_realignment.tsv
- _musbid_QCtimeseries.tsv
- _musbid_wcompcor.tsv
- _musbid_ccompcor.tsv
- _musbid_acompcor.tsv
- Click on “— covariate tools —“ on the bottom left —> Compute new/derived first-level covariates —> Compute ‘scrubbing’: thresholded list-of-outliers covariate —> leave first 3 the same, change name of the output to “scrubbing_conn”
- Delete everything else but realignment, QC_c_comp_cor, QC_w_comp_cor, QC_a_comp_cor, and scrubbing_conn
Covariates (2nd-level)
- Select the gMBI variable, change the newly added subjects’ values into 1 if gMBI, or 0 if HEI
- Select the HEI variable, change the newly added subjects’ values into 0 if gMBI, or 1 if HEI
Preprocessing (click on button on the bottom)
- Keep “functional Smoothing (spatial convolution with Gaussian kernel)” as the only preprocessing step
- Process functional/structural data; Process selected subjects (select your new subs)
- Distributed processing (run on Slurm computer cluster) -> Start
- Number of jobs = num of new subs needing to be preprocessed
- Smoothing kernel: 8mm
- After preprocessing is done, check the new subjects’ func folder, see if “ssub-*” file has been modified
- click “Done” -> “do not overwrite”, “distributed processing (run on Slurm)”
Denoising
- check if the new subs’ distribution is centered around 0
- click “Done” → “do not overwrite”, “distributed processing (run on Slurm)”