help > Removing scans after preprocessing
Showing 1-4 of 4 posts
Display:
Results per page:
Jan 28, 2015  08:01 PM | Julian Cheng - UCSF
Removing scans after preprocessing
We currently have our analysis pipeline set up this way:

1. Use SPM8 to preprocess the resting state images (slice timing, realign, smooth, resample)
2. Use CONN toolbox to compute single subject connectivity maps
3. Use SPM8 to create a group model

However, we haven't had much luck getting predicted results, so we want to look at data scrubbing to see if that will help us reduce noise from movement.
We are interested in implementing the methods detailed by Power et al. (Spurious but systematic correlations in functional connectivity MRI networks arise from subject motion) but have ran into a problem.
(The method in the paper is to calculate FD and DVARS to exclude scan volumes that exceed some threshold; the details aren't important but the end result is we now have a list of scan volumes that have bad data and need to be excluded from analysis) 
We've figured out how to implement the FD and DVARS calculations mentioned in the paper, but don't know how to work it into our processing pipeline. 
We know we can't just exclude the scans out right because the bandpass-filter in the preprocessing step of the CONN toolbox won't work on discontinuous data, so somehow the scans need to be removed/excluded post preprocessing (CONN) but before analysis (also CONN).
Thus, my question is this: is there a way to specify a vector of values indicating scans to omit from analysis in the data structure passed to conn_batch()?

PS. If you're wondering why we use SPM to do the group level, it is because we don't normalize during SPM's preprocessing and have to do it just before the group level. Personally, I'm not completely sold on this approach (I inherited the processing scripts) and I'm welcome to any suggestions/comments on modifying the processing pipeline as well.

Thanks!
Jan 29, 2015  02:01 AM | Alfonso Nieto-Castanon - Boston University
RE: Removing scans after preprocessing
Hi Julian,

The simplest way to do this is to define a new first-level covariate (e.g. named 'outliers'), point this to a series of .mat files identifying the outlier scans for each subject session (see below), and then enter this covariate as an additional 'confounding effect' during the Denoising step (this is the exact procedure that CONN uses when using ART to generate a similar list of outlier scans and then remove those effects from consideration during Denoising). Each .mat file (one file per subject and session) should simply contain one variable (the name of this variable is not important) dummy coding the identified outlier scans for this subject/session (e.g. if one session has 200 scans, and you want to remove scans 10 and 20, then create an all-zeros matrix with 200 rows and 2 columns, and set to 1 the 10th row in the first column, and the 20th row in the second column; e.g.

 outliers =[10 20];
 nscans = 200;
 X = full(sparse(outliers,1:numel(outliers),1,nscans,numel(outliers)));
 save outliers_Subject1_Session1.mat X;

Hope this helps
Alfonso
Originally posted by Julian Cheng:
We currently have our analysis pipeline set up this way:

1. Use SPM8 to preprocess the resting state images (slice timing, realign, smooth, resample)
2. Use CONN toolbox to compute single subject connectivity maps
3. Use SPM8 to create a group model

However, we haven't had much luck getting predicted results, so we want to look at data scrubbing to see if that will help us reduce noise from movement.
We are interested in implementing the methods detailed by Power et al. (Spurious but systematic correlations in functional connectivity MRI networks arise from subject motion) but have ran into a problem.
(The method in the paper is to calculate FD and DVARS to exclude scan volumes that exceed some threshold; the details aren't important but the end result is we now have a list of scan volumes that have bad data and need to be excluded from analysis) 
We've figured out how to implement the FD and DVARS calculations mentioned in the paper, but don't know how to work it into our processing pipeline. 
We know we can't just exclude the scans out right because the bandpass-filter in the preprocessing step of the CONN toolbox won't work on discontinuous data, so somehow the scans need to be removed/excluded post preprocessing (CONN) but before analysis (also CONN).
Thus, my question is this: is there a way to specify a vector of values indicating scans to omit from analysis in the data structure passed to conn_batch()?

PS. If you're wondering why we use SPM to do the group level, it is because we don't normalize during SPM's preprocessing and have to do it just before the group level. Personally, I'm not completely sold on this approach (I inherited the processing scripts) and I'm welcome to any suggestions/comments on modifying the processing pipeline as well.

Thanks!
Jan 29, 2015  06:01 PM | Julian Cheng - UCSF
RE: Removing scans after preprocessing
Thanks Alfonso, your detailed explanation is very helpful!
Originally posted by Alfonso Nieto-Castanon:
Hi Julian,

The simplest way to do this is to define a new first-level covariate (e.g. named 'outliers'), point this to a series of .mat files identifying the outlier scans for each subject session (see below), and then enter this covariate as an additional 'confounding effect' during the Denoising step (this is the exact procedure that CONN uses when using ART to generate a similar list of outlier scans and then remove those effects from consideration during Denoising). Each .mat file (one file per subject and session) should simply contain one variable (the name of this variable is not important) dummy coding the identified outlier scans for this subject/session (e.g. if one session has 200 scans, and you want to remove scans 10 and 20, then create an all-zeros matrix with 200 rows and 2 columns, and set to 1 the 10th row in the first column, and the 20th row in the second column; e.g.

 outliers =[10 20];
 nscans = 200;
 X = full(sparse(outliers,1:numel(outliers),1,nscans,numel(outliers)));
 save outliers_Subject1_Session1.mat X;

Hope this helps
Alfonso
Originally posted by Julian Cheng:
We currently have our analysis pipeline set up this way:

1. Use SPM8 to preprocess the resting state images (slice timing, realign, smooth, resample)
2. Use CONN toolbox to compute single subject connectivity maps
3. Use SPM8 to create a group model

However, we haven't had much luck getting predicted results, so we want to look at data scrubbing to see if that will help us reduce noise from movement.
We are interested in implementing the methods detailed by Power et al. (Spurious but systematic correlations in functional connectivity MRI networks arise from subject motion) but have ran into a problem.
(The method in the paper is to calculate FD and DVARS to exclude scan volumes that exceed some threshold; the details aren't important but the end result is we now have a list of scan volumes that have bad data and need to be excluded from analysis) 
We've figured out how to implement the FD and DVARS calculations mentioned in the paper, but don't know how to work it into our processing pipeline. 
We know we can't just exclude the scans out right because the bandpass-filter in the preprocessing step of the CONN toolbox won't work on discontinuous data, so somehow the scans need to be removed/excluded post preprocessing (CONN) but before analysis (also CONN).
Thus, my question is this: is there a way to specify a vector of values indicating scans to omit from analysis in the data structure passed to conn_batch()?

PS. If you're wondering why we use SPM to do the group level, it is because we don't normalize during SPM's preprocessing and have to do it just before the group level. Personally, I'm not completely sold on this approach (I inherited the processing scripts) and I'm welcome to any suggestions/comments on modifying the processing pipeline as well.

Thanks!
Nov 10, 2020  09:11 AM | Zahra Mor
RE: Removing scans after preprocessing
Dear Conn experts,
Excuse my naive question!
How do we calculate DVARS (Power, 2012) in Conn?
I have the Mean GSchange as a 2nd level covariate, but can not figure out how to step back to each subject's global signal change. So then (maybe) I can use this subject level GSchange to calculate DVARS (?)
Thanks in advance!