help > RE: Potential bugs in preprocessing pipeline
Sep 26, 2018  07:09 PM | Pravesh Parekh - National Institute of Mental Health and Neurosciences
RE: Potential bugs in preprocessing pipeline
Dear Dr. Alfonso,

Thank you for your reply. It is always a learning experience when listening to your thoughts. Thank you for that (and of course, many thanks for Conn!).

Thank you for the link to work in progress. I have downloaded the experimental release and will try it out (look forward to Conn 18b!). Apart from the ongoing discussion about slice timing correction, I have also seen a strange behaviour when using the indirect normalization pipeline. I have appended details of that below the slice timing correction discussion.

Regarding slice timing correction:
Say that my data has 42 slices (data was acquired in ascending manner). The JSON file associated with the data has the following slice timing correction:

[2.225; 2.17; 2.1175; 2.0625; 2.0075; 1.9525; 1.9; 1.845; 1.79; 1.7375; 1.6825; 1.6275; 1.5725; 1.52; 1.465; 1.41; 1.3575; 1.3025; 1.2475; 1.1925; 1.14; 1.085; 1.03; 0.9775; 0.9225; 0.8675; 0.8125; 0.76; 0.705; 0.65; 0.5975; 0.5425; 0.4875; 0.4325; 0.38; 0.325; 0.27; 0.2175; 0.1625; 0.1075; 0.0525; 0]

Timings for a few important slices:
first slice = 0;
last slice = 2.225
middle slice (i.e. slice number 21) = 1.1400
mean timing = 1.1121

When I specify the order as ascending in the GUI and check the SPM batch, the reference slice number is 21. However, when I specify BIDS and check the batch, the reference slice number (actually timing) is 1.1121. This corresponds to the mean timing but does not correspond to the timing of any specific slice (the timing is in the middle of 21). To get the same result as when specifying the slices (rather than timings), shouldn't the reference timing be the timing for slice number 21 (middle slice) = 1.1400? Of course, the difference in timing is quite minor.

On a related note, I noticed that for the same number of slices (=42), the batch shows reference slice as 21 for ascending acquisition and 22 for descending slice. I guess this must be because Conn is actually calculating the mean of the number of slices (just like the case of timing) and then rounding it up for the descending case and rounding it down in the ascending case (so that the resulting slice number is a valid slice). In the same tune, I was wondering if it would be better to specify the timing of a slice which actually exists.


Regarding indirect normalization:
I was trying out different preprocessing options and see a strange behaviour when running the indirect pipeline (I used the realignment and unwarping option without phase map). For the test case, the functional data had a couple of cerebellar slices missing while the functional data had a larger field of view and consequently had the full acquisition. After normalization, the resulting structural image gets cropped from the bottom resulting in an image similar to the functional image. This behaviour does not happen when using the direct normalization pipeline. I was unable to replicate the behaviour manually preprocessing the same image. I assumed that this is perhaps because of ART mask being calculated from the functional volume; however, this persists even if I skip ART outlier detection step during preprocessing. I have seen similar problem with other subjects too. What could be causing this?

I have attached a snapshot showing the native space structural and functional images, and the normalized structural and functional images using direct pipeline, and the normalizaed structural image using indirect normalization pipeline.


Regarding inclusion of CSF segmentation file for calculating wc0 file:
On a different note, when calculating the wc0 (normalized, skull stripped image), Conn includes GM, WM, and CSF files. It adds them up (resulting in a value of 1 in all brain tissue areas) and then multiplies it with the structural image to get the wc0 file. However,  this means that areas like eye balls will get included into the skull stripped image. Would it be better to only add c1 and c2 files, threshold (to get binary values), and then multiply with the structural image to get skull stripped image without eye balls and the "ring" of CSF around the brain? Of course, this does not matter too much as all processing happens using either the c* images or the functional images.


Look forward to your thoughts

Best Regards
Pravesh

Originally posted by Alfonso Nieto-Castanon:
Dear Pravesh,

As always thanks for your comments and feedback. The rationale for selecting the mid-time slice as reference when performing slice-timing correction in CONN is in order to attempt to minimize the average temporal displacement that needs to be corrected across all slices (each slice is corrected -i.e. time-shifted- by an amount equal to its actual acquisition time minus the acquisition time of the reference slice). In any way, perhaps I am missing something here so please feel free to clarify why you believe in this case selecting the mid-slice as reference could be more appropriate or preferable. 

Also, regarding your previous message, sorry the patch I sent you had too many dependencies with other version-18b changes. If you do not mind, please feel free to download the code in https://www.conn-toolbox.org/resources/s... to get the current development version -that already includes the patch that I sent you- (the final 18b version should be released in the next few weeks and that will become available as always here at nitrc.org)
 
Thanks
Alfonso
Originally posted by Pravesh Parekh:
Dear Dr. Alfonso,

I think there may be another bug in the pre-processing pipeline. When performing slice timing correction and selecting BIDS to pick up slicing order (actually timing), the reference slice is specified as half the last slice time i.e. say the last slice was acquired at 2250ms, then the reference slice is specified as 1125ms. I am assuming that instead of this, the timing for the middle slice is what needs to be picked up. 


Regards
Pravesh

Threaded View

TitleAuthorDate
Pravesh Parekh Sep 1, 2018
Pravesh Parekh Oct 10, 2018
Pravesh Parekh Oct 4, 2018
Pravesh Parekh Sep 17, 2018
Alfonso Nieto-Castanon Sep 25, 2018
RE: Potential bugs in preprocessing pipeline
Pravesh Parekh Sep 26, 2018
Alfonso Nieto-Castanon Oct 10, 2018
Pravesh Parekh Oct 11, 2018
Alfonso Nieto-Castanon Oct 12, 2018
Pravesh Parekh Dec 6, 2018
Alfonso Nieto-Castanon Dec 9, 2018
Jeff Browndyke Dec 10, 2018
Pravesh Parekh Dec 11, 2018
Pravesh Parekh Sep 25, 2018
Pravesh Parekh Sep 12, 2018
Alfonso Nieto-Castanon Sep 3, 2018
Pravesh Parekh Sep 5, 2018