open-discussion
open-discussion > RE: Suggestions for HCP Data
Jan 19, 2021 08:01 PM | Martin Styner
RE: Suggestions for HCP Data
Hi Steven
Good question. Most of our new studies have HCP-like sequences and thus DTIPrep is insufficient for the full preprocessing pipeline (as it does not provide any susceptibility correction). So, a new version of DTIPrep is needed, one that interfaces with FSL (or other toolboxes) to achieve the susceptibility correction.
And I am really not a fan of the old-FSL-style processing by averaging AP-PA sequences. The "new" style processing rather corrects susceptibility artifacts in the individual AP & PA volumes, but does not average them. This allows, for example, that you can reject a bad volume in AP, but include the corresponding good volume in PA. This also allows the processing of incompletely acquired data (where the subject had to leave the scanner early).
Furthermore, the new FSL versions (6.0.1+) have powerful correction methods included in eddy (FSL's eddy current and motion correction tool), particularly to incorporate the computed susceptibility-undistortion field into the eddy current/motion correction, as well as interpolate bad data. But, I think it would be bad practice use a volume that has significant artifacts throughout the image for the purpose of interpolation (i.e. it's better to just reject that whole bad image).
So, our most up-to-date (script-based) processing starts with a conservative DTIPrep, which removes the really bad volumes and does not do a motion/eddy current correction. The data is then converted to NIFTI, for use with FSL 6.0.3 for a topup-eddy based correction of susceptibility/motion/eddy/remaining-bad-data-interpolation.
Happy to share my scripts, if interestd.
Best
Martin
Good question. Most of our new studies have HCP-like sequences and thus DTIPrep is insufficient for the full preprocessing pipeline (as it does not provide any susceptibility correction). So, a new version of DTIPrep is needed, one that interfaces with FSL (or other toolboxes) to achieve the susceptibility correction.
And I am really not a fan of the old-FSL-style processing by averaging AP-PA sequences. The "new" style processing rather corrects susceptibility artifacts in the individual AP & PA volumes, but does not average them. This allows, for example, that you can reject a bad volume in AP, but include the corresponding good volume in PA. This also allows the processing of incompletely acquired data (where the subject had to leave the scanner early).
Furthermore, the new FSL versions (6.0.1+) have powerful correction methods included in eddy (FSL's eddy current and motion correction tool), particularly to incorporate the computed susceptibility-undistortion field into the eddy current/motion correction, as well as interpolate bad data. But, I think it would be bad practice use a volume that has significant artifacts throughout the image for the purpose of interpolation (i.e. it's better to just reject that whole bad image).
So, our most up-to-date (script-based) processing starts with a conservative DTIPrep, which removes the really bad volumes and does not do a motion/eddy current correction. The data is then converted to NIFTI, for use with FSL 6.0.3 for a topup-eddy based correction of susceptibility/motion/eddy/remaining-bad-data-interpolation.
Happy to share my scripts, if interestd.
Best
Martin
Threaded View
Title | Author | Date |
---|---|---|
Steven Meisler | Jan 19, 2021 | |
Martin Styner | Jan 19, 2021 | |
Gustavo Sudre | May 19, 2021 | |
Martin Styner | Nov 9, 2021 | |
neda mohammadi | Oct 23, 2021 | |
Steven Meisler | Jan 19, 2021 | |