open-discussion > Including CompCor (CONN toolbox) into SPM batch
Showing 1-16 of 16 posts
Display:
Results per page:
Jan 30, 2019  03:01 PM | Sara Calzolari
Including CompCor (CONN toolbox) into SPM batch
Hello everyone,

I am preprocessing some fMRI resting-state data and would like to perform denoising using CompCor (CONN toolbox).
Is there a way to include denoising in my SPM batch? If not can I just preprocess data on SPM and then use conn only for denoising?

Thank you, this is the first time I analyse fMRI data so apologies for the basic questions.

Best,
Sara
Jan 31, 2019  07:01 AM | Lars Kasper - Translational Neuromodeling Unit, IBT, University of Zurich and ETH Zurich
RE: Including CompCor (CONN toolbox) into SPM batch
Dear Sara,

I am sure you can combine preprocessing in SPM with the conn toolbox for denoising.

However, if you want to stay within the SPM framework completely, we have implemented a simple version of aCompCor in the PhysIO Toolbox (https://www.nitrc.org/projects/physio), which is integrated with the SPM Batch Editor. All you need to put there is a dependency on the warped white matter/CSF (wc2*,wc3*) tissue probability maps from unified segmentation to specify those as noise ROIs, and the warped (and realigned etc.) fMRI time series, from which the time series shall be extracted.

The help in the batch editor, after installing the toolbox, should give you more details. PhysIO is part of the TAPAS suite, and more info on it can be found in the README (https://github.com/translationalneuromod...).


All the best,
Lars
Feb 11, 2019  05:02 AM | Sara Calzolari
RE: Including CompCor (CONN toolbox) into SPM batch
Dear Lars,

many thanks for your reply, I will then use the PhysIO toolbox.


So, if I understood correctly,  in the SPM batch module "TAPAS PhysIO toolbox" I will have to focus just on the "model" part (see image attached), as I haven't collected physiological measures and I just want to perform aCompCor...correct? Can I just edit the batch script to delete the sections "log_files","scan_timing" and "preproc"?

Also, my preprocessing pipeline includes realignment (estimation and reslicing), slice timing, coregistration (estimate), segmentation, normalisation (write), smoothing. Anything else I would need for a successful denoising?

Many thanks for your help.

Best,
Sara
Feb 12, 2019  07:02 PM | Lars Kasper - Translational Neuromodeling Unit, IBT, University of Zurich and ETH Zurich
RE: Including CompCor (CONN toolbox) into SPM batch
Dear Sara,

concerning your first question how to setup the SPM batch: You can just start with the default batch that comes with the toolbox

examples/tapas_physio_example_philips_ecg3t_spm_job.m


and apply the following modifications:

  • Setting the log file names to empty values (press done w/o selecting a file in the batch editor)
  • Selecting model.noise_rois = Yes, and RETROICOR = No. In the noise_rois section, specify the fmri time series file (I would use the unsmoothed but normalized, preprocessed nifti file here) as well as the noise ROIs image files (the normalized tissue probability maps of white matter and csf, wc2* and wc3*, from unified segmentation would be a good choice). ROI thresholding and cropping as well as PCA number of components/explained share of variance can be set to your liking, see the attached examples.
  • You will still have to specify the sqpar values (number of slices, volumes, TR) to create the regressors in a meaningful way.
  • Everything else (scan_timing, preproc) can stay at its default values, because the modules are ignored, if no log_files are specified.


I have attached a batch that includes the PhysIO module, but makes a "refined" choice of the noise_rois parameters (basically adjusting their erosion/thresholding and number of components to robust values after visual inspection of the results when running with default values for my dataset).

Note that the output of the PhysIO module will only be the nuisance regressors for the GLM (as text file). In order to remove their variance from the nifti file, you will have to specify the GLM with those and then choose to "write residuals" in the model estimation module (or spm_write_residuals in the command line). There is one file (Res*) for each volume, and the new, preprocessed time series with noise filtered out is the combination of those. You can add a batch utility to create a single 4D file from these residual images as well. The attached batch file includes GLM, estimation, residuals and 4D file combination after the PhysIO module as well (*noise_regression.m). The only deviation from a classical GLM specification here is that the model only include multiple regressors, but no conditions, and that I set the masking threshold to 0.05 (instead of 0.8). This is very liberal, but avoids holes in the output nifti file which could otherwise occur in low intensity regions still of interest to your resting state analysis.

Concerning your question about the preprocessing pipeline, these seem to be the right modules in the right order (one could debate whether slice timing or realignment goes first, but often this is an empirical question). I would only advise to do the noise regression on the unsmoothed time series, because otherwise you might smooth gray matter voxels (and their temporal signature) partly into your noise ROI, and consequently regress them out of your data. You can smooth the data after denoising by adding this module at the end of this third batch.

I hope that helps!

All the best,
Lars
Feb 28, 2019  05:02 AM | Ralf Veit
RE: Including CompCor (CONN toolbox) into SPM batch
Dear Lars,

I applied your modifications both in the matlab_script philips_ecg3t_spm_job. located at .../examples/3.0.0/PhysIO/Philips/ECG3T (I could not find a file named tapas_physio_example_philips_ecg3t_spm_job.m) and in the batch you attached to the response to Sara. However, the script crashes with different error messages depending on the defined Vendor.

In case of Philips
Error using textread (line 165) File not found.
In file "/opt_prg/spm/spm12/toolbox/tapas_PhysIO/tapas_physio_read_physlogfiles_philips_matrix.m"

In case of Custom
Error using tapas_physio_fill_empty_parameters (line 55)
Please specify sampling interval for custom text data
In file "/opt_prg/spm/spm12/toolbox/tapas_PhysIO/tapas_physio_fill_empty_parameters.m" (v464),

In case of Siemens
Index exceeds matrix dimensions.
In file "/opt_prg/spm/spm12/toolbox/tapas_PhysIO/tapas_physio_read_physlogfiles.m"

The log file names are empty (I pressed done w/o selecting a file as you suggested). Are there other modifications necessary to run the script? Do I need the initial_pulse_kRpeakfile.mat (as in your batch)? It would be great if I can use your SPM modifications for denoising.
Attached is the error file using Philips as vendor.

Best
Ralf
Attachment: error_comp_cor.mat
Feb 28, 2019  08:02 AM | Lars Kasper - Translational Neuromodeling Unit, IBT, University of Zurich and ETH Zurich
RE: Including CompCor(CONN toolbox) into SPM batch
Dear Ralf,

thank you for trying out this new functionality. It might very well be that you are not using the most recent version of PhysIO, since in older versions the logfiles were sought for and the example file you were looking for was lacking. Could you quickly run �tapas_physio_version� in the command window to see whether you are using R2018.1 or .2 or any older version?

If yes, please try out the current development version with your batch:

https://github.com/translationalneuromod...

Thank you and let me know of any progress,

All the best,
Lars
Mar 1, 2019  01:03 AM | Ralf Veit
RE: Including CompCor(CONN toolbox) into SPM batch
Dear Lars,

I downloaded your development version (tapas version is R2018.2.0) and now there is an example file tapas_physio_example_philips_ecg3t_spm_job.m. After modification of the file according to your suggestions, the error is still the same
Running 'TAPAS PhysIO Toolbox'
Failed 'TAPAS PhysIO Toolbox'
Error using textread (line 165)
File not found.
and I have again different errors using different vendors. There is no option to unmark vendor. Thereafter I tried the following in the script
matlabbatch{1}.spm.tools.physio.log_files.vendor = {''};
but the program stopped
SWITCH expression must be a scalar or character vector constant and refered to tapas_physio_fill_empty_parameters.m
I am using spm12 (v7129) with matlab 2016b. I have one more question regarding your batch you provided in your response to Sara. There seems to be only one file in the fmri time series option (wuafmri01.nii). Why not the whole series?
I have attached the spm batch file.

Best
Ralf
Mar 1, 2019  03:03 AM | Lars Kasper - Translational Neuromodeling Unit, IBT, University of Zurich and ETH Zurich
RE: Including CompCor(CONN toolbox) into SPM batch
Dear Ralf,

thank you for trying out the development version! 
Originally posted by Ralf Veit:
Dear Lars,

I downloaded your development version (tapas version is R2018.2.0) and now there is an example file tapas_physio_example_philips_ecg3t_spm_job.m. After modification of the file according to your suggestions, the error is still the same
Running 'TAPAS PhysIO Toolbox'
Failed 'TAPAS PhysIO Toolbox'
Error using textread (line 165)
File not found.


Now, this is really weird. I just tried out your attached batch, and it runs all the way into the tapas_physio_create_noise_rois_regressors function for me. You still seem to be stuck at tapas_physio_read_physlogfiles, am I right? Could it be that the other version of PhysIO is still on the path and is accidentally called? Maybe you could set a breakpoint in tapas_physio_main_create_regressors (of the development version) just before the read_physlogfiles (about line 100) and see whether it indeed uses this version of the main function?

and I have again different errors using different vendors. There is no option to unmark vendor. Thereafter I tried the following in the script
matlabbatch{1}.spm.tools.physio.log_files.vendor = {''};
but the program stopped
SWITCH expression must be a scalar or character vector constant and refered to tapas_physio_fill_empty_parameters.m


This is a good suggestion, to be able to unmark the vendor, I will include that in a future version. Since Physio was originally developed for model-based noise correction using peripheral recordings, having no recordings wasn't anticipated originally. So as it stands, this is expected behavior and leaving any vendor (as e.g., Philips in your batch) should work.


I am using spm12 (v7129) with matlab 2016b. I have one more question regarding your batch you provided in your response to Sara. There seems to be only one file in the fmri time series option (wuafmri01.nii). Why not the whole series?


In my example, 01 refers to the run, so this is indeed a 4D-nifti file. If you have 1 nifti file per volume, you should include all files of the series, as you did correctly in your batch. Both ways should work with the development version (R2018.1 had a bug for 3D files, I think).

All the best,
Lars
Mar 1, 2019  06:03 AM | Ralf Veit
RE: Including CompCor(CONN toolbox) into SPM batch
Dear Lars,

thanks for the suggestion. I deleted the old tapas folder and restarted matlab and spm, then the script was able to calculate the pca and showed the first 3 components in a matlab figure. After that a second coregister and reslice with the wca* images were performed, but after completion the script stopped and no multiple regressors were created.
Failed 'TAPAS PhysIO Toolbox'
Index exceeds matrix dimensions.
In file "/opt_prg/spm_tool_12/tapas/PhysIO/code/model/tapas_physio_create_noise_rois_regressors.m" (???),

any idea?

Best Ralf
Mar 1, 2019  07:03 AM | Lars Kasper - Translational Neuromodeling Unit, IBT, University of Zurich and ETH Zurich
RE: IncludingCompCor(CONN toolbox) into SPM batch
Dear Ralf,

great that the other issue was resolved.

I just ran into the same issue �index exceeds matrix dimensions� today in a different context, and most likely it arises because not enough voxels survive in that second mask. Specifically, there are less voxel time series in the original data than PCA components (+1 for the mean) requested. I will catch this error in the next release, but for now you could just try to set a more liberal mask threshold (e.g., 0.9 or 0.95) and less erosion (if you used more than 1 voxel).

You could check that by computing sum(roi(:)>0) before the PCA computation.

All the best,
Lars
Mar 4, 2019  04:03 AM | Ralf Veit
RE: IncludingCompCor(CONN toolbox) into SPM batch
Dear Lars,

thank for all your help. I modified the script with a higher mask threshold (.95), but now the program stops with the error
  Reference to non-existent field 'censor_unreliable_recording_intervals'.
I copied the following line into the matlabbatch
matlabbatch{1}.spm.tools.physio.model.censor_unreliable_recording_intervals = false;

and receive the message

Item model: No field(s) named
censor_unreliable_recording_intervals

and when the job is running the same error as above.
I was really wondering because on friday I didn't have this problem.

Best
Ralf
Mar 5, 2019  03:03 AM | Ralf Veit
RE: IncludingCompCor(CONN toolbox) into SPM batch
Dear Lars,

it was my fault. After cleaning and deletion of all old scripts related to tapas and physIO, the script went through and created the multiple regressors. There were errors on the pca step but the program did not stop.

Warning: Escaped character '\_' is not valid. See 'doc sprintf' for
supported special characters.

I have now 8 regressors corresponding to the two ROIs (white matter/CSF) with 3 PCA and 1 mean time course respectively.
Thanks again for your support.

Best
Ralf
Mar 5, 2019  04:03 AM | Lars Kasper - Translational Neuromodeling Unit, IBT, University of Zurich and ETH Zurich
RE:IncludingCompCor(CONN toolbox) into SPM batch
Dear Ralf,

I am very happy to hear you found the bug and got the extraction to work. The issues with unexpected paths in a matlab environment are very common and for each of my own projects, I now have a specific path init function that calls "restoredefaultpath" and afterwards adds only the paths needed for the project.

Also, I use one separate copy of spm (and every other tool) per project.

BTW: The warning you got was only about the title printed, nothing to worry about!

All the best and good luck with the analysis,
Lars
Mar 5, 2019  06:03 AM | Ralf Veit
RE:IncludingCompCor(CONN toolbox) into SPM batch
Dear Lars,

just one more question. When using compcor for task related designs, Behzadi used some criteria (p=0.2) to exclude voxels with stimulus correlated fluctuations. I think in your present version there is no exclusion criteria. Do you think this is necessary?

Best
Ralf
Mar 9, 2019  12:03 PM | Lars Kasper - Translational Neuromodeling Unit, IBT, University of Zurich and ETH Zurich
RE:IncludingCompCor(CONN toolbox) into SPM batch
Dear Ralf,

sorry for the delayed response. I think the exclusion of stimulus-correlated fluctuations is a conceptual question and there is no right or wrong. In short: If you do it, you risk false positives, if you don�t do it, you risk false negatives.

Basically, you will have to ask yourself how a voxel that is not in gray matter can correlate with your task:

Reason 1): There is some physiological change induced by the task, and, for example, the induced change in heart rate creates a pulsatile flow change in the CSF as well, from which you extracted the �nuisance regressors�. If this is the case, using such CSF voxels that show a correlation to generate �nuisance regressors� might regress out actual neuronally-induced signal from gray matter voxels, and you end up with false negatives, finding no activation. On the other hand, voxels with CSF/GM partial volume effects might be erroneously considered �active voxels�, if you don�t include such nuisance regressors, because the task activation looks just so similar to the physiological changes. So if you omit task-correlated voxels, you might end up with false positives. I rather err on the side of false negatives (because usually strong claims are only made about positive findings in most publications), and therefore opt for not excluding any voxels from a well-defined mask because of correlations. But the �well-defined� is the crucial thing. You have to be sure no gray matter voxels end up in your mask for the noise rois extraction, otherwise you might regress out neuronally-induced signal, and the argument above does not hold.

Reason 2): There is some random correlation in some voxels of the ROI (since we have a lot of them) with your task. In this case you would end up regressing out the correlated part, if it ends up in the extracted principal components. I feel that if the number of voxels is large and they are physiological noise-dominated (or better: not dominated by the random correlation or task-based fluctuation), this effect is small, as long as your time series is not too short (compared to the number of nuisance regressors). If that is the case, i.e., if the degrees of freedom in your residual data becomes very small (i.e., number of regressors is approaching number of points in the fMRI time series), funny things can happen, and I highly recommend these two papers by, among others, Molly Bright and Kevin Murphy (as first and last authors) for further reading:

Is fMRI �noise� really noise? Resting state nuisance regressors remove variance with network structure
http://www.sciencedirect.com/science/art...

Potential pitfalls when denoising resting state fMRI data using nuisance regression
http://www.sciencedirect.com/science/art...

But I also think their main conclusion supports the idea that if you have good reasons to include a moderate number of nuisance regressors into your model (compared to the length of your paradigm), you are safe despite the random regressor correlations, too.

All the best,
Lars
Mar 11, 2019  01:03 AM | Ralf Veit
RE:IncludingCompCor(CONN toolbox) into SPM batch
Dear Lars,

thank you very much for your very detailled response. This helps me a lot.

Best
Ralf