help > Loading conn files in SPM
Showing 1-25 of 30 posts
Jun 12, 2014 11:06 PM | Roger Beaty
Loading conn files in SPM
Hi everyone,
I'm trying to load .mat files that were preprocessed in CONN back to SPM to perform a GLM analysis. In other words, I ran the preprocessing pipeline using the CONN gui and would like to import the files produced by CONN back to SPM. My guess is that CONN formats .mat files specific to the toolbox, and that these files are not easily loadable into the SPM GUI. Is it possible to import CONN files back to SPM and run a traditional GLM?
Many thanks,
Roger
I'm trying to load .mat files that were preprocessed in CONN back to SPM to perform a GLM analysis. In other words, I ran the preprocessing pipeline using the CONN gui and would like to import the files produced by CONN back to SPM. My guess is that CONN formats .mat files specific to the toolbox, and that these files are not easily loadable into the SPM GUI. Is it possible to import CONN files back to SPM and run a traditional GLM?
Many thanks,
Roger
Jun 13, 2014 02:06 AM | Alfonso Nieto-Castanon - Boston University
RE: Loading conn files in SPM
Hi Roger,
If you want to use in SPM the BOLD timeseries after removal of confounding effects, you may simply select in Setup->Options the checkbox labeled 'Create confound-corrected time-series'. and that will create as part of the Preprocessing step a series of .nii files with these BOLD timeseries which you may then enter into SPM for further analyses (these will appear in your conn_*/results/preprocessing/ folder and the files will be named niftiDATA_Subject###_Condition###.nii).
If you have already run your preprocessing step and want to avoid having to repeat it, you may also simply type in the command window:
and that will create these same files from the already computed preprocessing-step results.
Hope this helps
Alfonso
Originally posted by Roger Beaty:
If you want to use in SPM the BOLD timeseries after removal of confounding effects, you may simply select in Setup->Options the checkbox labeled 'Create confound-corrected time-series'. and that will create as part of the Preprocessing step a series of .nii files with these BOLD timeseries which you may then enter into SPM for further analyses (these will appear in your conn_*/results/preprocessing/ folder and the files will be named niftiDATA_Subject###_Condition###.nii).
If you have already run your preprocessing step and want to avoid having to repeat it, you may also simply type in the command window:
conn_matc2nii;
and that will create these same files from the already computed preprocessing-step results.
Hope this helps
Alfonso
Originally posted by Roger Beaty:
Hi everyone,
I'm trying to load .mat files that were preprocessed in CONN back to SPM to perform a GLM analysis. In other words, I ran the preprocessing pipeline using the CONN gui and would like to import the files produced by CONN back to SPM. My guess is that CONN formats .mat files specific to the toolbox, and that these files are not easily loadable into the SPM GUI. Is it possible to import CONN files back to SPM and run a traditional GLM?
Many thanks,
Roger
I'm trying to load .mat files that were preprocessed in CONN back to SPM to perform a GLM analysis. In other words, I ran the preprocessing pipeline using the CONN gui and would like to import the files produced by CONN back to SPM. My guess is that CONN formats .mat files specific to the toolbox, and that these files are not easily loadable into the SPM GUI. Is it possible to import CONN files back to SPM and run a traditional GLM?
Many thanks,
Roger
Jun 14, 2014 02:06 PM | Roger Beaty
RE: Loading conn files in SPM
Thanks Alfonso. The conn_matc2nii command worked out great.
Best,
Roger
Best,
Roger
Jan 26, 2015 06:01 PM | Roger Beaty
RE: Loading conn files in SPM
Hi Alfonso and all,
Following up from my previous question, I'm now trying to load the preprocessed time series from Conn into SPM for first- and second-level analyses. The SPM batch editor requires an SPM.mat file with timing information and a time series in temporal order to specify a model. This information is already implied by the condition files produced by Conn. Is it possible to bypass these steps and simply upload the condition files from Conn to specify a model in SPM?
Thanks,
Roger
Originally posted by Alfonso Nieto-Castanon:
Following up from my previous question, I'm now trying to load the preprocessed time series from Conn into SPM for first- and second-level analyses. The SPM batch editor requires an SPM.mat file with timing information and a time series in temporal order to specify a model. This information is already implied by the condition files produced by Conn. Is it possible to bypass these steps and simply upload the condition files from Conn to specify a model in SPM?
Thanks,
Roger
Originally posted by Alfonso Nieto-Castanon:
Hi
Roger,
If you want to use in SPM the BOLD timeseries after removal of confounding effects, you may simply select in Setup->Options the checkbox labeled 'Create confound-corrected time-series'. and that will create as part of the Preprocessing step a series of .nii files with these BOLD timeseries which you may then enter into SPM for further analyses (these will appear in your conn_*/results/preprocessing/ folder and the files will be named niftiDATA_Subject###_Condition###.nii).
If you have already run your preprocessing step and want to avoid having to repeat it, you may also simply type in the command window:
and that will create these same files from the already computed preprocessing-step results.
Hope this helps
Alfonso
Originally posted by Roger Beaty:
If you want to use in SPM the BOLD timeseries after removal of confounding effects, you may simply select in Setup->Options the checkbox labeled 'Create confound-corrected time-series'. and that will create as part of the Preprocessing step a series of .nii files with these BOLD timeseries which you may then enter into SPM for further analyses (these will appear in your conn_*/results/preprocessing/ folder and the files will be named niftiDATA_Subject###_Condition###.nii).
If you have already run your preprocessing step and want to avoid having to repeat it, you may also simply type in the command window:
conn_matc2nii;
and that will create these same files from the already computed preprocessing-step results.
Hope this helps
Alfonso
Originally posted by Roger Beaty:
Hi everyone,
I'm trying to load .mat files that were preprocessed in CONN back to SPM to perform a GLM analysis. In other words, I ran the preprocessing pipeline using the CONN gui and would like to import the files produced by CONN back to SPM. My guess is that CONN formats .mat files specific to the toolbox, and that these files are not easily loadable into the SPM GUI. Is it possible to import CONN files back to SPM and run a traditional GLM?
Many thanks,
Roger
I'm trying to load .mat files that were preprocessed in CONN back to SPM to perform a GLM analysis. In other words, I ran the preprocessing pipeline using the CONN gui and would like to import the files produced by CONN back to SPM. My guess is that CONN formats .mat files specific to the toolbox, and that these files are not easily loadable into the SPM GUI. Is it possible to import CONN files back to SPM and run a traditional GLM?
Many thanks,
Roger
Jan 27, 2015 06:01 PM | Alfonso Nieto-Castanon - Boston University
RE: Loading conn files in SPM
Hi Roger,
Sorry but CONN does not create first-level SPM.mat files that you could import into SPM. Could you please expand a bit on what exactly you are trying to do to see if there are some work-arounds that come to mind? (e.g. are you trying to use CONN preprocessed time-series to look at the "activation" -BOLD signal changes; not connectivity- associated with each condition; if that is the case you could have CONN generate those first-level estimates directly during the denoising step)
Thanks
Alfonso
Originally posted by Roger Beaty:
Sorry but CONN does not create first-level SPM.mat files that you could import into SPM. Could you please expand a bit on what exactly you are trying to do to see if there are some work-arounds that come to mind? (e.g. are you trying to use CONN preprocessed time-series to look at the "activation" -BOLD signal changes; not connectivity- associated with each condition; if that is the case you could have CONN generate those first-level estimates directly during the denoising step)
Thanks
Alfonso
Originally posted by Roger Beaty:
Hi Alfonso and
all,
Following up from my previous question, I'm now trying to load the preprocessed time series from Conn into SPM for first- and second-level analyses. The SPM batch editor requires an SPM.mat file with timing information and a time series in temporal order to specify a model. This information is already implied by the condition files produced by Conn. Is it possible to bypass these steps and simply upload the condition files from Conn to specify a model in SPM?
Thanks,
Roger
Originally posted by Alfonso Nieto-Castanon:
Following up from my previous question, I'm now trying to load the preprocessed time series from Conn into SPM for first- and second-level analyses. The SPM batch editor requires an SPM.mat file with timing information and a time series in temporal order to specify a model. This information is already implied by the condition files produced by Conn. Is it possible to bypass these steps and simply upload the condition files from Conn to specify a model in SPM?
Thanks,
Roger
Originally posted by Alfonso Nieto-Castanon:
Hi
Roger,
If you want to use in SPM the BOLD timeseries after removal of confounding effects, you may simply select in Setup->Options the checkbox labeled 'Create confound-corrected time-series'. and that will create as part of the Preprocessing step a series of .nii files with these BOLD timeseries which you may then enter into SPM for further analyses (these will appear in your conn_*/results/preprocessing/ folder and the files will be named niftiDATA_Subject###_Condition###.nii).
If you have already run your preprocessing step and want to avoid having to repeat it, you may also simply type in the command window:
and that will create these same files from the already computed preprocessing-step results.
Hope this helps
Alfonso
Originally posted by Roger Beaty:
If you want to use in SPM the BOLD timeseries after removal of confounding effects, you may simply select in Setup->Options the checkbox labeled 'Create confound-corrected time-series'. and that will create as part of the Preprocessing step a series of .nii files with these BOLD timeseries which you may then enter into SPM for further analyses (these will appear in your conn_*/results/preprocessing/ folder and the files will be named niftiDATA_Subject###_Condition###.nii).
If you have already run your preprocessing step and want to avoid having to repeat it, you may also simply type in the command window:
conn_matc2nii;
and that will create these same files from the already computed preprocessing-step results.
Hope this helps
Alfonso
Originally posted by Roger Beaty:
Hi everyone,
I'm trying to load .mat files that were preprocessed in CONN back to SPM to perform a GLM analysis. In other words, I ran the preprocessing pipeline using the CONN gui and would like to import the files produced by CONN back to SPM. My guess is that CONN formats .mat files specific to the toolbox, and that these files are not easily loadable into the SPM GUI. Is it possible to import CONN files back to SPM and run a traditional GLM?
Many thanks,
Roger
I'm trying to load .mat files that were preprocessed in CONN back to SPM to perform a GLM analysis. In other words, I ran the preprocessing pipeline using the CONN gui and would like to import the files produced by CONN back to SPM. My guess is that CONN formats .mat files specific to the toolbox, and that these files are not easily loadable into the SPM GUI. Is it possible to import CONN files back to SPM and run a traditional GLM?
Many thanks,
Roger
Jan 27, 2015 09:01 PM | Roger Beaty
RE: Loading conn files in SPM
Hi Alfonso,
Yes, I am trying to look at BOLD activation differences between conditions (i.e., standard SPM GLM analysis), not connectivity differences. The goal is basically to compare these univariate results from SPM with some multivariate connectivity results produced by CONN. If possible, I would like to use the preprocessed images from CONN. Any thoughts on how to do this?
Thanks,
Roger
Originally posted by Alfonso Nieto-Castanon:
Yes, I am trying to look at BOLD activation differences between conditions (i.e., standard SPM GLM analysis), not connectivity differences. The goal is basically to compare these univariate results from SPM with some multivariate connectivity results produced by CONN. If possible, I would like to use the preprocessed images from CONN. Any thoughts on how to do this?
Thanks,
Roger
Originally posted by Alfonso Nieto-Castanon:
Hi
Roger,
Sorry but CONN does not create first-level SPM.mat files that you could import into SPM. Could you please expand a bit on what exactly you are trying to do to see if there are some work-arounds that come to mind? (e.g. are you trying to use CONN preprocessed time-series to look at the "activation" -BOLD signal changes; not connectivity- associated with each condition; if that is the case you could have CONN generate those first-level estimates directly during the denoising step)
Thanks
Alfonso
Originally posted by Roger Beaty:
Sorry but CONN does not create first-level SPM.mat files that you could import into SPM. Could you please expand a bit on what exactly you are trying to do to see if there are some work-arounds that come to mind? (e.g. are you trying to use CONN preprocessed time-series to look at the "activation" -BOLD signal changes; not connectivity- associated with each condition; if that is the case you could have CONN generate those first-level estimates directly during the denoising step)
Thanks
Alfonso
Originally posted by Roger Beaty:
Hi Alfonso and
all,
Following up from my previous question, I'm now trying to load the preprocessed time series from Conn into SPM for first- and second-level analyses. The SPM batch editor requires an SPM.mat file with timing information and a time series in temporal order to specify a model. This information is already implied by the condition files produced by Conn. Is it possible to bypass these steps and simply upload the condition files from Conn to specify a model in SPM?
Thanks,
Roger
Originally posted by Alfonso Nieto-Castanon:
Following up from my previous question, I'm now trying to load the preprocessed time series from Conn into SPM for first- and second-level analyses. The SPM batch editor requires an SPM.mat file with timing information and a time series in temporal order to specify a model. This information is already implied by the condition files produced by Conn. Is it possible to bypass these steps and simply upload the condition files from Conn to specify a model in SPM?
Thanks,
Roger
Originally posted by Alfonso Nieto-Castanon:
Hi
Roger,
If you want to use in SPM the BOLD timeseries after removal of confounding effects, you may simply select in Setup->Options the checkbox labeled 'Create confound-corrected time-series'. and that will create as part of the Preprocessing step a series of .nii files with these BOLD timeseries which you may then enter into SPM for further analyses (these will appear in your conn_*/results/preprocessing/ folder and the files will be named niftiDATA_Subject###_Condition###.nii).
If you have already run your preprocessing step and want to avoid having to repeat it, you may also simply type in the command window:
and that will create these same files from the already computed preprocessing-step results.
Hope this helps
Alfonso
Originally posted by Roger Beaty:
If you want to use in SPM the BOLD timeseries after removal of confounding effects, you may simply select in Setup->Options the checkbox labeled 'Create confound-corrected time-series'. and that will create as part of the Preprocessing step a series of .nii files with these BOLD timeseries which you may then enter into SPM for further analyses (these will appear in your conn_*/results/preprocessing/ folder and the files will be named niftiDATA_Subject###_Condition###.nii).
If you have already run your preprocessing step and want to avoid having to repeat it, you may also simply type in the command window:
conn_matc2nii;
and that will create these same files from the already computed preprocessing-step results.
Hope this helps
Alfonso
Originally posted by Roger Beaty:
Hi everyone,
I'm trying to load .mat files that were preprocessed in CONN back to SPM to perform a GLM analysis. In other words, I ran the preprocessing pipeline using the CONN gui and would like to import the files produced by CONN back to SPM. My guess is that CONN formats .mat files specific to the toolbox, and that these files are not easily loadable into the SPM GUI. Is it possible to import CONN files back to SPM and run a traditional GLM?
Many thanks,
Roger
I'm trying to load .mat files that were preprocessed in CONN back to SPM to perform a GLM analysis. In other words, I ran the preprocessing pipeline using the CONN gui and would like to import the files produced by CONN back to SPM. My guess is that CONN formats .mat files specific to the toolbox, and that these files are not easily loadable into the SPM GUI. Is it possible to import CONN files back to SPM and run a traditional GLM?
Many thanks,
Roger
Jan 29, 2015 01:01 AM | Alfonso Nieto-Castanon - Boston University
RE: Loading conn files in SPM
Hi Roger,
If you are already entering your "effect of X" for each condition/task as confounding effects during Denoising, then CONN is already computing those first-level analyses (effect of each task/condition controlling for all other confounding effects) during the Denoising step. If you check on the 'Create confound effect beta-maps' option in Setup.Options CONN will generate the corresponding Beta volumes (in conn_*/preprocessing/BETA_Subject#.nii) containing the estimated task effects for each subject, which you can then enter into SPM for additional second-level analyses. These BETA files will contain all of the estimated effects for each subject as a 4d volume, so you need to identify which ones are the effects corresponding to your tasks/conditions. The corresponding design matrix is block-diagonal, with one block per session, and within each session the effects are ordered as: 1) constant term; 2) all effects entered in the Confounding effects list, in the same order, and each expanded if necessary (e.g. when using higher order derivatives or multiple dimensions); 3) optional additional detrending term(s) (one for linear, two for quadratic, etc.). If you are unsure how to identify the proper terms please send me your conn_*.mat file and I will let you know which volumes within these BETA_Subject*.nii files correspond to your task conditions.
Alternatively, of course, you can also enter the preprocessed/denoised timeseries into SPM to perform these first-level analyses (but in that case please keep in mind that you should not have the "effect of X" entered as confounding effects during Denoising, otherwise that will remove exactly the same effects that you want to later estimate). That would still require you to define those first-level design matrices within SPM, since unfortunately there is no simple way to transfer that information back from CONN (in case it helps, you can find the task-related regressors that CONN uses in the files conn_*/data/COND_Subject#_Session#.mat; variable named 'data').
Hope this helps
Alfonso
Originally posted by Roger Beaty:
If you are already entering your "effect of X" for each condition/task as confounding effects during Denoising, then CONN is already computing those first-level analyses (effect of each task/condition controlling for all other confounding effects) during the Denoising step. If you check on the 'Create confound effect beta-maps' option in Setup.Options CONN will generate the corresponding Beta volumes (in conn_*/preprocessing/BETA_Subject#.nii) containing the estimated task effects for each subject, which you can then enter into SPM for additional second-level analyses. These BETA files will contain all of the estimated effects for each subject as a 4d volume, so you need to identify which ones are the effects corresponding to your tasks/conditions. The corresponding design matrix is block-diagonal, with one block per session, and within each session the effects are ordered as: 1) constant term; 2) all effects entered in the Confounding effects list, in the same order, and each expanded if necessary (e.g. when using higher order derivatives or multiple dimensions); 3) optional additional detrending term(s) (one for linear, two for quadratic, etc.). If you are unsure how to identify the proper terms please send me your conn_*.mat file and I will let you know which volumes within these BETA_Subject*.nii files correspond to your task conditions.
Alternatively, of course, you can also enter the preprocessed/denoised timeseries into SPM to perform these first-level analyses (but in that case please keep in mind that you should not have the "effect of X" entered as confounding effects during Denoising, otherwise that will remove exactly the same effects that you want to later estimate). That would still require you to define those first-level design matrices within SPM, since unfortunately there is no simple way to transfer that information back from CONN (in case it helps, you can find the task-related regressors that CONN uses in the files conn_*/data/COND_Subject#_Session#.mat; variable named 'data').
Hope this helps
Alfonso
Originally posted by Roger Beaty:
Hi
Alfonso,
Yes, I am trying to look at BOLD activation differences between conditions (i.e., standard SPM GLM analysis), not connectivity differences. The goal is basically to compare these univariate results from SPM with some multivariate connectivity results produced by CONN. If possible, I would like to use the preprocessed images from CONN. Any thoughts on how to do this?
Thanks,
Roger
Originally posted by Alfonso Nieto-Castanon:
Yes, I am trying to look at BOLD activation differences between conditions (i.e., standard SPM GLM analysis), not connectivity differences. The goal is basically to compare these univariate results from SPM with some multivariate connectivity results produced by CONN. If possible, I would like to use the preprocessed images from CONN. Any thoughts on how to do this?
Thanks,
Roger
Originally posted by Alfonso Nieto-Castanon:
Hi
Roger,
Sorry but CONN does not create first-level SPM.mat files that you could import into SPM. Could you please expand a bit on what exactly you are trying to do to see if there are some work-arounds that come to mind? (e.g. are you trying to use CONN preprocessed time-series to look at the "activation" -BOLD signal changes; not connectivity- associated with each condition; if that is the case you could have CONN generate those first-level estimates directly during the denoising step)
Thanks
Alfonso
Originally posted by Roger Beaty:
Sorry but CONN does not create first-level SPM.mat files that you could import into SPM. Could you please expand a bit on what exactly you are trying to do to see if there are some work-arounds that come to mind? (e.g. are you trying to use CONN preprocessed time-series to look at the "activation" -BOLD signal changes; not connectivity- associated with each condition; if that is the case you could have CONN generate those first-level estimates directly during the denoising step)
Thanks
Alfonso
Originally posted by Roger Beaty:
Hi Alfonso and
all,
Following up from my previous question, I'm now trying to load the preprocessed time series from Conn into SPM for first- and second-level analyses. The SPM batch editor requires an SPM.mat file with timing information and a time series in temporal order to specify a model. This information is already implied by the condition files produced by Conn. Is it possible to bypass these steps and simply upload the condition files from Conn to specify a model in SPM?
Thanks,
Roger
Originally posted by Alfonso Nieto-Castanon:
Following up from my previous question, I'm now trying to load the preprocessed time series from Conn into SPM for first- and second-level analyses. The SPM batch editor requires an SPM.mat file with timing information and a time series in temporal order to specify a model. This information is already implied by the condition files produced by Conn. Is it possible to bypass these steps and simply upload the condition files from Conn to specify a model in SPM?
Thanks,
Roger
Originally posted by Alfonso Nieto-Castanon:
Hi
Roger,
If you want to use in SPM the BOLD timeseries after removal of confounding effects, you may simply select in Setup->Options the checkbox labeled 'Create confound-corrected time-series'. and that will create as part of the Preprocessing step a series of .nii files with these BOLD timeseries which you may then enter into SPM for further analyses (these will appear in your conn_*/results/preprocessing/ folder and the files will be named niftiDATA_Subject###_Condition###.nii).
If you have already run your preprocessing step and want to avoid having to repeat it, you may also simply type in the command window:
and that will create these same files from the already computed preprocessing-step results.
Hope this helps
Alfonso
Originally posted by Roger Beaty:
If you want to use in SPM the BOLD timeseries after removal of confounding effects, you may simply select in Setup->Options the checkbox labeled 'Create confound-corrected time-series'. and that will create as part of the Preprocessing step a series of .nii files with these BOLD timeseries which you may then enter into SPM for further analyses (these will appear in your conn_*/results/preprocessing/ folder and the files will be named niftiDATA_Subject###_Condition###.nii).
If you have already run your preprocessing step and want to avoid having to repeat it, you may also simply type in the command window:
conn_matc2nii;
and that will create these same files from the already computed preprocessing-step results.
Hope this helps
Alfonso
Originally posted by Roger Beaty:
Hi everyone,
I'm trying to load .mat files that were preprocessed in CONN back to SPM to perform a GLM analysis. In other words, I ran the preprocessing pipeline using the CONN gui and would like to import the files produced by CONN back to SPM. My guess is that CONN formats .mat files specific to the toolbox, and that these files are not easily loadable into the SPM GUI. Is it possible to import CONN files back to SPM and run a traditional GLM?
Many thanks,
Roger
I'm trying to load .mat files that were preprocessed in CONN back to SPM to perform a GLM analysis. In other words, I ran the preprocessing pipeline using the CONN gui and would like to import the files produced by CONN back to SPM. My guess is that CONN formats .mat files specific to the toolbox, and that these files are not easily loadable into the SPM GUI. Is it possible to import CONN files back to SPM and run a traditional GLM?
Many thanks,
Roger
Jul 31, 2015 11:07 AM | Chaleece Sandberg
RE: Loading conn files in SPM
Hi Alfonso,
If you use the confound-corrected time-series .nii files created by CONN after the denoising step in the SPM GLM for a first level analysis, do you still need to include the "outliers and movement".mat file from art as a regressor in SPM if it was entered as a first-level covariate in CONN? My hunch is no, but I just want to verify. Also, if the .nii files are separated by condition, rather than run/session, do I need to set up the SPM GLM differently?
Thanks!
Chaleece
Originally posted by Alfonso Nieto-Castanon:
If you use the confound-corrected time-series .nii files created by CONN after the denoising step in the SPM GLM for a first level analysis, do you still need to include the "outliers and movement".mat file from art as a regressor in SPM if it was entered as a first-level covariate in CONN? My hunch is no, but I just want to verify. Also, if the .nii files are separated by condition, rather than run/session, do I need to set up the SPM GLM differently?
Thanks!
Chaleece
Originally posted by Alfonso Nieto-Castanon:
Hi
Roger,
If you want to use in SPM the BOLD timeseries after removal of confounding effects, you may simply select in Setup->Options the checkbox labeled 'Create confound-corrected time-series'. and that will create as part of the Preprocessing step a series of .nii files with these BOLD timeseries which you may then enter into SPM for further analyses (these will appear in your conn_*/results/preprocessing/ folder and the files will be named niftiDATA_Subject###_Condition###.nii).
If you have already run your preprocessing step and want to avoid having to repeat it, you may also simply type in the command window:
and that will create these same files from the already computed preprocessing-step results.
Hope this helps
Alfonso
Originally posted by Roger Beaty:
If you want to use in SPM the BOLD timeseries after removal of confounding effects, you may simply select in Setup->Options the checkbox labeled 'Create confound-corrected time-series'. and that will create as part of the Preprocessing step a series of .nii files with these BOLD timeseries which you may then enter into SPM for further analyses (these will appear in your conn_*/results/preprocessing/ folder and the files will be named niftiDATA_Subject###_Condition###.nii).
If you have already run your preprocessing step and want to avoid having to repeat it, you may also simply type in the command window:
conn_matc2nii;
and that will create these same files from the already computed preprocessing-step results.
Hope this helps
Alfonso
Originally posted by Roger Beaty:
Hi everyone,
I'm trying to load .mat files that were preprocessed in CONN back to SPM to perform a GLM analysis. In other words, I ran the preprocessing pipeline using the CONN gui and would like to import the files produced by CONN back to SPM. My guess is that CONN formats .mat files specific to the toolbox, and that these files are not easily loadable into the SPM GUI. Is it possible to import CONN files back to SPM and run a traditional GLM?
Many thanks,
Roger
I'm trying to load .mat files that were preprocessed in CONN back to SPM to perform a GLM analysis. In other words, I ran the preprocessing pipeline using the CONN gui and would like to import the files produced by CONN back to SPM. My guess is that CONN formats .mat files specific to the toolbox, and that these files are not easily loadable into the SPM GUI. Is it possible to import CONN files back to SPM and run a traditional GLM?
Many thanks,
Roger
Aug 3, 2015 05:08 AM | Alfonso Nieto-Castanon - Boston University
RE: Loading conn files in SPM
Hi Chaleece,
Your hunch is correct, the BOLD signal will already be orthogonal to any effects entered in the 'Denoising' step so there is no need to add those covariates again in your SPM model when exporting these data. And regarding the "separated by condition" nii files, if you are using a relatively recent release of CONN you will get a series of files associated with "condition 0" named niftiSubject*_Condition0000.nii which contain all of your data (no separation by condition), so if using these data you may enter the same GLM design you would normally do in SPM (if you are instead entering the condition-specific timeseries into a first-level SPM GLM, then yes, you would typically need to permute your design matrix rows to follow the same order as in your functional data)
Hope this helps
Alfonso
Originally posted by Chaleece Sandberg:
Your hunch is correct, the BOLD signal will already be orthogonal to any effects entered in the 'Denoising' step so there is no need to add those covariates again in your SPM model when exporting these data. And regarding the "separated by condition" nii files, if you are using a relatively recent release of CONN you will get a series of files associated with "condition 0" named niftiSubject*_Condition0000.nii which contain all of your data (no separation by condition), so if using these data you may enter the same GLM design you would normally do in SPM (if you are instead entering the condition-specific timeseries into a first-level SPM GLM, then yes, you would typically need to permute your design matrix rows to follow the same order as in your functional data)
Hope this helps
Alfonso
Originally posted by Chaleece Sandberg:
Hi
Alfonso,
If you use the confound-corrected time-series .nii files created by CONN after the denoising step in the SPM GLM for a first level analysis, do you still need to include the "outliers and movement".mat file from art as a regressor in SPM if it was entered as a first-level covariate in CONN? My hunch is no, but I just want to verify. Also, if the .nii files are separated by condition, rather than run/session, do I need to set up the SPM GLM differently?
Thanks!
Chaleece
Originally posted by Alfonso Nieto-Castanon:
If you use the confound-corrected time-series .nii files created by CONN after the denoising step in the SPM GLM for a first level analysis, do you still need to include the "outliers and movement".mat file from art as a regressor in SPM if it was entered as a first-level covariate in CONN? My hunch is no, but I just want to verify. Also, if the .nii files are separated by condition, rather than run/session, do I need to set up the SPM GLM differently?
Thanks!
Chaleece
Originally posted by Alfonso Nieto-Castanon:
Hi
Roger,
If you want to use in SPM the BOLD timeseries after removal of confounding effects, you may simply select in Setup->Options the checkbox labeled 'Create confound-corrected time-series'. and that will create as part of the Preprocessing step a series of .nii files with these BOLD timeseries which you may then enter into SPM for further analyses (these will appear in your conn_*/results/preprocessing/ folder and the files will be named niftiDATA_Subject###_Condition###.nii).
If you have already run your preprocessing step and want to avoid having to repeat it, you may also simply type in the command window:
and that will create these same files from the already computed preprocessing-step results.
Hope this helps
Alfonso
Originally posted by Roger Beaty:
If you want to use in SPM the BOLD timeseries after removal of confounding effects, you may simply select in Setup->Options the checkbox labeled 'Create confound-corrected time-series'. and that will create as part of the Preprocessing step a series of .nii files with these BOLD timeseries which you may then enter into SPM for further analyses (these will appear in your conn_*/results/preprocessing/ folder and the files will be named niftiDATA_Subject###_Condition###.nii).
If you have already run your preprocessing step and want to avoid having to repeat it, you may also simply type in the command window:
conn_matc2nii;
and that will create these same files from the already computed preprocessing-step results.
Hope this helps
Alfonso
Originally posted by Roger Beaty:
Hi everyone,
I'm trying to load .mat files that were preprocessed in CONN back to SPM to perform a GLM analysis. In other words, I ran the preprocessing pipeline using the CONN gui and would like to import the files produced by CONN back to SPM. My guess is that CONN formats .mat files specific to the toolbox, and that these files are not easily loadable into the SPM GUI. Is it possible to import CONN files back to SPM and run a traditional GLM?
Many thanks,
Roger
I'm trying to load .mat files that were preprocessed in CONN back to SPM to perform a GLM analysis. In other words, I ran the preprocessing pipeline using the CONN gui and would like to import the files produced by CONN back to SPM. My guess is that CONN formats .mat files specific to the toolbox, and that these files are not easily loadable into the SPM GUI. Is it possible to import CONN files back to SPM and run a traditional GLM?
Many thanks,
Roger
Aug 5, 2015 08:08 PM | Chaleece Sandberg
RE: Loading conn files in SPM
Hi Alfonso!
Yes, this is very helpful. I finished the Denoising and got the niftiDATA_Subject*_Condition000 files which contain all of the volumes (I actually did not get separate condition files - from what I can tell). I wanted to then enter them as I normally would in SPM, so I tried to convert the 4D file to a 3D file in order to only select the volumes that were in each run. I got an error saying that there was no header file. Is there any way to split this file? If I don't, then I believe I'll need to recalculate my onsets and durations as if it were one long (very long - 900 volumes - I'm including pre and post) run and I don't like that prospect for tediousness, but also for possible problems with timing accuracy. Does that make sense?
Any advice would be much appreciated.
Thanks!
Chaleece
Originally posted by Alfonso Nieto-Castanon:
Yes, this is very helpful. I finished the Denoising and got the niftiDATA_Subject*_Condition000 files which contain all of the volumes (I actually did not get separate condition files - from what I can tell). I wanted to then enter them as I normally would in SPM, so I tried to convert the 4D file to a 3D file in order to only select the volumes that were in each run. I got an error saying that there was no header file. Is there any way to split this file? If I don't, then I believe I'll need to recalculate my onsets and durations as if it were one long (very long - 900 volumes - I'm including pre and post) run and I don't like that prospect for tediousness, but also for possible problems with timing accuracy. Does that make sense?
Any advice would be much appreciated.
Thanks!
Chaleece
Originally posted by Alfonso Nieto-Castanon:
Hi
Chaleece,
Your hunch is correct, the BOLD signal will already be orthogonal to any effects entered in the 'Denoising' step so there is no need to add those covariates again in your SPM model when exporting these data. And regarding the "separated by condition" nii files, if you are using a relatively recent release of CONN you will get a series of files associated with "condition 0" named niftiSubject*_Condition0000.nii which contain all of your data (no separation by condition), so if using these data you may enter the same GLM design you would normally do in SPM (if you are instead entering the condition-specific timeseries into a first-level SPM GLM, then yes, you would typically need to permute your design matrix rows to follow the same order as in your functional data)
Hope this helps
Alfonso
Originally posted by Chaleece Sandberg:
Your hunch is correct, the BOLD signal will already be orthogonal to any effects entered in the 'Denoising' step so there is no need to add those covariates again in your SPM model when exporting these data. And regarding the "separated by condition" nii files, if you are using a relatively recent release of CONN you will get a series of files associated with "condition 0" named niftiSubject*_Condition0000.nii which contain all of your data (no separation by condition), so if using these data you may enter the same GLM design you would normally do in SPM (if you are instead entering the condition-specific timeseries into a first-level SPM GLM, then yes, you would typically need to permute your design matrix rows to follow the same order as in your functional data)
Hope this helps
Alfonso
Originally posted by Chaleece Sandberg:
Hi
Alfonso,
If you use the confound-corrected time-series .nii files created by CONN after the denoising step in the SPM GLM for a first level analysis, do you still need to include the "outliers and movement".mat file from art as a regressor in SPM if it was entered as a first-level covariate in CONN? My hunch is no, but I just want to verify. Also, if the .nii files are separated by condition, rather than run/session, do I need to set up the SPM GLM differently?
Thanks!
Chaleece
Originally posted by Alfonso Nieto-Castanon:
If you use the confound-corrected time-series .nii files created by CONN after the denoising step in the SPM GLM for a first level analysis, do you still need to include the "outliers and movement".mat file from art as a regressor in SPM if it was entered as a first-level covariate in CONN? My hunch is no, but I just want to verify. Also, if the .nii files are separated by condition, rather than run/session, do I need to set up the SPM GLM differently?
Thanks!
Chaleece
Originally posted by Alfonso Nieto-Castanon:
Hi
Roger,
If you want to use in SPM the BOLD timeseries after removal of confounding effects, you may simply select in Setup->Options the checkbox labeled 'Create confound-corrected time-series'. and that will create as part of the Preprocessing step a series of .nii files with these BOLD timeseries which you may then enter into SPM for further analyses (these will appear in your conn_*/results/preprocessing/ folder and the files will be named niftiDATA_Subject###_Condition###.nii).
If you have already run your preprocessing step and want to avoid having to repeat it, you may also simply type in the command window:
and that will create these same files from the already computed preprocessing-step results.
Hope this helps
Alfonso
Originally posted by Roger Beaty:
If you want to use in SPM the BOLD timeseries after removal of confounding effects, you may simply select in Setup->Options the checkbox labeled 'Create confound-corrected time-series'. and that will create as part of the Preprocessing step a series of .nii files with these BOLD timeseries which you may then enter into SPM for further analyses (these will appear in your conn_*/results/preprocessing/ folder and the files will be named niftiDATA_Subject###_Condition###.nii).
If you have already run your preprocessing step and want to avoid having to repeat it, you may also simply type in the command window:
conn_matc2nii;
and that will create these same files from the already computed preprocessing-step results.
Hope this helps
Alfonso
Originally posted by Roger Beaty:
Hi everyone,
I'm trying to load .mat files that were preprocessed in CONN back to SPM to perform a GLM analysis. In other words, I ran the preprocessing pipeline using the CONN gui and would like to import the files produced by CONN back to SPM. My guess is that CONN formats .mat files specific to the toolbox, and that these files are not easily loadable into the SPM GUI. Is it possible to import CONN files back to SPM and run a traditional GLM?
Many thanks,
Roger
I'm trying to load .mat files that were preprocessed in CONN back to SPM to perform a GLM analysis. In other words, I ran the preprocessing pipeline using the CONN gui and would like to import the files produced by CONN back to SPM. My guess is that CONN formats .mat files specific to the toolbox, and that these files are not easily loadable into the SPM GUI. Is it possible to import CONN files back to SPM and run a traditional GLM?
Many thanks,
Roger
Aug 6, 2015 06:08 AM | Alfonso Nieto-Castanon - Boston University
RE: Loading conn files in SPM
Hi Chaleece,
You should be able to split that 4d file into multiple 3d files using spm_file_split, for example:
spm_file_split('niftiDATA_Subject001_Condition000.nii');
will create a number of individual 3d files from this 4d file.
You can also split the original file into smaller 4d files (e.g. one per session) using spm_file_merge, for example:
a=spm_vol('niftiDATA_Subject001_Condition000.nii');
spm_file_merge(a(1:20),'mynewfile.nii');
will create a new 4d file containing only the first 20 scans of the original 4d file.
If you want to break all of your preprocessed nifti* files into session-specific files you could use something like the following to do so (assuming you have your project loaded into CONN):
global CONN_x;
for nsub=1:CONN_x.Setup.nsubjects
filename = fullfile(CONN_x.folders.preprocessing,sprintf('niftiDATA_Subject%03d_Condition000.nii',nsub));
a = spm_vol(filename);
nscans = cumsum([0 cell2mat(CONN_x.Setup.nscans{nsub})]);
for nses=1:CONN_x.Setup.nsessions(min(numel(CONN_x.Setup.nsessions),nsub)),
scans = nscans(nses)+1:nscans(nses+1);
spm_file_merge(a(scans),conn_prepend('',filename,sprintf('_Session%d.nii',nses)));
end
end
That will create niftiDATA_Subject*_Session*.nii files from your nifti_DATA_Subject*_Condition000.nii files
Hope this helps
Alfonso
Originally posted by Chaleece Sandberg:
You should be able to split that 4d file into multiple 3d files using spm_file_split, for example:
spm_file_split('niftiDATA_Subject001_Condition000.nii');
will create a number of individual 3d files from this 4d file.
You can also split the original file into smaller 4d files (e.g. one per session) using spm_file_merge, for example:
a=spm_vol('niftiDATA_Subject001_Condition000.nii');
spm_file_merge(a(1:20),'mynewfile.nii');
will create a new 4d file containing only the first 20 scans of the original 4d file.
If you want to break all of your preprocessed nifti* files into session-specific files you could use something like the following to do so (assuming you have your project loaded into CONN):
global CONN_x;
for nsub=1:CONN_x.Setup.nsubjects
filename = fullfile(CONN_x.folders.preprocessing,sprintf('niftiDATA_Subject%03d_Condition000.nii',nsub));
a = spm_vol(filename);
nscans = cumsum([0 cell2mat(CONN_x.Setup.nscans{nsub})]);
for nses=1:CONN_x.Setup.nsessions(min(numel(CONN_x.Setup.nsessions),nsub)),
scans = nscans(nses)+1:nscans(nses+1);
spm_file_merge(a(scans),conn_prepend('',filename,sprintf('_Session%d.nii',nses)));
end
end
That will create niftiDATA_Subject*_Session*.nii files from your nifti_DATA_Subject*_Condition000.nii files
Hope this helps
Alfonso
Originally posted by Chaleece Sandberg:
Hi
Alfonso!
Yes, this is very helpful. I finished the Denoising and got the niftiDATA_Subject*_Condition000 files which contain all of the volumes (I actually did not get separate condition files - from what I can tell). I wanted to then enter them as I normally would in SPM, so I tried to convert the 4D file to a 3D file in order to only select the volumes that were in each run. I got an error saying that there was no header file. Is there any way to split this file? If I don't, then I believe I'll need to recalculate my onsets and durations as if it were one long (very long - 900 volumes - I'm including pre and post) run and I don't like that prospect for tediousness, but also for possible problems with timing accuracy. Does that make sense?
Any advice would be much appreciated.
Thanks!
Chaleece
Originally posted by Alfonso Nieto-Castanon:
Yes, this is very helpful. I finished the Denoising and got the niftiDATA_Subject*_Condition000 files which contain all of the volumes (I actually did not get separate condition files - from what I can tell). I wanted to then enter them as I normally would in SPM, so I tried to convert the 4D file to a 3D file in order to only select the volumes that were in each run. I got an error saying that there was no header file. Is there any way to split this file? If I don't, then I believe I'll need to recalculate my onsets and durations as if it were one long (very long - 900 volumes - I'm including pre and post) run and I don't like that prospect for tediousness, but also for possible problems with timing accuracy. Does that make sense?
Any advice would be much appreciated.
Thanks!
Chaleece
Originally posted by Alfonso Nieto-Castanon:
Hi
Chaleece,
Your hunch is correct, the BOLD signal will already be orthogonal to any effects entered in the 'Denoising' step so there is no need to add those covariates again in your SPM model when exporting these data. And regarding the "separated by condition" nii files, if you are using a relatively recent release of CONN you will get a series of files associated with "condition 0" named niftiSubject*_Condition0000.nii which contain all of your data (no separation by condition), so if using these data you may enter the same GLM design you would normally do in SPM (if you are instead entering the condition-specific timeseries into a first-level SPM GLM, then yes, you would typically need to permute your design matrix rows to follow the same order as in your functional data)
Hope this helps
Alfonso
Originally posted by Chaleece Sandberg:
Your hunch is correct, the BOLD signal will already be orthogonal to any effects entered in the 'Denoising' step so there is no need to add those covariates again in your SPM model when exporting these data. And regarding the "separated by condition" nii files, if you are using a relatively recent release of CONN you will get a series of files associated with "condition 0" named niftiSubject*_Condition0000.nii which contain all of your data (no separation by condition), so if using these data you may enter the same GLM design you would normally do in SPM (if you are instead entering the condition-specific timeseries into a first-level SPM GLM, then yes, you would typically need to permute your design matrix rows to follow the same order as in your functional data)
Hope this helps
Alfonso
Originally posted by Chaleece Sandberg:
Hi
Alfonso,
If you use the confound-corrected time-series .nii files created by CONN after the denoising step in the SPM GLM for a first level analysis, do you still need to include the "outliers and movement".mat file from art as a regressor in SPM if it was entered as a first-level covariate in CONN? My hunch is no, but I just want to verify. Also, if the .nii files are separated by condition, rather than run/session, do I need to set up the SPM GLM differently?
Thanks!
Chaleece
Originally posted by Alfonso Nieto-Castanon:
If you use the confound-corrected time-series .nii files created by CONN after the denoising step in the SPM GLM for a first level analysis, do you still need to include the "outliers and movement".mat file from art as a regressor in SPM if it was entered as a first-level covariate in CONN? My hunch is no, but I just want to verify. Also, if the .nii files are separated by condition, rather than run/session, do I need to set up the SPM GLM differently?
Thanks!
Chaleece
Originally posted by Alfonso Nieto-Castanon:
Hi
Roger,
If you want to use in SPM the BOLD timeseries after removal of confounding effects, you may simply select in Setup->Options the checkbox labeled 'Create confound-corrected time-series'. and that will create as part of the Preprocessing step a series of .nii files with these BOLD timeseries which you may then enter into SPM for further analyses (these will appear in your conn_*/results/preprocessing/ folder and the files will be named niftiDATA_Subject###_Condition###.nii).
If you have already run your preprocessing step and want to avoid having to repeat it, you may also simply type in the command window:
and that will create these same files from the already computed preprocessing-step results.
Hope this helps
Alfonso
Originally posted by Roger Beaty:
If you want to use in SPM the BOLD timeseries after removal of confounding effects, you may simply select in Setup->Options the checkbox labeled 'Create confound-corrected time-series'. and that will create as part of the Preprocessing step a series of .nii files with these BOLD timeseries which you may then enter into SPM for further analyses (these will appear in your conn_*/results/preprocessing/ folder and the files will be named niftiDATA_Subject###_Condition###.nii).
If you have already run your preprocessing step and want to avoid having to repeat it, you may also simply type in the command window:
conn_matc2nii;
and that will create these same files from the already computed preprocessing-step results.
Hope this helps
Alfonso
Originally posted by Roger Beaty:
Hi everyone,
I'm trying to load .mat files that were preprocessed in CONN back to SPM to perform a GLM analysis. In other words, I ran the preprocessing pipeline using the CONN gui and would like to import the files produced by CONN back to SPM. My guess is that CONN formats .mat files specific to the toolbox, and that these files are not easily loadable into the SPM GUI. Is it possible to import CONN files back to SPM and run a traditional GLM?
Many thanks,
Roger
I'm trying to load .mat files that were preprocessed in CONN back to SPM to perform a GLM analysis. In other words, I ran the preprocessing pipeline using the CONN gui and would like to import the files produced by CONN back to SPM. My guess is that CONN formats .mat files specific to the toolbox, and that these files are not easily loadable into the SPM GUI. Is it possible to import CONN files back to SPM and run a traditional GLM?
Many thanks,
Roger
Aug 6, 2015 03:08 PM | Chaleece Sandberg
RE: Loading conn files in SPM
This. Is. FABULOUS!
The spm_file_merge business is exactly what I needed. Thank you!
p.s. I figured out that the problem with the header is that I was trying to use spm_file_split with a file that I had copied and pasted into a different folder. It ran swimmingly in the same folder.
Originally posted by Alfonso Nieto-Castanon:
The spm_file_merge business is exactly what I needed. Thank you!
p.s. I figured out that the problem with the header is that I was trying to use spm_file_split with a file that I had copied and pasted into a different folder. It ran swimmingly in the same folder.
Originally posted by Alfonso Nieto-Castanon:
Hi
Chaleece,
You should be able to split that 4d file into multiple 3d files using spm_file_split, for example:
spm_file_split('niftiDATA_Subject001_Condition000.nii');
will create a number of individual 3d files from this 4d file.
You can also split the original file into smaller 4d files (e.g. one per session) using spm_file_merge, for example:
a=spm_vol('niftiDATA_Subject001_Condition000.nii');
spm_file_merge(a(1:20),'mynewfile.nii');
will create a new 4d file containing only the first 20 scans of the original 4d file.
If you want to break all of your preprocessed nifti* files into session-specific files you could use something like the following to do so (assuming you have your project loaded into CONN):
global CONN_x;
for nsub=1:CONN_x.Setup.nsubjects
filename = fullfile(CONN_x.folders.preprocessing,sprintf('niftiDATA_Subject%03d_Condition000.nii',nsub));
a = spm_vol(filename);
nscans = cumsum([0 cell2mat(CONN_x.Setup.nscans{nsub})]);
for nses=1:CONN_x.Setup.nsessions(min(numel(CONN_x.Setup.nsessions),nsub)),
scans = nscans(nses)+1:nscans(nses+1);
spm_file_merge(a(scans),conn_prepend('',filename,sprintf('_Session%d.nii',nses)));
end
end
That will create niftiDATA_Subject*_Session*.nii files from your nifti_DATA_Subject*_Condition000.nii files
Hope this helps
Alfonso
Originally posted by Chaleece Sandberg:
You should be able to split that 4d file into multiple 3d files using spm_file_split, for example:
spm_file_split('niftiDATA_Subject001_Condition000.nii');
will create a number of individual 3d files from this 4d file.
You can also split the original file into smaller 4d files (e.g. one per session) using spm_file_merge, for example:
a=spm_vol('niftiDATA_Subject001_Condition000.nii');
spm_file_merge(a(1:20),'mynewfile.nii');
will create a new 4d file containing only the first 20 scans of the original 4d file.
If you want to break all of your preprocessed nifti* files into session-specific files you could use something like the following to do so (assuming you have your project loaded into CONN):
global CONN_x;
for nsub=1:CONN_x.Setup.nsubjects
filename = fullfile(CONN_x.folders.preprocessing,sprintf('niftiDATA_Subject%03d_Condition000.nii',nsub));
a = spm_vol(filename);
nscans = cumsum([0 cell2mat(CONN_x.Setup.nscans{nsub})]);
for nses=1:CONN_x.Setup.nsessions(min(numel(CONN_x.Setup.nsessions),nsub)),
scans = nscans(nses)+1:nscans(nses+1);
spm_file_merge(a(scans),conn_prepend('',filename,sprintf('_Session%d.nii',nses)));
end
end
That will create niftiDATA_Subject*_Session*.nii files from your nifti_DATA_Subject*_Condition000.nii files
Hope this helps
Alfonso
Originally posted by Chaleece Sandberg:
Hi
Alfonso!
Yes, this is very helpful. I finished the Denoising and got the niftiDATA_Subject*_Condition000 files which contain all of the volumes (I actually did not get separate condition files - from what I can tell). I wanted to then enter them as I normally would in SPM, so I tried to convert the 4D file to a 3D file in order to only select the volumes that were in each run. I got an error saying that there was no header file. Is there any way to split this file? If I don't, then I believe I'll need to recalculate my onsets and durations as if it were one long (very long - 900 volumes - I'm including pre and post) run and I don't like that prospect for tediousness, but also for possible problems with timing accuracy. Does that make sense?
Any advice would be much appreciated.
Thanks!
Chaleece
Originally posted by Alfonso Nieto-Castanon:
Yes, this is very helpful. I finished the Denoising and got the niftiDATA_Subject*_Condition000 files which contain all of the volumes (I actually did not get separate condition files - from what I can tell). I wanted to then enter them as I normally would in SPM, so I tried to convert the 4D file to a 3D file in order to only select the volumes that were in each run. I got an error saying that there was no header file. Is there any way to split this file? If I don't, then I believe I'll need to recalculate my onsets and durations as if it were one long (very long - 900 volumes - I'm including pre and post) run and I don't like that prospect for tediousness, but also for possible problems with timing accuracy. Does that make sense?
Any advice would be much appreciated.
Thanks!
Chaleece
Originally posted by Alfonso Nieto-Castanon:
Hi
Chaleece,
Your hunch is correct, the BOLD signal will already be orthogonal to any effects entered in the 'Denoising' step so there is no need to add those covariates again in your SPM model when exporting these data. And regarding the "separated by condition" nii files, if you are using a relatively recent release of CONN you will get a series of files associated with "condition 0" named niftiSubject*_Condition0000.nii which contain all of your data (no separation by condition), so if using these data you may enter the same GLM design you would normally do in SPM (if you are instead entering the condition-specific timeseries into a first-level SPM GLM, then yes, you would typically need to permute your design matrix rows to follow the same order as in your functional data)
Hope this helps
Alfonso
Originally posted by Chaleece Sandberg:
Your hunch is correct, the BOLD signal will already be orthogonal to any effects entered in the 'Denoising' step so there is no need to add those covariates again in your SPM model when exporting these data. And regarding the "separated by condition" nii files, if you are using a relatively recent release of CONN you will get a series of files associated with "condition 0" named niftiSubject*_Condition0000.nii which contain all of your data (no separation by condition), so if using these data you may enter the same GLM design you would normally do in SPM (if you are instead entering the condition-specific timeseries into a first-level SPM GLM, then yes, you would typically need to permute your design matrix rows to follow the same order as in your functional data)
Hope this helps
Alfonso
Originally posted by Chaleece Sandberg:
Hi
Alfonso,
If you use the confound-corrected time-series .nii files created by CONN after the denoising step in the SPM GLM for a first level analysis, do you still need to include the "outliers and movement".mat file from art as a regressor in SPM if it was entered as a first-level covariate in CONN? My hunch is no, but I just want to verify. Also, if the .nii files are separated by condition, rather than run/session, do I need to set up the SPM GLM differently?
Thanks!
Chaleece
Originally posted by Alfonso Nieto-Castanon:
If you use the confound-corrected time-series .nii files created by CONN after the denoising step in the SPM GLM for a first level analysis, do you still need to include the "outliers and movement".mat file from art as a regressor in SPM if it was entered as a first-level covariate in CONN? My hunch is no, but I just want to verify. Also, if the .nii files are separated by condition, rather than run/session, do I need to set up the SPM GLM differently?
Thanks!
Chaleece
Originally posted by Alfonso Nieto-Castanon:
Hi
Roger,
If you want to use in SPM the BOLD timeseries after removal of confounding effects, you may simply select in Setup->Options the checkbox labeled 'Create confound-corrected time-series'. and that will create as part of the Preprocessing step a series of .nii files with these BOLD timeseries which you may then enter into SPM for further analyses (these will appear in your conn_*/results/preprocessing/ folder and the files will be named niftiDATA_Subject###_Condition###.nii).
If you have already run your preprocessing step and want to avoid having to repeat it, you may also simply type in the command window:
and that will create these same files from the already computed preprocessing-step results.
Hope this helps
Alfonso
Originally posted by Roger Beaty:
If you want to use in SPM the BOLD timeseries after removal of confounding effects, you may simply select in Setup->Options the checkbox labeled 'Create confound-corrected time-series'. and that will create as part of the Preprocessing step a series of .nii files with these BOLD timeseries which you may then enter into SPM for further analyses (these will appear in your conn_*/results/preprocessing/ folder and the files will be named niftiDATA_Subject###_Condition###.nii).
If you have already run your preprocessing step and want to avoid having to repeat it, you may also simply type in the command window:
conn_matc2nii;
and that will create these same files from the already computed preprocessing-step results.
Hope this helps
Alfonso
Originally posted by Roger Beaty:
Hi everyone,
I'm trying to load .mat files that were preprocessed in CONN back to SPM to perform a GLM analysis. In other words, I ran the preprocessing pipeline using the CONN gui and would like to import the files produced by CONN back to SPM. My guess is that CONN formats .mat files specific to the toolbox, and that these files are not easily loadable into the SPM GUI. Is it possible to import CONN files back to SPM and run a traditional GLM?
Many thanks,
Roger
I'm trying to load .mat files that were preprocessed in CONN back to SPM to perform a GLM analysis. In other words, I ran the preprocessing pipeline using the CONN gui and would like to import the files produced by CONN back to SPM. My guess is that CONN formats .mat files specific to the toolbox, and that these files are not easily loadable into the SPM GUI. Is it possible to import CONN files back to SPM and run a traditional GLM?
Many thanks,
Roger
Aug 6, 2015 04:08 PM | Chaleece Sandberg
RE: Loading conn files in SPM
Hello again,
I hate to be a bother, but I have now run into another problem when I try to actually run the GLM in SPM:
Running 'Model estimation'
SPM12: spm_spm (v6015) 11:58:27 - 06/08/2015
========================================================================
SPM12: spm_est_non_sphericity (v6015) 11:58:30 - 06/08/2015
========================================================================
Chunk 97/97 : ...processing
Failed 'Model estimation'
Error using spm_est_non_sphericity (line 196)
Please check your data: There are no significant voxels.
In file "C:\Users\Public\Documents\spm12\spm_est_non_sphericity.m" (v6015), function "spm_est_non_sphericity" at line 196.
In file "C:\Users\Public\Documents\spm12\spm_spm.m" (v6015), function "spm_spm" at line 418.
In file "C:\Users\Public\Documents\spm12\config\spm_run_fmri_est.m" (v5809), function "spm_run_fmri_est" at line 33.
The following modules did not run:
Failed: Model estimation
Does this mean that the Denoising step somehow reduced any significance? I didn't include the conditions as confounds for denoising. I am attaching my my batch for spm, if that helps. Would you like the conn.mat file?
I hate to be a bother, but I have now run into another problem when I try to actually run the GLM in SPM:
Running 'Model estimation'
SPM12: spm_spm (v6015) 11:58:27 - 06/08/2015
========================================================================
SPM12: spm_est_non_sphericity (v6015) 11:58:30 - 06/08/2015
========================================================================
Chunk 97/97 : ...processing
Failed 'Model estimation'
Error using spm_est_non_sphericity (line 196)
Please check your data: There are no significant voxels.
In file "C:\Users\Public\Documents\spm12\spm_est_non_sphericity.m" (v6015), function "spm_est_non_sphericity" at line 196.
In file "C:\Users\Public\Documents\spm12\spm_spm.m" (v6015), function "spm_spm" at line 418.
In file "C:\Users\Public\Documents\spm12\config\spm_run_fmri_est.m" (v5809), function "spm_run_fmri_est" at line 33.
The following modules did not run:
Failed: Model estimation
Does this mean that the Denoising step somehow reduced any significance? I didn't include the conditions as confounds for denoising. I am attaching my my batch for spm, if that helps. Would you like the conn.mat file?
Aug 6, 2015 05:08 PM | Alfonso Nieto-Castanon - Boston University
RE: Loading conn files in SPM
Hi Chaleece,
This indicates that SPM finds no significant "any condition/task effects" in your data. There are several potential reasons, my best guess is that this is related to the explicit masking of the analyses combined with the band-pass filtering of your original data (by default SPM will look for "in-brain" voxels by comparing their average BOLD signal to the average global BOLD signal; since your data is band-pass filtered the average BOLD signal is always 0). I would suggest to change in your script the line "spm.stats.fmri_spec.mthresh=.8" to ""spm.stats.fmri_spec.mthresh=-inf" in order to skip this implicit masking altogether. Please let me know if that does not seem to be the issue here.
Best
Alfonso
Originally posted by Chaleece Sandberg:
This indicates that SPM finds no significant "any condition/task effects" in your data. There are several potential reasons, my best guess is that this is related to the explicit masking of the analyses combined with the band-pass filtering of your original data (by default SPM will look for "in-brain" voxels by comparing their average BOLD signal to the average global BOLD signal; since your data is band-pass filtered the average BOLD signal is always 0). I would suggest to change in your script the line "spm.stats.fmri_spec.mthresh=.8" to ""spm.stats.fmri_spec.mthresh=-inf" in order to skip this implicit masking altogether. Please let me know if that does not seem to be the issue here.
Best
Alfonso
Originally posted by Chaleece Sandberg:
Hello again,
I hate to be a bother, but I have now run into another problem when I try to actually run the GLM in SPM:
Running 'Model estimation'
SPM12: spm_spm (v6015) 11:58:27 - 06/08/2015
========================================================================
SPM12: spm_est_non_sphericity (v6015) 11:58:30 - 06/08/2015
========================================================================
Chunk 97/97 : ...processing
Failed 'Model estimation'
Error using spm_est_non_sphericity (line 196)
Please check your data: There are no significant voxels.
In file "C:\Users\Public\Documents\spm12\spm_est_non_sphericity.m" (v6015), function "spm_est_non_sphericity" at line 196.
In file "C:\Users\Public\Documents\spm12\spm_spm.m" (v6015), function "spm_spm" at line 418.
In file "C:\Users\Public\Documents\spm12\config\spm_run_fmri_est.m" (v5809), function "spm_run_fmri_est" at line 33.
The following modules did not run:
Failed: Model estimation
Does this mean that the Denoising step somehow reduced any significance? I didn't include the conditions as confounds for denoising. I am attaching my my batch for spm, if that helps. Would you like the conn.mat file?
I hate to be a bother, but I have now run into another problem when I try to actually run the GLM in SPM:
Running 'Model estimation'
SPM12: spm_spm (v6015) 11:58:27 - 06/08/2015
========================================================================
SPM12: spm_est_non_sphericity (v6015) 11:58:30 - 06/08/2015
========================================================================
Chunk 97/97 : ...processing
Failed 'Model estimation'
Error using spm_est_non_sphericity (line 196)
Please check your data: There are no significant voxels.
In file "C:\Users\Public\Documents\spm12\spm_est_non_sphericity.m" (v6015), function "spm_est_non_sphericity" at line 196.
In file "C:\Users\Public\Documents\spm12\spm_spm.m" (v6015), function "spm_spm" at line 418.
In file "C:\Users\Public\Documents\spm12\config\spm_run_fmri_est.m" (v5809), function "spm_run_fmri_est" at line 33.
The following modules did not run:
Failed: Model estimation
Does this mean that the Denoising step somehow reduced any significance? I didn't include the conditions as confounds for denoising. I am attaching my my batch for spm, if that helps. Would you like the conn.mat file?
Oct 2, 2015 07:10 PM | Chaleece Sandberg
RE: Loading conn files in SPM
Hi Alfonso!
I have bad news. I am getting the same error with the threshold change you suggested. Do you know what else the problem may be? Could it be something in the way I'm setting up the Denoising step? I believe I verified all that during the conference call, but I could be missing something.
Thanks!
Chaleece
Originally posted by Alfonso Nieto-Castanon:
I have bad news. I am getting the same error with the threshold change you suggested. Do you know what else the problem may be? Could it be something in the way I'm setting up the Denoising step? I believe I verified all that during the conference call, but I could be missing something.
Thanks!
Chaleece
Originally posted by Alfonso Nieto-Castanon:
Hi
Chaleece,
This indicates that SPM finds no significant "any condition/task effects" in your data. There are several potential reasons, my best guess is that this is related to the explicit masking of the analyses combined with the band-pass filtering of your original data (by default SPM will look for "in-brain" voxels by comparing their average BOLD signal to the average global BOLD signal; since your data is band-pass filtered the average BOLD signal is always 0). I would suggest to change in your script the line "spm.stats.fmri_spec.mthresh=.8" to ""spm.stats.fmri_spec.mthresh=-inf" in order to skip this implicit masking altogether. Please let me know if that does not seem to be the issue here.
Best
Alfonso
Originally posted by Chaleece Sandberg:
This indicates that SPM finds no significant "any condition/task effects" in your data. There are several potential reasons, my best guess is that this is related to the explicit masking of the analyses combined with the band-pass filtering of your original data (by default SPM will look for "in-brain" voxels by comparing their average BOLD signal to the average global BOLD signal; since your data is band-pass filtered the average BOLD signal is always 0). I would suggest to change in your script the line "spm.stats.fmri_spec.mthresh=.8" to ""spm.stats.fmri_spec.mthresh=-inf" in order to skip this implicit masking altogether. Please let me know if that does not seem to be the issue here.
Best
Alfonso
Originally posted by Chaleece Sandberg:
Hello again,
I hate to be a bother, but I have now run into another problem when I try to actually run the GLM in SPM:
Running 'Model estimation'
SPM12: spm_spm (v6015) 11:58:27 - 06/08/2015
========================================================================
SPM12: spm_est_non_sphericity (v6015) 11:58:30 - 06/08/2015
========================================================================
Chunk 97/97 : ...processing
Failed 'Model estimation'
Error using spm_est_non_sphericity (line 196)
Please check your data: There are no significant voxels.
In file "C:\Users\Public\Documents\spm12\spm_est_non_sphericity.m" (v6015), function "spm_est_non_sphericity" at line 196.
In file "C:\Users\Public\Documents\spm12\spm_spm.m" (v6015), function "spm_spm" at line 418.
In file "C:\Users\Public\Documents\spm12\config\spm_run_fmri_est.m" (v5809), function "spm_run_fmri_est" at line 33.
The following modules did not run:
Failed: Model estimation
Does this mean that the Denoising step somehow reduced any significance? I didn't include the conditions as confounds for denoising. I am attaching my my batch for spm, if that helps. Would you like the conn.mat file?
I hate to be a bother, but I have now run into another problem when I try to actually run the GLM in SPM:
Running 'Model estimation'
SPM12: spm_spm (v6015) 11:58:27 - 06/08/2015
========================================================================
SPM12: spm_est_non_sphericity (v6015) 11:58:30 - 06/08/2015
========================================================================
Chunk 97/97 : ...processing
Failed 'Model estimation'
Error using spm_est_non_sphericity (line 196)
Please check your data: There are no significant voxels.
In file "C:\Users\Public\Documents\spm12\spm_est_non_sphericity.m" (v6015), function "spm_est_non_sphericity" at line 196.
In file "C:\Users\Public\Documents\spm12\spm_spm.m" (v6015), function "spm_spm" at line 418.
In file "C:\Users\Public\Documents\spm12\config\spm_run_fmri_est.m" (v5809), function "spm_run_fmri_est" at line 33.
The following modules did not run:
Failed: Model estimation
Does this mean that the Denoising step somehow reduced any significance? I didn't include the conditions as confounds for denoising. I am attaching my my batch for spm, if that helps. Would you like the conn.mat file?
Oct 8, 2015 01:10 PM | Alfonso Nieto-Castanon - Boston University
RE: Loading conn files in SPM
Hi Chaleece,
Sorry about that. If you could please send me your conn*.mat project file and the script that you are using for SPM first-level analyses I will be happy to take a closer look to see if I can figure out the source of this issue.
Best
Alfonso
Originally posted by Chaleece Sandberg:
Sorry about that. If you could please send me your conn*.mat project file and the script that you are using for SPM first-level analyses I will be happy to take a closer look to see if I can figure out the source of this issue.
Best
Alfonso
Originally posted by Chaleece Sandberg:
Hi
Alfonso!
I have bad news. I am getting the same error with the threshold change you suggested. Do you know what else the problem may be? Could it be something in the way I'm setting up the Denoising step? I believe I verified all that during the conference call, but I could be missing something.
Thanks!
Chaleece
Originally posted by Alfonso Nieto-Castanon:
I have bad news. I am getting the same error with the threshold change you suggested. Do you know what else the problem may be? Could it be something in the way I'm setting up the Denoising step? I believe I verified all that during the conference call, but I could be missing something.
Thanks!
Chaleece
Originally posted by Alfonso Nieto-Castanon:
Hi
Chaleece,
This indicates that SPM finds no significant "any condition/task effects" in your data. There are several potential reasons, my best guess is that this is related to the explicit masking of the analyses combined with the band-pass filtering of your original data (by default SPM will look for "in-brain" voxels by comparing their average BOLD signal to the average global BOLD signal; since your data is band-pass filtered the average BOLD signal is always 0). I would suggest to change in your script the line "spm.stats.fmri_spec.mthresh=.8" to ""spm.stats.fmri_spec.mthresh=-inf" in order to skip this implicit masking altogether. Please let me know if that does not seem to be the issue here.
Best
Alfonso
Originally posted by Chaleece Sandberg:
This indicates that SPM finds no significant "any condition/task effects" in your data. There are several potential reasons, my best guess is that this is related to the explicit masking of the analyses combined with the band-pass filtering of your original data (by default SPM will look for "in-brain" voxels by comparing their average BOLD signal to the average global BOLD signal; since your data is band-pass filtered the average BOLD signal is always 0). I would suggest to change in your script the line "spm.stats.fmri_spec.mthresh=.8" to ""spm.stats.fmri_spec.mthresh=-inf" in order to skip this implicit masking altogether. Please let me know if that does not seem to be the issue here.
Best
Alfonso
Originally posted by Chaleece Sandberg:
Hello again,
I hate to be a bother, but I have now run into another problem when I try to actually run the GLM in SPM:
Running 'Model estimation'
SPM12: spm_spm (v6015) 11:58:27 - 06/08/2015
========================================================================
SPM12: spm_est_non_sphericity (v6015) 11:58:30 - 06/08/2015
========================================================================
Chunk 97/97 : ...processing
Failed 'Model estimation'
Error using spm_est_non_sphericity (line 196)
Please check your data: There are no significant voxels.
In file "C:\Users\Public\Documents\spm12\spm_est_non_sphericity.m" (v6015), function "spm_est_non_sphericity" at line 196.
In file "C:\Users\Public\Documents\spm12\spm_spm.m" (v6015), function "spm_spm" at line 418.
In file "C:\Users\Public\Documents\spm12\config\spm_run_fmri_est.m" (v5809), function "spm_run_fmri_est" at line 33.
The following modules did not run:
Failed: Model estimation
Does this mean that the Denoising step somehow reduced any significance? I didn't include the conditions as confounds for denoising. I am attaching my my batch for spm, if that helps. Would you like the conn.mat file?
I hate to be a bother, but I have now run into another problem when I try to actually run the GLM in SPM:
Running 'Model estimation'
SPM12: spm_spm (v6015) 11:58:27 - 06/08/2015
========================================================================
SPM12: spm_est_non_sphericity (v6015) 11:58:30 - 06/08/2015
========================================================================
Chunk 97/97 : ...processing
Failed 'Model estimation'
Error using spm_est_non_sphericity (line 196)
Please check your data: There are no significant voxels.
In file "C:\Users\Public\Documents\spm12\spm_est_non_sphericity.m" (v6015), function "spm_est_non_sphericity" at line 196.
In file "C:\Users\Public\Documents\spm12\spm_spm.m" (v6015), function "spm_spm" at line 418.
In file "C:\Users\Public\Documents\spm12\config\spm_run_fmri_est.m" (v5809), function "spm_run_fmri_est" at line 33.
The following modules did not run:
Failed: Model estimation
Does this mean that the Denoising step somehow reduced any significance? I didn't include the conditions as confounds for denoising. I am attaching my my batch for spm, if that helps. Would you like the conn.mat file?
Nov 10, 2015 06:11 PM | Anila D'Mello
RE: Loading conn files in SPM
Hi Alfonso,
I'm interested in looking at activation differences between my conditions and was hoping to use the preprocessed, denoised files from Conn in SPM. I'm a bit unsure how to identify the proper terms within the BETA_Subject#.nii file. Is there a good way to determine which volumes I need to look at the effect of each task and enter these into SPM for further analysis?
Thanks,
Anila
I'm interested in looking at activation differences between my conditions and was hoping to use the preprocessed, denoised files from Conn in SPM. I'm a bit unsure how to identify the proper terms within the BETA_Subject#.nii file. Is there a good way to determine which volumes I need to look at the effect of each task and enter these into SPM for further analysis?
Thanks,
Anila
Mar 11, 2016 05:03 PM | laurel morris
RE: Loading conn files in SPM
Dear Alfonso, in response to this post, if you have just one
task/condition (rest), what is the difference between the
preprocessing/niftiDATA_Subject* data and
the preprocessing/BETA_Subject* data? I'm looking at time
course data before and after TMS. Many thanks for all your help,
Laurel
Mar 14, 2016 06:03 PM | Alfonso Nieto-Castanon - Boston University
RE: Loading conn files in SPM
Dear Laurel,
The nifitDATA*.nii files contain just the same information as the DATA*.matc files in the same preprocessing directory, only in a more standard nifti format. Those nifti* files are only generated by CONN if you select in Setup.Options the option to 'create confound corrected timeseres', and they are meant to be used when you want to export those confound-corrected timeseries to a different program/software. The BETA_Subject* files in that same directory represent the beta weights (regressor coefficients) associated with the different confounding effects during the Denoising step. Again, these are only generated by CONN if you select in Setup.Options the option to 'create confound-effect beta maps', and they are meant to be used when you want to export those regression coefficients to a different program/software.
Hope this helps
Alfonso
Originally posted by laurel morris:
The nifitDATA*.nii files contain just the same information as the DATA*.matc files in the same preprocessing directory, only in a more standard nifti format. Those nifti* files are only generated by CONN if you select in Setup.Options the option to 'create confound corrected timeseres', and they are meant to be used when you want to export those confound-corrected timeseries to a different program/software. The BETA_Subject* files in that same directory represent the beta weights (regressor coefficients) associated with the different confounding effects during the Denoising step. Again, these are only generated by CONN if you select in Setup.Options the option to 'create confound-effect beta maps', and they are meant to be used when you want to export those regression coefficients to a different program/software.
Hope this helps
Alfonso
Originally posted by laurel morris:
Dear Alfonso, in response to this post, if you
have just one task/condition (rest), what is the difference between
the preprocessing/niftiDATA_Subject* data and
the preprocessing/BETA_Subject* data? I'm looking at time
course data before and after TMS. Many thanks for all your help,
Laurel
Nov 23, 2016 05:11 PM | Leah Fleming - Yale University
RE: Loading conn files in SPM
Hi all,
I am not sure if anyone here will know the answer to this, but I figured I would give it a try. I am planning to use Dynamic Causal Modeling with SPM12 on some fMRI resting state data and was wondering if there was a way to input the files from my first and/or second level analysis from CONN into SPM so I don't have to start over again? It seems that SPM requires SPM.mat and VOI time series (mat) files.
Thanks for your help!
Leah
I am not sure if anyone here will know the answer to this, but I figured I would give it a try. I am planning to use Dynamic Causal Modeling with SPM12 on some fMRI resting state data and was wondering if there was a way to input the files from my first and/or second level analysis from CONN into SPM so I don't have to start over again? It seems that SPM requires SPM.mat and VOI time series (mat) files.
Thanks for your help!
Leah
Nov 28, 2016 10:11 AM | Isabel Berwian
RE: Loading conn files in SPM
Dear Alfonso
I have a follow-up question on the question form Laurel. I would like to export beta weights (regressor coefficients) ideally as a .txt file to import them into a SPM GLM. Is there a possibility how to get these beta-weights as .txt files? Or do you have another idea how I could do this? The reason I want to do this is to investigate the effect of the denonising with aCompCorr. Therefore, as an alternative solution, do you know a toolbox which implemented aCompCorr in the same way as CONN, that would allow me to do this?
Thank you very much in advance,
Isabel
I have a follow-up question on the question form Laurel. I would like to export beta weights (regressor coefficients) ideally as a .txt file to import them into a SPM GLM. Is there a possibility how to get these beta-weights as .txt files? Or do you have another idea how I could do this? The reason I want to do this is to investigate the effect of the denonising with aCompCorr. Therefore, as an alternative solution, do you know a toolbox which implemented aCompCorr in the same way as CONN, that would allow me to do this?
Thank you very much in advance,
Isabel
Feb 7, 2017 05:02 PM | kito24 - USC
RE: Loading conn files in SPM
Hi Leah,
Have you heard back about this, or were you able to figure it out? I also ran my analysis in CONN and need to convert it into SPM for DCM. Please let me know!
Thanks,
Kaori
Originally posted by Leah Fleming:
Have you heard back about this, or were you able to figure it out? I also ran my analysis in CONN and need to convert it into SPM for DCM. Please let me know!
Thanks,
Kaori
Originally posted by Leah Fleming:
Hi all,
I am not sure if anyone here will know the answer to this, but I figured I would give it a try. I am planning to use Dynamic Causal Modeling with SPM12 on some fMRI resting state data and was wondering if there was a way to input the files from my first and/or second level analysis from CONN into SPM so I don't have to start over again? It seems that SPM requires SPM.mat and VOI time series (mat) files.
Thanks for your help!
Leah
I am not sure if anyone here will know the answer to this, but I figured I would give it a try. I am planning to use Dynamic Causal Modeling with SPM12 on some fMRI resting state data and was wondering if there was a way to input the files from my first and/or second level analysis from CONN into SPM so I don't have to start over again? It seems that SPM requires SPM.mat and VOI time series (mat) files.
Thanks for your help!
Leah
Feb 8, 2017 02:02 PM | Leah Fleming - Yale University
RE: Loading conn files in SPM
Hi Kaori,
I never heard back from anyone about this and am still not sure of a way to do this; I ended up just using the pre-processed images from CONN and starting over in SPM.
If you figure this out, I would love to know how!
Best,
Leah
I never heard back from anyone about this and am still not sure of a way to do this; I ended up just using the pre-processed images from CONN and starting over in SPM.
If you figure this out, I would love to know how!
Best,
Leah
Feb 9, 2017 01:02 AM | kito24 - USC
RE: Loading conn files in SPM
Hi Alfonso and Chaleece,
Were you able to figure out a solution to this problem? I'm running into a similar bug; I tried to use preprocessed data from Conn following previous posts in the topic (converting the preprocessed .mat file to niftis, and then splitting those by sessions) but I'm also running into a similar error:
Error using spm_est_non_sphericity (line 196)
Please check your data: There are no significant voxels.
Error in spm_spm (line 421)
[xVi, am] = spm_est_non_sphericity(SPM);
Error in spm_getSPM (line 234)
SPM = spm_spm(SPM);
Error in spm_results_ui (line 261)
[SPM,xSPM] = spm_getSPM;
Error while evaluating uicontrol Callback
I also tried running this line but still have the same error.
spm.stats.fmri_spec.mthresh=-inf
Please let me know if you have a solution to this!
Thanks,
Kaori
Originally posted by Alfonso Nieto-Castanon:
Were you able to figure out a solution to this problem? I'm running into a similar bug; I tried to use preprocessed data from Conn following previous posts in the topic (converting the preprocessed .mat file to niftis, and then splitting those by sessions) but I'm also running into a similar error:
Error using spm_est_non_sphericity (line 196)
Please check your data: There are no significant voxels.
Error in spm_spm (line 421)
[xVi, am] = spm_est_non_sphericity(SPM);
Error in spm_getSPM (line 234)
SPM = spm_spm(SPM);
Error in spm_results_ui (line 261)
[SPM,xSPM] = spm_getSPM;
Error while evaluating uicontrol Callback
I also tried running this line but still have the same error.
spm.stats.fmri_spec.mthresh=-inf
Please let me know if you have a solution to this!
Thanks,
Kaori
Originally posted by Alfonso Nieto-Castanon:
Hi
Chaleece,
Sorry about that. If you could please send me your conn*.mat project file and the script that you are using for SPM first-level analyses I will be happy to take a closer look to see if I can figure out the source of this issue.
Best
Alfonso
Originally posted by Chaleece Sandberg:
Sorry about that. If you could please send me your conn*.mat project file and the script that you are using for SPM first-level analyses I will be happy to take a closer look to see if I can figure out the source of this issue.
Best
Alfonso
Originally posted by Chaleece Sandberg:
Hi
Alfonso!
I have bad news. I am getting the same error with the threshold change you suggested. Do you know what else the problem may be? Could it be something in the way I'm setting up the Denoising step? I believe I verified all that during the conference call, but I could be missing something.
Thanks!
Chaleece
Originally posted by Alfonso Nieto-Castanon:
I have bad news. I am getting the same error with the threshold change you suggested. Do you know what else the problem may be? Could it be something in the way I'm setting up the Denoising step? I believe I verified all that during the conference call, but I could be missing something.
Thanks!
Chaleece
Originally posted by Alfonso Nieto-Castanon:
Hi
Chaleece,
This indicates that SPM finds no significant "any condition/task effects" in your data. There are several potential reasons, my best guess is that this is related to the explicit masking of the analyses combined with the band-pass filtering of your original data (by default SPM will look for "in-brain" voxels by comparing their average BOLD signal to the average global BOLD signal; since your data is band-pass filtered the average BOLD signal is always 0). I would suggest to change in your script the line "spm.stats.fmri_spec.mthresh=.8" to ""spm.stats.fmri_spec.mthresh=-inf" in order to skip this implicit masking altogether. Please let me know if that does not seem to be the issue here.
Best
Alfonso
Originally posted by Chaleece Sandberg:
This indicates that SPM finds no significant "any condition/task effects" in your data. There are several potential reasons, my best guess is that this is related to the explicit masking of the analyses combined with the band-pass filtering of your original data (by default SPM will look for "in-brain" voxels by comparing their average BOLD signal to the average global BOLD signal; since your data is band-pass filtered the average BOLD signal is always 0). I would suggest to change in your script the line "spm.stats.fmri_spec.mthresh=.8" to ""spm.stats.fmri_spec.mthresh=-inf" in order to skip this implicit masking altogether. Please let me know if that does not seem to be the issue here.
Best
Alfonso
Originally posted by Chaleece Sandberg:
Hello again,
I hate to be a bother, but I have now run into another problem when I try to actually run the GLM in SPM:
Running 'Model estimation'
SPM12: spm_spm (v6015) 11:58:27 - 06/08/2015
========================================================================
SPM12: spm_est_non_sphericity (v6015) 11:58:30 - 06/08/2015
========================================================================
Chunk 97/97 : ...processing
Failed 'Model estimation'
Error using spm_est_non_sphericity (line 196)
Please check your data: There are no significant voxels.
In file "C:\Users\Public\Documents\spm12\spm_est_non_sphericity.m" (v6015), function "spm_est_non_sphericity" at line 196.
In file "C:\Users\Public\Documents\spm12\spm_spm.m" (v6015), function "spm_spm" at line 418.
In file "C:\Users\Public\Documents\spm12\config\spm_run_fmri_est.m" (v5809), function "spm_run_fmri_est" at line 33.
The following modules did not run:
Failed: Model estimation
Does this mean that the Denoising step somehow reduced any significance? I didn't include the conditions as confounds for denoising. I am attaching my my batch for spm, if that helps. Would you like the conn.mat file?
I hate to be a bother, but I have now run into another problem when I try to actually run the GLM in SPM:
Running 'Model estimation'
SPM12: spm_spm (v6015) 11:58:27 - 06/08/2015
========================================================================
SPM12: spm_est_non_sphericity (v6015) 11:58:30 - 06/08/2015
========================================================================
Chunk 97/97 : ...processing
Failed 'Model estimation'
Error using spm_est_non_sphericity (line 196)
Please check your data: There are no significant voxels.
In file "C:\Users\Public\Documents\spm12\spm_est_non_sphericity.m" (v6015), function "spm_est_non_sphericity" at line 196.
In file "C:\Users\Public\Documents\spm12\spm_spm.m" (v6015), function "spm_spm" at line 418.
In file "C:\Users\Public\Documents\spm12\config\spm_run_fmri_est.m" (v5809), function "spm_run_fmri_est" at line 33.
The following modules did not run:
Failed: Model estimation
Does this mean that the Denoising step somehow reduced any significance? I didn't include the conditions as confounds for denoising. I am attaching my my batch for spm, if that helps. Would you like the conn.mat file?
Nov 3, 2017 06:11 PM | Patrick McConnell - MUSC
"no significant voxels"
I am also having the same "no significant voxels" problem....
Originally posted by kito24:
Originally posted by kito24:
Hi Alfonso and
Chaleece,
Were you able to figure out a solution to this problem? I'm running into a similar bug; I tried to use preprocessed data from Conn following previous posts in the topic (converting the preprocessed .mat file to niftis, and then splitting those by sessions) but I'm also running into a similar error:
Error using spm_est_non_sphericity (line 196)
Please check your data: There are no significant voxels.
Error in spm_spm (line 421)
[xVi, am] = spm_est_non_sphericity(SPM);
Error in spm_getSPM (line 234)
SPM = spm_spm(SPM);
Error in spm_results_ui (line 261)
[SPM,xSPM] = spm_getSPM;
Error while evaluating uicontrol Callback
I also tried running this line but still have the same error.
spm.stats.fmri_spec.mthresh=-inf
Please let me know if you have a solution to this!
Thanks,
Kaori
Originally posted by Alfonso Nieto-Castanon:
Were you able to figure out a solution to this problem? I'm running into a similar bug; I tried to use preprocessed data from Conn following previous posts in the topic (converting the preprocessed .mat file to niftis, and then splitting those by sessions) but I'm also running into a similar error:
Error using spm_est_non_sphericity (line 196)
Please check your data: There are no significant voxels.
Error in spm_spm (line 421)
[xVi, am] = spm_est_non_sphericity(SPM);
Error in spm_getSPM (line 234)
SPM = spm_spm(SPM);
Error in spm_results_ui (line 261)
[SPM,xSPM] = spm_getSPM;
Error while evaluating uicontrol Callback
I also tried running this line but still have the same error.
spm.stats.fmri_spec.mthresh=-inf
Please let me know if you have a solution to this!
Thanks,
Kaori
Originally posted by Alfonso Nieto-Castanon:
Hi
Chaleece,
Sorry about that. If you could please send me your conn*.mat project file and the script that you are using for SPM first-level analyses I will be happy to take a closer look to see if I can figure out the source of this issue.
Best
Alfonso
Originally posted by Chaleece Sandberg:
Sorry about that. If you could please send me your conn*.mat project file and the script that you are using for SPM first-level analyses I will be happy to take a closer look to see if I can figure out the source of this issue.
Best
Alfonso
Originally posted by Chaleece Sandberg:
Hi
Alfonso!
I have bad news. I am getting the same error with the threshold change you suggested. Do you know what else the problem may be? Could it be something in the way I'm setting up the Denoising step? I believe I verified all that during the conference call, but I could be missing something.
Thanks!
Chaleece
Originally posted by Alfonso Nieto-Castanon:
I have bad news. I am getting the same error with the threshold change you suggested. Do you know what else the problem may be? Could it be something in the way I'm setting up the Denoising step? I believe I verified all that during the conference call, but I could be missing something.
Thanks!
Chaleece
Originally posted by Alfonso Nieto-Castanon:
Hi
Chaleece,
This indicates that SPM finds no significant "any condition/task effects" in your data. There are several potential reasons, my best guess is that this is related to the explicit masking of the analyses combined with the band-pass filtering of your original data (by default SPM will look for "in-brain" voxels by comparing their average BOLD signal to the average global BOLD signal; since your data is band-pass filtered the average BOLD signal is always 0). I would suggest to change in your script the line "spm.stats.fmri_spec.mthresh=.8" to ""spm.stats.fmri_spec.mthresh=-inf" in order to skip this implicit masking altogether. Please let me know if that does not seem to be the issue here.
Best
Alfonso
Originally posted by Chaleece Sandberg:
This indicates that SPM finds no significant "any condition/task effects" in your data. There are several potential reasons, my best guess is that this is related to the explicit masking of the analyses combined with the band-pass filtering of your original data (by default SPM will look for "in-brain" voxels by comparing their average BOLD signal to the average global BOLD signal; since your data is band-pass filtered the average BOLD signal is always 0). I would suggest to change in your script the line "spm.stats.fmri_spec.mthresh=.8" to ""spm.stats.fmri_spec.mthresh=-inf" in order to skip this implicit masking altogether. Please let me know if that does not seem to be the issue here.
Best
Alfonso
Originally posted by Chaleece Sandberg:
Hello again,
I hate to be a bother, but I have now run into another problem when I try to actually run the GLM in SPM:
Running 'Model estimation'
SPM12: spm_spm (v6015) 11:58:27 - 06/08/2015
========================================================================
SPM12: spm_est_non_sphericity (v6015) 11:58:30 - 06/08/2015
========================================================================
Chunk 97/97 : ...processing
Failed 'Model estimation'
Error using spm_est_non_sphericity (line 196)
Please check your data: There are no significant voxels.
In file "C:\Users\Public\Documents\spm12\spm_est_non_sphericity.m" (v6015), function "spm_est_non_sphericity" at line 196.
In file "C:\Users\Public\Documents\spm12\spm_spm.m" (v6015), function "spm_spm" at line 418.
In file "C:\Users\Public\Documents\spm12\config\spm_run_fmri_est.m" (v5809), function "spm_run_fmri_est" at line 33.
The following modules did not run:
Failed: Model estimation
Does this mean that the Denoising step somehow reduced any significance? I didn't include the conditions as confounds for denoising. I am attaching my my batch for spm, if that helps. Would you like the conn.mat file?
I hate to be a bother, but I have now run into another problem when I try to actually run the GLM in SPM:
Running 'Model estimation'
SPM12: spm_spm (v6015) 11:58:27 - 06/08/2015
========================================================================
SPM12: spm_est_non_sphericity (v6015) 11:58:30 - 06/08/2015
========================================================================
Chunk 97/97 : ...processing
Failed 'Model estimation'
Error using spm_est_non_sphericity (line 196)
Please check your data: There are no significant voxels.
In file "C:\Users\Public\Documents\spm12\spm_est_non_sphericity.m" (v6015), function "spm_est_non_sphericity" at line 196.
In file "C:\Users\Public\Documents\spm12\spm_spm.m" (v6015), function "spm_spm" at line 418.
In file "C:\Users\Public\Documents\spm12\config\spm_run_fmri_est.m" (v5809), function "spm_run_fmri_est" at line 33.
The following modules did not run:
Failed: Model estimation
Does this mean that the Denoising step somehow reduced any significance? I didn't include the conditions as confounds for denoising. I am attaching my my batch for spm, if that helps. Would you like the conn.mat file?