help
help > RE: Crashing with Large Data-set during Denoising
Aug 31, 2014 07:08 PM | Alfonso Nieto-Castanon - Boston University
RE: Crashing with Large Data-set during Denoising
Hi Tom,
When you canceled the process (e.g. using ctrl-C during Denoising) could you by any chance send me the message that you got on Matlab command line or CONN error gui? (this would help me find out which step was taking so long). Alternatively if you could tell me the last progress-bar name (e.g. 'Denoising functional data', 'Denoising ROI data', 'Preprocessing voxel-to-voxel covariance', etc.) that might help as well.
My guess would be that the voxel-to-voxel characterization occurring at the end of Denoising could be a step that may be most sensible to very-large sessions. If that is the case you should be able to simply uncheck the corresponding voxel-to-voxel option (either in Setup.Options.EnabledAnalyses or in the prompt shown right after pressing 'Done' during the Preprocessing step) and run through the rest of the steps without problems. Alternatively, if the issue is occurring during the "Denoising functional data" step, perhaps I could point you towards some easy-to-change flags in the conn_process.m code that implement an alternative 'low-memory-load' version of this step.
Let me know
Best
Alfonso
Originally posted by Tom Mole:
When you canceled the process (e.g. using ctrl-C during Denoising) could you by any chance send me the message that you got on Matlab command line or CONN error gui? (this would help me find out which step was taking so long). Alternatively if you could tell me the last progress-bar name (e.g. 'Denoising functional data', 'Denoising ROI data', 'Preprocessing voxel-to-voxel covariance', etc.) that might help as well.
My guess would be that the voxel-to-voxel characterization occurring at the end of Denoising could be a step that may be most sensible to very-large sessions. If that is the case you should be able to simply uncheck the corresponding voxel-to-voxel option (either in Setup.Options.EnabledAnalyses or in the prompt shown right after pressing 'Done' during the Preprocessing step) and run through the rest of the steps without problems. Alternatively, if the issue is occurring during the "Denoising functional data" step, perhaps I could point you towards some easy-to-change flags in the conn_process.m code that implement an alternative 'low-memory-load' version of this step.
Let me know
Best
Alfonso
Originally posted by Tom Mole:
We have unusually long EPI fMRI data files
lasting just under 1 hour for a relatively rare condition in only 3
subjects. It is a block design the subjects each with 4x ~1hr EPI
runs. As population inferences are underpowered with only 3
subjects, we are combining all runs into a single first level
analysis with 12 sessions and not performing a second level
analysis. Due to the large file sizes, we previously ran into
difficulties with whole-brain contrasts specifying and estimating
the SPM.mat files which was resolved by using the v7.3 switch and
within multiple .m file scripts using SPM12b and matlab R2014a
(https://www.jiscmail.ac.uk/cgi-bin/webad...). In
case, the .mat files needed similarly changing to be saved in the
7.3 format in conn, I also did a search and replace for all .m
files containing the string 'save(' and ammended these commands to
save in the v7.3 format. Though this didn't appear to help so I
reverted back and reinstalled conn.
When using conn 13 to import settings from our previously estimated spm.mat file, we got the error message below stating 'Index exceeds matrix dimensions' in matlab. When trying conn 14k, when importing the design data resulted in a message stating spm.mat file of 'unrecognised' format. After manually specifying the design, conn proceeded until the end of 'Denoising the functional data'. At this point the green progress bar appeared complete with ETA of 0 seconds but never progressed and 'hung'. No matlab error was produced and even after 16hrs there was no progress and needed cancelling. I note looking at systems performance stats that matlab was still requiring ++ ram throughout this time.
Given the long scans, I assume it may be a file size issue. It would be really appreciated if anyone had any advice on this?
Many thanks,
Tom
When using conn 13 to import settings from our previously estimated spm.mat file, we got the error message below stating 'Index exceeds matrix dimensions' in matlab. When trying conn 14k, when importing the design data resulted in a message stating spm.mat file of 'unrecognised' format. After manually specifying the design, conn proceeded until the end of 'Denoising the functional data'. At this point the green progress bar appeared complete with ETA of 0 seconds but never progressed and 'hung'. No matlab error was produced and even after 16hrs there was no progress and needed cancelling. I note looking at systems performance stats that matlab was still requiring ++ ram throughout this time.
Given the long scans, I assume it may be a file size issue. It would be really appreciated if anyone had any advice on this?
Many thanks,
Tom
Threaded View
| Title | Author | Date |
|---|---|---|
| Tom Mole | Aug 26, 2014 | |
| Alfonso Nieto-Castanon | Aug 31, 2014 | |
