help > Preprocessing error, functional normalization
Showing 1-8 of 8 posts
Display:
Results per page:
Nov 15, 2016  10:11 AM | Neir Eshel
Preprocessing error, functional normalization
I'm getting the following error when preprocessing functional scans, during the functional normalization step. I'm trying to import raw functional data from a few dozen subjects, all with essentially identical scans, and this error pops up for about 1/3 of them. I haven't noticed any patterns that distinguish this third from the other subjects. Occasionally the error stops the whole stream, whereas other times if I wait longer, it will continue to work, eventually telling me that preprocessing has finished correctly.  Anyone have any ideas?  Is there a place to see the file names or extensions that each processing step is supposed to create? I'm not sure where the "meaneasting_state" extension is supposed to come from. Thank you so much!

------------------------------------------------
Error using conn_setup_preproc (line 979)
Error preparing files for normalization. Mean functional file C:\...\fMRI DATA\TimePoint_1\dn_p004_tp1_meanesting_state.nii not found

Error in conn (line 806)
ok=conn_setup_preproc('',varargin{2:end});
Error in conn_menumanager (line 119)
feval(CONN_MM.MENU{n0}.callback{n1}{1},CONN_MM.MENU{n0}.callback{n1}{2:end});
CONN v.17.a
SPM12 + DEM FieldMap MEEGtools
Matlab v.2015b
storage: 732.9Gb available
Nov 15, 2016  10:11 AM | Pravesh Parekh - National Institute of Mental Health and Neurosciences
RE: Preprocessing error, functional normalization
Hello,

Typically, the mean image is created during the preprocessing steps and "mean" is appended in the beginning of the file name (SPM style), not to end of the file name, and certainly never in the middle! So in that sense, the "meanesting_state.nii" is a weird thing to see. My first instinct was to assume that a particular subject had "r" of "resting_state.nii" missing and hence the whole problem, but since you say that it has been running successfully sometimes, it gives me pause.

Another weird thing that I notice is that it is referring to the image dn_p004...nii. The reason this is weird is because you have reached the normalization stage. Before that, other files would have been created (during say slice timing correction, segmentation, etc.).

What is/are the preprocessing steps you are running? Are you running the Conn default pipeline or something different? Also, you mention that you are trying to import raw functional data from a few subjects. Can you clarify how you are importing these data into Conn? Perhaps you are running a script here?

Finally, it might also be a good idea to quickly check that there are no errors in file names (in case you are using a script to load files). What is the name of your nii file to start with? 

Best
Pravesh
Originally posted by Neir Eshel:
I'm getting the following error when preprocessing functional scans, during the functional normalization step. I'm trying to import raw functional data from a few dozen subjects, all with essentially identical scans, and this error pops up for about 1/3 of them. I haven't noticed any patterns that distinguish this third from the other subjects. Occasionally the error stops the whole stream, whereas other times if I wait longer, it will continue to work, eventually telling me that preprocessing has finished correctly.  Anyone have any ideas?  Is there a place to see the file names or extensions that each processing step is supposed to create? I'm not sure where the "meaneasting_state" extension is supposed to come from. Thank you so much!

------------------------------------------------
Error using conn_setup_preproc (line 979)
Error preparing files for normalization. Mean functional file C:\...\fMRI DATA\TimePoint_1\dn_p004_tp1_meanesting_state.nii not found

Error in conn (line 806)
ok=conn_setup_preproc('',varargin{2:end});
Error in conn_menumanager (line 119)
feval(CONN_MM.MENU{n0}.callback{n1}{1},CONN_MM.MENU{n0}.callback{n1}{2:end});
CONN v.17.a
SPM12 + DEM FieldMap MEEGtools
Matlab v.2015b
storage: 732.9Gb available
Nov 15, 2016  10:11 AM | Pravesh Parekh - National Institute of Mental Health and Neurosciences
RE: Preprocessing error, functional normalization
Also, you can find a list of most of the files that are created during preprocessing and other steps here: https://www.nitrc.org/forum/forum.php?thread_id=7219&forum_id=1144

Best
Pravesh


Originally posted by Neir Eshel:
I'm getting the following error when preprocessing functional scans, during the functional normalization step. I'm trying to import raw functional data from a few dozen subjects, all with essentially identical scans, and this error pops up for about 1/3 of them. I haven't noticed any patterns that distinguish this third from the other subjects. Occasionally the error stops the whole stream, whereas other times if I wait longer, it will continue to work, eventually telling me that preprocessing has finished correctly.  Anyone have any ideas?  Is there a place to see the file names or extensions that each processing step is supposed to create? I'm not sure where the "meaneasting_state" extension is supposed to come from. Thank you so much!

------------------------------------------------
Error using conn_setup_preproc (line 979)
Error preparing files for normalization. Mean functional file C:\...\fMRI DATA\TimePoint_1\dn_p004_tp1_meanesting_state.nii not found

Error in conn (line 806)
ok=conn_setup_preproc('',varargin{2:end});
Error in conn_menumanager (line 119)
feval(CONN_MM.MENU{n0}.callback{n1}{1},CONN_MM.MENU{n0}.callback{n1}{2:end});
CONN v.17.a
SPM12 + DEM FieldMap MEEGtools
Matlab v.2015b
storage: 732.9Gb available
Nov 15, 2016  11:11 AM | Neir Eshel
RE: Preprocessing error, functional normalization
> Thanks for such a quick response! I'm running the Conn default
> preprocessing pipeline. The functional input files are all named
> "dana_pXXX_tp1_resting_state.nii.gz", where the XXX is a number from 001
> to 999. I'm actually selecting the files directly using the Conn gui. I
> just downloaded Conn, so I wanted to try everything manually first, before
> writing scripts. It seems like the program is occasionally getting confused
> about the prefixes. I just ran it again for one of the same subjects that
> failed before and now it was able to complete the full preprocessing
> pipeline, creating "...resting_state_uw" and then "centering_meanu...",
> then "meanu...", then "u...", then "au...", then "y_meanu", then
> "wmeanu...", then "wau", then "swau..."
>
Is there something with the names? Thank you!


Nov 15, 2016  11:11 AM | Pravesh Parekh - National Institute of Mental Health and Neurosciences
RE: Preprocessing error, functional normalization
Hi Eshel,

It is possible that in the latest Conn release, the explicit naming of the files has been changed and made clearer. Also, it explains why the appending is happening as postfix rather than prefix (for some of the steps which Conn does outside of SPM). So far so good (I have not yet had the opportunity to get going with the latest Conn release!).

Now, as far as subject 004 goes (as in your previous post), there is clearly something going wrong. As you can see, "dn_p004_tp1_meanesting_state.nii" does not match any pattern at all! Your file name should be "dana_p004_tp1_resting_state.nii" and not "dn*_meanesting*.nii'. This looks like a bug (unless there was a problem in the name of the file itself).

The mean functional file (which Conn was unable to find in your previous post) should (most likely) have been named "meanaudana_p004_tp1_resting_state.nii". This would reflect the fact that the functional volume was subjected to "realign and unwarp" and "slice timing correction" (in that order), subsequent to which the mean functional image was created.

I think you might have to wait for Dr. Alfonoso to have a look at what's going wrong. In the meantime, you can try working with an older version (I assume you are using release 17?) to explore the features of Conn.


Best
Pravesh


Originally posted by Neir Eshel:
> Thanks for such a quick response! I'm running the Conn default
> preprocessing pipeline. The functional input files are all named
> "dana_pXXX_tp1_resting_state.nii.gz", where the XXX is a number from 001
> to 999. I'm actually selecting the files directly using the Conn gui. I
> just downloaded Conn, so I wanted to try everything manually first, before
> writing scripts. It seems like the program is occasionally getting confused
> about the prefixes. I just ran it again for one of the same subjects that
> failed before and now it was able to complete the full preprocessing
> pipeline, creating "...resting_state_uw" and then "centering_meanu...",
> then "meanu...", then "u...", then "au...", then "y_meanu", then
> "wmeanu...", then "wau", then "swau..."
>
Is there something with the names? Thank you!


Nov 15, 2016  06:11 PM | Alfonso Nieto-Castanon - McGovern Institute for Brain Research. MIT
RE: Preprocessing error, functional normalization
Hi Eshel and Pravesh

If you want to apply CONN's default preprocessing pipeline, simply check first in Setup.functional and Setup.structural that you have there your raw/original functional (e.g. dana_p004_tp1_resting_state.nii) and anatomical volumes, respectively, and that you are including all steps in this pipeline (from realignment to smoothing). The error message that you describe could be due simply to trying to run normalization to a functional dataset which has not yet been realigned (e.g. trying to run "functional normalization" on dana_p004_tp1_resting_state.nii instead of on audana_p004_tp1_resting_state.nii). If you have already performed realignment, slice-timing etc., and want to continue the default pipeline from the "functional normalization" step onwards, then make sure that in the Setup.functional and Setup.structural tabs you have, respectively, the realigned/slice-timing-corrected functional data (i.e. audana_p004_tp1_resting_state.nii) and the normalized structural data (i.e. wc0*.nii).

Hope this helps
Alfonso

ps. regardng the strange/misdirecting name "dn_p004_tp1_meanesting_state.nii" in the error message, that is just a byproduct of CONN's failed attempts to find a potential match to the (non-existing in this case) mean functional volume filename. In general, when looking for a potential "mean functional volume" filename, CONN will be searching for these patterns: for a filename of the form [PREFIX r BASENAME], a valid mean-functional-volume name will be [PREFIX(minus 'a' or 's') mean BASENAME], and for a filename of the form [PREFIX u BASENAME] a valid mean-functional-volume name will be [PREFIX(minus 'a' or 's') meanu BASENAME]. The "dn_p004_tp1_meanesting_state.nii" filename just represents the last of the attempts as potential "mean functional volume" filenames that CONN tried (it tried breaking down "dn_p004_tp1_resting.state.nii" as [dn_p004_tp1_]r[esting_state.nii]" and it applied the first rule)

Originally posted by Pravesh Parekh:
Hi Eshel,

It is possible that in the latest Conn release, the explicit naming of the files has been changed and made clearer. Also, it explains why the appending is happening as postfix rather than prefix (for some of the steps which Conn does outside of SPM). So far so good (I have not yet had the opportunity to get going with the latest Conn release!).

Now, as far as subject 004 goes (as in your previous post), there is clearly something going wrong. As you can see, "dn_p004_tp1_meanesting_state.nii" does not match any pattern at all! Your file name should be "dana_p004_tp1_resting_state.nii" and not "dn*_meanesting*.nii'. This looks like a bug (unless there was a problem in the name of the file itself).

The mean functional file (which Conn was unable to find in your previous post) should (most likely) have been named "meanaudana_p004_tp1_resting_state.nii". This would reflect the fact that the functional volume was subjected to "realign and unwarp" and "slice timing correction" (in that order), subsequent to which the mean functional image was created.

I think you might have to wait for Dr. Alfonoso to have a look at what's going wrong. In the meantime, you can try working with an older version (I assume you are using release 17?) to explore the features of Conn.


Best
Pravesh


Originally posted by Neir Eshel:
> Thanks for such a quick response! I'm running the Conn default
> preprocessing pipeline. The functional input files are all named
> "dana_pXXX_tp1_resting_state.nii.gz", where the XXX is a number from 001
> to 999. I'm actually selecting the files directly using the Conn gui. I
> just downloaded Conn, so I wanted to try everything manually first, before
> writing scripts. It seems like the program is occasionally getting confused
> about the prefixes. I just ran it again for one of the same subjects that
> failed before and now it was able to complete the full preprocessing
> pipeline, creating "...resting_state_uw" and then "centering_meanu...",
> then "meanu...", then "u...", then "au...", then "y_meanu", then
> "wmeanu...", then "wau", then "swau..."
>
Is there something with the names? Thank you!


Nov 18, 2016  07:11 PM | Neir Eshel
RE: Preprocessing error, functional normalization
>
Thanks for all your help! It turns out that there was nothing wrong with
the raw volumes, and I did include all the default pipeline steps in the
default order. The trick was to run the preprocessing separately for my two
sessions of functional imaging. (I had been attempting to run preprocessing
for both sessions at the same time). When I did it in two batches, there
were no errors. Not quite sure why, but I thought I'd reply to let you know
how it ended up going! Thanks.

>
Aug 7, 2018  02:08 PM | Gunes Sevinc - Harvard University
RE: Preprocessing error, functional normalization
Originally posted by Neir Eshel:
>
Thanks for all your help! It turns out that there was nothing wrong with
the raw volumes, and I did include all the default pipeline steps in the
default order. The trick was to run the preprocessing separately for my two
sessions of functional imaging. (I had been attempting to run preprocessing
for both sessions at the same time). When I did it in two batches, there
were no errors. Not quite sure why, but I thought I'd reply to let you know
how it ended up going! Thanks.

>
Thank your the info! I had the same problem and was able to solve it buy running each session separately.