processing-scripts > Not enough data in global.1D
Showing 1-5 of 5 posts
Display:
Results per page:
Jan 24, 2012  03:01 AM | Andreas Horn
Not enough data in global.1D
Hello everybody,

first of all, thank you very much for the great work with fcon 1000 and the preprocessing scripts.
I would like to run the scripts on my own, similar dataset to compare it with some of your datasets.

the scripts worked fine on some of your datasets, so everything seems to be installed correctly.
Now when I try to run them on my data, an exception is being thrown as "global.1D" does not have enough data.

I checked the forum and already fixed the 3dskullstrip debian incompatibility and told it to use original voxel size
(in total added flags -no_use_edge -orig_vol to 3dskullstrip in preprocess_anat.sh).

Still, I get this error. My full protocol is written below.

Thank you so much for any suggestions!

Andy


preprocessing 12
--------------------------------------
!!!! PREPROCESSING ANATOMICAL SCAN!!!!
--------------------------------------
deobliquing 12 anatomical
++ 3drefit: AFNI version=AFNI_2011_05_26_1457 (Jun 22 2011) [64-bit]
++ Authored by: RW Cox
++ Processing AFNI dataset mprage.nii.gz
 + loading and re-writing entire dataset mprage.nii.gz
++ 3drefit processed 1 datasets
Reorienting 12 anatomical
skull stripping 12 anatomical
++ 3dcalc: AFNI version=AFNI_2011_05_26_1457 (Jun 22 2011) [64-bit]
++ Authored by: A cast of thousands
++ Output dataset ./mprage_brain.nii.gz
---------------------------------------
!!!! PREPROCESSING FUNCTIONAL SCAN !!!!
---------------------------------------
Dropping first TRs
++ 3dcalc: AFNI version=AFNI_2011_05_26_1457 (Jun 22 2011) [64-bit]
++ Authored by: A cast of thousands
*+ WARNING:   If you are performing spatial transformations on an oblique dset,
  such as rest.nii.gz,
  or viewing/combining it with volumes of differing obliquity,
  you should consider running:
     3dWarp -deoblique
  on this and  other oblique datasets in the same session.
 See 3dWarp -help for details.
++ Oblique dataset:rest.nii.gz is 10.484265 degrees from plumb.
++ Output dataset ./rest_dr.nii.gz
Deobliquing 12
++ 3drefit: AFNI version=AFNI_2011_05_26_1457 (Jun 22 2011) [64-bit]
++ Authored by: RW Cox
++ Processing AFNI dataset rest_dr.nii.gz
 + loading and re-writing entire dataset rest_dr.nii.gz
++ 3drefit processed 1 datasets
Reorienting 12
Motion correcting 12
++ 3dTstat: AFNI version=AFNI_2011_05_26_1457 (Jun 22 2011) [64-bit]
++ Authored by: KR Hammett & RW Cox
++ Output dataset ./rest_ro_mean.nii.gz
++ 3dvolreg: AFNI version=AFNI_2011_05_26_1457 (Jun 22 2011) [64-bit]
++ Authored by: RW Cox
++ Coarse del was 10, replaced with 3
++ Max displacement in automask = 0.25 (mm) at sub-brick 371
Skull stripping 12
++ 3dAutomask: AFNI version=AFNI_2011_05_26_1457 (Jun 22 2011) [64-bit]
++ Authored by: Emperor Zhark
++ Loading dataset rest_mc.nii.gz
++ Forming automask
 + Fixed clip level = 279.744934
 + Used gradual clip level = 264.299774 .. 296.982086
 + Number voxels above clip level = 44079
 + Clustering voxels ...
 + Largest cluster has 43454 voxels
 + Clustering voxels ...
 + Largest cluster has 42786 voxels
 + Filled   461 voxels in small holes; now have 43247 voxels
 + Clustering voxels ...
 + Largest cluster has 43246 voxels
 + Clustering non-brain voxels ...
 + Clustering voxels ...
 + Largest cluster has 91922 voxels
 + Mask now has 43246 voxels
++ Dilating automask
 + Clustering voxels ...
 + Largest cluster has 85658 voxels
++ 49510 voxels in the mask [out of 135168: 36.63%]
++ first   9 x-planes are zero [from R]
++ last    8 x-planes are zero [from L]
++ first   0 y-planes are zero [from P]
++ last    4 y-planes are zero [from A]
++ first   0 z-planes are zero [from I]
++ last    0 z-planes are zero [from S]
++ Output dataset ./rest_mask.nii.gz
++ CPU time = 0.000000 sec
++ 3dcalc: AFNI version=AFNI_2011_05_26_1457 (Jun 22 2011) [64-bit]
++ Authored by: A cast of thousands
++ Output dataset ./rest_ss.nii.gz
Getting example_func for registration for 12
++ 3dcalc: AFNI version=AFNI_2011_05_26_1457 (Jun 22 2011) [64-bit]
++ Authored by: A cast of thousands
++ Output dataset ./example_func.nii.gz
Smoothing 12
Grand-mean scaling 12
Band-pass filtering 12
++ 3dFourier: AFNI version=AFNI_2011_05_26_1457 (Jun 22 2011) [64-bit]
Removing linear and quadratic trends for 12
++ 3dTstat: AFNI version=AFNI_2011_05_26_1457 (Jun 22 2011) [64-bit]
++ Authored by: KR Hammett & RW Cox
++ Output dataset ./rest_filt_mean.nii.gz
++ 3dDetrend: AFNI version=AFNI_2011_05_26_1457 (Jun 22 2011) [64-bit]
++ 3dcalc: AFNI version=AFNI_2011_05_26_1457 (Jun 22 2011) [64-bit]
++ Authored by: A cast of thousands
++ Output dataset ./rest_pp.nii.gz
Generating mask of preprocessed data for 12
------------------------------
!!!! RUNNING REGISTRATION !!!!
------------------------------
/bioimaging/andreash/topography/fcon_scripts/tissuepriors/3mm/
------------------------------
!!!! RUNNING SEGMENTATION !!!!
------------------------------
Segmenting brain for 12
Creating global mask
++ 3dcopy: AFNI version=AFNI_2011_05_26_1457 (Jun 22 2011) [64-bit]
Registering 12 csf to native (functional) space
Smoothing 12 csf
Registering 12 csf to standard space
Finding overlap between 12 csf and prior
Registering 12 csf back to native space
Threshold and binarize 12 csf probability map
Mask csf image by 12 functional
Registering 12 wm to native (functional) space
Smoothing 12 wm
Registering 12 wm to standard space
Finding overlap between 12 wm and prior
Registering 12 wm back to native space
Threshold and binarize 12 wm probability map
Mask wm image by 12 functional
--------------------------------------------
!!!! RUNNING NUISANCE SIGNAL REGRESSION !!!!
--------------------------------------------
Splitting up 12 motion parameters
Extracting global signal for 12
++ 3dmaskave: AFNI version=AFNI_2011_05_26_1457 (Jun 22 2011) [64-bit]
+++ 49510 voxels survive the mask
Extracting signal from csf for 12
++ 3dmaskave: AFNI version=AFNI_2011_05_26_1457 (Jun 22 2011) [64-bit]
+++ 319 voxels survive the mask
Extracting signal from white matter for 12
++ 3dmaskave: AFNI version=AFNI_2011_05_26_1457 (Jun 22 2011) [64-bit]
+++ 2113 voxels survive the mask
Modifying model file
Running feat model
Not enough data in /bioimaging/andreash/topography/12/func/nuisance/global.1D
++ 49510 voxels in mask
Running film to get residuals
Log directory is: /bioimaging/andreash/topography/12/func/nuisance/stats


An exception has been thrown
Unable to open /bioimaging/andreash/topography/12/func/nuisance/nuisance.mat
++ 3dTstat: AFNI version=AFNI_2011_05_26_1457 (Jun 22 2011) [64-bit]
++ Authored by: KR Hammett & RW Cox
** FATAL ERROR: Can't open dataset /bioimaging/andreash/topography/12/func/nuisance/stats/res4d.nii.gz
** Program compile date = Jun 22 2011
++ 3dcalc: AFNI version=AFNI_2011_05_26_1457 (Jun 22 2011) [64-bit]
++ Authored by: A cast of thousands
** FATAL ERROR: can't open dataset /bioimaging/andreash/topography/12/func/nuisance/stats/res4d.nii.gz
** Program compile date = Jun 22 2011
** ERROR (nifti_image_read): failed to find header file for '/bioimaging/andreash/topography/12/func/rest_res'
** ERROR: nifti_image_open(/bioimaging/andreash/topography/12/func/rest_res): bad header info
ERROR: failed to open file /bioimaging/andreash/topography/12/func/rest_res
ERROR: Could not open image /bioimaging/andreash/topography/12/func/rest_res
Image Exception : #22 :: Failed to read volume /bioimaging/andreash/topography/12/func/rest_res
terminate called after throwing an instance of 'RBD_COMMON::BaseException'
/bioimaging/andreash/topography/fcon_scripts/5_nuisance.sh: line 95: 10368 Aborted                 flirt -ref ${reg_dir}/standard -in ${func_dir}/${rest}_res -out ${func_dir}/${rest}_res2standard -applyxfm -init ${reg_dir}/example_func2standard.mat -interp trilinear
+ + + + +
cat: rest: No such file or directory
Jan 24, 2012  04:01 AM | Maarten Mennes
RE: Not enough data in global.1D
Hi Andreas,

Thank you for using the scripts! Great that you're applying them to your own data.

Re the global.1D file. The problem is likely not your mask, but rather the amount of volumes in your timeseries vs. the amount of timepoints it finds in global.1D. How many lines do you have in global.1D? How many volumes do you have in your timeseries? (likely you are 1 off in specifying n_vols - if that's not the problem there might be a problem in writing the global.1D file)

Best,
Maarten
Jan 24, 2012  07:01 AM | Andreas Horn
RE: Not enough data in global.1D
Dear Maarten,

thank you very much for your quick response!
This was already really helpful. The global.1D (as well as csf and wm.1D) are empty (!).
The mc*.D files instead are filled with 389 lines each. I have 400 timepoints and exclude the first 11 by specifying these variables in the batch_list.txt file:

'11 399 400 2.01'

... so my first timepoint is 12 (count starts at 0), the last one is 400 -> so it seems that there is one line extra in the mc*.D files?

If I have 400 volumes in my images, should the entry in batch_lists.txt file e.g. be '0 389 399 TR'?
Or is '0 399 400 TR' correct?

Thanks so much for your support,

Best, Andy
Jan 24, 2012  07:01 AM | Maarten Mennes
RE: Not enough data in global.1D
Hi Andy,

If you have 400 images and are deleting the first 11 your entry in batch_list.txt should be:

'11 399 389 2.01'

i.e., you start with 400 volumes and remove 11, that leaves you with 389 volumes...

It is weird though that the global.1D files is completely empty. Is the global_mask.nii.gz file ok? If you open it, does it cover the full brain?

Maarten
Jan 24, 2012  09:01 AM | Andreas Horn
RE: Not enough data in global.1D
Dear Marteen,

this alone fixed it - thank you so much. Am now running the script and it processed the first subject well, so I assume that everything will work out now.
I guess that it might have not even started to write out the nuisance parameters because of some prior error? Well, most importantly it works now.

Thank you very much for your help again!

Yours, Andy