indi > Starting with this project
Showing 1-25 of 45 posts
Display:
Results per page:

1   2   Next >
Jan 6, 2011  02:01 PM | Carolina Valencia
Starting with this project
Hi everybody:
I´m trying to start with this project but I have many problems:
I´m following these instructions http://fcon_1000.projects.nitrc.org/indi... but I can´t download the VM.000.006
Moreover I´m very lost because I´m following several web pages but I don´t know where to start
http://www.nitrc.org/plugins/mwiki/index...
http://www.xnat.org/XNAT+Virtual+Machine
http://www.nitrc.org/plugins/mwiki/index...

Somebody can guide me please.

Thanks a lot,
Best regards,
Carolina
Jan 6, 2011  03:01 PM | Maarten Mennes
RE: Starting with this project
Hi Carolina,

Sorry you're having problems with getting started with the project. I'm guessing it might take e few steps to get you going. First of all, the VM.000.006 link on http://fcon_1000.projects.nitrc.org/indi... Is it possible that you are not logged in into your nitrc account when you click download? So open your browser, login into nitrc and then navigate the the above page and try again.

http://www.nitrc.org/plugins/mwiki/index... --> after you have downloaded the VM, you should be able to follow these instructions. Also, download a dataset of your choice (DICOM or NIFTI) to go along with it.

KEEP in mind that the LITE releases do not use XNAT or the VM at all, so if that doesn't work out for you, you can still get the data that way.

http://www.nitrc.org/plugins/mwiki/index... --> this page explains how to use the processing scripts that you can find on the connectomes website. They can be used once you have your data organized.

Hope this already helps a bit, please do not hesitate to come back to the forums if you get stuck somewhere!
Best,
Maarten
Jan 7, 2011  05:01 PM | Carolina Valencia
RE: Starting with this project
Hi Maarten,

Thanks a lot for your support.
Now I sucessfully downloaded the VM.000.006 (in ubuntu because in windows xp I don't know why the file after download it and unzip it shows an error that it was corrupted) and followed the instructions explained on INDI#Getting_INDI_to_work. until the last step Adding data to your INDI release. I downloaded NKI.archive.1-5.DICOM so I followed the instructions for INDI-XNAT
and in this step ./uploadXmlToXNAT.pl ~/Desktop/host/NKI_archive_1-5_.DICOM I got an error: No such file or directory at ./uploadXmlToXNAT.pl line 27.
In the host folder I have KI.archive.1-5.DICOM.tar which I uncompressed and the folder NKI_archive_1-5_.DICOM was created.

By the way, I have doubts if I'm using ubuntu why I have to use virtual box with ubuntu again to mount the VM.000.006 and if I want to do this in windows xp how I uncompressed VM.000.006 without errors.

Best Regards,

Carolina
Jan 11, 2011  03:01 PM | Carolina Valencia
RE: Starting with this project
Hi Maarten,

I did this step:

Unzip the INDI Virtual Machine (tar -xzf VM.000.006.tgz). Be sure to do this a in folder that will suit your project (e.g. /home/me/INDI) and be aware that this folder will grow considerably as you add data. After unzipping you should see a file called: INDI.vdi and a folder called xnat. This is a shared folder between your computer and the virtual machine that contains the configuration files for xnat

The results files from the unzip step were a folder called NKI.001.001.VM other folder PAX HEADER and the file INDI.vdi. I don´t find any called xnat.


Best Regards,

Carolina
Jan 11, 2011  04:01 PM | Maarten Mennes
RE: Starting with this project
Hi Carolina,

sorry for not getting back earlier, we got caught up in the HBM abstracts.

1) In your previous message you said that this command gave you and error:
./uploadXmlToXNAT.pl ~/Desktop/host/NKI_archive_1-5_.DICOM

It seems you have a typo there, there should be no . in front of DICOM. The best way of circumventing such problems is to use 'tab' to do autocompletion of words for you. Type the beginning of the script or path and then hit 'tab' and it will autocomplete or suggest what autocompletion. That way you can be sure a directory or file is present.

2) Hmm, that almost looks like you have an old/incorrect version of the VM. Could you redownload VM.000.006.tgz and retry?
http://www.nitrc.org/frs/downloadlink.ph...

Sorry you run into these troubles!
Maarten
Jan 12, 2011  03:01 PM | Carolina Valencia
RE: Starting with this project
Hi Maarten,

Finally I could uncompress correct files and follow all the steps in http://www.nitrc.org/plugins/mwiki/index...
But again, I'm lost with http://www.nitrc.org/plugins/mwiki/index...
I don't know how to do the batch processing, this step is with the virtual machine, right?
I already have the scripts and the data releases below the column: DICOM, For use with Virtual Machine. But I don know exactly where to begin.

Thanks for your patience,

Carolina
Jan 14, 2011  03:01 PM | Maarten Mennes
RE: Starting with this project
Hi Carolina,

great you succeeded in the installation!

The scripts are intended for batch processing the datasets in preparation for functional connectivity analyses (in fact they can also do these for you). Once you have the data set up in XNAT, you can select subjects you want to process (e.g., based on IQ or Age), and export them again. Then you can apply the scripts to these datasets.

For practice reasons, I would suggest you to download one of the Lite releases as these already come in an easy structure (searching for variables of interest is a bit harder though as you have to search through a spreadsheet).

Finally, the scripts are intended for use with NIFTI files, so if you downloaded DICOM (which is perfectly fine), you will have to convert the datasets to .nii first. You can use dcm2nii from mricron (also available on nitrc) for that.

The scripts are bash scripts. You can either run them within the virtual machine or within any UNIX (Mac, Linux, or Cygwin for windows) environment. You just need access to your data. That is for instance why we have you make the host folder during setup so data from the virtual machine would be also available on your 'actual' computer.

Hope this helps,
Maarten
Jan 17, 2011  03:01 PM | Carolina Valencia
RE: Starting with this project
Hi Marteen,

Thank you for your help.
As I said before, I downloaded all dataset from the Dicom column. I only find within those folders, files in xml format which are not read by dcm2nii program. should I do something extra before?
When you refers to lite releases in the third paragraph you are talking about the datasets that I have downloaded, right?

Sorry for my poor knowledge related to the project.

Best,

Carolina
Jan 17, 2011  03:01 PM | Maarten Mennes
RE: Starting with this project
Hi Carolina,

There are two versions of the project: 1 that comes with the XNAT databasing system and 1 that comes without XNAT. The information and datasets included in both are exactly the same, it is the way things are stored and used that are different.

If you downloaded the XNAT version you can use the XNAT system to search trough the different datasets you downloaded and look at the various phenotypic information included. To start that you would double-click the INDI icon on the desktop of the VM. If you find datasets that you would like to analyze (e.g., all children between 7-18y), then you could select them and select 'download images' for those. That way you would download their scans to a new folder where you can start analyzing them. This could be done with the fcon_1000 scripts, yet those require nifti images.

The xml file you see in the folders you downloaded from the site are what gets added to XNAT if you run the uploadXML script. That actually adds all of that subject's information and imaging data to XNAT. From within XNAT you would have to 'download' them again (as explained above) to be able to process the images.

Alternatively you have the LITE release that do not require XNAT, nor the VM. They are just a collection of folders/images and a spreadsheet that contains the phenotypic information. Here, you are responsible yourself for storing them how you'd like and you would have to use e.g., excel to search through the spreadsheet. You can however immediately start processing the scans with the fcon_1000 scripts, provided you have the nifti version. We also provide the DICOM version for users how prefer DICOM over nifti and maybe already have their own processing path set up. With that said, you don't need to use the fcon_1000 scripts. If you already have a processing path in place, then by all means use that one. We made the scripts available for users who don't have a processing path in place, who would like to see what steps we applied, etc. They are free to use and modify in whatever way you want...

Maarten
Jan 27, 2011  04:01 PM | Carolina Valencia
RE: Starting with this project
Hi Marteen,

Now I'm trying to process with my own data, I have the anatomic and functional images, I edit the batch_process.sh and SEED_list.txt without problem (I guess) but I'm stuck editing batch_list.txt, I don't understand what exactly I have to edit here, and How I know the timepoints that it request
Do you have any example with images that follow all the steps explained in the wiki?

Thanks a lot,

Best regards,

Carolina
Jan 31, 2011  03:01 PM | Maarten Mennes
RE: Starting with this project
Dear Carolina,

great that you are also applying the scripts to your own data. There is an explanation on the wiki on how to use the scripts. http://www.nitrc.org/plugins/mwiki/index...


The entries for batch list are:

1. The directory where your data are: e.g., /home/carolina/my_data

2. full path to a subject list that contains 1 line per subject: e.g., /home/carolina/my_data/scripts/subjects.txt
with subject.txt looking like
sub001
sub002

3. what volume to start with: 0 if you do not want to remove volumes from the beginning of your run. Most of the time you want to remove some volumes from the beginning of the timeseries to allow for the magnetic field to stabilize. You would remove 4 or 5 volumes. In the fcon datasets we already did that so you don't have to remve extra points.

4. what volume to end with: the last volume - 1 (e.g., if you have 200 volumes, this number would be 199).

5. the number of volumes to include = (what volume to end with) - (what volume to start with) + 1; e.g., 199-0+1=200

6. the TR of your functional timeseries

You can see the number of volumes in your timeseries by using e.g., "3dinfo scan.nii.gz". 3dinfo is an AFNI command that gives you some info about your scan.

Maarten
Feb 1, 2011  02:02 PM | Carolina Valencia
RE: Starting with this project
Hi Marteen,

thank you for your support, now I have doubts for few details
My folder order is
for the scripts: /home/cvalencia/Documentos/scripts
Subjects: /home/cvalencia/Documentos/Subj001
anatomical folder: /home/cvalencia/Documentos/Subj001/anat
functional folder: /home/cvalencia/Documentos/Subj001/func1

So my batch_list is a line with /home/cvalencia/Documentos /home/cvalencia/Documentos/scripts/subjects.txt 0 196 197 2.7
and my subjects.txt is a line with Subj001

Now in a terminal and in the script folder I try to execute the batch_process.sh and it shows the error: order not found

Do I miss something??

Thanks in advance,

Best regards,

Carolina
Feb 1, 2011  05:02 PM | Maarten Mennes
RE: Starting with this project
hi Carolina,

if you did not modify the scripts it is best to keep the functional folder as

/home/cvalencia/Documentos/Subj001/func

If you need that changed, you will have to go into the scripts, look for func_dir and adjust accordingly

Can you copy and paste the exact error you get, this one is a bit cryptic :)

Maarten
Feb 1, 2011  05:02 PM | Carolina Valencia
RE: Starting with this project
Hi Maarten,

Sorry but I don't understand your last email, I edited the batch_process.sh (copy paste below) according to my data, but I don't find where to edit my func folder.
About the error when trying to execute the batch_process.sh
cvalencia@cvalencia-Precision-WorkStation-T3400:~/Documentos/scripts$ batch_process.sh
batch_process.sh: orden no encontrada
In English I thing is order not found
I know it sounds weird, and I have no idea, moreover I'm begging user of ubuntu too.


####################
## scripts directory
####################
## directory where you put the scripts downloaded from http://www.nitrc.org/projects/fcon_1000
## e.g. /home/fcon_1000/scripts
scripts_dir=/home/cvalencia/Documentos/scripts


###########################
## what do you want to run?
###########################
## 1 - general RSFC preprocessing
## 2 - single-subject RSFC (requires general preprocessing to be completed)
## 3 - ALFF/fALFF (default frequency band of interest is 0.01-0.1Hz)
## 4 - Dual Regression
what_to_do=1


#######################
## some important names
#######################
## anatomical scan you want to use (no extension)
anat_name=20100710_12333209971816fl3D1x1x1sagSUBJECTNC0001s003a001
## resting-state scan you want to use (no extension)
rest_name=20100710_12333209968684ep2dboldDRAANA2700SUBJECTNC0001s004a001


################################################
## Extra parameters needed to run postprocessing
################################################
## image you want to use to calculate RSFC for.
## This is usually the image containing resting-state scan after regressing out the nuisance parameters
postprocessing_image=rest_res.nii.gz
## path to list of seeds you want to calculate RSFC for e.g. /home/fcon_1000/scripts/my_seeds.txt
## DEFAULT path = within the scripts directory specified above
## the list includes the full path for each seed e.g. /home/my_seeds/connectomes_seed_3mm.nii.gz
seed_list=${scripts_dir}/SEED_LIST.txt
## image you want to use for postprocessing in MNI space
## This image is used in the dual regression step. Dual regression is applied to this image.
postprocessing_mni_image=rest_res2standard.nii.gz
## mask used for Dual Regression; default = MNI based mask
## the resolution has to be consistent with the DR templates
mask=${FSLDIR}/data/standard/MNI152_T1_3mm_brain_mask.nii.gz
## template used for Dual Regression
## default = the metaICA image, including 20 ICA components as derived in Biswal et al. 2010, PNAS
## running Dual Regression is tuned to run for 20 components you will have to edit 8_singlesubjectDR.sh if you want to use a different template
DR_template=${scripts_dir}/templates/metaICA.nii.gz


#######################################
## Standard brain used for registration
## include the /full/path/to/it
#######################################
standard_brain=${FSLDIR}/data/standard/MNI152_T1_3mm_brain.nii.gz


#####################################################################################
## batch processing parameter list
## DO NOT FORGET TO EDIT batch_list.txt itself to include the appropriate directories
#####################################################################################
## path to batch_list.txt, e.g., /home/fcon_1000/scripts/my_batch_list.txt.
## DEFAULT = within the scripts directory specified above
## This file contains the sites (and their parameters) you want to process.
## batch_list.txt contains 1 line per site listing:
## - analysis directory, i.e. full/path/to/site
## - subjectlist, i.e. full/path/to/site/site_subjects.txt
## - first timepoint, default = 0 (start with the 1st volume)
## - last timepoint = number of timepoints - 1 (since count starts at 0)
## - number of timepoints in the timeseries
## - TR
batch_list=${scripts_dir}/batch_list_cv.txt
Feb 1, 2011  06:02 PM | Maarten Mennes
RE: Starting with this project
Dear Carolina,

with editing func1 to func I meant that you would rename your func1 folder to func. That will be the easiest for now.

Since you are an ubuntu/linux beginner, I would suggest that you go through some unix/linux tutorials first. This will make further understanding of the scripts and how they work much easier. You can find some through google. Here is a good example:

http://www.ee.surrey.ac.uk/Teaching/Unix...

The error you are running into means: "command not found" and is caused by the fact that you just typed "batch_process.sh". Instead you will have to type "./batch_process.sh". That is because the commandline needs to know where the script is that you want to run. In this case it is where you are, which is in unix indicated by the .

I would also suggest you to rename your long scan names (e.g., 20100710_12333209971816fl3D1x1x1sagSUBJECTNC0001s003a001) to mprage and rest or so. This will be a lot easier later on.

Hang in there, practice makes perfect!
Maarten
Feb 1, 2011  07:02 PM | Carolina Valencia
RE: Starting with this project
Hi Marteen,

I did all your suggestions, now I can run but a lot of errors appears in the terminal... I tried to figure out by myself but I don't know how to proceed
First I have a doubt, where is the postprocessing_image=rest_res.nii.gz file and the postprocessing_mni_image=rest_res2standard.nii.gz file... I thought those files are the outputs of the pre processing step. But know I think is related to the problems in the logfile.
I attached the logfile, some errors are in Spanish, sorry for this...

Thanks,

Carolina
Attachment: logfile
Feb 2, 2011  05:02 PM | Maarten Mennes
RE: Starting with this project
Hi Carolina,

Since you are not getting the output files something is going wrong during the processing of the images. Can you attach that logfile again, it doesn't seem to be attached... Or copy and paste?

Maarten
Feb 2, 2011  06:02 PM | Carolina Valencia
RE: Starting with this project
The logfile copy paste below


cvalencia@cvalencia-Precision-WorkStation-T3400:~/Documentos/scripts$ ./batch_process.sh
/home/cvalencia/Documentos + /home/cvalencia/Documentos/scripts/subjects.txt + 0 + 196 + 197 + 2.7
preprocessing Subj001
--------------------------------------
!!!! PREPROCESSING ANATOMICAL SCAN!!!!
--------------------------------------
deobliquing Subj001 anatomical
/home/cvalencia/Documentos/scripts/1_anatpreproc.sh: línea 36: 3drefit: orden no encontrada
Reorienting Subj001 anatomical
/home/cvalencia/Documentos/scripts/1_anatpreproc.sh: línea 40: 3dresample: orden no encontrada
skull stripping Subj001 anatomical
/home/cvalencia/Documentos/scripts/1_anatpreproc.sh: línea 45: 3dSkullStrip: orden no encontrada
/home/cvalencia/Documentos/scripts/1_anatpreproc.sh: línea 46: 3dcalc: orden no encontrada
---------------------------------------
!!!! PREPROCESSING FUNCTIONAL SCAN !!!!
---------------------------------------
Dropping first TRs
/home/cvalencia/Documentos/scripts/2_funcpreproc.sh: línea 49: 3dcalc: orden no encontrada
Deobliquing Subj001
/home/cvalencia/Documentos/scripts/2_funcpreproc.sh: línea 53: 3drefit: orden no encontrada
Reorienting Subj001
/home/cvalencia/Documentos/scripts/2_funcpreproc.sh: línea 57: 3dresample: orden no encontrada
Motion correcting Subj001
/home/cvalencia/Documentos/scripts/2_funcpreproc.sh: línea 61: 3dTstat: orden no encontrada
/home/cvalencia/Documentos/scripts/2_funcpreproc.sh: línea 62: 3dvolreg: orden no encontrada
Skull stripping Subj001
/home/cvalencia/Documentos/scripts/2_funcpreproc.sh: línea 66: 3dAutomask: orden no encontrada
/home/cvalencia/Documentos/scripts/2_funcpreproc.sh: línea 67: 3dcalc: orden no encontrada
Getting example_func for registration for Subj001
/home/cvalencia/Documentos/scripts/2_funcpreproc.sh: línea 71: 3dcalc: orden no encontrada
Smoothing Subj001
** ERROR (nifti_image_read): failed to find header file for 'rest_ss'
** ERROR: nifti_image_open(rest_ss): bad header info
Error: failed to open file rest_ss
Cannot open volume rest_ss for reading!
Grand-mean scaling Subj001
** ERROR (nifti_image_read): failed to find header file for 'rest_sm'
** ERROR: nifti_image_open(rest_sm): bad header info
Error: failed to open file rest_sm
Cannot open volume rest_sm for reading!
Band-pass filtering Subj001
/home/cvalencia/Documentos/scripts/2_funcpreproc.sh: línea 83: 3dFourier: orden no encontrada
Removing linear and quadratic trends for Subj001
/home/cvalencia/Documentos/scripts/2_funcpreproc.sh: línea 87: 3dTstat: orden no encontrada
/home/cvalencia/Documentos/scripts/2_funcpreproc.sh: línea 88: 3dDetrend: orden no encontrada
/home/cvalencia/Documentos/scripts/2_funcpreproc.sh: línea 89: 3dcalc: orden no encontrada
Generating mask of preprocessed data for Subj001
** ERROR (nifti_image_read): failed to find header file for 'rest_pp'
** ERROR: nifti_image_open(rest_pp): bad header info
Error: failed to open file rest_pp
Cannot open volume rest_pp for reading!
------------------------------
!!!! RUNNING REGISTRATION !!!!
------------------------------
mkdir: no se puede crear el directorio «/home/cvalencia/Documentos/Subj001/reg»: El archivo ya existe
cp: no se puede efectuar «stat» sobre «/home/cvalencia/Documentos/Subj001/anat/mprage_brain.nii.gz»: No existe el fichero o el directorio
cp: no se puede efectuar «stat» sobre «/home/cvalencia/Documentos/Subj001/func/example_func.nii.gz»: No existe el fichero o el directorio
** ERROR (nifti_image_read): failed to find header file for 'highres'
** ERROR: nifti_image_open(highres): bad header info
Error: failed to open file highres
ERROR: Could not open image highres
Image Exception : #22 :: Failed to read volume highres
terminate called after throwing an instance of 'RBD_COMMON::BaseException'
/home/cvalencia/Documentos/scripts/3_registration.sh: línea 57: 5581 Abortado flirt -ref highres -in example_func -out example_func2highres -omat example_func2highres.mat -cost corratio -dof 6 -interp trilinear
Could not open matrix file example_func2highres.mat
Cannot read input-matrix
** ERROR (nifti_image_read): failed to find header file for 'highres'
** ERROR: nifti_image_open(highres): bad header info
Error: failed to open file highres
ERROR: Could not open image highres
Image Exception : #22 :: Failed to read volume highres
terminate called after throwing an instance of 'RBD_COMMON::BaseException'
/home/cvalencia/Documentos/scripts/3_registration.sh: línea 63: 5583 Abortado flirt -ref standard -in highres -out highres2standard -omat highres2standard.mat -cost corratio -searchcost corratio -dof 12 -interp trilinear
Could not open matrix file highres2standard.mat
Cannot read input-matrix
Could not open matrix file example_func2highres.mat
Cannot read input-matrix
** ERROR (nifti_image_read): failed to find header file for 'example_func'
** ERROR: nifti_image_open(example_func): bad header info
Error: failed to open file example_func
ERROR: Could not open image example_func
Image Exception : #22 :: Failed to read volume example_func
terminate called after throwing an instance of 'RBD_COMMON::BaseException'
/home/cvalencia/Documentos/scripts/3_registration.sh: línea 71: 5586 Abortado flirt -ref standard -in example_func -out example_func2standard -applyxfm -init example_func2standard.mat -interp trilinear
Could not open matrix file example_func2standard.mat
Cannot read input-matrix
/home/cvalencia/Documentos/scripts/tissuepriors/3mm/
------------------------------
!!!! RUNNING SEGMENTATION !!!!
------------------------------
mkdir: no se puede crear el directorio «/home/cvalencia/Documentos/Subj001/segment»: El archivo ya existe
Segmenting brain for Subj001
** ERROR (nifti_image_read): failed to find header file for 'mprage_brain'
** ERROR: nifti_image_open(mprage_brain): bad header info
Error: failed to open file mprage_brain
ERROR: Could not open image mprage_brain
Image Exception : #22 :: Failed to read volume mprage_brain.nii.gz
terminate called after throwing an instance of 'RBD_COMMON::BaseException'
/home/cvalencia/Documentos/scripts/4_segment.sh: línea 55: 5594 Abortado fast -t 1 -g -p -o segment ${anat}_brain.nii.gz
Creating global mask
/home/cvalencia/Documentos/scripts/4_segment.sh: línea 59: 3dcopy: orden no encontrada
Registering Subj001 csf to native (functional) space
** ERROR (nifti_image_read): failed to find header file for '/home/cvalencia/Documentos/Subj001/reg/example_func'
** ERROR: nifti_image_open(/home/cvalencia/Documentos/Subj001/reg/example_func): bad header info
Error: failed to open file /home/cvalencia/Documentos/Subj001/reg/example_func
ERROR: Could not open image /home/cvalencia/Documentos/Subj001/reg/example_func
Image Exception : #22 :: Failed to read volume /home/cvalencia/Documentos/Subj001/reg/example_func
terminate called after throwing an instance of 'RBD_COMMON::BaseException'
/home/cvalencia/Documentos/scripts/4_segment.sh: línea 64: 5596 Abortado flirt -in ${anat_dir}/segment_prob_0 -ref ${reg_dir}/example_func -applyxfm -init ${reg_dir}/highres2example_func.mat -out ${segment_dir}/csf2func
Smoothing Subj001 csf
** ERROR (nifti_image_read): failed to find header file for '/home/cvalencia/Documentos/Subj001/segment/csf2func'
** ERROR: nifti_image_open(/home/cvalencia/Documentos/Subj001/segment/csf2func): bad header info
Error: failed to open file /home/cvalencia/Documentos/Subj001/segment/csf2func
Cannot open volume /home/cvalencia/Documentos/Subj001/segment/csf2func for reading!
Registering Subj001 csf to standard space
** ERROR (nifti_image_read): failed to find header file for '/home/cvalencia/Documentos/Subj001/segment/csf_sm'
** ERROR: nifti_image_open(/home/cvalencia/Documentos/Subj001/segment/csf_sm): bad header info
Error: failed to open file /home/cvalencia/Documentos/Subj001/segment/csf_sm
ERROR: Could not open image /home/cvalencia/Documentos/Subj001/segment/csf_sm
Image Exception : #22 :: Failed to read volume /home/cvalencia/Documentos/Subj001/segment/csf_sm
terminate called after throwing an instance of 'RBD_COMMON::BaseException'
/home/cvalencia/Documentos/scripts/4_segment.sh: línea 70: 5598 Abortado flirt -in ${segment_dir}/csf_sm -ref ${reg_dir}/standard -applyxfm -init ${reg_dir}/example_func2standard.mat -out ${segment_dir}/csf2standard
Finding overlap between Subj001 csf and prior
** ERROR (nifti_image_read): failed to find header file for '/home/cvalencia/Documentos/Subj001/segment/csf2standard'
** ERROR: nifti_image_open(/home/cvalencia/Documentos/Subj001/segment/csf2standard): bad header info
Error: failed to open file /home/cvalencia/Documentos/Subj001/segment/csf2standard
Cannot open volume /home/cvalencia/Documentos/Subj001/segment/csf2standard for reading!
Registering Subj001 csf back to native space
** ERROR (nifti_image_read): failed to find header file for '/home/cvalencia/Documentos/Subj001/reg/example_func'
** ERROR: nifti_image_open(/home/cvalencia/Documentos/Subj001/reg/example_func): bad header info
Error: failed to open file /home/cvalencia/Documentos/Subj001/reg/example_func
ERROR: Could not open image /home/cvalencia/Documentos/Subj001/reg/example_func
Image Exception : #22 :: Failed to read volume /home/cvalencia/Documentos/Subj001/reg/example_func
terminate called after throwing an instance of 'RBD_COMMON::BaseException'
/home/cvalencia/Documentos/scripts/4_segment.sh: línea 76: 5600 Abortado flirt -in ${segment_dir}/csf_masked -ref ${reg_dir}/example_func -applyxfm -init ${reg_dir}/standard2example_func.mat -out ${segment_dir}/csf_native
Threshold and binarize Subj001 csf probability map
** ERROR (nifti_image_read): failed to find header file for '/home/cvalencia/Documentos/Subj001/segment/csf_native'
** ERROR: nifti_image_open(/home/cvalencia/Documentos/Subj001/segment/csf_native): bad header info
Error: failed to open file /home/cvalencia/Documentos/Subj001/segment/csf_native
Cannot open volume /home/cvalencia/Documentos/Subj001/segment/csf_native for reading!
Mask csf image by Subj001 functional
** ERROR (nifti_image_read): failed to find header file for '/home/cvalencia/Documentos/Subj001/segment/csf_bin'
** ERROR: nifti_image_open(/home/cvalencia/Documentos/Subj001/segment/csf_bin): bad header info
Error: failed to open file /home/cvalencia/Documentos/Subj001/segment/csf_bin
Cannot open volume /home/cvalencia/Documentos/Subj001/segment/csf_bin for reading!
Registering Subj001 wm to native (functional) space
** ERROR (nifti_image_read): failed to find header file for '/home/cvalencia/Documentos/Subj001/reg/example_func'
** ERROR: nifti_image_open(/home/cvalencia/Documentos/Subj001/reg/example_func): bad header info
Error: failed to open file /home/cvalencia/Documentos/Subj001/reg/example_func
ERROR: Could not open image /home/cvalencia/Documentos/Subj001/reg/example_func
Image Exception : #22 :: Failed to read volume /home/cvalencia/Documentos/Subj001/reg/example_func
terminate called after throwing an instance of 'RBD_COMMON::BaseException'
/home/cvalencia/Documentos/scripts/4_segment.sh: línea 87: 5603 Abortado flirt -in ${anat_dir}/segment_prob_2 -ref ${reg_dir}/example_func -applyxfm -init ${reg_dir}/highres2example_func.mat -out ${segment_dir}/wm2func
Smoothing Subj001 wm
** ERROR (nifti_image_read): failed to find header file for '/home/cvalencia/Documentos/Subj001/segment/wm2func'
** ERROR: nifti_image_open(/home/cvalencia/Documentos/Subj001/segment/wm2func): bad header info
Error: failed to open file /home/cvalencia/Documentos/Subj001/segment/wm2func
Cannot open volume /home/cvalencia/Documentos/Subj001/segment/wm2func for reading!
Registering Subj001 wm to standard space
** ERROR (nifti_image_read): failed to find header file for '/home/cvalencia/Documentos/Subj001/segment/wm_sm'
** ERROR: nifti_image_open(/home/cvalencia/Documentos/Subj001/segment/wm_sm): bad header info
Error: failed to open file /home/cvalencia/Documentos/Subj001/segment/wm_sm
ERROR: Could not open image /home/cvalencia/Documentos/Subj001/segment/wm_sm
Image Exception : #22 :: Failed to read volume /home/cvalencia/Documentos/Subj001/segment/wm_sm
terminate called after throwing an instance of 'RBD_COMMON::BaseException'
/home/cvalencia/Documentos/scripts/4_segment.sh: línea 93: 5605 Abortado flirt -in ${segment_dir}/wm_sm -ref ${reg_dir}/standard -applyxfm -init ${reg_dir}/example_func2standard.mat -out ${segment_dir}/wm2standard
Finding overlap between Subj001 wm and prior
** ERROR (nifti_image_read): failed to find header file for '/home/cvalencia/Documentos/Subj001/segment/wm2standard'
** ERROR: nifti_image_open(/home/cvalencia/Documentos/Subj001/segment/wm2standard): bad header info
Error: failed to open file /home/cvalencia/Documentos/Subj001/segment/wm2standard
Cannot open volume /home/cvalencia/Documentos/Subj001/segment/wm2standard for reading!
Registering Subj001 wm back to native space
** ERROR (nifti_image_read): failed to find header file for '/home/cvalencia/Documentos/Subj001/reg/example_func'
** ERROR: nifti_image_open(/home/cvalencia/Documentos/Subj001/reg/example_func): bad header info
Error: failed to open file /home/cvalencia/Documentos/Subj001/reg/example_func
ERROR: Could not open image /home/cvalencia/Documentos/Subj001/reg/example_func
Image Exception : #22 :: Failed to read volume /home/cvalencia/Documentos/Subj001/reg/example_func
terminate called after throwing an instance of 'RBD_COMMON::BaseException'
/home/cvalencia/Documentos/scripts/4_segment.sh: línea 99: 5607 Abortado flirt -in ${segment_dir}/wm_masked -ref ${reg_dir}/example_func -applyxfm -init ${reg_dir}/standard2example_func.mat -out ${segment_dir}/wm_native
Threshold and binarize Subj001 wm probability map
** ERROR (nifti_image_read): failed to find header file for '/home/cvalencia/Documentos/Subj001/segment/wm_native'
** ERROR: nifti_image_open(/home/cvalencia/Documentos/Subj001/segment/wm_native): bad header info
Error: failed to open file /home/cvalencia/Documentos/Subj001/segment/wm_native
Cannot open volume /home/cvalencia/Documentos/Subj001/segment/wm_native for reading!
Mask wm image by Subj001 functional
** ERROR (nifti_image_read): failed to find header file for '/home/cvalencia/Documentos/Subj001/segment/wm_bin'
** ERROR: nifti_image_open(/home/cvalencia/Documentos/Subj001/segment/wm_bin): bad header info
Error: failed to open file /home/cvalencia/Documentos/Subj001/segment/wm_bin
Cannot open volume /home/cvalencia/Documentos/Subj001/segment/wm_bin for reading!
--------------------------------------------
!!!! RUNNING NUISANCE SIGNAL REGRESSION !!!!
--------------------------------------------
Splitting up Subj001 motion parameters
awk: cannot open /home/cvalencia/Documentos/Subj001/func/rest_mc.1D (No such file or directory)
awk: cannot open /home/cvalencia/Documentos/Subj001/func/rest_mc.1D (No such file or directory)
awk: cannot open /home/cvalencia/Documentos/Subj001/func/rest_mc.1D (No such file or directory)
awk: cannot open /home/cvalencia/Documentos/Subj001/func/rest_mc.1D (No such file or directory)
awk: cannot open /home/cvalencia/Documentos/Subj001/func/rest_mc.1D (No such file or directory)
awk: cannot open /home/cvalencia/Documentos/Subj001/func/rest_mc.1D (No such file or directory)
Extracting global signal for Subj001
/home/cvalencia/Documentos/scripts/5_nuisance.sh: línea 60: 3dmaskave: orden no encontrada
Extracting signal from csf for Subj001
/home/cvalencia/Documentos/scripts/5_nuisance.sh: línea 64: 3dmaskave: orden no encontrada
Extracting signal from white matter for Subj001
/home/cvalencia/Documentos/scripts/5_nuisance.sh: línea 68: 3dmaskave: orden no encontrada
Modifying model file
Running feat model
Not enough data in /home/cvalencia/Documentos/Subj001/func/nuisance/global.1D
/home/cvalencia/Documentos/scripts/5_nuisance.sh: línea 84: 3dBrickStat: orden no encontrada
Running film to get residuals
Log directory is: /home/cvalencia/Documentos/Subj001/func/nuisance/stats+++
** ERROR (nifti_image_read): failed to find header file for '/home/cvalencia/Documentos/Subj001/func/rest_pp'
** ERROR: nifti_image_open(/home/cvalencia/Documentos/Subj001/func/rest_pp): bad header info
Error: failed to open file /home/cvalencia/Documentos/Subj001/func/rest_pp
ERROR: Could not open image /home/cvalencia/Documentos/Subj001/func/rest_pp
Image Exception : #22 :: Failed to read volume /home/cvalencia/Documentos/Subj001/func/rest_pp.nii.gz


An exception has been thrown
Failed to read volume /home/cvalencia/Documentos/Subj001/func/rest_pp.nii.gzTrace: read_volume4DROI.

/home/cvalencia/Documentos/scripts/5_nuisance.sh: línea 91: 3dTstat: orden no encontrada
/home/cvalencia/Documentos/scripts/5_nuisance.sh: línea 92: 3dcalc: orden no encontrada
** ERROR (nifti_image_read): failed to find header file for '/home/cvalencia/Documentos/Subj001/func/rest_res'
** ERROR: nifti_image_open(/home/cvalencia/Documentos/Subj001/func/rest_res): bad header info
Error: failed to open file /home/cvalencia/Documentos/Subj001/func/rest_res
ERROR: Could not open image /home/cvalencia/Documentos/Subj001/func/rest_res
Image Exception : #22 :: Failed to read volume /home/cvalencia/Documentos/Subj001/func/rest_res
terminate called after throwing an instance of 'RBD_COMMON::BaseException'
/home/cvalencia/Documentos/scripts/5_nuisance.sh: línea 95: 5644 Abortado flirt -ref ${reg_dir}/standard -in ${func_dir}/${rest}_res -out ${func_dir}/${rest}_res2standard -applyxfm -init ${reg_dir}/example_func2standard.mat -interp trilinear
+ + + + +
cat: rest: No existe el fichero o el directorio
Feb 2, 2011  06:02 PM | Maarten Mennes
RE: Starting with this project
Hi Carolina,

it looks like you do not have AFNI installed or that it is not accessible at the general path.

make sure you have the following in your .bashrc - Change /Users/Shared/afni to where you installed afni

AFNIDIR=/Users/Shared/afni
PATH=${AFNIDIR}:${PATH}
export AFNIDIR

Also see here for more instruction on installing AFNI
http://afni.nimh.nih.gov/pub/dist/HOWTO/...


Another solution might be to use the Lin4Neuro ubuntu distribution which has these tools already installed.
http://www.nemotos.net/lin4neuro/

A good learning idea would also be to open the 1_anatpreproc.sh, 2_funcpreproc.sh etc. and apply the commands manually to a dataset. That way you learn what commands are used and what they do to the data.

Maarten
Feb 3, 2011  01:02 PM | Carolina Valencia
RE: Starting with this project
Hi Marteen,

I have AFNI installed but not accessible in the path, I edited the .bashrc
Then I run the batch_process.sh and it starts to process... but it stops in the nuisance step, I understand there is a file missed...

Thanks!


cvalencia@cvalencia-Precision-WorkStation-T3400:~/Documentos/scripts$ ./batch_process.sh
/home/cvalencia/Documentos + /home/cvalencia/Documentos/scripts/subjects.txt + 0 + 196 + 197 + 2.7
preprocessing Subj001
--------------------------------------
!!!! PREPROCESSING ANATOMICAL SCAN!!!!
--------------------------------------
deobliquing Subj001 anatomical
++ 3drefit: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
++ Authored by: RW Cox
++ Processing AFNI dataset mprage.nii.gz
+ loading and re-writing entire dataset mprage.nii.gz
++ 3drefit processed 1 datasets
Reorienting Subj001 anatomical
skull stripping Subj001 anatomical
The intensity in the output dataset is a modified version
of the intensity in the input volume.
To obtain a masked version of the input with identical values inside
the brain, you can either use 3dSkullStrip's -orig_vol option
or run the following command:
3dcalc -a mprage_RPI.nii.gz -b mprage_surf.nii.gz+orig -expr 'a*step(b)' \
-prefix mprage_surf.nii.gz_orig_vol
to generate a new masked version of the input.
++ 3dcalc: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
++ Authored by: A cast of thousands
++ Output dataset ./mprage_brain.nii.gz
---------------------------------------
!!!! PREPROCESSING FUNCTIONAL SCAN !!!!
---------------------------------------
Dropping first TRs
++ 3dcalc: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
++ Authored by: A cast of thousands
*+ WARNING: If you are performing spatial transformations on an oblique dset,
such as rest.nii.gz,
or viewing/combining it with volumes of differing obliquity,
you should consider running:
3dWarp -deoblique
on this and other oblique datasets in the same session.
See 3dWarp -help for details.
++ Oblique dataset:rest.nii.gz is 5.179751 degrees from plumb.
++ Output dataset ./rest_dr.nii.gz
Deobliquing Subj001
++ 3drefit: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
++ Authored by: RW Cox
++ Processing AFNI dataset rest_dr.nii.gz
+ loading and re-writing entire dataset rest_dr.nii.gz
++ 3drefit processed 1 datasets
Reorienting Subj001
Motion correcting Subj001
++ 3dTstat: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
++ Authored by: KR Hammett & RW Cox
++ Output dataset ./rest_ro_mean.nii.gz
++ 3dvolreg: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
++ Authored by: RW Cox
++ Coarse del was 10, replaced with 3
++ Max displacement in automask = 0.40 (mm) at sub-brick 0
Skull stripping Subj001
++ 3dAutomask: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
++ Authored by: Emperor Zhark
++ Loading dataset rest_mc.nii.gz
++ Forming automask
+ Fixed clip level = 414.830291
+ Used gradual clip level = 370.560822 .. 455.330688
+ Number voxels above clip level = 31506
+ Clustering voxels ...
+ Largest cluster has 30753 voxels
+ Clustering voxels ...
+ Largest cluster has 30208 voxels
+ Filled 196 voxels in small holes; now have 30404 voxels
+ Clustering voxels ...
+ Largest cluster has 30404 voxels
+ Clustering non-brain voxels ...
+ Clustering voxels ...
+ Largest cluster has 100668 voxels
+ Mask now has 30404 voxels
++ Dilating automask
+ Clustering voxels ...
+ Largest cluster has 96134 voxels
++ 34938 voxels in the mask [out of 131072: 26.66%]
++ first 13 x-planes are zero [from R]
++ last 12 x-planes are zero [from L]
++ first 7 y-planes are zero [from P]
++ last 7 y-planes are zero [from A]
++ first 0 z-planes are zero [from I]
++ last 0 z-planes are zero [from S]
++ Output dataset ./rest_mask.nii.gz
++ CPU time = 1.060000 sec
++ 3dcalc: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
++ Authored by: A cast of thousands
++ Output dataset ./rest_ss.nii.gz
Getting example_func for registration for Subj001
++ 3dcalc: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
++ Authored by: A cast of thousands
++ Output dataset ./example_func.nii.gz
Smoothing Subj001
Grand-mean scaling Subj001
Band-pass filtering Subj001
++ 3dFourier: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
Removing linear and quadratic trends for Subj001
++ 3dTstat: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
++ Authored by: KR Hammett & RW Cox
++ Output dataset ./rest_filt_mean.nii.gz
++ 3dDetrend: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
++ 3dcalc: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
++ Authored by: A cast of thousands
++ Output dataset ./rest_pp.nii.gz
Generating mask of preprocessed data for Subj001
------------------------------
!!!! RUNNING REGISTRATION !!!!
------------------------------
mkdir: no se puede crear el directorio «/home/cvalencia/Documentos/Subj001/reg»: El archivo ya existe
/home/cvalencia/Documentos/scripts/tissuepriors/3mm/
------------------------------
!!!! RUNNING SEGMENTATION !!!!
------------------------------
mkdir: no se puede crear el directorio «/home/cvalencia/Documentos/Subj001/segment»: El archivo ya existe
Segmenting brain for Subj001
Creating global mask
++ 3dcopy: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
Registering Subj001 csf to native (functional) space
Smoothing Subj001 csf
Registering Subj001 csf to standard space
Finding overlap between Subj001 csf and prior
Registering Subj001 csf back to native space
Threshold and binarize Subj001 csf probability map
Mask csf image by Subj001 functional
Registering Subj001 wm to native (functional) space
Smoothing Subj001 wm
Registering Subj001 wm to standard space
Finding overlap between Subj001 wm and prior
Registering Subj001 wm back to native space
Threshold and binarize Subj001 wm probability map
Mask wm image by Subj001 functional
--------------------------------------------
!!!! RUNNING NUISANCE SIGNAL REGRESSION !!!!
--------------------------------------------
Splitting up Subj001 motion parameters
Extracting global signal for Subj001
++ 3dmaskave: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
+++ 34938 voxels survive the mask
Extracting signal from csf for Subj001
++ 3dmaskave: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
+++ 173 voxels survive the mask
Extracting signal from white matter for Subj001
++ 3dmaskave: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
+++ 1440 voxels survive the mask
Modifying model file
Running feat model
++ 34938 voxels in mask
Running film to get residuals
Log directory is: /home/cvalencia/Documentos/Subj001/func/nuisance/stats++++
paradigm.getDesignMatrix().Nrows()=197
paradigm.getDesignMatrix().Ncols()=9
sizeTS=197
numTS=34938
Completed
Prewhitening and Computing PEs...
Percentage done:
1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65,66,67,68,69,70,71,72,73,74,75,76,77,78,79,80,81,82,83,84,85,86,87,88,89,90,91,92,93,94,95,96,97,98,99,Completed
Saving results...
Completed
++ 3dTstat: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
++ Authored by: KR Hammett & RW Cox
** FATAL ERROR: Can't open dataset /home/cvalencia/Documentos/Subj001/func/nuisance/stats/res4d.nii.gz
** Program compile date = Jan 7 2011
++ 3dcalc: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
++ Authored by: A cast of thousands
** FATAL ERROR: can't open dataset /home/cvalencia/Documentos/Subj001/func/nuisance/stats/res4d.nii.gz
** Program compile date = Jan 7 2011
** ERROR (nifti_image_read): failed to find header file for '/home/cvalencia/Documentos/Subj001/func/rest_res'
** ERROR: nifti_image_open(/home/cvalencia/Documentos/Subj001/func/rest_res): bad header info
Error: failed to open file /home/cvalencia/Documentos/Subj001/func/rest_res
ERROR: Could not open image /home/cvalencia/Documentos/Subj001/func/rest_res
Image Exception : #22 :: Failed to read volume /home/cvalencia/Documentos/Subj001/func/rest_res
terminate called after throwing an instance of 'RBD_COMMON::BaseException'
/home/cvalencia/Documentos/scripts/5_nuisance.sh: línea 95: 24898 Abortado flirt -ref ${reg_dir}/standard -in ${func_dir}/${rest}_res -out ${func_dir}/${rest}_res2standard -applyxfm -init ${reg_dir}/example_func2standard.mat -interp trilinear
+ + + + +
cat: rest: No existe el fichero o el directorio
cvalencia@cvalencia-Precision-WorkStation-T3400:~/Documentos/scripts$
Feb 3, 2011  02:02 PM | Maarten Mennes
RE: Starting with this project
Hi Carolina,

great!

do an ls on this directory /home/cvalencia/Documentos/Subj001/func/nuisance/
I'm suspecting you ran it a couple of times and will have stats++++++ directories. Delete all stats directories and run it again...

Maarten
Feb 3, 2011  05:02 PM | Carolina Valencia
RE: Starting with this project
Hi Marteen,

Now, finally I think the processing was successful but the last line give me doubts,
What do you think?

Thanks a lot,

Carolina


cvalencia@cvalencia-Precision-WorkStation-T3400:~/Documentos/scripts$ ./batch_process.sh
/home/cvalencia/Documentos + /home/cvalencia/Documentos/scripts/subjects.txt + 0 + 196 + 197 + 2.7
preprocessing Subj001
--------------------------------------
!!!! PREPROCESSING ANATOMICAL SCAN!!!!
--------------------------------------
deobliquing Subj001 anatomical
++ 3drefit: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
++ Authored by: RW Cox
++ Processing AFNI dataset mprage.nii.gz
+ loading and re-writing entire dataset mprage.nii.gz
++ 3drefit processed 1 datasets
Reorienting Subj001 anatomical
skull stripping Subj001 anatomical
The intensity in the output dataset is a modified version
of the intensity in the input volume.
To obtain a masked version of the input with identical values inside
the brain, you can either use 3dSkullStrip's -orig_vol option
or run the following command:
3dcalc -a mprage_RPI.nii.gz -b mprage_surf.nii.gz+orig -expr 'a*step(b)' \
-prefix mprage_surf.nii.gz_orig_vol
to generate a new masked version of the input.
++ 3dcalc: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
++ Authored by: A cast of thousands
++ Output dataset ./mprage_brain.nii.gz
---------------------------------------
!!!! PREPROCESSING FUNCTIONAL SCAN !!!!
---------------------------------------
Dropping first TRs
++ 3dcalc: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
++ Authored by: A cast of thousands
*+ WARNING: If you are performing spatial transformations on an oblique dset,
such as rest.nii.gz,
or viewing/combining it with volumes of differing obliquity,
you should consider running:
3dWarp -deoblique
on this and other oblique datasets in the same session.
See 3dWarp -help for details.
++ Oblique dataset:rest.nii.gz is 5.179751 degrees from plumb.
++ Output dataset ./rest_dr.nii.gz
Deobliquing Subj001
++ 3drefit: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
++ Authored by: RW Cox
++ Processing AFNI dataset rest_dr.nii.gz
+ loading and re-writing entire dataset rest_dr.nii.gz
++ 3drefit processed 1 datasets
Reorienting Subj001
Motion correcting Subj001
++ 3dTstat: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
++ Authored by: KR Hammett & RW Cox
++ Output dataset ./rest_ro_mean.nii.gz
++ 3dvolreg: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
++ Authored by: RW Cox
++ Coarse del was 10, replaced with 3
++ Max displacement in automask = 0.40 (mm) at sub-brick 0
Skull stripping Subj001
++ 3dAutomask: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
++ Authored by: Emperor Zhark
++ Loading dataset rest_mc.nii.gz
++ Forming automask
+ Fixed clip level = 414.830291
+ Used gradual clip level = 370.560822 .. 455.330688
+ Number voxels above clip level = 31506
+ Clustering voxels ...
+ Largest cluster has 30753 voxels
+ Clustering voxels ...
+ Largest cluster has 30208 voxels
+ Filled 196 voxels in small holes; now have 30404 voxels
+ Clustering voxels ...
+ Largest cluster has 30404 voxels
+ Clustering non-brain voxels ...
+ Clustering voxels ...
+ Largest cluster has 100668 voxels
+ Mask now has 30404 voxels
++ Dilating automask
+ Clustering voxels ...
+ Largest cluster has 96134 voxels
++ 34938 voxels in the mask [out of 131072: 26.66%]
++ first 13 x-planes are zero [from R]
++ last 12 x-planes are zero [from L]
++ first 7 y-planes are zero [from P]
++ last 7 y-planes are zero [from A]
++ first 0 z-planes are zero [from I]
++ last 0 z-planes are zero [from S]
++ Output dataset ./rest_mask.nii.gz
++ CPU time = 1.040000 sec
++ 3dcalc: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
++ Authored by: A cast of thousands
++ Output dataset ./rest_ss.nii.gz
Getting example_func for registration for Subj001
++ 3dcalc: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
++ Authored by: A cast of thousands
++ Output dataset ./example_func.nii.gz
Smoothing Subj001
Grand-mean scaling Subj001
Band-pass filtering Subj001
++ 3dFourier: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
Removing linear and quadratic trends for Subj001
++ 3dTstat: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
++ Authored by: KR Hammett & RW Cox
++ Output dataset ./rest_filt_mean.nii.gz
++ 3dDetrend: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
++ 3dcalc: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
++ Authored by: A cast of thousands
++ Output dataset ./rest_pp.nii.gz
Generating mask of preprocessed data for Subj001
------------------------------
!!!! RUNNING REGISTRATION !!!!
------------------------------
/home/cvalencia/Documentos/scripts/tissuepriors/3mm/
------------------------------
!!!! RUNNING SEGMENTATION !!!!
------------------------------
Segmenting brain for Subj001

Creating global mask
++ 3dcopy: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
Registering Subj001 csf to native (functional) space
Smoothing Subj001 csf
Registering Subj001 csf to standard space
Finding overlap between Subj001 csf and prior
Registering Subj001 csf back to native space
Threshold and binarize Subj001 csf probability map
Mask csf image by Subj001 functional
Registering Subj001 wm to native (functional) space
Smoothing Subj001 wm
Registering Subj001 wm to standard space
Finding overlap between Subj001 wm and prior
Registering Subj001 wm back to native space
Threshold and binarize Subj001 wm probability map
Mask wm image by Subj001 functional
--------------------------------------------
!!!! RUNNING NUISANCE SIGNAL REGRESSION !!!!
--------------------------------------------
Splitting up Subj001 motion parameters
Extracting global signal for Subj001
++ 3dmaskave: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
+++ 34938 voxels survive the mask
Extracting signal from csf for Subj001
++ 3dmaskave: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
+++ 173 voxels survive the mask
Extracting signal from white matter for Subj001
++ 3dmaskave: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
+++ 1440 voxels survive the mask
Modifying model file
Running feat model
++ 34938 voxels in mask
Running film to get residuals
Log directory is: /home/cvalencia/Documentos/Subj001/func/nuisance/stats
paradigm.getDesignMatrix().Nrows()=197
paradigm.getDesignMatrix().Ncols()=9
sizeTS=197
numTS=34938
Completed
Prewhitening and Computing PEs...
Percentage done:
1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65,66,67,68,69,70,71,72,73,74,75,76,77,78,79,80,81,82,83,84,85,86,87,88,89,90,91,92,93,94,95,96,97,98,99,Completed
Saving results...
Completed
++ 3dTstat: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
++ Authored by: KR Hammett & RW Cox
++ Output dataset /home/cvalencia/Documentos/Subj001/func/nuisance/stats/res4d_mean.nii.gz
++ 3dcalc: AFNI version=AFNI_2010_10_19_1028 (Jan 7 2011) [32-bit]
++ Authored by: A cast of thousands
++ Output dataset /home/cvalencia/Documentos/Subj001/func/rest_res.nii.gz
+ + + + +
cat: rest: No existe el fichero o el directorio
cvalencia@cvalencia-Precision-WorkStation-T3400:~/Documentos/scripts$
Feb 3, 2011  06:02 PM | Maarten Mennes
RE: Starting with this project
Hi Carolina,

not sure what is causing that error (cat rest), but it seems all processes completed as they should! Well done!

Can you copy/paste your batch_process.sh, batch_list and subject_list?

Maarten
Feb 4, 2011  01:02 PM | Carolina Valencia
RE: Starting with this project
Hi Maarten,

I have doubts with some lines in the batch_process.sh, this file rest_res.nii.gz is an output? when is used the seeds_list.txt and where I can get it? (in the experiment I mean)

Below the files you requested,

Thanks in advance,

Carolina

batch_list_cv

/home/cvalencia/Documentos /home/cvalencia/Documentos/scripts/subjects.txt 0 196 197 2.7

subjects.txt

Subj001

batch_process.sh

#!/usr/bin/env bash

##########################################################################################################################
## SCRIPT TO BATCH PROCESS DATASETS INCLUDED IN THE 1000 FUNCTIONAL CONNECTOMES PROJECT
##
## Written by Maarten Mennes & Michael Milham
## for more information see http://www.nitrc.org/projects/fcon_1000
##
## MAKE SURE TO CHECK/CHANGE ALL PARAMETERS BEFORE RUNNING THE SCRIPT
##
##########################################################################################################################

####################
## scripts directory
####################
## directory where you put the scripts downloaded from http://www.nitrc.org/projects/fcon_1000
## e.g. /home/fcon_1000/scripts
scripts_dir=/home/cvalencia/Documentos/scripts


###########################
## what do you want to run?
###########################
## 1 - general RSFC preprocessing
## 2 - single-subject RSFC (requires general preprocessing to be completed)
## 3 - ALFF/fALFF (default frequency band of interest is 0.01-0.1Hz)
## 4 - Dual Regression
what_to_do=1


#######################
## some important names
#######################
## anatomical scan you want to use (no extension)
anat_name=mprage
## resting-state scan you want to use (no extension)
rest_name=rest


################################################
## Extra parameters needed to run postprocessing
################################################
## image you want to use to calculate RSFC for.
## This is usually the image containing resting-state scan after regressing out the nuisance parameters
postprocessing_image=rest_res.nii.gz
## path to list of seeds you want to calculate RSFC for e.g. /home/fcon_1000/scripts/my_seeds.txt
## DEFAULT path = within the scripts directory specified above
## the list includes the full path for each seed e.g. /home/my_seeds/connectomes_seed_3mm.nii.gz
seed_list=${scripts_dir}/SEED_LIST.txt
## image you want to use for postprocessing in MNI space
## This image is used in the dual regression step. Dual regression is applied to this image.
postprocessing_mni_image=rest_res2standard.nii.gz
## mask used for Dual Regression; default = MNI based mask
## the resolution has to be consistent with the DR templates
mask=/home/cvalencia/Documentos/scripts/templates/MNI152_T1_3mm_brain_mask.nii.gz
## template used for Dual Regression
## default = the metaICA image, including 20 ICA components as derived in Biswal et al. 2010, PNAS
## running Dual Regression is tuned to run for 20 components you will have to edit 8_singlesubjectDR.sh if you want to use a different template
DR_template=${scripts_dir}/templates/metaICA.nii.gz


#######################################
## Standard brain used for registration
## include the /full/path/to/it
#######################################
standard_brain=/home/cvalencia/Documentos/scripts/templates/MNI152_T1_3mm_brain.nii.gz


#####################################################################################
## batch processing parameter list
## DO NOT FORGET TO EDIT batch_list.txt itself to include the appropriate directories
#####################################################################################
## path to batch_list.txt, e.g., /home/fcon_1000/scripts/my_batch_list.txt.
## DEFAULT = within the scripts directory specified above
## This file contains the sites (and their parameters) you want to process.
## batch_list.txt contains 1 line per site listing:
## - analysis directory, i.e. full/path/to/site
## - subjectlist, i.e. full/path/to/site/site_subjects.txt
## - first timepoint, default = 0 (start with the 1st volume)
## - last timepoint = number of timepoints - 1 (since count starts at 0)
## - number of timepoints in the timeseries
## - TR
batch_list=${scripts_dir}/batch_list_cv.txt



##########################################################################################################################
##---START OF SCRIPT---------------------------------------------------------------------------------------------------------------------##
##---NO EDITING BELOW UNLESS YOU KNOW WHAT YOU ARE DOING----------------------------------------------------------------##
##########################################################################################################################

while read line
do

## 1. cut site specific parameters from batch_list
analysisdirectory=$( echo $line | cut -d ' ' -f1 )
subject_list=$( echo $line | cut -d ' ' -f2 )
first_vol=$( echo $line | cut -d ' ' -f3 )
last_vol=$( echo $line | cut -d ' ' -f4 )
n_vols=$( echo $line | cut -d ' ' -f5 )
TR=$( echo $line | cut -d ' ' -f6 )

## 2. do the processing asked for
case ${what_to_do} in
1) echo ${analysisdirectory} + ${subject_list} + ${first_vol} + ${last_vol} + ${n_vols} + ${TR}
${scripts_dir}/0_preprocess.sh ${scripts_dir} ${analysisdirectory} ${subject_list} ${anat_name} ${rest_name} ${first_vol} ${last_vol} ${n_vols} ${TR} ${standard_brain}
;;
2) echo ${analysisdirectory} + ${subject_list} + ${postprocessing_image} + ${rest_name} + ${seed_list} + ${standard_brain}
${scripts_dir}/6_singlesubjectRSFC.sh ${analysisdirectory} ${subject_list} ${postprocessing_image} ${rest_name} ${seed_list} ${standard_brain}
;;
3) echo ${analysisdirectory} + ${subject_list} + ${rest_name} + ${n_vols} + ${TR} + ${standard_brain}
${scripts_dir}/7_singlesubjectfALFF.sh ${analysisdirectory} ${subject_list} ${rest_name} ${n_vols} ${TR} ${standard_brain}
;;
4) echo ${analysisdirectory} + ${subject_list} + ${postprocessing_mni_image} + ${DR_template} + ${mask}
${scripts_dir}/8_singlesubjectDR.sh ${analysisdirectory} ${subject_list} ${postprocessing_mni_image} ${DR_template} ${mask}

;;
esac

done < ${batch_list}
Feb 4, 2011  10:02 PM | Maarten Mennes
RE: Starting with this project
Hi Carolina,

yes, rest_res.nii.gz is the output (as is rest_res2standard.nii.gz). It indicates the residuals of the resting state preprocessing which included regressing out signal from white matter, CSF and the global signal. For the rest of the analyses we will be working with the residuals of that regression.

The seed_list is something you create yourself. It should contain the names of ROI's or seeds that you created and that you want to use to do a functional connectivity analysis. In such analysis you correlate the timeseries of your seed with the timeseries of all other voxels in the brain, effectively assessing how similar a voxel's fluctuations are to the fluctuations of your seed. High correlations would mean they are "functionally connected", i.e., showing the same signal fluctuations.

Take a look at this paper for a nice example:
http://dx.doi.org/10.1016/j.neuroimage.2...
NeuroImage
Volume 37, Issue 2, 15 August 2007, Pages 579-588
Mapping the functional connectivity of anterior cingulate cortex
Daniel S. Margulies, A.M. Clare Kelly, Lucina Q. Uddin, Bharat B. Biswal, F. Xavier Castellanos, and Michael P. Milham

Maarten

1   2   Next >