help > file sizes and storage for big data
Showing 1-3 of 3 posts
Display:
Results per page:
May 5, 2017  03:05 AM | Clas Linnman
file sizes and storage for big data
Hi, 
I plan on running 750 subjects, 1 mprage, 1 bold run, through Conn
I did a test run of 10 subjects, the raw data was 1.1gb, after preprocessing I am at 17.4 gb raw data and 9.2gb results
so...multiply this by 75, and I am at 2Tb. Is this a reasonable estimate? And is there a way to reduce file sizes as I go along?
May 11, 2017  05:05 PM | Shady El Damaty - Georgetown University
RE: file sizes and storage for big data
I would uncheck all of the optional inputs in the setup phase (unless you really need them). You can also delete intermediate files generated during preprocessing (au*.nii, a*.nii, u*.nii).

Seed to voxel and voxel-to-voxel analysis create many intermediate files that take up lots of storage space. ROI-ROI analysis take up the least space. Make sure to select ONLY the analysis you will be doing in the setup phase (ROI-ROI, Seed-ROI, V2V). If you're unsure which you'd like to do, I'd proceed incrementally and deleting exploratory analyses after you have finished.

Aside from that, there isn't too much else you can do :(  Support for .gz files would be great but CONN relies on uncompressed NIFTI files . That is to say even if you wrote a shell script to compress all your images, CONN would just decompress them when it opened the files.

Alfonoso - It would be nice to see a FSL-like feature for working with gunzipped niftis in CONN. (Unless I'm mistaken and this already exists!)
Jun 4, 2024  07:06 PM | L F
RE: file sizes and storage for big data

I apologise for commiting necromancy on this thread. I would also really like to see support for automatic file cleanup and compressed file support. I am working with high-resolution 7T data and my file sizes are huge.