Posted By: Alfonso Nieto-Castanon - Sep 25, 2015
Tool/Resource: CONN : functional connectivity toolbox
 
We are happy to announce the latest release of CONN (release 2015f).

Among several improvements and new features we would like to highlight the ability to massively parallelize your functional connectivity analyses by running CONN in a computer cluster distributed environment. Grid and cloud computing environments are becoming increasingly ubiquitous in research, and many institutions already offer researchers readily access to cluster computing resources. CONN allows you to take advantage of this by offering a user-friendly way to use these resources. All of the functional/anatomical Preprocessing steps (e.g. realignment, slice-timing correction, normalization, etc.), Setup, Denoising, and first-level Analyses in CONN can now be executed in parallel (as much as one separate process per subject), allowing you to analyze hundreds of subjects in the time it would typically take to process just one or a few subjects. CONN offers native support for some of the most common grid schedulers (Grid Engine, PBS/Torque, LSF) and simple options to accomodate other schedulers/environments.

If your institution offers a computer cluster environment we encourage you to give this new functionality a try. In most cases there should be zero configuration required, just select your institution scheduler-type and decide how many parallel jobs you would like your processing step to be broken into. Parallelization functionality can be used from the CONN gui as well as through batch scripts. If your institution does not offer Matlab access or if you are using a third-party cloud computing environment (e.g. AWS, NITRC-CE) we will be offering shortly pre-compiled versions of CONN that do not require a Matlab license/environment. To get started see the CONN gui Help.Documentation.GridComputing and Tools.GridComputing menus (and for batch scripting see conn_batch batch.parallel field)

Hope this helps!
CONN team

RSS Feed Monitor in Slack
Latest News

This news item currently has no comments.