help > RE: Conn/Matlab keeps crashing
Jul 12, 2019  07:07 PM | Victoria Okuneye - University of Chicago
RE: Conn/Matlab keeps crashing
My group ends up needing a lot of memory whenever we are transitioning between steps - Preprocessing --> DONE --> Denoising --> 1st level --> 2nd Level.  When were just looking at the results within a step  or even running the jobs (aside from ICA) we don't need a lot of memory. It only after we finish the job and the project tries to save and load up the new dmat files that we often find CONN/linux session crashing due to lack of memory.
Currently at about ~600 Subjects I find myself having to request 400 GB of memory on our hpc cluster to be safe, at least over 256 GB.

I don't know if there are any potential for future work arounds....like the CONN project loading in discrete chunks for large datasets. Cause usually it take about an hour for our dataset to load up after finishing a step and it sucks when it gets to like Subject 500 and realizes there isn't enough memory, crashes and then we need to start all over again. I'm also concerned as out dataset sizes continues to grow that we may reach some memory limits on what we can request to support CONN, currently our hpc cluster has a bigmem node with 512GB capacity.

Threaded View

TitleAuthorDate
newbie Jul 2, 2019
Lukas Van Oudenhove Apr 19, 2020
RE: Conn/Matlab keeps crashing
Victoria Okuneye Jul 12, 2019
Stephen L. Jul 12, 2019
Victoria Okuneye Apr 12, 2020
Alfonso Nieto-Castanon Apr 12, 2020
Victoria Okuneye Apr 21, 2020
Stephen L. Jul 2, 2019
newbie Jul 3, 2019
Stephen L. Jul 4, 2019