help > Conn/Matlab keeps crashing
Showing 1-6 of 6 posts
Display:
Results per page:
Jul 2, 2019  09:07 AM | newbie
Conn/Matlab keeps crashing
Hello all,

I have a large dataset (n=229) and while conn is loading the .dmat files, it crashes around participant 200. I've tried multiple things, restarting the project from scratch, deleting the higher .dmat files, etc. The project is stuck at the first level analysis and if I delete all the .dmat files, redo the first level analysis again, it crashes again, and won't load. Help! 
Thanks!
Jul 2, 2019  01:07 PM | Stephen L. - Coma Science Group, GIGA-Consciousness, Hospital & University of Liege
RE: Conn/Matlab keeps crashing
Hello newbie,

I did not try yet to process that many subjects but I will soon, so I am interested in a potential resolution :-/

What version of CONN are you using? Do you have the log output of CONN? If you use CONN 18b, CONN creates a log file (that can also be read in a separate window that CONN creates), otherwise for older version it's in the MATLAB console output or in the error window when you click "Additional details".

Best regards,
Stephen
Jul 3, 2019  08:07 AM | newbie
RE: Conn/Matlab keeps crashing
Hi Stephen,
So I think I figured out my problem. I haven't really tried to do much more because I got so frustrated and was able to get my number crunching with what I had, but I think I was too shy with my computing resources. I did run one setup and it ran fine with the extra resources. So, I guess the moral of the story is, don't be shy with your computing resources.
Jul 4, 2019  01:07 PM | Stephen L. - Coma Science Group, GIGA-Consciousness, Hospital & University of Liege
RE: Conn/Matlab keeps crashing
Dear newbie,

Thank you very much for reporting back on this issue.

Still this seems weird for me as it looks like there is a step that needs to store all subjects data into memory, which I find surprising as I can't see which step would need that. For 200 subjects, it might be possible to get enough RAM, but for more like 1K, then there would surely be an issue (and we don't know what data structure is used during the crash, so if the storage requirement is quadratic for example, then the RAM requirements would also grow quadratically).

Well anyway when I will try for myself I will try to bring more info if I get a similar issue ;-)

Best regards,
Stephen
Jul 12, 2019  12:07 PM | Victoria Okuneye - University of Chicago
RE: Conn/Matlab keeps crashing
My group ends up needing a lot of memory whenever we are transitioning between steps - Preprocessing --> DONE --> Denoising --> 1st level --> 2nd Level.  When were just looking at the results within a step  or even running the jobs (aside from ICA) we don't need a lot of memory. It only after we finish the job and the project tries to save and load up the new dmat files that we often find CONN/linux session crashing due to lack of memory.
Currently at about ~600 Subjects I find myself having to request 400 GB of memory on our hpc cluster to be safe, at least over 256 GB.

I don't know if there are any potential for future work arounds....like the CONN project loading in discrete chunks for large datasets. Cause usually it take about an hour for our dataset to load up after finishing a step and it sucks when it gets to like Subject 500 and realizes there isn't enough memory, crashes and then we need to start all over again. I'm also concerned as out dataset sizes continues to grow that we may reach some memory limits on what we can request to support CONN, currently our hpc cluster has a bigmem node with 512GB capacity.
Jul 12, 2019  01:07 PM | Stephen L. - Coma Science Group, GIGA-Consciousness, Hospital & University of Liege
RE: Conn/Matlab keeps crashing
> Hello Victoria,
>
Thank you very much for your detailed feedback, that's very helpful.
Someone else reported a similar issue recently. It's good to know this
happens during the display functions, not during the analysis, this should
be easier to test and fix.