help > RE: FDR-correction problem
Sep 10, 2019  04:09 PM | Andrew Zalesky
RE: FDR-correction problem
Hi Ruslan,

207 x 207 with 24 subjects should be computationally feasible. 32 GB should be enough memory. When using FDR, the number of permutations must be very high (>100000) and this may be causing a memory problem for you. Try using the NBS and reducing the the number of permutations to 5000. Do not need as many permutations with NBS compared to FDR.

Yes - it is reasonable not to apply the Fisher Z transform, although some other researchers might have stronger opinions on this matter.

Andrew


Originally posted by Ruslan Masharipov:
Dear Dr. Zalesky,

Could you please clarify some questions regarding correction of multiple comparisons implemented in the NBS?
1) When I trying to perform FDR-correction matlab leaks all memory (32 Gb!) and shuts down. The same happens even if I choose Permutations = 2. We have 207*206/2 = 21321 connections and N=24 subjects. However, FWE correction for One-sample test perfectly works without any problems. Do you have any suggestions how this issue can be fixed?

2) Is that possible not to apply Fisher Z transformation to Pearson R correlation coefficients before submitting matrices to the NBS toolbox?
Reason not to do it: NBS uses nonparametric permutation test.
Reason to do it: NBS calculates mean values from correlational matrices. Is it correct to calculate mean R-coefficient for one group?

Thank you in advance!
Sincerely yours
Masharipov Ruslan

Threaded View

TitleAuthorDate
Ruslan Masharipov Sep 10, 2019
RE: FDR-correction problem
Andrew Zalesky Sep 10, 2019