permeability tutorial - large input

Hello,
I was trying to run the permeability tutorial and I encountered a problem with running large input.

I was trying to find and example on the manual of how to load a large input geometry chunk by chunk so it fits on a single node memory before the parallel re-distribution. Is there an actual example on the examples in the distribution where i have missed it? I would be extremely grateful for any help or code snipet regarding this issue.

Cheers,
Aleksandra.

Dear Aleksandra,

I understand from your post that in the the following line of the example code permeability.cpp


        geometryFile >> geometry;

you assume that the content of the file “geometryFile” is first fully read into the memory of a given node before being dispatched to the other nodes of the parallel machine. This is however not true: Palabos reads the file piecewise to avoid memory issues.

Is is possible that the problem you observe when working with big files is of a different nature? A few hypotheses are:

  • Your problem is too large, even for your parallel machine (in that case the code should probably crash even if you don’t read the data file).
  • You are working on a 32-bit machine. Palabos cannot read files larger than 2 Gigabytes (and, more generally, handle data sets larger than 2 Gigabytes) on a 32-bit machine. An exception is the BlueGene/P for which a workaround was implemented.
  • The number of values in the data file does not match the size of the allocated matrix.

At which line does the code fail, and what is the error message?

Cheers,
Jonas

Dear Jonas,
thank you very much for your reply!

Although I have eliminated all the possible issues you pointed out, your response made me notice that that the issue was with me using the ethernet interconnect and weird compiler settings. After I recompiled the problem does not seem to persist anymore.

Thank you again,
Cheers,
Aleksandra.