plb_ifstream behaviour dependant on number of nodes with mpi?

Hallo everyone,

I am trying to set an external scalar for a MultiBlockLattice by reading an ASCII file. I have just run a test on a 80x80 2D lattice, with the input file being 6400 ones, each on a separate line. I define the scalar field in this way:

string epsilonFile = “epsilon”;
plb_ifstream epsilonStream(epsilonFile.c_str());
MultiScalarField2D epsilon(parameters.getNx(), parameters.getNy());
epsilonStream >> epsilon;

MultiBlockLattice2D<T, DESCRIPTOR> lattice (
         parameters.getNx(), parameters.getNy(),
         new DYNAMICS<T,DESCRIPTOR>(parameters.getOmega()) );

setExternalScalar(lattice, lattice.getBoundingBox(), 0, epsilon);

I made use of the IncomprFlowParam class. Then I build a vtk file with:
     VtkImageOutput2D<T> vtkOut(createFileName("zzz", 999, 6), parameters.getDeltaX());
     vtkOut.writeData<T>(*computeExternalScalar(lattice, 0), "epsilon", 1.);

I get two different results if I run the executable on one core, or on four cores with the mpirun -np 4 command. In the one-core case, the field is 0 on the whole domain; in the four-cores case, the field is 1 on a half of the domain, and 0 otherwise.

I obtain the correct result (field equal to 1 on the whole domain) if I initialize the value of the field directly:

setExternalScalar(lattice, lattice.getBoundingBox(), 0, epsilon);

or if I plot the field on a vtk file directly:
    vtkOut.writeData<T>(epsilon, "epsilon", 1.);

Thank you in advance.