Problems with Neumann boundary conditions on a sparse multiblock domain


I experienced the following problem when extending the sparseGeometry example file: I have a blood vessel geometry similar to the example geometry, I create a sparse multiblock structure. One difference to the example is that I set an velocity-outflow condition on one of the outlets.

OnLatticeBoundaryCondition3D<T, Descriptor> * boundaryCondition;		
boundaryCondition = createLocalBoundaryCondition3D<T, Descriptor> ();		
						someBox3DObject, lattice,

The cutoff procedures I do very similar to the example, and in most cases, my code works already fine, also in parallel execution. However, for some resolutions, my code crashes during the first collideAndStream step by Segmentation fault.

In debug mode, I found out that in src/core/cell.h, line 218, in Cell::computeVelocity(…), the PLB_PRECONDITION fails. After some more debugging, I found out that this function was called during the first collideAndStream step from the following function:
(gdb) plb::CopyVelocityFunctional3D<double, plb::descriptors::D3Q19Descriptor, 0, 0, 1>::process (this=0x100b6bef0, domain={x0 = 0, x1 = 33, y0 = 0, y1 = 33, z0 = 0, z1 = 0}, lattice=@0x104ac33a0) at neumannCondition3D.hh:117
(gdb) 117 lattice.get(iX-normalX, iY-normalY, iZ-normalZ).computeVelocity(u);
So, in my (rough) understanding, it seems that the CopyVelocityFunctional::process method is called within the Block envelope, which makes no sense (?).
Then I inspected the Block distribution and found out that, by accident the velocity Neumann condition was placed accidentally directly at a BlockLattice boundary, namely the block adjacent to the block where the error occurs. This could be the reason why the code fails only for certain resolutions.

So, can it be that there is a hidden bug in the case, when a Neumann condition is directly placed at a blocklattice boundary?
Currently, I am still using the version 0.6, release 1. I will test if the error is still present in version 0.7, but adapting the code to the new version will perhaps take some time…

Okay, it seems that the error is fixed in the version 0.7r2. So, as a lesson for me and possibly for other users: Get version 0.7, especially when working with Neumann BCs… :wink: