Small Re~1/10 - 1/100: Are they suitable for LB?

Dear All,
Another turn of LB science)), and I would be very grateful for your comments.

MY SYSTEM:
I have a small obstacle of radius a~1mkm in water (I remember Timm’s post about LB for blood cells;
and in my case I have some artificial, but also soft, polymeric object). Because of small radius, Reynolds is also small, Re~1/10-1/100. (fluid velocity V ~1 cm/s, viscosity for water nu~1/100 cm2/s at room temperature).

In lattice units it gives the viscosity nu~10-100 and the relaxation time tau~3*nu~30-300,
if we take V~0.01-0.1 and a~10. So, tau is huge for small Re!

PROBLEMS:

  1. From common sense it means that the fluid needs to propagate over about 30-300 lattice length
    to come into equilibrium! With a~10 (that is <<300), it is clearly Knudsen regime and we can forget about
    hydrodynamics (it’s kinetics). It’s bad, because in real water we have, of course, pure hydrodynamics for mkm scales.

  2. Also, with such a long tau I have to have an LB lattice of size L>300 (say, a lattice 1000x1000x1000),
    because on smaller lattice the interaction of the objects through the fluid is strongly non-local,
    that most probably leads to unphysical behavior. But such a lattice looks terribly huge.
    I feel that computations on this lattice could be extremely lengthy.

My QUESTIONS are:

** Does this all mean that LB is inapplicable/unsuitable for such small Re numbers?

** Or, maybe, we can tune the LB parameters somehow, to use it for small objects/ small Re’s?
(I want LB because it gives huge advantages and is elegant))).

What do LB gurus think about it? What does the experience of others say in this case?

Best regards,
German

Hi,

I don’t understand everything in your post, because I don’t know, for example, what a “mkm” is. But all in all, I think that you are right when you conclude that the relaxation time in a simulation cannot be arbitrarily large. At some point, a high relaxation time is interpreted as a large Knudsen number, and you lose numerical stability and accuracy.

The parameter with which you can play to keep tau small when Re increases is the velocity, measured in lattice units. Indeed, the relation

nu = N*U / Re

expressed in lattice units (nu~tau, N~resolution, U~Mach number) shows that tau is decreased by decreasing U. Of course, decreasing U means that you slow down your simulation, but that’s life. If your simulation is inherently multi-scale (slow and fast physics is simultaneously observed) you have to pay the cost for resolving the different scales at the same time, unless you use a particularly sophisticated approach to decouple the scales.