buoyancy force / natural convection

Dear All,

I want to perform simulations on natural convection, thus I’m presently running the well known buoyancy flow in a square cavity testcase with my code. Unfortunately I don’t get the expected results, but to be honest 1) I’m not sure if I’ve implemented well, and 2) if I use the correct input values…

For the buoyancy force, I use Luo’s Scheme (introducing a force term in collsion)…


  (pseudo code only)
  Loop over i:
    compute_feq[i]
    f_buoyancy = 3. * w[i] * rho * deltaTemp  * (  c[i].x * g.x + c[i].y * g.y + c[i].z * g.z )
    f_out[i] += omega * ( f_eq - f_in[i] ) + f_buoyancy
  }

My scaling of units is done by…

given values…


Ra = 100000
Pr = 0.71
Re = 375.3 (=sqrt(Ra/Pr))
uMax = 0.01
Resolution N = 100

from this I derive…


DeltaX = 1/N = 0.01
DeltaT = uMax/N = 0.0001
Lattice viscosity = 0.002664583 (=DeltaT/(DeltaX^2*Re))
Lattice thermal diffusivity = 0.003752933 (=viscosity/Pr)
Lattice gravity = 0.000001 (=DeltaT^2/DeltaX)

Well, I dont see the forest for the trees…

With this I expect to get streamlines sinmilar to Fig. 7 in [Mohamad and Kuzmin (2010),“A critical evaluation of force term in lattice Boltzmann method, natural convection problem”. International Journal of Heat and Mass Transfer], but I get a fully symmetrical circular flow.

Help is highly appreciated,
Thanks in advance,

Francois

Dear All,

I’m caught in a “unit conversion” problem, and I would need some help. I was able to calculate the above mentioned well known buoyancy flow benchmark, using the following unit conversion…

starting from given Ra- and Pr-numbers…


define resolution N
define viscosity nu
thermal diffusivity alpha= nu/Pr
gravity times thermal expansion g*beta = Ra*nu*alpha / N^3
dx = 1
dt = 1

Using LBGK for flow and a FD scheme for temperature, this works, even if it’s not yet clear to me.

I would prefer to use the unit conversion given in the Tech.Report of J.Lätt on lbmethod.org


define resolution N
define gravity times thermal expansion g*beta
deltax = 1/N
deltat = sqrt( g*beta*deltax)
viscosity nu = sqrt( Pr/Ra ) * deltat/deltax^2
thermal diffusivity alpha = sqrt ( 1/(Ra*Pr)) * deltat/deltax^2

but with this I can’t get the right results… I did benchmarks for the pure flow simulation already, and my code seems to work properly, but I’m stucked within this unit conversion, and presently I cant see the problem. To me, unit conversion according to J.Lätt is clear and comprehensible, but using it, I cant get the correct results.

Please, could anyone give me some hint or feedback on this?

Kind Regards,

Francois

Dear All,

I think I have it…

Since I use a FD Scheme for Temperature, it’s a matter of “paying attention” ;)… The values of deltaX and deltaT do not appear directly in LBM formulation, just as scaling values for nu and stuff, that’s the point why in case of pure flow simulation these values do not have an impact, as long as everything is scaled well. In case of a coupled thermal and flow simulation I have to pay attention, that LBM is running in so called lattice units, whereas my FD scheme is running in dimensionless units, thus the values for deltaX and deltaT derived for lattice units won’t lead to success. Using dimensionless values for deltaX and deltaT everything works well!

Ok, no one replied so far, but maybe this “monologue” will be usefull for someone searching for unit conversion related issues in the future.

Kind Regards,
Francois

Maybe this is of help: click here

Timm