Data Driven Modelling of Turbulent Flows Using Artificial Neural Networks
thesisposted on 2020-05-01, 00:00 authored by Luca Lamberti
Numerical simulations based on Reynolds-averaged Navier Stokes (RANS) models are still the work-horse tool in engineering design involving turbulent flows. Two decades ago, when LES started gaining popularity thanks to the increasing availability of computational resources, it was widely expected that it would have gradually replaced RANS methods in industrial CFD for decades to come. In the past two decades, however, while LES-based methods gained widespread applications and the earlier hope did not diminish, the predicted time when LES would replace RANS has been significantly delayed. Most industrial users are probably decades away from any routine use of scale resolving simulations, not to mention the cost, time and user skill it take to run these computations. In brief, RANS solvers, particularly those based on standard eddy viscosity models (e.g k-ε, k-ω, S-A and k-ω SST) are expected to remain the workhorse in the CFD of high Reynolds number flows for decades. However, predictions from RANS simulations are known to have large discrepancies in many flows of engineering relevance, including those with swirl, pressure gradients, or mean streamline curvature. It is a consensus that the dominant cause for such discrepancies is the RANS-modeled Reynolds stresses. In light of the long stagnation in traditional turbulence modeling, researchers explored machine learning as an alternative to improve RANS modeling by leveraging data from highfidelity simulations. The goal is to make use of vast amounts of turbulent flows data, machine learning techniques and current understanding of turbulence physics to develop models with better predictive capabilities in the context of RANS simulations. Recently, in a seminar work, Ling et al (2016) developed a neural network architecture capable of embedding invariance properties into the Reynolds stress tensor predicted in output. Such a network, named the tensor basis neural network (TBNN), was applied to a variety of flow fields with encouraging results compared to both classical turbulence models and neural networks that do not preserve Galilean invariance. Yet, as in most data driven turbulence modelling approaches, the TBNN was used as a post-processing tool to correct the Reynolds stress tensor field predicted by a RANS simulation run with standard closure models. This means that, theoretically, the network can be applied only to correct the Reynolds stress tensor for the same RANS model on which it has been trained since, in general, different turbulence models yield different results depending on the flow type. Moreover, there is no physisical insight that suggests a relation between the RANS velocity gradients - used as inputs of the machine learning model - and the true Reynolds stress tensor. Differently, in this work a network with a similar architecture to the Ling’s one was trained and tested on a database of high-fidelity data of eight different flows to learn a functional mapping between the inputs of Pope’s General Eddy Viscosity Model and the anisotropic part of the Reynolds stress tensor. Then the network was embedded into a CFD RANS solver as a replacement of the standard closure model - and therefore called at every solver’s iteration. Lasty, the RANS solver with embedded TBNN was be tested on a canonical flow case - turbulent channel flow - to evaluate its performances. As for the organization of this work: in Chapter 1 further details on the data driven turbulence modelling will be given, RANS models and equations will be introduced and also an introduction to Neural Networks will be presented. In Chapter 2, it will be given a detailedexplanation of the RANS CFD solver and the neural network’s implementation. In Chapter 3, the method will be tested on a turbulent channel flow case and the results will be discussed. Lastly, in Chapter 4, some meaningful conclusions will be drawn.