FLOW Community

FLOW Community

Research strategy for FLOW e-Science Community
Numerics in Fluid Mechanics

 

(Strategy Document in pdf format V3.4)

 

Background and Motivation

 

The intricate behaviour of flowing fluids in both nature and technical applications has fascinated scientists and laymen alike for centuries. Complex phenomena such as turbulence are ubiquitous in fluid flows, and studying them is of major relevance not only for industrial applications, but also, e.g., for a better understanding of the earth climate system. As such, turbulence has been described as “the last unsolved problem of classical physics”. At least with some simplifying approximations, the governing equations of fluid flows are essentially known for two centuries, i.e. the Navier-Stokes equations. Their closed solution, however, has not been achieved even despite major efforts. Thus, advances in fluid dynamics have been dominated for many years by experimental studies, for example in wind tunnels. During the last 30 years, the field has experienced a change by the introduction of the computer: Numerical simulations of initially laminar, but also turbulent flows have dramatically increased our knowledge. Turbulence models of various complexity and accuracy have been developed over the years, with profound impact into the daily engineering workflow. The first fully-resolved so-called direct numerical simulation (DNS) of a turbulent channel in 1987 by a NASA group (Kim, Moin, Moser, JFM 177, 1987) can surely be called a landmark in the application of simulations in advancing physics. Since then, computer power has dramatically increased, as did numerical algorithms for fluid simulations: Computational fluid dynamics (CFD) has majored to a very active research field.

The simulation of fluid flows usually requires a large amount of degrees of freedom. For instance, when considering turbulent flow, the range of excited scales is very wide, ranging from integral scales (of the size of the considered domain, such as an airplane wing or even a typical atmospheric eddy) down to small viscous scales on the order of a few micrometers. Thus, CFD applications have always been among the best customers of computer centres. Nowadays, DNS using up to order 10 billion grid points are feasible, and a few million grid points are considered to be small simulations. For instance, the visualisation above shows the vortical structures in a turbulent boundary layer at Req= 4300. Only about 7% of the streamwise extent of the domain is shown. Simulation performed based on 7.5 billion grid points on the Ekman cluster at PDC and NSC (Reference: Schlatter et al., JFM 659, 2010).

However, the growing capability of computer systems also causes an increasing dependency of researchers on the employed simulation methodology, as the whole process chain of tools is getting more and more complex; up to the point that a complete oversight of the process is impractical for an individual scientist. Issues such as pre-processing (grid generation, initial data), simulation runs (high-performance computing on parallel systems) and post-processing (data storage, mining and visualisation), have become specialised areas of expertise, which are hardly managed by a single researcher. Rather, the processes leading up to performing a successful simulation of a problem at hand have to be considered aspects of e-Science, involving knowledge of a broad range of areas including physics, numerical analysis, computer science, and software skills. It is the purpose of the present initiative to enable and improve the mentioned e-Science capabilities among the Swedish researchers, in the present context the fluid dynamics community.

 

Research environment

 

Research in computational fluid dynamics is mainly performed in the groups at KTH Mechanics, KTH Numerical Analysis, KTH Aeronautical and Vehicle Engineering, which are the traditional member departments of the Linné FLOW Centre.However, within the Swedish e-Science Research Centre (SeRC), also related departments with similar research interest are included, namely NORDITA, Stockholm University (Bert Bolin Centre) and LiU Management & Engineering/Mathematics. These groups are very diverse in terms of the applied computational methods and the physical problems addressed. Also, their involvement in the areas of e-Science is different: Some groups focus on code development and the related software-engineering aspects, whereas other groups mainly apply commercial codes and direct their research activities towards a deeper understanding of complex physical phenomena. The strength of the present Community directly lies in the variety of backgrounds of the members, uniquely combining strong expertise in areas such as large-scale computations, advanced numerical methods, modelling of complex flows and innovative visualisation.

Due to the close collaboration between the various named departments, and the co-existing FLOW e-Science area within SeRC and the Linné FLOW Centre, in the following the research goals and strategy for the combined activity is listed; including partners from Linköping University, Stockholm University and NORDITA.

 

Research Areas:

The large number of active researchers and involved departments within the SeRC-FLOW Community directly leads to a wide scope of the performed research, ranging from nano-fluidics applications to large-scale simulations of turbulence in the earth’s atmosphere. A wide range of different numerical methods is employed, including small homemade test codes, larger in-house software suites, and commercial codes. In the following, two classifications are listed, based on physical problem and numerical approach:

 

Physical aspects: Turbulence and Modelling, Climate and Geophysical Flows, Transition and Control, Complex and Biological Flows, Aeroacoustics.

Numerical aspects:Large-Scale Simulations, Parallelisation / Optimisation / Adaptivity, Method Development, Visualisation, Applied CFD.

 

Interaction Core and Application e-Science

A more abstract view of the ongoing research can be obtained by attempting to combine the above lists.  A schematic view of the complex interdependence of various areas of expertise involved in the relevant e-Science for fluid problems is shown in Fig. 2; the content of the individual bullet points should be considered as a general guideline. A strong vertical division of the workflow into three main parts can be seen: “Application Area”, “Computational Tools” and “Core Expertise”: It is only when combining these three aspects in the best possible way that optimal results can be obtained. During the last years, computational work has increasingly become more difficult and more specialised, leading to an increased gap between these three areas. To this end, the efforts of the FLOW e-Science Community should be focused on closing the gap, in order to deliver the best results.

 

Sketch

 

Various computational methods are in active use and development within the FLOW e-Science community: The software situation is more heterogeneous than in other areas of CSE as there are only a few “general purpose” methods. Members of the Community act as main developers in many of the below software projects (e.g. Simson, FEMLEGO), or as active contributors to international software projects (FEniCS, Nek5000), the SeRC Community can also act as a meeting place for developers, sharing ideas and techniques for software development and maintenance. A common platform including a forum and documentation for the software packages can be established.

 

List of Code

 

Within the SeRC FLOW community, expertise for the following numerical codes/methods, which are regularly employed, is available:

  • FEniCS: Free software for automated solution of differential equations, including solvers for incompressible and compressible Navier-Stokes equations. Support for parallelism, adaptive grids, FSI, etc. License: GPL (www.fenicsproject.org)
  • SIMSON: Package containing solver and extensive pre-/postprocessing tools for incompressible channel and boundary layer geometries. Hybrid OpenMP/MPI parallelisation, tested up to 16384 processors. License: GPL (http://tinca.pdc.kth.se/sites/default/files/simson-user-guide-v4.0.pdf)
  • NEK5000: Spectral-element solver with pre- and postprocessing software for incompressible flow in moderately complex geometries. Capability for large-scale parallelism with up to O(100’000) processors. License: GPL (http://nek5000.mcs.anl.gov)
  • CESM1: Community climate system model from National Centre of Atmospheric Research in USA. Consists of models of atmospheric, oceanic and sea ice dynamics and the land surface.
  • LOVECLIM: Earth system model of intermediate complexity developed at Université catholique de Louvain in Belgium. Consists of models for atmospheric (quasi-geostrophic), oceanic and sea ice dynamics, land surface and vegetation.
  • EDGE: CFD flow solver for unstructured grids of arbitrary elements. Edge is available as a complete source package, subject to the FOI license agreement (www.foi.se)
  • COMSOL Mulitphysics: Finite element analysis, solver and Simulation software / FEA Software package for various physics and engineering applications, especially coupled phenomena. Licence: Commercial (www.comsol.com)
  • ANSYS/FLUENT/ICEM: Commercial integrated flow solver with extensive pre- and postprocessing capabilities. Parallelisation using MPI. License: Commercial (www.ansys.com)
  • Star-CD: Commercial integrated flow solver with extensive pre- and postprocessing capabilities. Parallelisation using MPI. License: Commercial (www.cd-adapco.com)
  • Palabos: Open-source “Parallel Lattice Boltzmann Solver”, with active community support. License: GPL (http://www.lbmethod.org/palabos/)
  • FEMLEGO: Set of Maple procedures and fortran subroutines that can be used to build complete fortran simulation codes for partial differential equations, with the entire problem definition done in Maple. Parallelism using MPI. License: GPL. (http://www.mech.kth.se/~gustava/femLego)
  • OpenFOAM: Free, open source CFD software package. Parallelisation using MPI. License: GPL (www.openfoam.com)

 

In addition to the codes listed here, there are of course numerous specific methods with different levels of complexity and size.

  • ParaView: open-source, multi-platform data analysis and visualization application. License: GPL (www.paraview.org)
  • VisIt: Free interactive parallel visualization and graphical analysis tool for viewing scientific data. License: BSD (https://wci.llnl.gov/codes/visit/)  

Research Projects

 

A list of ongoing and concluded research projects within the FLOW Community can be found here.

 

Coordination and Application Experts

 

The activities of the SeRC FLOW Community are coordinated at KTH Mechanics. To further emphasise the close collaboration between PDC, NSC, SeRC and SNIC, an application expert with focus on CFD applications is also associated to the FLOW community.

Coordination SeRC-FLOW:

Philipp Schlatter, KTH Mechanics, pschlatt@mech.kth.se, 08 790 71 76

Application Expert SeRC-FLOW:

Mattias Chevalier, KTH Mechanics / PDC, mattias@mech.kth.se, 790 89 13

 

Strategic Goals

 

  • To develop efficient and accurate numerical methods for simulation of complex flows such as high-Reynolds number transitional and turbulent flows, flow involving chemical reactions, geophysical flows and flows around wind turbines, and two-phase and biological flows.
  • To use high-performance computing to enhance the understanding of different physical phenomena in complex flows and fluid-structure interaction.
  • To use and develop advanced tools for postprocessing of simulation data, such as visualisation and long-term storage and access to data.
  • To act as a coordinator for FLOW e-Science research, both internally and externally; including international HPC initiatives within FLOW research areas (PRACE etc.), and outside visibility of FLOW research.

 

Roadmap

 

Method development:

Categorize the employed numerical methods within the FLOW community, leading to efficient use of the expertise within the community by exchanging knowledge on available numerical methods. Specific projects include:

  • To develop numerical algorithms based on mathematical formulations that include dissolution of the air in an air pocket into the liquid, or evaporation and condensation of the liquid vapour. This can be naturally included in diffuse interface models. This should then allow prediction of the size and appearance or disappearance of air/vapour pockets, depending on the ambient pressure, the properties of the liquid, dissolved gases, the electric field, etc.
  • To investigate the possibility to predict sound that is generated and scattered by fluid-structure interaction via experiments and advanced FEM simulations, using so called unified continuum description for robust fluid-structure coupling and adaptive mesh refinement with error control.

 

High-performance computing:

Act as a coordinator for ongoing and future activities using large-scale computer resources. Sample projects include:

  • To numerically study the tip-vortex breakdown generated by wind turbine rotor blades and the wind turbine wake interactions. These issues are of particular relevance when studying the setup of wind-turbine farms in which the mutual interaction between individual turbines strongly affects to the total energy production. Collaboration with J. Sørensen and S. Ivanell.
  • First simulation of Lagrangian particles in complex geometries using Spectral Element Method. These studies would eventually allow for a better prediction of e.g. pollutant transport through blood veins. Collaboration with P. Fischer and C. Casciola.
  • Extend Reynolds number range for turbulent boundary layers using large-scale DNS. Contribute to the fundamental discussion of near-wall regeneration cycle, appearance of coherent structures in the boundary layer. Such projects are directed towards a deeper understanding of the ubiquitous turbulence on solid walls, and eventually allow a reduction of drag.
  • To study energy cascade processes in wall-bounded stratified turbulence by means of large-scale DNS. Develop suitable subgrid-scale models for the efficient simulation of atmospheric boundary-layer flows. Improved models for climate prediction are potential outcomes of these projects.

 

Postprocessing:

Provide and collect experience in using advanced methods for handling of large-scale simulation data. Act as general landing point for interested (internal/external) researchers. Specific aspects include:

  • Develop and contribute to large-scale visualisation efforts.
  • Provide and encourage a well-maintained and centralized database for storage of scientific data for open access by other researchers to build up a well-maintained data base for simulation data in Sweden.

 

Coordination:

The community acts as a coordinator for the various discussed aspects. Specific action items include:

  • Coordination of applications to DEISA/PRACE within the Community. A coordinated approach to such pan-European applications is crucial for successful and continuing participation.
  • Initiate regular meeting within the community to discuss current issues; organize tutorial relating to commonly used software, such as IDE, debuggers etc. in an effort to cross-fertilize efficient working environments. Share knowledge on core e-Science aspects such as visualization tools, optimization on new computer architectures etc.
  • Initiate and define the collaboration with application experts and researchers from core areas. Collaboration with PDC and NSC
  • Extend the available course curriculum at the host universities in the area of e-Science and – together with the established graduate school – suggest possible extensions/improvements.
  • Act as contact point for commercial software distributers if requested by members.