In fluid mechanics, advances in knowledge and modelling are required to understand the multi-scale and often multi-physical flows involved in industrial systems (turbomachines, heat exchangers, propulsion systems, chemical engineering reactor-mixers) as well as in the environment (atmosphere, oceans, pollutant dispersion, sediment transport). These advances are strongly based on the experimental exploration of the behaviour of controlled flows, and the constraints are very strong: access to fields of variables such as speed, acceleration, particle or scalar concentration, in two or three-dimensional domains, with dynamic magnitudes in the order of 1 000 to 10 000 in time and space necessary to solve all the physical scales present in turbulent or multiphase flows, acquisition times sufficiently long to ensure a good statistical convergence and the detection of rare events.
Over the past 15 years, imaging has undergone a technological revolution with the advent of high-resolution and high-capacity cameras that render possible to approach these objectives. Today's cameras can capture around 20 000 frames per second for 1 megapixel resolution over very long periods of time. Depending on the variables targeted, the experiments may require the use of 1 to 4 cameras. The latest camera models generate 25GB every 20 seconds, resulting in raw data volumes of several terabites, ultimately providing from million to billion useful physical data per experimental flow condition.
The current practice in large international laboratories is to collect these raw data, then transfer and process them on powerful clusters: this second phase can last several weeks or even months! To overcome this limitation, much faster access to the useful information is highly desirable.
The objective here is a paradigm shift: instead of storing large volumes of raw data, the idea is to develop data processing routines targeted on predefined variables, and directly linked to acquisition in order to filter the data and limit the storage to useful information only.
This project exploits both Cloud and Big Data: (i) the Cloud to provide virtualized resources on demand and (ii) the Big Data to quickly analyze large volumes of complex and heterogeneous data.
The challenge lies in developing original IT solutions adapted to fast imaging techniques on which the LEGI is particularly active and recognized, in order to significantly extend our measurement capabilities.
This project involves a collaboration between the LEGI and the Laboratoire d’Informatique de Grenoble (ERODS team, Efficient and Robust Distributed Systems)
PI: Alain Cartellier; Co-PI: Noël De Palma, Post-doc: Thomas Calmant