Research

Results


We have worked on three main problems.

  • Computation of low-probabilities

    The first research axis is related to the computation of low-probabilities. Calculation of tail probabilities is of fundamental importance in several domains, such as for example risk assessment. One major challenge consists in the computation of low-failure probability and multiple-failure regions, especially when an unbiased estimation of the error is required. Methods developed in literature rely mostly on the construction of an adaptive surrogate, tackling some problems such as the metamodel building criterion and the global computational cost, at the price of a generally biased estimation of the failure probability. In this paper, we propose a novel algorithm permitting to both building an accurate metamodel and to provide a statistically consistent error. In fact, it relies on a novel metamodel building strategy, which aims to refine the limit-state region in all the branches “equally”, even in the case of multiple failure regions, with a robust stopping building criterion. Secondly, two “quasi-optimal” importance sampling techniques are used, which permit, by exploiting the accurate knowledge of the metamodel, to provide an unbiased estimation of the failure probability, even if the metamodel is not fully accurate. As a consequence, the proposed method provides a very accurate unbiased estimation even for low failure probability or multiple failure regions. Several numerical examples are carried out, showing the very good performances of the proposed method with respect to the state-of-the-art in terms of accuracy and computational cost. Additionally, another importance sampling technique is proposed in this paper, permitting to drastically reduce the computational cost when estimating some reference values, or when a very weak failure-probability event should be computed directly from the metamodel.

  • Propagation of uncertainties through systems of codes

    Secondly, we have worked on the propagation of uncertainties through systems of codes. Uncertainty propagation in complex industrial solvers demands efficient surrogate model construction methods. The surrogate model could replace a computational demanding expensive solver in order to propagate the input distributions at minimal computational cost. The surrogate performance is therefore closely related to the input distributions of interest. When the solver output characteristics are a priori unknown, the selection of the training points is crucial and should solely rely on the input distribution. Beside collocation strategies, the most classical one, Latin hypercube Sampling (LHS), is defined on uniform, independent input distributions. Adapting this method to non uniform and dependent distributions that arise in problems involving a sequence of solvers or prior Bayesian calibration of parameters, is non trivial, especially if only samples from the input distribution are available. In this study, we have proposed some efficient training sampling methods for industrial problems where the input distribution is solely defined by a large set of samples that have to be propagated through an expensive solver. The clustering and random variable discretization method are investigated for their good coverage properties. We are currently coupling them with Gaussian Process methods to compare their efficiency on several test cases with assorted complexity and input dimensionality.

  • Robust optimization

    We have worked on advanced uncertainty quantification and robust optimization methodologies to be used during the ORC turbine design process in order to account for multiple uncertainties. Typical energy sources for ORC power systems, such as waste heat recovery or biomass, geothermal, and solar energy, typically feature variable heat load and turbine-inlet thermodynamic conditions. We performed a study about the Uncertainty Quantification (UQ) analysis on a typical supersonic nozzle cascade for ORC applications, by considering a two-dimensional high-fidelity turbulent Computational Fluid Dynamic (CFD) model. Kriging-based techniques are used in order to take into account at a low computational cost, the combined effect of uncertainties associated to operating conditions, fluid parameters, and geometric tolerances. The geometric variability is described by a finite Karhunen-Loeve expansion representing a non-stationary Gaussian random field, entirely defined by a null mean and its autocorrelation function. Several results are illustrated about the ANOVA decomposition of several quantities of interest for different operating conditions, showing the importance of geometric uncertainties on the turbine performances.

    Moreover, we have performed an original and fast robust shape optimization approach to overcome the limitation of a deterministic optimization that neglects operating conditions variability, applied on a typical 2D ORC turbine cascade (Biere). Flow around the blade is solved by means of inviscid simulation using the open-source SU2 code, considering Non-Ideal gas effects modeled through the use of the Peng-Robinson-Stryjek-Vera equation of state, from which a Quantity of Interest (QoI) is recovered. We proposed a mono-objective formulation consisting in minimizing the α-quantile of the QoI under a constraint, at a low computational cost. This is performed by using an efficient robust optimization approach, coupling a state-of- the-art quantile estimation and a classical bayesian optimization method. First, the advantages of a quantile-based formulations are illustrated with respect to a classical mean-based robust optimization. Secondly, we demonstrated the effectiveness of applying this robust optimization framework with a low-fidelity inviscid solver by comparing the resulting optimal design with the ones obtained with a deterministic optimization using a high-fidelity turbulent solver.

  • Comments are closed.