ABSTRACT

Complex computer codes are often too time expensive to be directly used to perform uncertainty propagation or sensitivity analysis. A solution to cope with this problem consists in replacing the cpu-time expensive computer model by a cpu inexpensive mathematical function, called metamodel. Among the metamodels classically used in computer experiments, the Gaussian process (Gp) model has shown strong capabilities to solve practical problems. However, in case of high dimensional experiments (with typically several tens of inputs), the Gp metamodel building process remains difficult. To face this limitation, we propose a general methodology which combines several advanced statistical tools. First, an initial space-filling design is performed providing a full coverage of the high-dimensional input space (Latin hypercube sampling with optimal discrepancy property). From this, a screening based on dependence measures is performed. More specifically, the Hilbert-Schmidt independence criterion which builds upon kernel-based approaches for detecting dependence is used. It allows ordering the inputs by decreasing primary influence, for the purpose of the metamodeling. Furthermore, significance tests based either on asymptotic theory or permutation technique are performed to identify a group of potentially non-influential inputs. Then, a joint Gp metamodel is sequentially built with the group of influential inputs as explanatory variables. The residual effect of the group of non-influential inputs is captured by the dispersion part of the joint metamodel. Then, a sensitivity analysis based on variance decomposition can be performed through the joint Gp metamodel. The efficiency of the methodology is illustrated on a thermal-hydraulic calculation case simulating accidental scenario in a Pressurized Water Reactor.