An EIM-based compression-extrapolation tool for efficient treatment of homogenized cross-section data - CEA - Commissariat à l’énergie atomique et aux énergies alternatives Accéder directement au contenu
Article Dans Une Revue Annals of Nuclear Energy Année : 2023

An EIM-based compression-extrapolation tool for efficient treatment of homogenized cross-section data

Résumé

Nuclear reactor simulators implementing the widespread two-steps deterministic calculation scheme tend to produce a large volume of intermediate data at the interface of their two subcodes – up to dozens or even hundred of gigabytes – which can be so cumbersome that it hinders the global performance of the code. The vast majority of this data consists of ‘‘few-groups homogenized cross-sections’’, nuclear quantities stored in the form of tabulated multivariate functions which can be precomputed to a large extent. It has been noticed in Tomatis (2021) that few-groups homogenized cross-sections are highly redundant — that is, they exhibit strong correlations, which paves the way for the use of compression techniques. We here pursue this line of work by introducing a new coupled compression/surrogate modeling tool based on the Empirical Interpolation Method, an algorithm originally developed in the framework of partial differential equations (Barrault et al., 2004). This EIM-compression method is based on the infinite norm ∥ ⋅ ∥∞, and proceeds in a greedy manner by iteratively trying to approximate the data and incorporating the chunks of information which cause the largest error. In the process, it generates a vector basis and a set of interpolation points, which provide an elementary surrogate model that can be used to approximate future data from little information. The algorithm is also very suitable for parallelization and out-of-core computation (processing of data too large for the computer RAM) and very easy to apprehend and implement. This method enables us to both efficiently compress cross-sections and spare a large fraction of the required lattice calculations. We investigate its performance on large realistic nuclear data replicating the notorious VERA benchmark (Godfrey, 2014) (20 energy groups, pin-by-pin homogenization, 10 particularized isotopes). Compression loss, memory savings and speed are analyzed both from a data-centric point of view in the perspective of applications in neutronics, and by comparison with an existing and widely-used method – stochastic truncated SVD – to assess mathematical efficiency. We discuss the usage of our surrogate model and its sensitivity to the choice of the training set. The method is shown to be competitive in terms of accuracy and speed, provide important memory savings and spare a large amount of physics code computation; all this could facilitate the adoption of fine-grain modelization schemes (pin-by-pin and many-groups homogenization, particularization of many isotopes) in industrial setups. A Github repository is available,1 which contains all the methods used for the article.
Fichier non déposé

Dates et versions

cea-04216802 , version 1 (25-09-2023)

Identifiants

Citer

Olivier Truffinet, Karim Ammar, Nicolas Gérard Castaing, Jean-Philippe Argaud, Bertrand Bouriquet. An EIM-based compression-extrapolation tool for efficient treatment of homogenized cross-section data. Annals of Nuclear Energy, 2023, 185, pp.109705. ⟨10.1016/j.anucene.2023.109705⟩. ⟨cea-04216802⟩
11 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More