Download PDF Download PDF

International Conference on Computational Science (ICCS) 2020, Date: 2020/06/03 - 2020/06/05, Location: Amsterdam, NL

Publication date: 2020-06-15
Volume: 12142 Pages: 374 - 388
ISSN: 978-3-030-50432-8
Publisher: Springer

Lecture Notes in Computer Science (LNCS)

Author:

Loevbak, Emil Andre
Mortier, bert ; Samaey, giovanni ; Vandewalle, stefan ; Krzhizhanovskaya, Valeria V ; Závodszky, Gábor ; Lees, Michael H ; Dongarra, Jack J ; Sloot, Peter MA ; Brissos, Sérgio ; Teixeira, João

Keywords:

Science & Technology, Technology, Computer Science, Interdisciplinary Applications, Computer Science, Software Engineering, Computer Science, Theory & Methods, Computer Science, Multilevel Monte Carlo, Kinetic equations, Diffusive scaling, Particle methods, 1SB1919N|1SB1921N#54289637

Abstract:

In many applications, it is necessary to compute the time-dependent distribution of an ensemble of particles subject to transport and collision phenomena. Kinetic equations are PDEs that model such particles in a position-velocity phase space. In the low collisional regime, explicit particle-based Monte Carlo methods simulate these high dimensional equations efficiently, but, as the collision rate increases, these methods suffer from severe time-step constraints. In the high collision regime, asymptotic-preserving particle schemes are able to produce stable results. However, this stability comes at the cost of a bias in the computed results. In earlier work, the multilevel Monte Carlo method was used to reduce this bias by combining simulations with large and small time steps. This multilevel scheme, however, still has large variances when correlating fine and coarse simulations, which leads to sub-optimal multilevel performance. In this work, we present an improved correlation approach that decreases the variance when bridging the gap from large time steps to time steps of the order of magnitude of the collision rate. We further demonstrate that this reduced variance results in a sharply reduced simulation cost at the expense of a small bias.