A Local Approach to Parameter Space Reduction for Regression and Classification Tasks
F. Romor, M. Tezzele, and G. Rozza. Journal of Scientific Computing, vol. 99, no. 3, p. 83, 2024.
Abstract: Parameter space reduction has been proved to be a crucial tool to speed-up the execution of many numerical tasks such as optimization, inverse problems, sensitivity analysis, and surrogate models’ design, especially when in presence of high-dimensional parametrized systems. In this work we propose a new method called local active subspaces (LAS), which explores the synergies of active subspaces with supervised clustering techniques in order to carry out a more efficient dimension reduction in the parameter space. The clustering is performed without losing the input–output relations by introducing a distance metric induced by the global active subspace. We present two possible clustering algorithms: K-medoids and a hierarchical top–down approach, which is able to impose a variety of subdivision criteria specifically tailored for parameter space reduction tasks. This method is particularly useful for the community working on surrogate modelling. Frequently, the parameter space presents subdomains where the objective function of interest varies less on average along different directions. So, it could be approximated more accurately if restricted to those subdomains and studied separately. We tested the new method over several numerical experiments of increasing complexity, we show how to deal with vectorial outputs, and how to classify the different regions with respect to the LAS dimension. Employing this classification technique as a preprocessing step in the parameter space, or output space in case of vectorial outputs, brings remarkable results for the purpose of surrogate modelling.
N. Demo, M. Tezzele, and G. Rozza. SIAM Journal on Scientific Computing, vol. 43, no. 3, B831–B853, 2021.
Abstract: In this work, we present an extension of genetic algorithm (GA) which exploits the supervised learning technique called active subspaces (AS) to evolve the individuals on a lower-dimensional space. In many cases, GA requires in fact more function evaluations than other optimization methods to converge to the global optimum. Thus, complex and high-dimensional functions can end up extremely demanding (from the computational point of view) to be optimized with the standard algorithm. To address this issue, we propose to linearly map the input parameter space of the original function onto its AS before the evolution, performing the mutation and mate processes in a lower-dimensional space. In this contribution, we describe the novel method called ASGA, presenting differences and similarities with the standard GA method. We test the proposed method over n-dimensional benchmark functions --- Rosenbrock, Ackley, Bohachevsky, Rastrigin, Schaffer N. 7, and Zakharov --- and finally we apply it to an aeronautical shape optimization problem.
F. Romor, M. Tezzele, M. Mrosek, C. Othmer, and G. Rozza, “Multi-fidelity data fusion through parameter space reduction with applications to automotive engineering”, International Journal for Numerical Methods in Engineering, vol. 124, no. 23, pp. 5293–5311, 2023. doi: 10.1002/nme.7349
F. Romor, M. Tezzele, A. Lario, and G. Rozza, “Kernel-based active subspaces with application to computational fluid dynamics parametric problems using discontinuous Galerkin method”, International Journal for Numerical Methods in Engineering, vol. 123, no. 23, pp. 6000–6027, 2022. doi: 10.1002/nme.7099.
M. Tezzele, F. Salmoiraghi, A. Mola, and G. Rozza, “Dimension reduction in heterogeneous parametric spaces with application to naval engineering shape design problems”, Advanced Modeling and Simulation in Engineering Sciences, vol. 5, no. 1, p. 25, Sep. 2018, issn: 2213-7467. doi: 10.1186/s40323-018-0118-3.