Weekly Seminar
Incremental low-rank semismooth Newton in samplet-compressed kernel approximation
Chiara Segala (USI, Lugano (Switzerland))
Thu, 12 Mar 2026
• 10:15 coffee/tea: EDDy's room 256, bldg 1953 // 10:45-11:45 talk in 008
• Pontdriesch 14, room 008 SeMath (host: Michael Herty)
Abstract
We propose an incremental low-rank semismooth Newton method for efficient $\ell^1$-regularized kernel approximation in a samplet-compressed framework. The approach applies a normal-map-based semismooth Newton iteration to exploit second-order information while preserving sparsity. To enable scalability, we introduce an incremental SVD decomposition that updates the Newton system efficiently as the active set evolves, avoiding costly recomputation of the full kernel matrix. When combined with the samplet basis, the proposed scheme yields compact representations and accelerated convergence. Numerical experiments demonstrate the accuracy and computational advantages of the method.