KamNet: an integrated spatiotemporal deep neural network for rare event searches in KamLAND-Zen

Files
PhysRevC.107.014323.pdf(1.62 MB)
Published version
Date
2023-01-30
Authors
Li, A.
Fu, Z.
Grant, Christopher
Ozaki, H.
Shimizu, I.
Song, H.
Takeuchi, A.
Winslow, L.A.
Version
Published version
OA Version
Citation
A. Li, Z. Fu, C. Grant, H. Ozaki, I. Shimizu, H. Song, A. Takeuchi, L.A. Winslow. 2023. "KamNet: An integrated spatiotemporal deep neural network for rare event searches in KamLAND-Zen" Physical Review C, Volume 107, Issue 1. https://doi.org/10.1103/physrevc.107.014323
Abstract
Rare event searches allow us to search for new physics at energy scales inaccessible with other means by leveraging specialized large-mass detectors. Machine learning provides a new tool to maximize the information provided by these detectors. The information is sparse, which forces these algorithms to start from the lowest level data and exploit all symmetries in the detector to produce results. In this work we present KamNet, which harnesses breakthroughs in geometric deep learning and spatiotemporal data analysis to maximize the physics reach of KamLAND-Zen, a kiloton scale spherical liquid scintillator detector searching for 0νββ. Using a simplified background model for KamLAND, we show that KamNet outperforms a conventional convolutional neural network (CNN) on benchmarking Monte Carlo simulations with an increasing level of robustness. Using simulated data, we then demonstrate KamNet's ability to increase KamLAND-Zen's sensitivity to 0νββ and 2νββ decay to excited states. A key component of this work is the addition of an attention mechanism to elucidate the underlying physics KamNet is using for the background rejection.
Description
License
©2023 American Physical Society. Published by the American Physical Society under the terms of the Creative Commons Attribution 4.0 International license. Further distribution of this work must maintain attribution to the author(s) and the published article's title, journal "citation, and DOI.