Title: Data Cleaning with Traditional and Artificial Intelligence Methods for the LEGEND Experiment
Abstract: Neutrinoless double beta decay (0νββ) is a theorized radioactive disintegration process in which two neutrons decay into two protons and two electrons, without emitting any neutrinos. This process, if observed, would violate the conservation of lepton number and indicate that neutrinos are their own antiparticles. Detecting this decay would provide key insights into the nature of neutrinos and physics beyond the Standard Model. The Large Enriched Germanium Experiment for Neutrinoless ββ Decay (LEGEND) is an international collaboration aiming to detect 0νββ using high-purity Germanium (HPGe) detectors enriched in 76Ge. LEGEND will operate in two phases, with the first (second) stage deploying up to 200 (1000) kg of HPGe detectors to set lower limits on the half-life of 0νββ of 10^27 (10^28) years.
Signals captured by HPGe detectors, also referred to as waveforms, are created by physical particle interactions. Digitized signals from HPGe detectors also contain non-physical events, such as transient anomalies due to injected noise or crosstalk in the electronics chain. Data cleaning aims to distinguish between physical versus anomalous events, with the ultimate goal of removing non-physical signals from datasets. Traditional data cleaning consists in creating and fine-tuning parameters based on digital signal processing (DSP). A hierarchical data cleaning method based on aggregations of DSP cuts was employed in LEGEND-200. Data cleaning cuts were studied and validated prior to the unblinding of LEGEND-200 data for its first 0νββ result. Traditional methods yielded data cleaning efficiencies ranging from 95.3 – 98.6 %.
Traditional data cleaning takes significant time and effort, scaling up with the number of deployed detectors in LEGEND…
2:00-4:00pm, Phillips 277; or by Zoom, https://unc.zoom.us/j/94604230816