Presentation
Enhancing High Energy Physics Analysis: Advancements in Computing Infrastructure and Software for the LHC and Future
Presenter
DescriptionHigh Energy Physics (HEP) is fundamentally statistical, relying on the Standard Model (SM) hypothesis, which encapsulates entities like the Higgs Boson, Quarks, Leptons, and force-mediating Bosons. Despite its comprehensive framework, the SM has limitations, unable to explain several phenomena. Particle accelerators such as the LHC serve as a tools in investigating the SM's potential inadequacies, offering clues that might lead to beyond Standard Model (BSM) Physics. A significant challenge in HEP is to handle enormous data volumes, aiming to search for new particles or to scrutinize exceptionally rare SM processes, with any enhancement in event rates may come from BSM. Since the beginning, the development of a robust computing infrastructure and software has been crucial for effectively managing and analyzing this data. This includes leveraging heterogeneous computing, harnessing the power of GPUs or FPGAs, and integrating machine learning and AI into analysis workflows to handle data more efficiently. With the LHC set to evolve into the High Luminosity LHC, significantly increasing data volumes, it’s essential to fortify our computational capabilities. This presentation will discuss into the current developments, highlighting the integration of innovative tools that empower physicists to analyze data more proficiently and pave the way for future of HEP.
TimeMonday, June 313:00 - 13:30 CEST
LocationHG E 1.2
Session Chair
Event Type
Minisymposium
Engineering
Physics
Computational Methods and Applied Mathematics