Doctoral Dissertation Open Access
Horzela, Maximilian Maria
<?xml version='1.0' encoding='utf-8'?> <oai_dc:dc xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:oai_dc="http://www.openarchives.org/OAI/2.0/oai_dc/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.openarchives.org/OAI/2.0/oai_dc/ http://www.openarchives.org/OAI/2.0/oai_dc.xsd"> <dc:creator>Horzela, Maximilian Maria</dc:creator> <dc:date>2023-11-24</dc:date> <dc:description>The achievable precision in the calculations of predictions for observables measured at the LHC experiments depends on the amount of invested computing power and the precision of input parameters that go into the calculation. Currently, no theory exists that can derive the input parameter values for perturbative calculations from first principles. Instead, they have to be derived from measurements in dedicated analyses that measure observables sensitive to the input parameters with high precision. Such an analysis that measures the production cross section of oppositely charged muon pairs with an invariant mass close to the mass of the $\mathrm{Z}$ boson in association with jets in a phase space divided into bins of the transverse momentum of the dimuon system $p_T^\text{Z}$, and two observables $y^*$ and $y_b$ created from the rapidities of the dimuon system and the jet with the highest momentum is presented. To achieve the highest statistical precision in this triple-differential measurement the full data recorded by the CMS experiment at a collision energy of $\sqrt{s}=13\,\mathrm{TeV}$ in the years 2016 to 2018 is combined. The measured cross sections are compared to theoretical predictions approximating full NNLO accuracy in perturbative QCD. Deviations from these predictions are observed, rendering further studies at full NNLO accuracy necessary. To obtain the measured results large amounts of data are processed and analysed on distributed computing infrastructures. Theoretical calculations pose similar computing demands. Consequently, substantial amounts of storage and processing resources are required by the LHC collaborations. These requirements are met in large parts by the resources of the WLCG, a complex federation of globally distributed computer centres. With the upgrade of the LHC and the experiments, in the HL-LHC era, the computing demands are expected to increase substantially. Therefore, the prevailing computing models need to be updated to cope with the unprecedented demands. For the design of future adaptions of the HEP workflow executions on infrastructures a simulation model is developed, and an implementation tested on infrastructure design candidates inspired by a proposal of the German HEP computing community. The presented study of these infrastructure candidates showcases the applicability of the simulation tool in the strategical development of a future computing infrastructure for HEP in the HL-LHC context.</dc:description> <dc:identifier>https://publish.etp.kit.edu/record/22205</dc:identifier> <dc:identifier>oai:publish.etp.kit.edu:22205</dc:identifier> <dc:language>eng</dc:language> <dc:relation>url:https://publish.etp.kit.edu/communities/cms</dc:relation> <dc:relation>url:https://publish.etp.kit.edu/communities/computing</dc:relation> <dc:relation>url:https://publish.etp.kit.edu/communities/etp</dc:relation> <dc:rights>info:eu-repo/semantics/openAccess</dc:rights> <dc:subject>LHC Run 2</dc:subject> <dc:subject>Standard Model</dc:subject> <dc:subject>Muons</dc:subject> <dc:subject>Jets</dc:subject> <dc:subject>Non-perturbative QCD</dc:subject> <dc:subject>Distributed Computing</dc:subject> <dc:subject>Grid Computing</dc:subject> <dc:subject>Simulation</dc:subject> <dc:subject>Infrastructure Design</dc:subject> <dc:title>Measurement of Triple-Differential Z+Jet Cross Sections with the CMS Detector at 13 TeV and Modelling of Large-Scale Distributed Computing Systems</dc:title> <dc:type>info:eu-repo/semantics/doctoralThesis</dc:type> <dc:type>thesis-phd-thesis</dc:type> </oai_dc:dc>