Friday, May 24, 2002, 16:00
WHGA Auditorium
T. Wenaus, CERN
Abstract:
The Large Hadron Collider (LHC) at CERN will come into operation in
2007, opening a new frontier in particle physics with its
unprecedented collision energy and luminosity. Its general purpose
detectors ATLAS and CMS are optimised to maximise the discovery
potential for new physics such as Higgs bosons and supersymmetric
particles, while specialised detectors are directed at heavy ion
physics (ALICE) and B physics (LHCb).
The LHC experiments will generate petabytes per year of data collected
from highly complex detectors for processing in analysis systems
capable of extracting physics signals down to very small signal to
background ratios. Analysis will be performed by globally distributed
physicists numbering about 2000 in each of the large experiments. The
computing and software systems required for the LHC physics program
present challenges that require the development and application of new
technologies and approaches in data management and processing,
software and collaborative science. Innovation must come particularly
in the area of distributed analysis, and LHC scientists are working
in close collaboration with computer scientists in developing new
'grid computing' technologies. This talk will survey the challenges of
software and computing at the LHC and how they are being addressed.