Information Technology Reference
In-Depth Information
This choice has multiple motivations, the most important being the fact
that 2008 is the turning point year for these experiments (ALICE, ATLAS,
CMS, and LHCb), which after many years of preparations are basically
ready to start (i rst proton beams have been circulated in the LHC collider
in September 2008 and the full start of data taking is scheduled for spring
2009). These experiments have played a crucial role in the evolution
of grid technologies in the last several years and notably in connection
with grid infrastructure projects. The most important projects are EGEE
(Enabling Grid for E-sciencE) in Europe [2], OSG (Open Science Grid) in the
US [3], and NDGF (Nordic Data Grid Facility) in the Nordic countries [4].
In the evolution of grid technology the HEP community and the HEP
experiments have played a determinant role. The essential contribution
was the enthusiastic promotion of the idea of grid computing formalized
and popularized by I. Foster and K. Kesselmann in the late 1990s [5]: HEP
adopted grid technology as the foundation for the entire computing in the
LHC era. This led to the creation of the i rst production infrastructures
based on grid technology.
The importance of the HEP role can be judged by following facts:
1. The HEP community had already at that time an established
experience in creating long-lived collaborations across different
and geographically distributed entities (universities, laboratories,
etc.) funded by the coherent effort of several funding agencies.
The HEP experiments were already exceeding several hundred
collaborators from several tens of universities in the early 1990s
(e.g., CDF experiment at Fermilab, USA). At the same time, thus
still in the preparation phase, the LHC experiments were reach-
ing an even larger scale (the largest LHC experiment, ATLAS,
exceeds 2,100 physicists from 167 institutes in 37 countries). In a
sense, the HEP world was proving that the collaboration scale that
the grid was suggesting was attainable and even desirable when
excellence and optimization of resources requires to cross exist-
ing borders (national, institutional, etc.).
2. The HEP community had already started a deep rel ection about
the way to provide the necessary computing power and data han-
dling capabilities for the LHC research program. The experience
of the CERN LEP experiments (active between 1989 and 2000 at
CERN) and of several other HEP experiments such as CDF and
D0 (Fermilab), BaBar (SLAC), NA48, and COMPASS (CERN) made
very clear the importance of computing in terms of handling very
large data samples (1 PB range). This was not new; from the
very beginning, nuclear and particle physics were early adopters
of new computing technologies. The new point was the observa-
tion that the computing infrastructure (software and hardware)
Search WWH ::




Custom Search