Hardware Reference
In-Depth Information
the automated data movement in Lustre 2.5, the striped transfer mode will
provide significant performance improvement.
3.4 Blue Waters Applications
The allocation of resources on the Blue Waters system is handled by sev-
eral allocation bodies; the National Science Foundation (NSF) allocates 80%
of available resources with the remaining 20% allocated by the University
of Illinois at Urbana{Champaign (UIUC), the Great Lakes Consortium for
Petascale Computation (GLCPC), and by the Blue Waters Project for In-
dustrial, Educational and Innovation opportunities. During the first year of
operation or production, there were 30 NSF projects, 29 UIUC projects, 10
GLCPC projects, and a handful of Industry and Education projects. Collec-
tively, the projects are called SETs. An up-to-date list of allocated projects
is available from the Blue Waters portal. 9 The 70 SET projects from the first
year of production utilized a wide variety of applications with a comparable
wide range of I/O requirements.
The SETs typically have more than one primary application. A partial list
of applications used by the SETs contains over 42 applications in use during
the first year of operations. The areas of research range from astrophysics
(ENZO, pGADGET, PPM, MAESTRO, CASTRO), biochemistry (AMBER,
Gromacs, NAMD), and climate science (CCCM, CESM), to geophysics (Cy-
berShake, AWP-ODC), heliophysics (H3D, VPIC, OSIRIS), lattice quantum
chromodynamics (MILC, Chroma), and turbulence (PSDNS, DISTUF).
The NSF SETs account for the largest portion of the time available on
Blue Waters; many of which were involved with Blue Waters project during
the co-design deployment phase. An understanding of the SETs I/O method-
ologies and requirements came from analysis of their proposals, responses to
questionnaires, and from interviews with project team members. The infor-
mation provided by the teams was used to evaluate the effectiveness of the
Blue Waters file system design, its configuration, and policies and practices
used to manage the file system.
Blue Waters' focus is to enable investigations not possible elsewhere and
to turn petascale computation and analysis at very large scales from a heroic
event to an everyday event. Figure 3.6 shows a typical job mix for Blue Waters.
This figure shows the job layout across the torus for the 10 largest running
jobs as displayed on the Blue Waters portal. Each circle represents one Gemini
(two nodes) in the torus. As shown in the figure, 99.8% of nodes were in
use, which is a good day, but is not far from typical. One can see the special
attention to topology-aware scheduling to reduce torus contention. The largest
job displayed uses 13,851 nodes (443,232 integer cores) and ran for 8 hours
and 27 minutes.
9 bluewaters.ncsa.illinois.edu
 
Search WWH ::




Custom Search