Hardware Reference
In-Depth Information
{ What is the degree of agyrotropy in the spatial vicinity of the X-
line? In other words, is the density plot of the U ?;1 vs. U ?;2 com-
ponents highly asymmetrical?
While these questions can be addressed to some extent for smaller-scale 2D
and 3D simulations involving millions or billions of particles, it is challenging
to address these questions when the number of particles reach beyond hun-
dreds of billions or trillions. Hampered by the lack of scalable tools, physicists
have largely ignored the particle data, used some form of sub-sampling, or re-
lied on coarser gridded data for their analysis. The two-trillion-particle study
was the first to offer flexible technical capabilities for analyzing trillion-particle
datasets.
19.3 I/O Challenges
The main challenges with the trillion-particle VPIC simulation are man-
aging the sheer volume of data. The VPIC simulation writes a significant
amount of data at a user-prescribed interval. In the simulation of two trillion
particles (including one trillion ions and one trillion electrons), three different
datasets were stored: particle datasets, field and hydro datasets, and check-
point datasets. The particle dataset for a given timestep comprises about one
trillion electrons. The data size of each electron particle is 32 bytes, repre-
senting various particle properties. The number of particles increases as the
simulation progresses. In our simulation, the datasets written in each timestep
varied between 30 TB to 42 TB of data. The simulation wrote the particle
dataset at 10 timesteps and the total particle data size was about 335 TB. The
field dataset includes information such as electric and magnetic field strength,
and the particle dataset includes information about the particle's position,
momentum, and energy. The field dataset per timestep is relatively small, on
the order of tens of gigabytes. Overall, the total amount of data after finishing
the simulation was approximately 490 TB, including field data and checkpoint
data. Another challenge discussed in Section 19.3 explains a scalable strategy
to store terabytes of data produced by a simulation running on hundreds of
thousands of cores.
19.4 Software and Hardware
19.4.1 Hardware Platform
Hopper is a Cray XE6 system located at NERSC consisting of 6,384
compute nodes, each containing two 12 core AMD 2.1-GHz MagnyCours
 
Search WWH ::




Custom Search