Information Technology Reference
In-Depth Information
A Flexible Strategy for Distributed and Parallel
Execution of a Monolithic Large-Scale
Sequential Application
Felipe Navarro 1 , Carlos Gonzalez 1 , Oscar Peredo 1 , Gerson Morales 1 ,
Alvaro Egana 1 , and Julian M. Ortiz 1 , 2
1 ALGES Laboratory, Advanced Mining Technology Center (AMTC),
University of Chile, Chile
2 Department of Mining Engineering, University of Chile, Chile
Abstract. A wide range of scientific computing applications still use al-
gorithms provided by large old code or libraries, that rarely make profit
from multiple cores architectures and hardly ever are distributed. In this
paper we propose a flexible strategy for execution of those legacy codes,
identifying main modules involved in the process. Key technologies in-
volved and a tentative implementation are provided allowing to under-
stand challenges and limitations that surround this problem. Finally a
case study is presented for a large-scale, single threaded, stochastic geo-
statistical simulation, in the context of mining and geological modeling
applications. A successful execution, running time and speedup results
are shown using a workstation cluster up to eleven nodes.
Keywords: HPC, parallel computing, distributed system, workload mod-
eling, gslib.
1 Introduction
The development of scientific computing applications has been benefited by new
hardware technologies and software frameworks, allowing new applications to
reach faster execution times, using better programming practices. Despite these
advances, many fields in science and engineering still use algorithms and methods
implemented in large monolithic applications, in the sense that they have single-
tiered and self-contained software designs, contrary to current trends of modular
and flexible designs. From those monolithic applications, only a portion were
designed to eciently use multi-core architectures and even less can be executed
in distributed environments. Nowadays, many monolithic sequential applications
are still actively used, taking several minutes, hours or days to compute.
Many scientists are not parallel computing users and - in some cases -have
basic programming skills. Most of the time they use large old code or libraries
designed for single-core workstations, mostly because their research priority is
Search WWH ::




Custom Search