Information Technology Reference
In-Depth Information
Keywords Simulation Real-time visualization Game engine architecture
Task distribution
2.1 Introduction
Increasing realism level in virtual simulations depends not only on the enhancement
of modeling and rendering effects, but also on the improvement of different aspects
such as animation, artificial intelligence of the characters, and physics simulation.
Real-time systems are defined as solutions that have time constraints to run their
tasks. Hence, if the system is unable to execute its work under some time threshold,
it will fail. In order to achieve such constraints, the main loops have to be carefully
implemented. The main loop is the design pattern of such kind of applications.
Real-time simulators are applications that employ knowledge of many different
fields, such as computer graphics, artificial intelligence, physics, computer net-
works, and others. While these are typical requirements found in games, simula-
tions usually require these features with much more accuracy. More, computer
simulators are also interactive applications that exhibit three general classes of
tasks: data acquisition, data processing, and data presentation. Data acquisition is
related to gathering data from input devices as keyboards, mice, and dedicated
interfaces, depending on the simulator. Data processing tasks consists on applying
logic rules, responding to user commands, simulating physics, and artificial
intelligence behaviors. Data presentation tasks relate to providing feedback to the
user about the current simulation state, usually through images and audio. Many
simulators are included in multiuser environments, requiring the usage of distri-
bution and logical partitioning of the scene [ 1 ].
Simulators are interactive real-time systems and have time constraints to exe-
cute all of their processes and to present the results to the user. If the system is
unable to do its work in real time, it will lose its interactivity and consequently it
will fail. A common parameter for measuring the performance is frames per
second (FPS). The general lower acceptable bound for a game is 16 FPS. There are
no higher bounds for a FPS measurements, but when the refresh rate of the video
output (a computer monitor) is inferior to the game application refresh rate, some
generated frames will not be presented to the user (they will be lost). One moti-
vation for designing loops optimizations is to better achieve an optimal FPS rate
for the application. Doing so, it is possible to spend more time with higher pre-
cision physical calculation or more complex logic behaviors.
The architecture that we present in this paper follows a similar concept as cloud
and distributed computing, where machines across the Internet shares resources,
software, and information, where the user's computer can use other resources
available on the network, to help it in processing the application. By using this
approach, a computer with less computing power could join the simulation session,
by relaying the effort to process the system to the network cloud.
Search WWH ::




Custom Search