HTML and CSS Reference
At the dawn of real-time multiplayer network programming, network engineers established peer-to-peer
connections among players, whereby all players (peers) are connected to each other and can send their actions to
each other directly. Although this reduces the overall network traffic, connecting all peers is problematic, owing to
one-way firewalls and routers, which have become commonplace among today's Internet users. As a result, there
will be a subset of peers between whom a connection cannot be made, and the server will have to route messages
for these peers, regardless. Thus, most modern games use a client-server model, whereby all traffic passes through
a central server. This server is typically a dedicated machine with low packet latency and high bandwidth. Because
data are flowing through the server, clients need to send their actions to only one destination, dramatically reducing
upload bandwidth, but at the expense of increased latency (see Figure 12-1 ).
Figure 12-1. Graphical model of peer-to-peer and client-server network topologies (courtesy of Wikimedia)
Latency is the measure of time delay experienced by a system. Typically, two types of latency concern network
The first is input latency , the time between when the user requests an action (e.g., by pressing a button) and
when that action appears to take place. In games such as WarCraft III , an early real-time strategy (RTS) game
designed when many of the players were on high-latency Internet connections, the game plays an acknowledgment
sound immediately after the user takes an action. This audio cue indicates to the user that his or her action has been
received, even though the input has not taken effect. This trick and other visual and audio illusions can simulate lower
levels of input latency, without making changes to the game engine.
The second form of latency is state latency (also called simply latency), which measures the time between when a
local action is taken and when that action is received by all the remote clients. This is the true measure of latency in a
system, and there are few ways to reduce it. However, it is possible to hide state latency through client-side prediction
(for more information, see the section “Client-Side Prediction”).
Synchronization is the most challenging problem facing network programmers. When a new client joins a game in
progress, or a new game begins, the server must perform an initial sync, whereby the complete state of the game,
including any custom assets, or server-specific settings, are sent to the new client. After the initial sync, the server can
strictly route client actions and assume that all clients executing the same actions will maintain exactly the same game
state. This is known as the lockstep method. The server can also continue to send the complete game state at regular
intervals, along with all client actions. This is known as the state broadcast method. When two clients are playing on the
same server but contain different game states because of a problem in the network code, they are said to be out of sync.
Two out-of-sync clients may each perceive that he or she is winning the game, when in fact the client's opponent is
moving, based on his or her own divergent game state. When two clients are experiencing a different game because of a
permanent out-of-sync condition, this is called a desync and results in frustrated players and a bad player experience.