Game Development Reference
In-Depth Information
4. Then we enter an ininite loop where we call the
Env_Sleep()
function, whose
source code is explained as follows, to avoid using 100 percent utilization of CPU:
FPendingExit = false;
while ( !IsPendingExit() ) { Env_Sleep( 100 ); }
In this example, we used a fixed value of 100 milliseconds to put the
thread into the sleep mode. When processing audio, it is useful to
calculate sleep delays based on the buffer size and sampling rate. For
example, a buffer of
65535
bytes that contains 16-bit mono samples at a
sampling rate of
44100
Hz gives us approximately
65535 / (44100 × 16 /
8) ≈ 0.7
seconds of audio playback. Stereo playback cuts this time in half.
5.
Finally, we release the OpenAL objects:
alcDestroyContext( FContext );
alcCloseDevice( FDevice );
UnloadAL();
}
6. The rest of the declaration simply contains all the required ields and the initialization
lag:
bool FInitialized;
private:
ALCdevice* FDevice;
ALCcontext* FContext;
};
7.
The
Env_Sleep()
function used in the code just makes the thread inactive for
a given amount of milliseconds. It is implemented using the
Sleep()
system call
in Windows and the
usleep()
function in Android:
void Env_Sleep( int Milliseconds )
{
#if defined _WIN32
Sleep( Milliseconds );
#else
usleep( static_cast<useconds_t>( Milliseconds ) * 1000 );
#endif
}
8.
Playing the
.wav
iles is not enough for us, since we want to support different
audio formats. So, we have to split the audio playback and the actual decoding
of ile formats into two separate entities. We are ready to introduce the
iWaveDataProvider
class whose subclasses serve as data sources
for our audio playback classes: