Game Development Reference
In-Depth Information
Now we can create the audio source:
clPtr<AudioSource> Src = new AudioSource();
Finally, we need to create a WavProvider object, which decodes audio iles, attach it to the
Src source, start playback and wait for its completion:
clPtr<Blob> Data = LoadFileAsBlob("test.wav");
Src->BindWaveform( new WavProvider( Data ) );
Src->Play();
while ( Src->IsPlaying() ) {}
After the sound playback is inished, we reset the Src pointer to NULL and send the
termination signal to the g_Audio thread:
Src = NULL;
g_Audio.Exit(true);
To obtain the Data object, we have to implement the following function, which reads the ile
contents into a memory block:
clPtr<Blob> LoadFileAsBlob( const std::string& FName )
{
clPtr<iIStream> input = g_FS->CreateReader( FName );
clPtr<Blob> Res = new Blob();
Res->CopyMemoryBlock( input->MapStream(), input->GetSize() );
return Res;
}
We use the global initialized instance of FileSystem , the g_FS object. Please note that
on the Android OS, we cannot use the standard paths and therefore resort to our virtual ile
system implementation.
There's moreā€¦
We can implement a number of helper routines to ease the use of the AudioSource class.
The irst useful routine is source pausing. OpenAL provides the alSourcePause() function,
which is not enough, since we have to be in control of all the unqueued buffers being played.
This unqueuing is not important at this point as we have only one buffer, but when we get to
streaming the sound, we have to take care of the buffers queue. The following code should be
added to the AudioSource class to implement pausing:
void Pause()
{
alSourcePause( FSourceID );
UnqueueAll();
}
 
Search WWH ::




Custom Search