Game Development Reference
In-Depth Information
/**
* Fires on init graphics
*
* @param w width of the image
* @param h height
*/
private static void OnInitVideo(int w, int h) {
if (listener != null)
listener.OnInitVideo(w, h);
}
/**
* Fires when the C lib calls SysError()
* @param message
*/
private static void OnSysError(final String message) {
Log.e(TAG, "Natives::OnSysError " + message);
if (listener != null)
listener.OnSysError(message);
}
}
To listen for events from the native engine, QuakeView implements Natives.EventListener
and calls
// Listen for JNI events
Natives.setListener(this);
An important step is to load the native library using the System class.
// Load native lib
System.loadLibrary("quake");
This loads the library libquake.so from the project folder libs/armeabi . This library load
takes care of the renderer, but you must also handle audio, keyboard, and touch events.
Handling Audio Independently of the Format
One of the most frustrating things when working with native code in Android is audio
handling. There are few options to work with outside the Java realm, making native
development very tough. In the early days, Google used the obscure Enhanced Audio
System (EAS) API to provide audio. I've never heard about it or seen any game engine use
it. Nevertheless, progress has been made and now new open APIs such as Open Audio
Library, or OpenAL, are supported. OpenAL is used by modern engines, but unfortunately
Quake does not use it.
Luckily, there is a neat feature from JNI to access the memory address from a Java
ByteBuffer within C. This allows the native code to simply write audio bytes to that memory
address, which in turn will be played by the Java code using the Android MediaTrack API.
 
Search WWH ::




Custom Search