Geography Reference
In-Depth Information
public void onSensorChanged(SensorEvent event) {
....
mOverlay.update(CalculationsHandler.doCalc(
CalculationsHandler.
calculateDeviceOrientation(average)));
...
}
Since these updates are happening very rapidly and the devices CPU would not
be able to keep up with all the calculations it calculates the average of a number of
updates and then passes that value on to be used to find the orientation. The
variable for the average value may be changed according to the device's capa-
bilities. Once both the Magnetic Field and Accelerometer values have been passed
to the CalculationsHandler class it does the following.
...
if (SensorManager.getRotationMatrix(
inR, null, accelerometerData, magneticFieldData)) {
if (SensorManager.remapCoordinateSystem(
inR, SensorManager.Axis_X,
SensorManager.Axis_Z, outR)) {
SensorManager.getOrientation(outR, orientation);
}
}
...
7.7 Pointing in the Right Direction?
This section deals with the problem of how to determine if the camera phone is
pointing in the right direction. Essentially we have two location tuples and need to
find out if one is in line of sight of the other. We also have the device's orientation
(the azimuth value), ranging from a bearing of 0(N) through 90(E), 180(S),
270(W) back to North.
7.8 The End Result
Lastly we take a look at a scenario where we can see all of the components
working. Figure 12 shows an example of the camera phone pointing at virtual
objects. The same building is displayed from two different angles.
Search WWH ::




Custom Search