There are other differences between these libraries as well. JMF, for example, is a powerful and
complex tool designed to process any sort of media. I am sure audio visualizations have been created
with the JMF library, but Java Sound has a more modern and simpler API, so it makes for better example
code. The AudioClip class is part of the Applet API; it provides only the most basic functionality, so it is
not suitable for our uses.
To use the Java Sound API, we have to do a couple of things in our code: we must prepare the audio
file for playback, buffer the song, create a thread that reads and writes the audio data, and write some
code that analyzes the audio data as it is being played.
Figure 9-2 is a graphical representation of all the classes and threads required to sample the audio as
it is playing as well as expose the audio stream to JavaFX. As we can see, there are three threads involved
in making this all work, but only the Audio thread and the Accumulate thread are defined by our code.
The JavaFX rendering thread is responsible for drawing the scene and is implicitly defined when any
JavaFX application is created.
Figure 9-2. Interaction between classes
The Audio Thread reads from the source of the audio and uses Java Sound to play it through the
speakers. The Accumulate Thread samples the sound data as it is being played and simplifies the data so
it is more useful to our application. It must be simplified because it is hard to create an interesting
visualization from what is effectively a stream of random bytes. The Accumulate Thread informs the
JavaFX thread that there are changes to the data through the Observable/Observer pattern. Lastly,
changes are made to the scene based on the simplified audio data. The following sections explain how
this is implemented in code.