Game Development Reference
In-Depth Information
How it works...
The technical principles behind the phonemes are not that different from animating other
parts of the character. We create AnimChannels , which handles different sets of bones.
The first tricky bit is to organize the channels if you want to be able to control different
parts of the body at the same time.
The pipeline for how to apply the phonemes can also be difficult. The first step will be to
not set them directly in the code. It's not implausible that changing the expression of the
character could be called directly from the code on certain events. Doing so for each phon-
eme in a sentence would be very cumbersome. Using the cinematics system is a good start
as it would be relatively simple to write a piece of code that parses a text file and creates a
cinematic sequence from it. Timing is really crucial, and it can take a lot of time to get the
movements synced with sound. Doing it in a format that allows you to have a quick itera-
tion is important.
Another more complex way would be to build up a database that maps words and phon-
emes and automatically applies them in a sequence.
The absolutely simplest approach is to not really care about lip syncing and just apply a
moving mouth animation whenever the character speaks.
Search WWH ::




Custom Search