So I've been experimenting with atlasXML and a sprite. I think I can feasibly make an XML that would map the movement of a character's mouth in sync with audio. I acknowledge that this will be tedious but I have a plan to build a C# app which will make this process a lot easier. If you've already tried this, how did it go? What were the biggest problems? Do you have any examples you can show??? That last question is perhaps the most important. Are there any examples online? I've already seen the following posts but they don't really seem to have much resolution. Am I blazing a new trail here? http://www.html5gamedevs.com/topic/6547-dealing-with-audio-latency-on-some-platforms/ http://www.html5gamedevs.com/topic/12082-sync-animations-to-sound/
Hello. I'm a complete newb to Phaser. In fact, this is my first post. I have lots of experience working with animations in various platforms but never in Phaser. My goal: I want to make a character and animate its mouth in sync with a long audio clip. Basically, this character will be talking for 1 to 2 minutes at a time. The character will move and do other animations as well but the most important thing, and probably the most difficult, will be to animate the mouth for the whole sequence. My questions: what Phaser tutorials would you suggest to get me pointed in the right direction? I couldn't seem to find anything that exactly matches what I need but I'm sure I can figure it out by doing others.