Snouto

Animation and Spatial Audio

Recommended Posts

Does anyone know if there's a reason why spatial audio doesn't work when animating a camera towards a mesh that has an audio source either attached or placed in the same position?

My project has an Arc Camera parented to a mesh, which is then animated around a scene. I've set up end points where the camera parent will animate to, with the idea being as soon as it approaches the centre of the target position the spatial audio will kick in. However, in practice this doesn't appear to ever happen and in fact the only way I seem able to get it to work at all is if I manually zoom or move the arc camera past these audio zones.

Any thoughts anyone?

Share this post


Link to post
Share on other sites

As far as I remember, it's a limitation of my code not handling the Arc Rotate camera direction update. I've really thought about it being used in a FPS way using thus a FreeCamera and all derivatives (WebVR and so on). To make it works, I would need to update this part: https://github.com/BabylonJS/Babylon.js/blob/master/src/babylon.scene.ts#L4075 

Share this post


Link to post
Share on other sites
6 hours ago, davrous said:

As far as I remember, it's a limitation of my code not handling the Arc Rotate camera direction update. I've really thought about it being used in a FPS way using thus a FreeCamera and all derivatives (WebVR and so on). To make it works, I would need to update this part: https://github.com/BabylonJS/Babylon.js/blob/master/src/babylon.scene.ts#L4075 

Thanks guys. So I've made the following change in my Babylon.max.js code (from line 4066 in your source code linked above):

var matrix = listeningCamera.getWorldMatrix();
var world_position = BABYLON.Vector3.TransformCoordinates(listeningCamera.position, matrix);
audioEngine.audioContext.listener.setPosition(world_position.x, world_position.y, world_position.z);

and it seems to work fine, but can you see any issues with this patch? I'm wondering what the cost of calculating the world matrix and then transforming the camera's position using this matrix will be on the rendering pipeline. I need my scene to run smooth on all devices including mobile so if this is likely to cause a performance hit I'll have to think of a different approach.

Share this post


Link to post
Share on other sites

Snouto gets his "I Contributed to BJS Core" certificate and shoulder patch!  YAY!!!  Congrats, Snouto!  (also thx to core team who does the feature-add!)  YAY!  Party!!!

(there's not really a certificate or shoulder patch, but... let's pretend-up some BEAUTIFUL ONES just the same)  :)

Share this post


Link to post
Share on other sites

By the way, maybe I'm missing something here but as I mentioned in my first post spatial audio appeared to work fine (at least, with my limited tests) using the ArcRotateCamera so long as one moves the camera manually. It's only when the camera is parented and then the parent animated that the spatial audio has issues. If the issue is that this setup would work fine with any other camera then fine, I just wanted to be absolutely clear here.

p.s. do you need my home address for the contributor patches @Wingnutmentioned? ;)

Share this post


Link to post
Share on other sites

An additional question: 

When we create spatial audio we define as such:

new BABYLON.Sound("Music", "sounds/sound.wav", scene, null, { loop: true, autoplay: true, spatialSound:true, maxDistance:1.5, volume:0.3 });

The docs say the value of maxDistance - 1.5 here - is also in units.

How does this distance value relate to e.g. diameter of a sphere of 5, what would the maxDistance value need to be to ensure the audio starts to play the moment the edge of the sphere is encountered?

Share this post


Link to post
Share on other sites

So by your example, max distance would equal the radius of the spheres in your scene I.e. half the sphere diameter? If I had a cube of height,width,depth = 5 each would max distance then be 2.5 if we assume the sound is attached to the mesh (does attaching to a mesh centre the sound in the mesh?)

Share this post


Link to post
Share on other sites

Is there any way I can get the arc camera spatial audio update added sooner rather than later? Just noticed the 3.3 timeline is up until september 2018, which is well beyond my needs.

 

@Deltakosh when you said earlier that support for the arc rotate camera was needed, what exactly is it missing at the moment? Like I mentioned in a subsequent comment I can hear the audio kicking in while using my arc camera. I have reservations about whether the audio is playing correctly, in the sense that it starts when it should and increases in volume as it approaches the source, and that the range of the audio is accurate. I'm sure I've heard other audio playing at the same time that shouldn't be within earshot, and i'm not at all convinced the sound starts, stops and the volume adjusts as it should. Are these the issues you're thinking of?

 

Cheers :)

Share this post


Link to post
Share on other sites

Indeed you are correct that does work well, but if you recall the original purpose of this thread was to discuss spatial audio in the context of an arc camera attached to a mesh and the mesh then animated. You seemed to acknowledge that using spatial audio in this way would require changes to the arc camera and added it to your 3.3 milestone.

I then made a change in the code at my side, but I'm not sure whether this is the change you have added to your 3.3 milestone or if you intended to do something more, and that's really what I'm following up here. With regards to my observations of my scene it could simply be my crappy ears not hearing things properly, but I just wanted to be sure the custom implementation I have now is in fact what you intend to add later (i.e. I'm not missing something).

 

 

Share this post


Link to post
Share on other sites

On a seperate note but still related to spatial audio, i've just noticed that using the audio analyser with spatial audio only works if the audio can be heard. So for example if the spatial audio is at a distance that means audio can not be heard, then the soundtrack to which the sound is attached will not create any data for the analyser to use. Is there something going on under the hood with spatial audio whereby maybe audio.pause is being called when out of range, to save on resources perhaps?

Share this post


Link to post
Share on other sites
On 12/04/2018 at 5:02 PM, Snouto said:

On a seperate note but still related to spatial audio, i've just noticed that using the audio analyser with spatial audio only works if the audio can be heard. So for example if the spatial audio is at a distance that means audio can not be heard, then the soundtrack to which the sound is attached will not create any data for the analyser to use. Is there something going on under the hood with spatial audio whereby maybe audio.pause is being called when out of range, to save on resources perhaps?

Any chance of a follow up on this stuff @davrous? Cheers :)

Share this post


Link to post
Share on other sites

For the audio analyzer, I'm just binding to the Web Audio API. By default, I'm attaching the audio analyzer to the output of the soundtrack gain node. So, if the sound can't be heard because of the distance, the analyzer will indeed show nothing. I thought it would be the most common usage case. To have the analyzer displaying always some data, this would require to plug-in before the panner node (which web audio uses to create spatialization).

Why do you need such a feature?

Share this post


Link to post
Share on other sites
10 hours ago, davrous said:

For the audio analyzer, I'm just binding to the Web Audio API. By default, I'm attaching the audio analyzer to the output of the soundtrack gain node. So, if the sound can't be heard because of the distance, the analyzer will indeed show nothing. I thought it would be the most common usage case. To have the analyzer displaying always some data, this would require to plug-in before the panner node (which web audio uses to create spatialization).

Why do you need such a feature?

Thanks for the information @davrous - my reasons for needing such behaviour are quite specific to my project and probably not likely to be needed elsewhere, but to answer your question I wanted to have a visual representation in my scene of audio playing at a particular location. To do this I'm simply updating the Y scale of a mesh in time to one of the frequency channels, but since the analyser only works when the audio can be heard this means that from afar my mesh does nothing and the effect is lost. It's not vitally important though.

I'm still interested in understanding what updates to the Arc camera you were planning on making to the spatial audio set up (added to the roadmap here https://github.com/BabylonJS/Babylon.js/issues/4001). I feel like I might be missing out on an important update for the spatial audio to work exactly as intended with my use case, but it's equally possible you had some other updates to make that won't make a huge difference to me. If they are significant and important updates would it be possible to expedite and release the work quickly?

Cheers :)

Share this post


Link to post
Share on other sites

Hi @Snouto, ok I get it for analyzer. I'll see how to update the audio engine to let people connecting the analyzer directly to the audio source rather than after specific nodes like panner.

For the Arc camera, it would really save both us time if you could create a Playground sample demonstrating what you're doing, the current audio behavior and what you think it should do instead. I'm currently busy working on various conferences (such as //BUILD) and don't have a lot of bandwidth to check this point. 

Thanks! :)

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

  • Recently Browsing   0 members

    No registered users viewing this page.