Search the Community

Showing results for tags 'audio'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • HTML5 Game Coding
    • News
    • Game Showcase
    • Facebook Instant Games
    • Coding and Game Design
  • Frameworks
    • Phaser 3
    • Phaser 2
    • Pixi.js
    • Babylon.js
    • Panda 2
    • melonJS
    • Haxe JS
    • Kiwi.js
  • General
    • General Talk
  • Business
    • Collaborations (un-paid)
    • Jobs (Hiring and Freelance)
    • Services Offered

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


Website URL


Twitter


Skype


Location


Interests

Found 185 results

  1. Hello guys, My name is JK. I'm a professional video game and theme music composer. I've written music for video game developers for many years. Below is my area of work for video game and theme. A Wide Range of games Action, Adventure, Casual, Puzzle, 8bit/chiptune, Arcade, Sci-fi, fantasy 2.Production & Media Video Channel/ Presentation, Podcast, Cartoon, Movie, Anime, Corporate, Business, Inspirational Themes, Advertise. If you're interested in hiring me, you can listen to music and check out my gig on this site https://www.fiverr.com/jaksanapong/create-original-music-loop-for-your-game-or-video If you have some question or concern, feel free to shoot me a message Thank you and I'm looking forward to working with you guys. JK
  2. Hi folks, I want to buy mic for recording some sound fx and voices for our characters. Budget is under 300$. What you recommends? I know that there is bunch of videos on youtube about it, but i need it for game sound fx, so i think it is correct place to ask about it. Thanks!🎤
  3. Hunt prehistoric creatures with your friends online. Build your own base, craft tools, and weapons to survive in a large-scale true cross-platform open world game. This game is an application of the engine I’ve built, to prove a statement: It is POSSIBLE to build a 3D version of the Internet, where instead of browsing through websites, we could jump from one 3D space to the next. I “invite” everyone to make this happen. I want you guys to build your own 3D spaces implementing your own ideas what the web should look like in the future. We could just link them all together and make this Interconnected Virtual Space happen - yeah, the Metaverse, for the Snow Crash fans out there Tech Details that I hope provokes further questions: Loading Assets on Demand is even more important in the Browser than on PC or Console. Internet speed is only a fraction of the speed of the hard drive. It is essential to only load whats visible if you want to provide an open world environment for users visiting your world the first time. LOD - Level of Details allows Web Browsers to show something immediately for the users just like an ordinary website. It may look poor and users can see the object improving as they are loaded, still, I think its a good trade-off. Users get a good enough view of their environment instantly and can start interacting with it immediately. Terrain and the Grid System I’ve created the terrain in Blender, then I split it up like a chessboard automatically using Javascript. It is easy/cheap to determine that which cell contains the given coordinate and every single cell in the Grid has a reference to its neighbors. This is the basis of server layers of optimization when it comes to rendering, AI, and collision detection, etc. A recursive search is very easy to do, using those links to neighboring cells. Lighting I've implanted basic Directional and Ambient Lighting to support Day & Night Cycles and Point Lighting for individual light sources like a campfire, torch etc. To my surprise, the difficult part was to get good looking flames, thanks to the lack of Alpha Sorting in WebGL, what I had to implement in Javascript instead. Animation I animate my models in Blender, export them to “.dae". The file format comes with a serious limitation, you can’t define multiple animations and it only maintains a list of keyframes as references. So I maintain a JSON file per “.dae” to define multiple animations by having sets of keyframes. E.g.: “{running: [0, 4], jump: [5,7], ..}”. But I kept it simple and didn’t take it to the realm of animations per body part. Physics In short, I was stupid enough to write my own ..on the other hand, I have a fine level of control over how much I allow it to run. Again, on mobile, it is crucial to have that level of control to navigate 200+ creatures in real-time. I have two different type of Collision Detection. Collision with the Terrain and collision with other model surfaces. Terrain collision is very cheap, this allows me to move so many NPC at once. Collision with other models is heavier though, but that allows me to climb random looking rocks. I optimized it enough to make it feasible to run on mobile devices. I use a low poly version of the models to determine the collision and I only run it for the models near the Player, utilizing the Grid System I mentioned above. AI NPCs can navigate a random terrain, avoid obstacles and “see” and “hear” other NPCs if behind them. At the moment, they move rather robotically, but this allows me to calculate, where they can move next without hitting any obstacles and how long it will take to get there. I only run the AI right before they get to the target location. Basically, 200+ NPCs make only 40-100 AI calls per second. ..I certainly have room for improvement here Multithreading in the browser is difficult but necessary to achieve good Frame Rate. Nothing but the rendering should be on the main thread ideally - Good luck to achieve it though I’ve managed to push most of the heavy logic into a speared thread, but AI is still running on the main one. In a thread you have very limited access to important functionalities of the browser, therefore, there is only so much you can do. Also, specific objects can only be passed by reference between threads, everything else has to be serialized on one end and deserialized on the other. You want to be careful how often you do it. Audio I use the Web Audio API that works as expected. On top of that, I implemented Audio Sprites: I compile all related sounds to a single mp3 file and that comes with a JSON object to define where certain audio snippets can be found. It's a fast, accurate and reliable solution unlike using Audio HTML Tag, but that one has its own use cases as well, e.g.: streaming an mp3 file comes for free, like streaming an internet radio station. Multiplayer - I use WebRTC and not WebSockets - I know, I know, hear me out. The idea was that COOP is a very likely scenario and players may only prefer the company of their friends. I that case, they don’t have to purchase access to a private server, as long as they are happy to let their world go dormant between gaming sessions. Plus, all the logic is implemented for single player mode on the client-side, which logic has to be duplicated on the servers, if I went down the WebSockets rout - just think about where the AI logic should be, server- or client side. I expect this one to be a controversial decision, ..sometimes even I'm not sure whether this was the “right" decision There is a whole lot more to this though. Resources could be distributed between players when it comes to AI to ease the load on the Host - I know it is a potential security issue, but there is a use case for it, like AI for distant NPCs in COOP as long as you have no hacker friends ..this could be crucial on mobile devices. Controller Support The Gamepad API provides you with raw access to every button and joystick. You certainly have to implement your own layer on top of that. Events for pressing/holding down buttons don't come out of the box. Implementation of the dead zone of joysticks is missing and it is inconsistent how you can access different types of controllers through the API, even the same controller but on different devices. In the end, you will have to provide a controller mapping implementation as well in your settings. But its totally doable and works like a charm. Touchscreen Support It's a tricky one. As I mentioned above, on iPhones its completely useless till Apple decides to comply with Web Standards. Even on Android, it is a bit tricky: For the UI you probably want to use HTML. It makes sense because it comes with all the neat features that make our lives easier until you realize that you can’t switch between weapons while running - wait, what? You see, while you are holding down the left side of the screen to maintain speed and try to click on a button, or any HTML element to switch weapons, the click event won’t fire. No click event will be fired while doing multi-touching. As a result, HTML and any fancy framework for that matter are no longer good enough solution own their own. UI When it comes to games we expect a whole lot more from the UI than on a website. Multi-touch support - as discussed above and even controller support is not as straightforward as you might think. You have raw access to the controller, so when a button is selected and the user pushes the joystick diagonally upward, you have no idea what is in that direction, therefore what should be selected next. You have to implement the navigation of your UI with a controller for yourself. And you need controller support because that's the only way to move around in VR and on consoles. Speaking of VR, you want to render your UI twice in VR - once for the right eye and once for the left eye - and only once when not in VR, just something to keep in mind Also, rendering the UI could be heavy. This might be a surprise but if you think about it, on a website you don’t do anything but the UI, so you have a lot mere leeway to work with, whereas in a game you don’t want the UI to impact the Frame Rate, so it has to be very lightweight and probably you want Scheduling to have a final say on what gets refreshed. Taking all this into account, I really don’t see how any framework could be a good option - they were simply designed with different requirements in mind and there is more downside to using any of them than upside. Precomputed Scene Occlusion Culling using a Grid System Most of the optimization is happening real-time or triggered on a regular basis while running the game with one exception: I render every cell in the Grid System from the point of view of every single other Cell. E.g.: Cell A can see cell B and C but not D. I literally diff two images with javascript to determine whether the given cube can be seen or not. Then I record the results in a JSON file, which is used for rendering. This reduces the number of cubes to be rendered significantly, but it takes about 40 hours to run this optimization for the whole terrain. Running the game on Mobile Devices iPhone runs WebGL significantly better than top-end Android devices but practically useless because Apple ignores important web standards. E.g.: Pinch zoom can’t be blocked, therefore when you use your left thumb to move around and right thumb to look around, instead of doing so you end up zooming back and forth the screen. It also doesn't support fullscreen mode - video does, but not the canvas element. Another interesting limitation on iPhone is that you can only have 25 elements in an array in GLSL, which translates to having only 25 bones in an animated model. This doesn't make animation impossible, but you can’t use most animated models that you buy in the stores, you have to do it again with only 25 bones. Profiling “What gets measured, gets managed”. The built-in profiler in Chrome certainly has its use-case, but not good enough for what we want, so probably you will have to build your own at some point with specific features. You want to know how long a certain section of your game runs per frame, e.g.: Rendering, AI, Physics, etc. and probably these sections won’t run sequentially, but they are interrupted by other processes that you don’t want to include into the specific measurement. One thing is for sure, you cant do optimization without identifying the source of the lags. - I've certainly wasted enough time trying Scheduling As long as you are pushing the limits of the devices it is always a battle for a smooth frame rate. Therefore, you have to implement a scheduling system to manage what is allowed to run and for how long. E.g.: whatever is loaded and processed in the background will have an effect on the frame rate even if it is running on a different thread, you want to throttle that. You don’t want to set variables through WebGL API unnecessarily. AI always varies how much calculations it has to do depending on the number of unique encounters of 200+ NPCs in a random environment. Basically, you will have to limit what runs and how long, manage what is a nice to have calculation and a game-breaking one and try to make it seamless for the user. Probably every single topic above deserves a dedicated post, so please feel free to ask anything - there is no stupid question - then I would like to use those questions to write an in-depth post on every single topic that helps fellow devs to overcome similar obstacles - no doubt I will learn a thing or two in the process Live Tech Demo is available on https://plainsofvr.com
  4. My question is in reference to audio tied to a mesh, so volume and panning are handled by the BABYLON engine. This works great for wav and mp3 audio files but lately I've wanted to use generators like tone.js to generate tones with javascript. It seems that BABYLON audio requires a URL to insert audio into the audio chain but is there any way to use the audio engine in BABYLON with tones generated with tone.js and the web audio API without rendering them to wav or mp3?
  5. Doug

    Hi Rich.  @rgk mentioned that you might be able to please add a "patron" badge to my forum profile?  Thanks very much!

  6. isekream

    Audio Rhythm Syncing

    I'm trying to develop an audio rhythm based 2d html5 game and need some advice. I'm trying to develop the logic to sync the audio with the games animation. I do not need a constant animated sequence nor do I need to sync user input but I do need the animation to play once per selected be at and on time/ beat and be able to increase the number of beats eventually animating on almost all notes per bar for the duration of the audio. Now I've researched various method on how to compensate FPS for BPS when trying to sync audio with visual/ input tasks but I'm wondering if this is overkill for me. I have access to the audio's time update method and the BPM of each audio. I also have the time position of each beat( via http://sonicapi.com). In having all these factors I'm asking the following questions Should I sync the BPS with FPS to get audio sync? Should I update animations based on audio current playhead? Should I try to correlate a currentime stamp with the audio playhead position and try to use the audio beat time position for accuracy? As a side note Has anyone ever trying to fiddle with detecting "voice notes" in a song? I'm looking at PulseJS and ClubberJS but not sure how to devise a detection formula
  7. Does anyone know if there's a reason why spatial audio doesn't work when animating a camera towards a mesh that has an audio source either attached or placed in the same position? My project has an Arc Camera parented to a mesh, which is then animated around a scene. I've set up end points where the camera parent will animate to, with the idea being as soon as it approaches the centre of the target position the spatial audio will kick in. However, in practice this doesn't appear to ever happen and in fact the only way I seem able to get it to work at all is if I manually zoom or move the arc camera past these audio zones. Any thoughts anyone?
  8. Gerente

    Audio on Mobile

    Hello, Any idea if there is a problem on loading .ogg files on chrome mobile? Im trying : PIXI.loader.add('menu', "assets/audio/menu.ogg"); PIXI.loader.load(function (loader, resources) { console.log(resources) resources['menu'].data.play() }) It works perfect on PC but on Mobile (CHROME IPHONE) its doesnt reach the console.log line. Any Idea?. Thanks
  9. lpbr

    Cannot STOP sounds...

    I have an array with a list of sounds and then I loop barely like this to add them to the game: for (var i=0;i<=aSounds.length-1;i++) window[aSounds[i].name+'_snd'] = game.add.audio(aSounds[i].name); To play the sound at the desired points of my code I just do: window[me.name+'_snd'].play() // works fine However I am not managing to STOP the sound with: window[me.name+'_snd'].stop(); // instead to stop the sound it just restarts What am I doing wrong? Thanks!
  10. Audio middleware company CRI Middleware recently announced that their flagship product ADX2, which is already powering the audio of more than 4000 games is now available for HTML5. ADX2 provides all the features necessary for fully interactive audio (randomization, complex sound effect behaviors, real-time parameter control, auto-ducking, DSP effects and more…) with a DAW-like intuitive interface and a highly-optimized sound engine. For more information about ADX2, you can check the CRIWARE web site: http://www.criware.com CRIWARE also offers ADX2UP, a plug-in version of ADX2 for Unity, available to all developers for US $99: http://unityplugin.crimiddleware.com/?lang=en
  11. Hello, Possible Bug in Assets Manager Audio. QUESTIONS: Is this a bug, is it related to M64 and M65, does anyone have any insight on M64 and M65? STEPS: 1) Run the Audio test (https://www.babylonjs-playground.com/#PCY1J#8) from these docs (https://doc.babylonjs.com/how_to/playing_sounds_and_music) and no audio. Reproduced in Dev environment. Along with interesting deprecation messages in the console. below... (maybe not related?): [Deprecation] GainNode.gain.value setter smoothing is deprecated and will be removed in M64, around January 2018. Please use setTargetAtTime() instead if smoothing is needed. See [Deprecation] AudioParam value setter will become equivalent to AudioParam.setValueAtTime() in M65, around March 2018 See https://webaudio.github.io/web-audio-api/#dom-audioparam-value for more details. Interesting. NOTE: Possible bug behavior seems limited to AssetsManager, as I can still use direct load sound and it works. QUESTIONS: Is this a bug, is it related to M64 and M65, does anyone have any insight on M64 and M65? ping @davrous ... : ) Thanks,
  12. Hello, my name is Esteban Tamashiro. I am a freelance composer with experience in several kind of games. I am particulary interested in RPG and Visual Novels. I have a Degree in Music Composition, and I am Piano Teacher Grade 5 from the Yamaha Music System. You can listen to my music here: Memory Trees OST (RPG): Dawn Drop OST (Visual Novel): More of my work here: https://soundcloud.com/esteban-tamashiro https://vimeo.com/estebantamashiro Contact: estebantamashiro@gmail.com
  13. joe2movies

    Menu state audio issue

    I have an issue with the audio when first start the game in the menu state. The audio doesn't play. But after I start playing the game from another state and return to the menu state it works fine. When i test in android phone it doesn't work properly, but when I test emulator it seems to work fine. I use phonegap to build the app. Below is the code to my menu state var menuAudio = null; var menuState = { create: function () { this.menuAudio = game.add.audio('cheering'); this.menuAudio.loop = false; this.menuAudio.volume = 1; this.menuAudio.play(); var favButton = this.game.add.button(370,780,"favFolder",this.favMovies,this); favButton.anchor.setTo(0.5,0.5); }, favMovies: function(){ if (this.menuAudio != null) { this.menuAudio.destroy(); } game.state.start("difficulty"); } }
  14. Looking for a cinematic theme for your game? Listen to a preview of some of my latest music: Hero Fantasy Pack Vol 1
  15. Hello, I'm able to attach spatial audio to an object in my scene. But as I'm now creating many objects and pushing them to an array, I cannot assign the audio to the mesh in the array. Here's the basic code: for (let b = 0; b <= 20; b++) { boxRotArray = BABYLON.MeshBuilder.CreateBox("box_rot" + b, {size:0.1}, scene); } const bgm = new BABYLON.Sound('backgroundMusic', './Demos/sounds/WubbaWubbaSound.mp3', scene, null, { spatial: true, maxDistance: 20, loop: true, autoplay: true }); bgm.attachToMesh(boxRotArray[0]); I've tried using the name of the object in the array, and I've also tried to assign a new variable to equal boxRotArray[0]. when I console.log(boxRotArray[0]); it is a mesh and I'm using this in many different other assignments. Any thoughts? Thanks, DB
  16. Fenopiù

    sound.isPlaying always return true

    Good morning everyones! I'm using Phaser 2.6.2. I'm working to play a sound sequence, I've didn't find a queue option in Phaser sound manager, so I've tried to build a basic one by myself. I've 1 to 5 sounds to play one after one to make a single sound if the player has reached 1 to 5 goals. I'v all the sounds preloaded in an array. When I call: console.log(sound[i].isPlaying); is always true. When I call: console.log(sound[i].currentTime); is always 0. Why? How can I make it works?
  17. Fenopiù

    Preloading audio from JSON

    Hi everybody! I'm trying to preloading my audio from the same JSON where I preload the images. This is my JSON file: { "preloading": [ { "type": "image", "key": "image1", "url": "Images/image1.png", "overwrite": false }, { "type": "audio", "key": "audio1", "urls": [ "Audio/audio1.wav" ], "autoDecode": true } ], "meta": { "generated": "1401380327373", "app": "Phaser Asset Packer", "url": "http://phaser.io", "version": "1.0", "copyright": "Photon Storm Ltd. 2014" } } This is where I read the file: function preload(): void { game.load.removeAll(); game.load.pack('preloading', 'myjson.json', null, this); game.load.audio('audioN', 'Audio/audioN.wav', true); } function startOnClick() { audio = game.sound.play('audioN'); } If I launch my game, images are preloaded succesfully as audioN.wav, the audio in JSON give me this error: phaser.js:72849 Phaser.Loader: No URL given for file type: audio key: audio1 for all the audio file on my JSON file. The JSON file, the images folder and the audio folder are located in the root of the project. The audioN is a test that I've added to see if the problem was a type error in JSON or not, it doesn't change anything if I kick away it from my code. Any idea why I receive this error? Maybe @Arian Fornaris or anyone else could let me understand where my code is wrong? And, partial OT, why I cannot change the meta tags (except for "copyright")? If I do so it doesn't load nothing anymore.
  18. I'm trying to port my game to mobile using ludei's cocoonjs. When playing my game in a browser, (in chrome, Edge, FF), the audio in the game works fine. Opening the game in the Cocoon Developer App (using the Cavas+ option), none of audio is found. I get errorr such as: Phaser.Cache.getSound: Key "Orcs" not found in Cache. All the audio files are in my preload state i.e. game.load.audio("Orcs", ["assets/sounds/Orcs.mp3"]); And again, all the audio works fine in a regular browser. Anyone else run into this issue and have a solution? One more note - the audio does work when using the Webview option in the Cocoon Dev App. However, I'd rather use the Canvas+ since everything I read said that's the better option for html5 games all contained within a canvas. Thanks!
  19. I'm trying to port my game to mobile using ludei's cocoonjs. When playing my game in a browser, (in chrome, or Edge), the audio in the game works fine. Opening the game in the Cocoon Developer App (using the Cavas+ option), none of audio is found. I get errorr such as: Phaser.Cache.getSound: Key "Orcs" not found in Cache. All the audio files are in my preload state i.e. game.load.audio("Orcs", ["assets/sounds/Orcs.mp3"]); And again, all the audio works fine in a regular browser. Anyone else run into this issue and have a solution? Thanks!
  20. jonwiesman

    Playing .m4a files

    Hi, I'm trying to play a simple .m4a file when the user collides with an object. I create it like this: game.load.audio("dot0", "sounds/dot0.m4a"); sfxDot = game.add.audio("dot0"); Then later, try to play it like this: sfxDot.play(); But I get the following error: Cannot read property 'createBufferSource' of null at c.Sound.play (phaser.min.js:3) at Object.collectDot ( The sfxDot object seems like a valid sound, from the F12 console: c.Sound {game: c.Game, name: "dot0", key: "dot0", loop: false, markers: {…}, … I'm sure I'm doing something stupid, this is my first attempt to get a sound to play so I'm probably just missing out on a setup call or something. Also, I'm not even sure that .m4a files are supported, but they were listed in the docs. I looked through the audio samples but couldn't really find a sample that wanted to do exactly what I'm doing here (open a .m4a and play it.)
  21. Hi! You can call me Kit. I make music, hence this being a music station. I've been composing for about two years for a project I've been working on with a couple friends. I realized I could raise some extra money for it by composing for other people, so hopefully we can work something out. You can see my portfolio here: It's right here! It's got an extremely heavy focus on electronic, jazzy stuff, but I will gleefully do whatever you want me to do. Do you want a country style bluegrass jam over a screaming acid house bassline? Can do. Do you want a bombastic orchestral arrangement but where every instrument is a bike horn? Glad to oblige. Do you want to go avant-garde and make your game's OST completely silent? Works for me. If this sounds good to you, let's start talking about prices. PRICING: You can pay in two different ways. I either work for $10 an hour (rounded to the nearest hour), or $20 per minute of music (rounded to the nearest thirty seconds). If you want to pay per minute, keep in mind that revisions cost extra. If you give me a good idea of what you want, I can crank something out for you lasting about two minutes in about three to five hours. Basically, paying per minute of music or per hour of work will come out to about the same price, so it's really up to you. BUDGETING: I'm admittedly new to this, so if you need to figure out whether you can pay for me or how my prices compare to other people, I'm happy to give you an estimate if you give me a budget, a tracklist, and a general style. Youtube links as references are a good idea - these help a lot when it comes to figuring out what I can do for you. I'm also down to give you a quick snippet of music lasting between thirty to forty five seconds if you want to know what to expect when you work with me. Keep in mind that it won't be a complete track, but it'll probably help us reach an understanding regarding workflow, communication, etc. If this sounds good to you, feel free to drop me a line at magiccirclemusicstation@gmail.com. I'll be waiting for you. Cheers!
  22. I have a game I am working on and want to bring audio into the background. One of the key features of my game is that it is based on live data. One of these live data components is live audio. I have the ability to receive live audio on my server (it is a server written in C by the way) encode it how I see fit (to MP3 for example), and then package it and send it somewhere. Where I am totally lost is exactly how I am supposed to package and send this audio data to the browser. I do know that there are at least three protocols - HTTP, RTMP, and and RTSP. I think I want to stick with HTTP. Suppose I create an <audio> element on the browser. What does this element want in terms of "here is the live stream for you to connect to"? And how is my sever supposed to deliver this audio data? Do I need to open up a web socket? Does the audio file need to be saved to a disk (like a spool or scratch file)? I am pretty lost here after many days of research...
  23. boyofgreen

    Audio blobURL broken in v3

    Hello Friends, I recently upgraded to the RC version of BJS 3 and am now seeing an error that isn't present in v2. I am creating an audio file and setting the src to a blob URL that I have just created with getUserMedia (it's a wave): var newUrl = (window.URL || window.webkitURL).createObjectURL(blob); var audioEl = new Audio(); audioEl.controls = true; audioEl.src = newUrl; document.getElementById('curTrackContainer').appendChild(audioEl); audioObj[i].sound.dispose(); audioObj[i].sound = new BABYLON.Sound(newName+i, newUrl, scene, null, { loop: false, autoplay: false }); The audio file plays okay from the audio element, but when I try to have babylon play it, I get this error: BJS - [21:52:42]: Error while trying to play audio: Sampled15010375539220, TypeMismatchError babylon.customRC.js (6025,13) This method works fine in version 2.5 but breaks in 3.0. Any help would be appreciated, if we know what has changed in that space. Thanks, Jeff
  24. Zampano

    Audio loop duplicating - Bug?

    Hi there, I've stumbled across some very weird audio behavior and can't seem to avoid it. I'm trying to loop a marker of my audio file, which works well on itself. Whenever pausing and resuming the audio loop at least once however, as soon as the next loop point is reached it will not stop the currently running audio and play from the marker but just add a new layer of audio that plays from the marker instead, while the old one just continues to play. And it keeps doing that for every loop. //Create { this.sfx_music = this.sound.add("key); this.sfx_music.allowMultiple = false; this.sfx_music.addMarker('turbo', 116.8696, 7.3846, 1, true); this.sfx_music.play('turbo'); } unpauseGame: function() { if(this.game.paused) { this.game.paused = false; this.sfx_music.resume(); } }, pauseGame: function() { this.sfx_music.pause(); this.game.paused = true; } Pausing the game without pausing the music doesn't cause this problem, but It's mandatory for me to pause the music when pausing the game.. Could this be a bug in sound.pause() or sound.resume()? I'm using Chrome but I have the same behavior in firefox as well. Please help! Best Zampano
  25. swpowe

    Recording Audio

    Hello! I'm brand new to phaser and fairly new to programming. I'm trying to setup a basic game and I'm wondering if it's possible to record audio using phaser. I've got objects that I'm dragging around and when I place the object, I'd like to be able to record the players audio. Not sure if this is possible or how I'd even go about it. Any help would be much appreciated! Thanks!