Search the Community

Showing results for tags 'webm'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • HTML5 Game Coding
    • News
    • Game Showcase
    • Facebook Instant Games
    • Coding and Game Design
  • Frameworks
    • Phaser 3
    • Phaser 2
    • Pixi.js
    • Babylon.js
    • Panda 2
    • melonJS
    • Haxe JS
    • Kiwi.js
  • General
    • General Talk
  • Business
    • Collaborations (un-paid)
    • Jobs (Hiring and Freelance)
    • Services Offered

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


Website URL


Twitter


Skype


Location


Interests

Found 4 results

  1. JCPalmer

    Status of Camera rig VR

    I am not sure there is really a need to be able to record (webm) when using the VR rig. One of the few 3D video on YouTube has gotten 2.8 million views, so I would definitely want to get some of that. Not seen any videos with a barrel distortion. I do not have the hardware. Can you even watch stuff with a "phone on your head"? I recall something during the winter Olympics. If trying to record with the VR rig is a waste of time, please let me know. If this is desirable output, basically nothing comes out when canvas.toDataURL('image/webp', quality); is called on Chrome when this rig is used. I suspect that it is due to the fact that each of the sub-cameras writes to it own viewport, and that is not taken into account. Really glad this would not work for the stereo rigs I added, or I might have screwed myself. When I tried the newer WEBVR rig, it did output, but only a single screen. Is the VR rig just around for backward compatibility? Might it be ok to mess with the old VR rig, maybe add another post process, if I can think of a way? FYI, @Wingnut & @Deltakosh, I cringe every time RIG_MODE_STEREOSCOPIC_SIDEBYSIDE_PARALLEL is mentioned as a solution when some device does not support webVR. The stereo rigs are interleafing, meaning that in the doubled dimension only every other pixel is printed. It gives each side a squished look, but on a 3D TV it widens out to the original dimensions. Something for these devices would need to crop the left & right side of each image to really work.
  2. I have made a form for inputting parameters for webm audio/video recordings. I have broken out the top level StackPanel apart from the full screen AdvancedDynamicTexture. When either the "Rehearse" or "Record" buttons are pressed, you cannot still have the form visible or this happens: Clearly not acceptable to be in the video itself, & a little annoying for rehearsing. The only way I found to temporarily get rid of it was to dispose the texture. After the scene is done, I took the StackPanel and put it in a new AdvancedDynamicTexture, just 2 lines. The buttons are connected to NOTHING now though. Using a Mesh for the texture will allow it to be hidden, but then I need a second camera in ortho mode & setup layer masks. Is there any way to avoid going that route?
  3. I was attempting to use MediaRecorder to mix all the BABYLON.Sounds that called play() into a single opus audio track, at the relative time they started playing. I am starting from this stackoverflow topic. Am writing in Typescript, and found MediaRecorder is not yet defined. I can live with that. The bigger problem is an AudioContext is defined very inconsistently. I will need createMediaStreamDestination(), which is defined by Mozilla, but not a method in Typescript. I think, ok maybe do a PR of the missing things, but better check what WWW says. WTF, not one them matches another! What am I missing? This is part of to a rewrite / typescript conversion of Whammy, calling Double Whammy. I know whammy only works on chrome, so got to test this works there before proceeding with the larger project. I have verified opus is a valid webM audio format though, and think the video should play everywhere.
  4. Hello, In using various file formats for a videoTexture, I have an .mp4 file at 3896 KB, an .ogv file at 4421 KB, and a .webm file at 4421 KB. On my PC, they all load and play within 1 second, but on my Android devices, the .mp4 file causes my mesh object (a plane) approximately 15 -30 seconds to display a black surface as if it could not find the texture, and when I press my play button (a simple GUI element which initiates the play() function, it takes another approximately 15 or more seconds to begin playing the .mp4 video file. However, when the source is an .ogv or a .webm file both at a smaller file size, they load in approximately 1 second and play in approximately 1 second after pressing my play button. I have tried practically every setting available to format the .mp4 video file to improve performance on Android devices, however, nothing has improved the performance. Any thoughts on such poor performance of .mp4 files on Android would be appriciated - as well as to identify the Android OS and avoid loading .mp4 files on Android - but continue to load .mp4 on PC and MAC OS for improved quality for file size. Also, I've tried every possible way to stop autoplay of a videoTexture on Windows, but nothing appears to work. I thought that Wingnut had posted a playground scene which paused autoplay on PC, although I can only find a couple that chnge playback speed and pause on a pointerUp scene event. However, what I'd truly like to be able to do is pause every device at frame 0 or frame 1 (in .ms of course) using some functio utilizing (htmlVideo.currentTime). As always, thank you for your help with any of these questions. Cheers, DB