Jump to content

Streaming Video as a Texture


dbawel
 Share

Recommended Posts

Hello,

 

Does anyone have an example on how to stream video into a scene and apply onto an object as part of a dynamic texture?  

 

Also, in a recent post I had discovered that when loading many textures into an array, the last texture is often not loaded, and sometimes more than one texture is not loaded.  My solution was to load a small texture that isn't used in the scene, and then all textures except the last "dummy" texture almost always loads correctly.  Wingnut and others tried to duplicate this in the playground as I did, as was unable to reproduce.

However, I recall using a babylon.js function which waits for all textures to load before continuing the script.  I can't locate the scripts I used this, so if anyone can point me to these functions, I would be grateful.

 

However, streaming video onto an object as a texture (dynamice texture) is most important - as I'll find my lod scripts on disk when I really put in the effort.  It's simply that I need streaming video on an object as soon as I can figure it out, or as soon as one of the genius' on this forum provides an example.

 

As always, thanks for any help you might provide.

 

Cheers,

 

DB

Link to comment
Share on other sites

 

 
  • Members
  • bullet_black.pngbullet_black.pngbullet_black.png
  • 3626 posts
  • twitter:@deltakosh

Posted 09 September 2015 - 06:50 PM

Hello,

 

we support videotexture so if your video is a streamed video I think this should work.

 

Example here:https://github.com/B...r/index.js#L147

 

 

 

Hey DK and Hello All,

 

This resembles the babylonjs.com page - which would make sense.  These are steaming video files, however, is there a method which I can I apply to any one of these these streaming videos to define as a texture on a Babylon.js 3D scene object, and as a MultiMaterial on this object with a dynamic texture assigned also to be able to paint on the object with the streaming wideo texture while the streaming video texture is playing or paused on a single frame of the streaming video?  I will begin testing with some of these functions DK exposed as well as some of the cases to see where I'm able to integrare our node.js server to stream a video as a texture on a 3D object to multi-users to use my real-time muilti-user drawing application to paint directly onto an object with a streaming video as a texture - on a paused video frame which is captured and then applied to a mesh is fine for our needs.  This I have yet to figure out, so I really bow down to the big brains on this forum to open up their Mensa minds for the immense task to paint on streaming video  - paused or not.  Even if I must capture a single frame from the video and map on a single plane to apply as a MultiMaterial along with a dynamic texture to be able to paint on the streaming or on a single captured frame of video.  Those of you who know what I'm asking understands that his is certainly a challenge.  I'm very concerned the Babylon.js framework may not be currently capable of accomplishing this, but I certainly hope that if not now, then soon - as once I have this in place, it is a custom app as a favor I'm writing for Peter Jackson and Richard Taylor who own Weta - and they will be happy to send a personal  letter(I'm speaking for them as friends) to ANYONE who finds a solution to this challenge.  I will also invirte you to tour Weta film Studios and Weta Workshop with Sir Richard Taylor and Sir Peter Jackson, as they will extend their gratitude to any of the Mensa brains who might find the solution to this before I do; and I'm not a Mensa member (never applied, as I don't do well with rejection.

 

So if anyone is able to find the solution for this, and you are able to book a ticket to NZ, I will ensure that you receive a tour from Peter Jackson personally on his Stone Street Studios, as well as the fantastic tour of Weta Workshop from Richard Taylor - which will absolutly change your outlook for what is really doable in current feature film and other production - as these places rival NASA - really - and you'll find feature film history that any you will absolutely appriciate - and everyone you know will be amazed by your stories for years to come.

 

So I've promised allot here, which is most likely not necessary, however I'm under a deadline, and will ask my good friends who own Weta to help with the incentive to accomplish the key function of this app which we want to begin beta testing in 30 days - with additional featires, of course.  I'll post the entire real-time Multi-user app on the forum once it's ready for Beta.  Users such as Temechon have a link to an alpha now of our real-time multiuser creative drawing app, but we'll see if he believes this app is as impressive as I am eluding to.  It's certainly fun to use, but Pete is counting the days until it is ready for use in his next production.  I can explain why at a later date, if it's not already obvious - but this type of knowledge would only be understood from working in feature film dailies.

 

Oh, and I'll try and be there too, to make sure that all of the "juicy" stories and pictures are told and shown to make your tours a VERY special event for whomever might get there.

 

And as always, I appriciate any help from anyone who might assist in providing a solution to help me get to the end goal of painting directly on sreaming video in real-time.

 

Cheers,

 

DB

Link to comment
Share on other sites

Wow.  Thanks for that, DB!  That sounds great! 

 

Ok, just a quick opinion, here.  You are dealing with an area that is NOT part of BJS... and that's getting a frame of video... into a context2D object.  Once you have that... your interaction with BJS... comes alive.  And what you are talking here... is continuously "broadcasting" AND "receiving"... a "live" context2D image buffer. Essentially, this would be a networked object... with a networked onChange event... which would trigger an updating of ALL other (remote) context2D objects.

 

On the BJS side... you are watching for changes to your local networked context2D object, and re-painting your dynamicTexture every time it changes.  Also, every time YOU change your texture with a brush stroke, you "submit" your change to the networked context2D "thing".

 

I'm sure there's lots of other cool state-management things to consider... but... hmm... I still thought I'd make a little noise here... what the heck.  Good luck.

Link to comment
Share on other sites

Hey Wingnut, DK, and All,

 

Please repare yourself for a long read, as I have much to say.  I just hope it's interesting enough to hold your attention.  <_<

 

I already have the multi-user video streaming function for both user and server working on a very basic level (for simple testing and proof of concept.)  If you wish to test, just launch the link below on two different systems or in 2 different browsers, and the video can be controlled by each seperate user who has the scene loaded.  As I didn't add any functions for frame sync in this test app, the sync isn't really locking, but it will once I integrate the code to continuously sync as the sync functions for most media formats were written for a seperate app years ago.  I'll integrate the sync functions last in my list of priorities - as my main concern currently is that I'm not yet able to use the video from this multi-user streaming app and apply the streaming video as one of 2 materials which need to be pushed into a single MultiMaterial and applied to a mesh - which when the video is paused, I can use the current drawing function I've written for the dynamic texture to paint (draw) over the top of the mesh with the MultiMaterial applied to it.

 

The link to the multi-user video streaming test app is:

https://warm-beyond-8271.herokuapp.com/Public/playerMU1.html

 

The video needs to be started in each browser as I set it up this way for each user to initiate their session with the press of the Play/Pause button.  Then to sync (sort of) the videos in seperate browsers and/or systems, just hit the Restart button and this will restart the video in any browser which has initiated the video streaming.  all other buttons work correctly - except for the "FWD +10sec" and "Back -10sec".  I haven't looked to see why these aren't functioning correctly, however, it's not in any way important to the test scene, so these will reamin broken for now and forever.  But at least you can see that I have all functions and elements for completing the app with the exception of pushing the streaming video (or a single paused frame of the video) to a MultiMaterial which also has the dynamic texture pushed to it.

 

Once I accomplish this, then the app is ready for Pete, Richard, and their teams to begin beta testing - which they are really looking forward to, as this will save them considerable time in reviewing dailies (shots from previous days of shooting, as well as shots with new VFX and other post processes recently applied.)

And for the design team (the best design team in the world - by far) which Richard manages, which this app will provide them with the tools to allow their design artists to quickly construct and/or alter illustrations with clients in any location worlwide from practically any mobile device.  This multi-user drawing app is working now, as it doesn't require any video components or MultiMaterial with dynamic textures as a component, but Richard would like to have a single application to begin using company wide for both Pete's and Richard's needs as opposed to a seperate app for each task - dailies vs. design illustration.  

 

And FYI - I'm building this for them and giving it to them as a free tool, and will never charge them for the use of this application, as I owe them both a great deal of gratitude for first giving me the opportunity to be entrusted with several key roles in production, and giving me their full trust and complete freedom to work with my teams without any constraints or interference - regarless of some odd requests from time to time.  And since I left Weta, they have remained good friends as opposed to other directors which will go un-named.  And I only mention this as important to me - since each of us together as a core team bleed out our hearts and souls in order to make challenging films such as the Lord of the Rings trilogy and others.  Things never seen or done before - so you become quite good friends over years of working long hours 7 days a week - and Richard and Pete blled more than anyone in making their films.  So it's an honor to remain good friends after going through so much together.

 

Anyway, I thought I'd mention this as I didn't want anyone to think I might be making moneyfrom this with a company who can certainly afford to pay well for this. Not that I feel in any way that it's wrong to charge for apps, as I must earn an income at some point, or my Wife will eventually hand me divorce papers.  But for now, any help given will be very much appriciated which is why I offered tours of the studios and of Weta Workshop as Pete and Rihard would be happy to share their "worlds" with anyone who helps their friend in creating and delivering a tool which will help them save time and improve quality in their daily tasks.

 

As for the link to the code that DK provided, the function DK pointed me to is definately a component that I did not know and will need to integrate in any version of applying video as a texture - as referenced below:

"new BABYLON.VideoTexture" 

But I'll suppose the only way for this to apply as a texture is to stream the video and cache into memory for every user, and then apply as a material - as I don't see anyway to apply as a texture unless it is loaded into memory.  DK, if I'm wrong on this assumption, please let me know.

 

So thank you very much for this.  Of course, As DK is one of the few people who are giving of their valuable time to writing the Babylon.js framework, he and anyone of the small group of BJS soldiers (and those of you who also contribute extensions and other valuable tools regularly) are certainly welcome to a guided tour from Sir Richard Taylor to the Weta Workshop - which is by far one of the greatest places on earth.  It's much better than a trip to Disneyland, as Disneyland is the only experience which I can think of that might be comparable in any way - but Workshop is far better.  I can perhaps get Pete to commit to giving a tour of Stone Street Studios, but I'm only cautious, as Pete's time is incredibly difficult to get - as he works harder than ANYONE in film and entertainment (perhaps only Richard's time commitment and level of work can compare to Pete's work commitment,) and although they both have families, they both have to be a Father everyday as they aren't hollywood types with nannies and servants.  They both care for their children as any normal parent, drive them to school, do the cooking, cleaning, laundry, etc., and don't provide any bit of what you would think someone worth almost a billion dollars would live as a lifestyle.  And Richard is my best friend, so he would make a tour I request a priority, where Pete would choose time with his kids first.  And Richard does also, but he always finds time for my rediculous requests.

So if you fit the profile of anyone mentioned above, and get to Wellington (let me know in advance to make certain they are at home and not shooting abroad,) you are welcome to a personal tour by Sir Richard Taylor, and Peter Jackson - if I can lock him down in Wellington for a few hours.  But for anyone to provide me with a working example of painting on a streaming video, I'll do my best to make certain that you receive tours from both Richard and Pete.  Although, your tour of Workshop will be the one which will live in your memory as one of the best days ever.  And you can also get some "real dirt" on me from Richard if you ask him - that is, if any of my dirty secrets have value to anyone.  But Richard and I have been to hell and back, so you'd hear allot anyway.

 

And Wingnut, thanks for your "analysis" of some of the elements and operations required to make this all come together.  The good news is that I already have all of the functions and elements you mention working now - and with Temechon's assistance with the use of bGUI, I have a very nice real-time multi-user app and interface which allows many users to simultaniously draw on any 3D object in their own color, brush size, and all of these attributes unique to each user are drawn on every participating local user's screen in real time, maintaining all unique attibutes when drawn - so it's working now and is allot of fun to use.  I sent Temechon the first alpha version of this for him to see and to test, and haven't checked my email yet today to get his feedback.  But once I finish version1 of the app (no video drawing on version1), I'll post it on this forum for everyone to use and to provide feedback if they would be so kind.

 

I still have much to do to solve the painting on streaming video, but I'm so very close.  I expect Version 1 will be finished this weekend (finally0, and version 2 should follow within a week later - I hope.  Again, thanks for reading through my exhaustive posts (I'm guessing some of you can tell that I've written a couple of novels previously.)  Please keep any ideas coming, and I look forward to calling the guys at Weta to schedule a tour - whether it's for a user who provides me with a working example of all that I need (which I pretty much have everything I need already, and will be posting soon,) or most likely, all of you mavericks who are spending countless hours writing this framework which allows us to have an open source framework to produce work within - and hopefull will generate revenue for many of us who are spending day after day building great games and apps.  I hope we can all meet as a community soon.

 

Cheers,

 

DB

Link to comment
Share on other sites

Lol, if you manage to get a mail from someone in weta that says "We are using babylon.js and we like it" you can ask me WHATEVER you want :)

 

For your question, I think you do not need anything else that what you have....because what you are doing with your video tag can be REPLICATED to a VideoTexture:

http://www.babylonjs-playground.com/#256QWU#2

 

 

So as you can see, the videoTexture object exposes the .video HTML object. which in turn can be used to control the video :)

 

Isn't that cool?

Link to comment
Share on other sites

  • 1 year later...

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...