Jump to content

Babylon.js support for Samsung Galaxy S6 and Gear VR


dbawel
 Share

Recommended Posts

Hello,

I have a project to build in short time which must support the stereoscopic display using a Samsung Galaxy S6 phone paired with the Samsung Gear VR. I recall that last year I was in a dialog with several developers on this forum including @JCPalmer and @Deltakosh where it appeared we were considering currently stopping full support of the Oculus camera specifically and moving to a more generic stereoscopic camera which could be modified to support most any VR headset using the stereoscopic BJS camera. I don't recall where we ended up, and if/how BJS continued support of the Oculus camera, but I quickly need to build a reasonably simple VFX project for multiple mobile devices (including Sony and other Android tablets, as well as IOS and Ipad) and to also identify the Samsung S6 and switch to a stereoscopic camera which supports the Oculus stereoscopic format. I'm also fine with building two different scenes - one for most mobile devices, and a seperate scene for the support of the Galaxy S6 paired with the Gear VR. Can anyone provide a code example which renders and displays stereo video on the S6 attached to the Gear VR; as well as what properties are available in the camera to deliver the best stereo imagery to the Gear VR? I know in our post discussions we covered all of the essential settings for a generic stereo camera such as covergence, divergence, parallax, interaxial seperation, etc., but don't recall defining any settings specific to the Oculus camera and the necessary rendering settings in support of their stereo format. 

If anyone has a sample scene and/or the camera code which has been tested and working with the Gear VR, this would assist me a great deal - as I won't need to spend the time once again discovering what works best for the Oculus (Gear VR) stereo display - and can focus all of my time on the scene - as I have a tight schedule to produce a series of stereo effects and controls as a proof of concept to show that babylon.js is capable of rendering everything the client has spec-ed out for the test, and to demonstrate that babylon.js is the best choice for the framework to support both 2D and stereo cameras for future projects - and specifically Oculus at this time since they are currently working directly with Samsung.

Also, any assistance with supporting the Samsung Gear VR bluetooth controller would be highly appriciated, as this is the other additional spec which I must deliver in a scene which works for most any mobile device (no problem there) and the Oculus camera for the Gear VR and controller. I hve no previous experience using the Ger VR controller, and won't receive the S6, Ger VR, and controller until Thursday - and have a presentation scheduled for this Monday - not my call, but I don't see where there should be any problems other than my inexperience with actually rendering to the Gear VR and using their bluetooth controller.

As always, thanks for any help and/or examples you might provide. Oh yeah - and please "wish me luck." :unsure:

DB

Link to comment
Share on other sites

Hey David!

Good news, we support both WebVR devices (like Occulus) and device orientation VR (like Gear VR).

here are the related cameras:

http://doc.babylonjs.com/classes/2.3/VRDeviceOrientationFreeCamera

http://doc.babylonjs.com/classes/2.3/WebVRFreeCamera

 

Regarding live demos, I think this one could be great:http://leblon-delienne.com/astroboy/index.html

Here are some PG done with vr device camera:

http://doc.babylonjs.com/playground?q=VRDeviceOrientationFreeCamera

 

Link to comment
Share on other sites

Hey @Deltakosh,

Thank you for the links!:)  I haven't looked at this at all since we were in intense discussions concerning the VR camera late last year, and I'm not familiar with the Gear VR at all; and I'm still a little worried about the auto load of the Oculus app when the phone is plugged into the Gear VR via the USB connection - which is necessary for the Gear VR on-board controls. So I hope simply pairing a bluetooth controller with the S6 will suffice. I don't want and hopefully don't need the Oculus app loaded at all when rendering using a babylon.js stereoscopic camera. I believe I can mount a Samsung S6 without plugging in the USB connector - at least, this is what I've read online. I hope this won't be a problem in developing using the Gear VR outside of the Oculus API - and I'll let the community know what I discover specifically in my development using the Gear VR.

Again, thank you as always, and FYI - we go into beta testing with Weta in early May - finally!  At that time, I'll personally send you a link so you can see how they are using the app, as well as any feedback on how pleased they are that we were able to develop the app for them utilizing Babylon.js - and for other companies to use following the beta release. I know they will be happy to express their opinions about the power and flexability within the BJS framework, and how the use of this app will save them considerable time and resources.

Cheers,

DB

And if anyone in the community has experience specifically with the Samsung Gear VR, and has any advice concerning development within WebGL for this device using babylon.js, any advice is valued and appriciated. It appears to be simple, but nothing ever really is, is it?:huh:

Link to comment
Share on other sites

Hey @Deltakosh

Do you see any problems with using the BABYLON.OculusCamera() from an older master JS script prior to 2.1? Perhaps this might allow me to spend less time modifying the current WebVR cameras to produce a simple demo in the few days I have to produce several demo / proof of concept scenes on multiple devices. Although I receive the "Uncaught TypeError" in my JS console when I try and reference any babylon-master.js files prior to version 2.1. Would I be able to avoid "launching" the babylon.js framework when calling an older master file, and how might I do this. Or do you know how I might seperate these functions to create a seperate script to reference in which to call these functions?

Thanks,

DB

Link to comment
Share on other sites

Hey!I have the galaxy s6 and the gear VR, While I haven't focused on babylonjs and the Gear VR too much since i decided to use unity for the VR Experience. But I might be able to answer a few things.

So when you plug your camera into the gear vr headset it automatically launches the oculus app. There is a trick that I have been using to avoid that, I do have the developer edition so it might not work on the consumer version of the Gear vr. I basicly flip the flap with the usb connection and squeeze it in so it doesnt use the usb ( big pain in the but)

BUT Samsung released an update not too long ago and it make it so much easier which basicly allows you to browse the web using the oculus app. Its alot of fun. So you will have access to the button controls on the gear vr and wont have to to any of that nonesense ^^.

http://developer.samsung.com/technical-doc/view.do?v=T000000270L

since you dont have the gear VR yet, here is what browsing the internet on your gear vr looks like:

 

 

 

 

Link to comment
Share on other sites

Hello @jessepmason,

I had found this method online recently as well, and it appears to be a potential method I might deploy to bypass the auto launch of the Oculus application when using my S6 with the Gear VR. So thank you for pointing this out to me and for sharing your experince with the Gear VR and Galaxy S6. If you have any further information concerning any experiences with these devices and the use of babylon.js for WebGL media development, any info you might provide would be helpful as I expect to receive my equipment today, and am truly pushing the envelope in accepting a hard deadline of Monday to prove the BJS framework over all others as the very best choice for my client's pipeline.

And if you have ANY code which you might share here on this forum or in a personal message, this would be helpful in so many ways for my task ahead, as the Gear VR and controller are the only two variables which may put me to the test at this time. Although since the client is quite aware of the tight schedule for completing the presentation, success will place BJS as the best choice above and beyond the many other reasons why babylon.js is the undisputed best framework for what this company has planned for their media 

And @Deltakosh - I know that backwards compatability has always been a key factor in the evolution of babylon.js across the board, however, it appears that the only case where this is broken is in the support of scenes containing the BABYLON.OculusCamera() - however, please let me know if there are any other legacy issues with older scenes in any newer or more recent babylon-master.js engines which may have either been overlooked or consciously selected to dis-continue support.

Otherwise, you know my personal view of babylon.js, as the very best development framework for WebGL hands down, and we all thank you, @davrous, and all of the many others who have freely given their time and experience to create a tool which we all benefit from.:)

Cheers,

DB

Link to comment
Share on other sites

Thats pretty cool that weta is looking to view all their 3d assets through the browser, if any framework its definitely babylon! :P

I perfer plugging the phone into the usb now that way you can take advantage of the picker in the center of the display to select things etc. As for code its rather simple with babylon so I dont think there is really anything to point out to get you started. But there is probley someone with more experience using babylon  might say differently.

I find using mobile webvr is great for a few things, watching 360 movies, panoramas, even browsing the internet is awesome! But using it with babylon just isnt there yet, I haven't seen a babylon or a three js scene where I dont feel sick using it. So just note your vr experience isn't going to be perfect compared to making a .apk

definitely would be interested in seeing whatever you come up with!

Link to comment
Share on other sites

I made the stereoscopic camera / post process.  It took more time than I had wanted due to me breaking my clavicle (not during coding :)).  @Vousk-prod. did not share his 3D quality checking trick in time for me to really do anything with it.  Not sure how much you could do much to change the fragment shader anyway.  If interested in trying to improve the results, you do not have to have the gear. I was using a 3D TV, but you can test right from the desktop.  My turn around time was very poor.  It did not help that I had to work out the process of using the TV at the same time as making the camera and test scene.

The other thing that came up afterward was a conflict with other post-processes.  The way a 3D rig post process was initiated made it not possible to add others.  This is probably not major to fix.  Those with the toys might like to address this as well.

Link to comment
Share on other sites

42 minutes ago, jessepmason said:

But using it with babylon just isnt there yet, I haven't seen a babylon or a three js scene where I dont feel sick using it.

Hi!

I'm curious about this one, what makes a VR experience sickening or not ?
Could you develop with why WebGL doen't work well yet ? Is it a matter of FPS ? Or other sort of issues ?

I found this article interesting :
http://arstechnica.com/gaming/2015/09/valve-blames-developers-for-lingering-vr-nasuea-issues/

There must be more in-depth guides out there, though everything is still very new, I haven't had the chance to try a GearVR yet :-/

Link to comment
Share on other sites

Note that this is just my experience with mobile webVR using it with a HTC Vive or Oculus Rift is more than likely different since its tethered to a PC with alot more power. In my babylon particular case even with a simple scene like a cube it still lags behind, I dont know the exact issue :(

What make a vr experience sickening or not? There are alot of factors but basicly any rotation done by not done by the user rotating their head will make you sick and yeah FPS.

I would recommend getting a gear vr and trying it out, you will know what I am talking about right away.

Link to comment
Share on other sites

I'm an apple boy, but this GearVr thing makes me wanna switch >_< 

I've read articles about how those VR sets require huge computers, Oculus hasn't released a devKit for mac yet because "noone of existing macs can handle it right now".

and I was thinking : are you serious ? Just use BABYLON.js, apply a video texture inside a sphere mesh, VRcamera and you're done with the 360deg vid !

but things might be a lil more complicated I guess ... have to try my own nausea soon, to get the idea :-)

Link to comment
Share on other sites

Hey @meteoritool,

I've had the very good fortune of working with VR and AR for both entertainment, military, and even a medical industry application for virtual surgery out of UCSD here in CA who developed the Da Vinci robotic remote surgical device (which I did not personally contribute to). My good fortune came from my "expertise" in developing some of the first real-time motion capture systems (as I don't want to sound or appear to place myself as an expert in my own mind, but only say this from how others have referred to me in publications from the past). And this forre' into real-time mocap directly provided work with AR, VR, and haptics at a time when these technologies were in their infancy - to the point where I was often criticized for my strong belief that motion capture would become standard practice in entertainment as well as many other industries. But the intense laughter which I experienced from lectures long ago began to change tone as the audiences grew as well.

There are many factors which cause those individuals who are prone to nasuea to experience equilibrium illness when using current VR/AR technologies. Current AR devices are found to lessen negative reactions from such individuals, however, it is the physical construct of the current devices which cause nasuea, and not anything specific to the VR genre - generally speaking. This is because we are literally "tricking" our brains into believing that we are immersed in a 3 dimensional world by using "tricks" to convince our brains that we are in 3 dimensional spaces utilizing mostly 2 dimensional elements.Thus there is a great amount of sensory data missing when using the current 2D VR and AR devices. Although our brains are quite good at interpolating 2 dimensional stereoscopic imagery somewhat mimicking the light which our eyes project onto the back of our retinas, this causes our brains to attempt and make the very best sense of what appears to be placement and actions within 3 dimensional space.

However, as this is a very un-natural process only to somewhat mimick 3 dimensional space by using 2D displays to trick our brains, the lack of additional information from an object and its light sources, as well as many other factors such as the process by our brains which construct a representation of our periphery and the speed in which this is processed, attributes to our feelings of nasea. There is a ton of information online which thoroughly explains all aspects of why current VR technology is a "trick" on the brain and can be unhealthy for our eyes and brains in many ways - and if you are developing any VR and/or AR applications, familiarizing yourself with the fundamentals of how our eyes and brains process our eyesight to represent our physical world spatially, as well as how our brains process color information, will definately help you build better VR and AR experiences - especially with the limitations of the current devices available today and in the near future.

However, everthing changes within the next 5 years, as the next generation of VR/AR devices project the equivilency of natural light almost to a precise representation of light emulating in the real world. If you take the time to search through several other posts I've written on this forum - in the past 3 years or so, you'll find that I cover far more specifically about such devices and how they differ from what is currently in production for the consumer markets. And, you might possibly understand why a major leader in feature film production whom all of you would know - and I'm not here to throw names around, so I won't - but this person told me directly - and I quote (almost precisely) "this new device will change the world in the very same way the invention of radio did." Which at first, I didn't truly believe him, even though he has always been ahead of the rest of the world in adopting successful technologies, and this person rarely makes such a bold statement. Also, in previously working with Ray Kurzweil (there, now I'm throwing a name around - but this one's an important figure) - as a "true" futurist, he is always making predictions, and most of the time entirely correct, with possible exception of the time required to become reality - but as a very intelligent person, he also agrees with my collegue in feature film. 

But as I state in other posts, I'm legally not permitted to share how this technology works, however, as one of the very fortunate yet few individuals who have been offered the opportunity to use one of the only two existing prototypes for this "light field" projection technology, I can say with certainty that this device specifically will change the world as we know it - in a very similar if not more profound way than the invention of radio did.

And so to summize an answer to your question concerning nausea or any illness from using VR devices, this is a very real "reality" with using current technology and the devices available now and in the near future - including the Microsoft Hololens - only once the newer "light field" projection technology is released, this will be an underlying and soometimes serious side effect for many users of VR and AR. But once this sea change in display technology arrives, there will be no further physical reasons for any nausea or illness to occur, as your equilibrium and other sensory nerves will be naturally stimulated as well as spatial processing by the brain. And the human brain will not be capable of differentiating between reality and simulation - which opens up a whole new areas for discussion, as I'm sure all who read this can imagine. But what an amazing world it will be.;)

Link to comment
Share on other sites

Hi @JCPalmer,

I recall when you were working on the new camera late last year, and we got into a few really good discussions. And I also remember your mentioning of breaking your clavicle - which I hope has now healed correctly. I've had my share of breaks due to riding motorcycles - and the clavicle was one of the most painful and difficult to heal without any differences in its physiology once healed. Of course I was only 15 years old at the time, so it was much easier for me to heal, as I might assume you're a bit older than 15 from your expertise in development languages.

We all thank you for all of the hard work and time you put into building the VR and WebVR cameras (I believe you built both). I doubt many people could have accomplished this with the functionality you provide, as I know my abilities aren't anywhere near what was required to impliment this as a useful tool - as well as all of the other dependencies which must be considered.

Thanks for the advice - I listened and have been working on the Oculus camera already today - since as you mentioned, much of the work can be done on a desktop as we're only talking about stereo cameras. And as you also mentioned, I'm most concerned about conflicts with other processes as well. I only have a couple of days to impliment, and to showcase scenes, effects, and elements which are either difficult to impliment in 3JS, or not possible with the desired flexability and/or quality. But I'm well on my way, and Fed Ex should be ringing my doorbell any time now.:rolleyes: - me listening for the bell to ring, just like Pavlov's dog. 

And now good news for me, as I just received an email to let me know that a key member of my client's team may not be able to make the presentation Monday - so a postponement of a few days is likely.:D This will assure that I will convice them to choose babylon over 3JS - I certainly have my preference, but am working with my development partner to present an un-biased view of both frameworks so the client's "tech guys" can decide which they feel is best for their needs. We already know the answer to this, but will give 3JS a fair chance as well. However, BJS wins hands down - and for more reasons than I could possibly list in this post using the entire rest of the day to do so.

Again, thank you for your advice, and all your work in assisting with the development of the framework as well. I am still amazed at the speed in which BJS is still evolving, and how well it is designed, built, and easy to modify many functions when needed.

Cheers,

DB

Link to comment
Share on other sites

12 hours ago, dbawel said:

your equilibrium and other sensory nerves will be naturally stimulated as well as spatial processing by the brain. And the human brain will not be capable of differentiating between reality and simulation

the secret technology sounds like LSD lol ! :lol:
Now I'm trying to guess what it could be ;-) A screen that is not flat maybe ? Instead of a "plane" screen, have a "cube" screen ?
Will it change the way we develop 3D worlds with BABYLON.Js or other 3Dgame engines ?

Thx for your interesting answer ! I like the Sci-Fi feeling in all that stuff :-) Plus my imagination now runs wild >_<
A worth seeing movie about that "technology" : http://www.imdb.com/title/tt1821641/

Link to comment
Share on other sites

@Deltakosh - Yes, I understood why it was removed, however, do you see any issues with making use of it for a quick demo where there is little time to get everything working?

@meteoritool - I can tell you a bit more if I'm careful, which will cause this "sci-fi" and magical sounding display device potentially make better mental sense, although this limited info will also propel the device into a world which is difficult to conceive. It is a "series" of multiple projectors within a single headset so small that the human eye cannot natively see many of it's components witout the aid of magnification. And these project natural lightwaves directly onto the retina of the eyes - not to simulate natural light, but to produce natural light. This is why your brain will in practically no way be able to determine light from the device(s) in comparison to light within the natural world. So get ready, as the world is truly about to change - just imagine what this will mean. 

And if you want to watch an incredibly funny movie which was more absurd than any other at the time of release, take a look at "Idiocracy." As I mentioned, it was absurd at the time of release, but pay attention when watching again, as it is beginning to look more like a documentary these days from the devices people use to display information, to the Taco Bell vending machines (which are already testing in some cities - not Taco Bell but a large fast food chain), and all the way to the current state of politics. What I used to find funny, is now far more than simply  frightening.:huh: But at least the display technology will be awesome!

DB

Link to comment
Share on other sites

@dbawel

- First I did not create those cameras.  I am actually a destroyer of cameras.  When Oculus stuff was being changed over to VR,  there were all these new cameras being created.  I re-factored anaglyph and all the VR cameras right into the Camera base class.  I also added the 2 stereo rigs.  This means that every camera can be any type of 3D rig with the possible exception of virtual joy stick.  This is what I would stress to the Magic Leap people, I assume.  There are still a lot of camera classes, but many are just wrappers.  Even more rigs could be added, and all cameras would get them.

- 2nd started to update my rig test scene .  I was tired of that fly carpet thing.  Moving images actually make it more difficult to use Vousk's method of evaluating 3D quality.  The test is done by going into side-by-side mode, and putting a horizontal ruler across (using another window).  Stuff should line up if it good.  In tests I do this seems really close, so think lack of working with other post processes is all that I need to fix.

Selection_182.png

- 3rd in addition to failing the newly added post process test, the buttons are very hard to click in 2.4.  I went back to 2.3, and things work fine.  It is a pain, but it works sporadically.  You get the hand cursor, but the click does not work.  Any ideas @Deltakosh ?

Link to comment
Share on other sites

Yeah, but not always.  Others fail at different times too.  I have a dedicated Dialog test scene.  I plan on trying to switch that to 2.4.  It has a arc rotate "scene" camera too, but not sub cameras for the 3D rigs.

Right now I am trying experiments try to support post-processes with rigs by adding the "user" post-process to the sub-cameras before the "3D" internal ones in the chain.  VR and anaglyph work, but the stereos do not.  The first camera is not being passed to the interlacer camera.  The first camera uses passPostProcess.  BTW, what is displayPassPostProccess?

None, can remove correctly yet.

Link to comment
Share on other sites

@Deltakosh - Thanks for the optimism, as I believe I can get a scene working before Monday. 

@JCPalmer - As always, your posts are to the point and generally answer what I require to more into areas of babylon.js I've yet to develop. As for supporting Magic Leap, this will require a camera which has really never been developed before, as the rendering process is completely a reverse process of raytracing. What I'm currently attempting to show them is that in my opinion babylon.js is an ideal framework for most every aspect of 3 dimensional scene development, animation, etc. up to the point of rendering - excluding lights which will require completely different calculations from how BJS is currently calculating surface illumination, as the required rendering can be generally described as the reverse of firing rays from a camera plane (I'm not saying that BJS is using raytracing), but in order to support Magic Leap display technology, this will require a process where (in a very simplistic description) rays are fired from each light source and are absorbed and reflected when the rays are gathered from each surface area of a scene object.

And thank you for chaning to the current VR camera, as even though I opted for the Gear VR headset, if I don't plug my S6 into the USB port of the headset, I'm not locked into the Oculus format - so I have have options to develop for quality alone, and not for any specific licensed format.

So as Magic Leap will most likely be using the Android OS as well as Qualcomm processors - which is what many Android tablets and phones use currently - and they will require developers to produce loads of content, an open source development framework such as babylon.js is ideal, and this framework is the most flexible and adaptible in my opinion and from personal experience. And just to qualify the above, it is strictly an assumption of mine that the Android OS powered by Qualcomm processors will be ultimately be the hardware and OS they will be shipping with their devices, as well as providing support of existing Android devices in the initial stages of release of their first AR/VR headset display - as Google and Qualcomm are the largest shareholders of Magic Leap aside from the founder Rony Abovitz and a few members of the board of the company.

And a questin to both of you - are there any issues with the latest version of the babylon-master.js file(s) in developing using the VR camera? Also, when I test the "3D Formats" rigs test scene by @JCPalmer on my devices including several OS', I find that once I make a choice for the VR camera in the dialog GUI, I can always choose any camera as first choice. However, once a choice is made, then I'm only able to coose between "OverUnder", "VR", and "None" as rig types, with a very rare exception of "Anaglyph" on occasion - with no pattern I can find as to when this might be selectable. I don't know if this will help, but I had the exact same isue in using the bGUI extension for an interface, and the solution was to create a different camera and camera type for the GUI elements - and this change fixed the problem for me in every case.

One more thing - I've only received all of the components for displaying VR scenes, which I had my choice of practically all available headsts and hardware. I choose the Samsung Gear VR, Samsung S6 phone, and the Moga Pro Bluetooth controller. I almost bought the Samsung S7 phone, but with considerable research, I found that the S6 WebGL graphic performance was ideal for most any scene, even though the S7 has a considerably faster CPU. Although in benchmark testing, the graphic processing and display performance is not improved in direct relation to the improved processor speed - as most every other component including processing RAM is exactly the same. And today, I used several benchmark applications including the AnTuTu Benchmak app which I generally rely on for mobile device benchmarks, and the S6 performace benchmarks were almost the same as my Sony Xperia tablet, and in several of the 3D and HTML 5 bechmark tests, outperformed my tablet - which is still rated as one of the highest on the market. So I personally highly recommend the devices I chose for VR development at this time, as I spent countless hours researching different configurations, specs, and then sellers to find what I believe works best for VR development in BJS currently. The S6 will clip into the Gear VR headset without having to plugin to the USB port, which forces the phone to run the Oculus app which makes development a whoe lot more difficult for me. And the Moga controller functions perfectly as wel as the native Samsung controller, when set to the "B" pairing setting. And if you want to use the Oculus app and the items from their online store, then simply plug the phone into the Gear VR USB port. 

Also, make certain if you purchase any Samsung or other smartphone through a service provider, or in my case as I required an unlocked phone, make certain the service provider or online seller is completely reputable. In my searching for the best prices online, I found more fake S6 and S7 phones than real ones - as the market is flooded with S4 phones which fit nicely into S6 and S7 cases. And although there are many good videos online showing how to determine if your phone is a fake or not, I found several sources for phones now that don't fail any of the tests to determine if your phone is real or not - and these changes have only appeared in the last few months. So no matter where you purchase your phone, always use one of the various apps to verify that your phone is not a fake. The AnTuTu Benchmak app will do this for practically any smartphone, as well as the "Genuine Galaxy" app - and there are others as well, however, there are also apps produced by the fake phone manufacturers, however I can guarantee from experience that the two apps listed above are genuine, as this has been a HUGE problem for almost 2 years now, and millions - yes millions of people worldwide have been burned by these companies all out of China - where the real phones are also made. My new S6 was manufactured in Korea, and I'm not aware of any fakes from Korea yet, but I don't trust any smartphone seller these days.

I'm not a fn of current VR displays, but this is an area which is exploding right now, and soon to be lots of jobs available for developers with VR experience - as well as a huge initial market with a current void of games and content, so sales will certainly be higher for practically any game released around or after Chrismas this year 2016, so it's well worth the experience of learning all of the differeces you need to consider in developing for VR, even if it's simply to gain the experience if you intend on working as a game developer in the future; at least in my opinion. And right now, you can purchase all of the equipment I'm using for about $400, if you are OK with a used S6. Just be very careful, and purchase through Amazon, as you'll find good prices, and they will refund if it's a fake knock-off phone.

DB

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...