Search the Community

Showing results for tags 'texture'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • HTML5 Game Coding
    • News
    • Game Showcase
    • Coding and Game Design
  • Frameworks
    • Phaser
    • Pixi.js
    • Babylon.js
    • Panda.js
    • melonJS
    • Haxe JS
    • Kiwi.js
  • General
    • General Talk
  • Business
    • Collaborations (un-paid)
    • Jobs (Hiring and Freelance)
    • Services Offered

Found 181 results

  1. I am having a really weird issue. I am creating a sprite from an image as follows: var grid1_texture = new PIXI.Texture.fromImage("whiteSquare.jpg"); Now this works fine and I am able to add this to the stage. I basically want to add all my images to a folder called resources. So I changed the code to this: var grid1_texture = new PIXI.Texture.fromImage("/resources/whiteSquare.jpg"); When I do this, I can see in Brackets editor that the link is being made however, the sprites do not get rendered on the stage. Please help!
  2. Hello BJS devs, What are BabylonJS's plans for WebGL 2 support? Namely, how do you guys envision supporting webgl 2 and backwards compatibility? I'm not super experienced with game development but I've been playing around with a toy project and I need to use binary 3D textures. I've made some changes to BJS to support 3D textures but I'm not familiar enough with the codebase to think this is the correct way of doing this. https://github.com/BabylonJS/Babylon.js/compare/master...wongwill86:texture3d?expand=1 Any feedback is much appreciated and I would be happy to make changes / open a PR if this seems like the correct approach. Also, if there is already a branch that supports this correctly, that would be even better Thanks!
  3. Hi, I'm having a serious problem with the memory usage, sometimes it hits 2.6 GB and never goes lower. I have to load some textures that vary from 10 MB to 40 MB for each map, and I'm sure that when I'm switching between the maps this textures are not being removed from the memory, so the problem grows every time you switch the Maps. To test the issue, I created a 186 MB image then I loaded it into PIXI and got a big black texture throwing a lot of WebGL errors, maybe because it's too big? Anyway, I noticed that the RAM consuption grows a lot on the task manager, so I started my attemps to remove it from there, but I could not do it. I tried to do the following: texture.destroy(true); texture = null; But I got nothing, so I did this (to throw everything away): for (key in PIXI.utils.TextureCache) { PIXI.utils.TextureCache[key].destroy(true); } But the memory still remains with the same size, so I tried to use the destroy(true) and after I run the GC, the RAM lowered a bit but I was still able to notice the 186 MB texture being loaded. What must I do to remove it from there?
  4. Hi, I'am noob I made texture creator tool using this tutorial ( http://phaser.io/examples/v2/create/gen-paint ). It's cool. But I don't know how set animation to texture from above tutorial. Could you please let me know how set animation to texture from array string?
  5. Hey, I just upgraded to OSX Sierra and this bit of code does not work anymore on chrome : /* --- Skybox --- */ var skyboxObject = BABYLON.Mesh.CreateBox("skyBox", 10000.0, scene); var skyboxMaterial = new BABYLON.StandardMaterial("skyBox" + sceneKey, scene); skyboxMaterial.backFaceCulling = false; skyboxMaterial.reflectionTexture = new BABYLON.CubeTexture(taskObject.ASSETS_FOLDER + "/textures/fantasy/Sky", scene, ["_px.png", "_py.png", "_pz.png", "_nx.png", "_ny.png", "_nz.png"]); skyboxMaterial.reflectionTexture.coordinatesMode = BABYLON.Texture.SKYBOX_MODE; skyboxMaterial.diffuseColor = new BABYLON.Color3(0, 0, 0); skyboxMaterial.specularColor = new BABYLON.Color3(0, 0, 0); skyboxObject.material = skyboxMaterial; skyboxObject.rotation.x = Math.PI; I get this : Only one side of the cube. No error in the console. I will try to get more information on this, I have been swamped these past few days... sorry. This bug is not present in safari.
  6. Hi there, I'm fairly new to Babylon.js, having previously worked mostly on OGL and Apple's SceneKit (similar to Babylon) outside the browser. As such, I'm probably overlooking something absolutely basic, so please bear with me if I ask a stupid question. I'm tasked with bringing a 3D visualisation app to the web that until now only runs natively on a workstation. One task that I assumed should be simple involves materials. I can, and did, assign file-based images to materials (diffuse and specular maps), and this works really nice. But after looking through the extensive (thank you!) documentation, I'm slightly at a loss at how to achieve the following: The original (non-web) application usually takes a client-provided image (file, usually TIFF or PDF), and then applies a filter onto that image to generate the specular map. This generation process can be quite involved, and may include importing other client-provided imagery. I have written some javascript code that imports the file to a HTML 5 canvas, runs the filter, and then creates the JavaScript image object from the result as follows: function convertCanvasToImage(canvas) { var image = new Image(); image.src = canvas.toDataURL("image/png"); return image; } So far, so good. Now, for the life if me, I don't know how to get this image into a material property; at least not short of writing this as a file somewhere, and then loading it - this can create a ton of issues and I'd like to avoid that. There is brilliant support for a file-based image that take a string as the name, e.g.; materialMain.diffuseTexture = new BABYLON.Texture("textures/owl90.png", scene); But there is no way that I have found to do the same with a JavaScript object of type Image. I have looked at procedural textures that can be based on image files, but their setup appears a bit too inflexible (relying on cofig files) and complex/big calibre (animations, shader) for something that I assume to be basic. So, how can I load an image / html 5 context (basically an RGBA raster image) into a babylon texture? I can't be the first poor sod to try this - what am I overlooking? Thanks for any help, -ch
  7. I need to put thousands of shapes onto the stage and currently I am using PIXI.Texture. I draw a PIXI.Graphics object and then convert it into a PIXI.Texture using `RenderTexture.create().` Then I get a sprite from the texture. But when I draw 3000+ objects on my canvas I find the FPS drops from 60 to 30. If I draw 6000+ objects it slows down everything and my laptop's fan is roaring. I have found some article on https://stackoverflow.com/questions/23468218/draw-10-000-objects-on-canvas-javascript saying that in pure canvas javascript we can just put different layers/canvas on the screen to draw like 1000 object in one layer. But what is the correct way to do this in PIXI?
  8. WebGL supports textures with base64 Data URLs, following the format "data:[mediatype];base64,[base64string]". However, when creating a texture with a data url in Babylon.js: var material = new BABYLON.StandardMaterial("0", scene);material.diffuseTexture = new BABYLON.Texture(dataUrl, scene, false, false, BABYLON.Texture.CUBIC_MODE);the browser console shows the message: Uncaught TypeError: Cannot read property 'replace' of nullIs there something I am doing wrong or is it a Babylon bug? I attached my full test code. Thanks! index.html
  9. Hello everybody! I am trying to make text plane fade animation with text drawn on texture. I created dynamic texture, then created a plane. Next I set texture as textPlane.material.diffuseTexture. Texture has hasAlpha set to true. this.textPlaneTexture.drawText(text, null, 50, 'bold 100px Roboto Mono', 'pink', 'transparent'); So there's text written on the screen with transparent background. Next I created fade animation like: var fading = new BABYLON.Animation.CreateAndStartAnimation('fade' + this.text, this.textPlane.material, 'alpha', 30, 30, 1, 0, 0, null, () => { this.textPlane.isVisible = false; }); But here's the problem when the alpha is changing then also backgorund is getting from black to transparent. Why is that? How to prevent changing background alpha, only the text is alpha change is expected. Here's the playground http://babylonjs-playground.azurewebsites.net/#28LOAX
  10. Is there anyway to load textures in pixi synchronously? Something like: var texture = PIXI.Texture.fromImage("bunny.png", onComplete);function onComplete() { var bunny = new PIXI.Sprite(texture);}I tried the following: var img = new Image();img.onload = onComplete;img.src = 'bunny.png';function onComplete() { var texture = new PIXI.Texture(new PIXI.BaseTexture(img));}But the code above gives me an error. Please let me know how I can make pixi work synchronously. Thank you in advance. EDIT: I also tried: http://jsfiddle.net/8MawM/ but, it also gives me an error.
  11. I am trying to create a fragment shader via a PIXI.AbstractFilter to create a wave rippling effect to be applied to a background texture. I have already worked out the algorithm for the wave effect in JavaScript. What I am having difficulty doing is getting the data I need into the shader through PIXI. For my effect to work, I need to have a large Float32Array to keep track of wave heights and a texture containing the original, unaltered contents of the background image to read from in order to apply the effect of pixel displacement (light refraction). I've been doing a lot of searching and have come up with some partial solutions. I attempt to load my large Float32Array into the shader as a texture with type GL.FLOAT (with the OES_texture_float extension) and an internal format of GL.LUMINANCE and read from it. From what I can tell, my shader isn't receiving my data the way I need it to. Just as a test, I set gl_FragColor to read from my data texture, and instead of the solid black that should have appeared, it rendered a color from either the source texture or the texture of the sprite that the filter is applied to.If I weren't using PIXI, what I would try next is to use gl.getUniformLocation, but it takes the current program as its first parameter, and I don't know of a way to access that. The basic flow of my shader needs to go: Read From Array -> Calculate displacement based on value -> Render the current fragment as the color at x+displacement, y+displacement -> Get updated version of array This is my code in the constructor for my shader: ws.Shader = function(tex) { // GLSL Fragment Shader for Wave Rendering ws.gl = game.renderer.gl; ws.flExt = ws.gl.getExtension("OES_texture_float"); var unis = { dataTex: { type: "sampler2D", value: ws.gl.TEXTURE1 }, canvasTex: { type: "sampler2D", value: ws.gl.TEXTURE2 }, mapSize: { type: "2f", value: [ws.width+2,ws.height+2] }, dispFactor: { type: "1f", value: 20.0 }, lumFactor: { type: "1f", value: 0.35 } }; var fragSrc = [ "precision mediump float;", "varying vec2 vTextureCoord;", "varying vec4 vColor;", "uniform sampler2D uSampler;", "uniform sampler2D dataTex;", "uniform sampler2D canvasTex;", "uniform vec2 mapSize;", "uniform float dispFactor;", "uniform float lumFactor;", "void main(void) {", "vec2 imgSize = vec2(mapSize.x-2.0,mapSize.y-2.0);", "vec2 mapCoord = vec2((vTextureCoord.x*imgSize.x)+1.5,(vTextureCoord.y*imgSize.y)+1.5);", "float wave = texture2D(dataTex, mapCoord).r;", "float displace = wave*dispFactor;", "if (displace < 0.0) {", "displace = displace+1.0;", "}", "vec2 srcCoord = vec2((vTextureCoord.x*imgSize.x)+displace,(vTextureCoord.y*imgSize.y)+displace);", "if (srcCoord.x < 0.0) {", "srcCoord.x = 0.0;", "}", "else if (srcCoord.x > mapSize.x-2.0) {", "srcCoord.x = mapSize.x-2.0;", "}", "if (srcCoord.y < 0.0) {", "srcCoord.y = 0.0;", "}", "else if (srcCoord.y > mapSize.y-2.0) {", "srcCoord.y = mapSize.y-2.0;", "}", /*"srcCoord.x = srcCoord.x/imgSize.x;", "srcCoord.y = srcCoord.y/imgSize.y;",*/ "float lum = wave*lumFactor;", "if (lum > 40.0) { lum = 40.0; }", "else if (lum < -40.0) { lum = -40.0; }", "gl_FragColor = texture2D(canvasTex, vec2(0.0,0.0));", "gl_FragColor.r = gl_FragColor.r + lum;", "gl_FragColor.g = gl_FragColor.g + lum;", "gl_FragColor.b = gl_FragColor.b + lum;", "}"]; ws.shader = new PIXI.AbstractFilter(fragSrc, unis); // Send empty wave map to WebGL ws.activeWaveMap = new Float32Array((ws.width+2)*(ws.height+2)); ws.dataPointerGL = ws.gl.createTexture(); ws.gl.activeTexture(ws.gl.TEXTURE1); ws.gl.bindTexture(ws.gl.TEXTURE_2D, ws.dataPointerGL); // Non-Power-of-Two Texture Dimensions ws.gl.texParameteri(ws.gl.TEXTURE_2D, ws.gl.TEXTURE_MIN_FILTER, ws.gl.NEAREST); ws.gl.texParameteri(ws.gl.TEXTURE_2D, ws.gl.TEXTURE_WRAP_S, ws.gl.CLAMP_TO_EDGE); ws.gl.texParameteri(ws.gl.TEXTURE_2D, ws.gl.TEXTURE_WRAP_T, ws.gl.CLAMP_TO_EDGE); ws.gl.texImage2D(ws.gl.TEXTURE_2D, 0, ws.gl.LUMINANCE, ws.width+2,ws.height+2,0, ws.gl.LUMINANCE, ws.gl.FLOAT, ws.activeWaveMap); // Send texture data from canvas to WebGL var canvasTex = ws.gl.createTexture(); ws.gl.activeTexture(ws.gl.TEXTURE2); ws.gl.bindTexture(ws.gl.TEXTURE_2D, canvasTex); // Non-Power-of-Two Texture Dimensions ws.gl.texParameteri(ws.gl.TEXTURE_2D, ws.gl.TEXTURE_MIN_FILTER, ws.gl.NEAREST); ws.gl.texParameteri(ws.gl.TEXTURE_2D, ws.gl.TEXTURE_WRAP_S, ws.gl.CLAMP_TO_EDGE); ws.gl.texParameteri(ws.gl.TEXTURE_2D, ws.gl.TEXTURE_WRAP_T, ws.gl.CLAMP_TO_EDGE); ws.gl.texImage2D(ws.gl.TEXTURE_2D, 0, ws.gl.RGBA, ws.gl.RGBA, ws.gl.UNSIGNED_BYTE, tex.imageData); } I then attempt to update dataTex in the ws object's update loop: ws.activeWaveMap.set(ws.outgoingWaveMap); // WebGL Update ws.gl.activeTexture(ws.gl.TEXTURE1); ws.gl.bindTexture(ws.gl.TEXTURE_2D, ws.dataPointerGL); /* // Non-Power-of-Two Texture Dimensions ws.gl.texParameteri(ws.gl.TEXTURE_2D, ws.gl.TEXTURE_MIN_FILTER, ws.gl.NEAREST); ws.gl.texParameteri(ws.gl.TEXTURE_2D, ws.gl.TEXTURE_WRAP_S, ws.gl.CLAMP_TO_EDGE); ws.gl.texParameteri(ws.gl.TEXTURE_2D, ws.gl.TEXTURE_WRAP_T, ws.gl.CLAMP_TO_EDGE);*/ ws.gl.texImage2D(ws.gl.TEXTURE_2D, 0, ws.gl.LUMINANCE, ws.width+2,ws.height+2,0, ws.gl.LUMINANCE, ws.gl.FLOAT, ws.activeWaveMap); I'm sure that plenty of this isn't right, but I believe that I can sort things out once I can get to the point where I can actually access my data. Can anyone point me in the right direction? This is central enough to my project that I am willing to discard PIXI altogether if there isn't a way to implement what I am trying to do. Also, I am using PIXI via Phaser, if that makes a difference. Thanks!
  12. what if i only want to optimize only the texture in a scene?
  13. I want to process images on client side with JIMP (awesome library for image processing). I read png file with jimp, modify it with jimp, and then I've get stucked, because I don't know how to display it in pixi.js. The image read by jimp has bitmap property, and when I trying to create texture from it (with PIXI.Texture.from for example I gets errors). Thanks in advance!
  14. hi everyone. i'm developing 2d MMORPG. and image using jpg or png but this is slow and need more memory so it give many problems on mobile by overflowed memory so i'm consider to use texture compress.. but this is not easy because our game need supporting mobile (ios, android) DXT: supported by all desktop devices and some Android devices PVR: supported by all iOS devices and some Android devices ETC1: supported by most Android devices so i need auto-generater. anybody know this? like this https://blog.playcanvas.com/webgl-texture-compression-made-easy/ thx everybody
  15. Hello, Does anyone have an example on how to stream video into a scene and apply onto an object as part of a dynamic texture? Also, in a recent post I had discovered that when loading many textures into an array, the last texture is often not loaded, and sometimes more than one texture is not loaded. My solution was to load a small texture that isn't used in the scene, and then all textures except the last "dummy" texture almost always loads correctly. Wingnut and others tried to duplicate this in the playground as I did, as was unable to reproduce. However, I recall using a babylon.js function which waits for all textures to load before continuing the script. I can't locate the scripts I used this, so if anyone can point me to these functions, I would be grateful. However, streaming video onto an object as a texture (dynamice texture) is most important - as I'll find my lod scripts on disk when I really put in the effort. It's simply that I need streaming video on an object as soon as I can figure it out, or as soon as one of the genius' on this forum provides an example. As always, thanks for any help you might provide. Cheers, DB
  16. Hi All, I would like to replicate what I've seen referred to in some other engines as ambient light; definition of which being - a light that evenly distributes across the whole scene; e.g. doesn't cast shadows, and no parts of the scene look darker / lighter than any other part. My main reason for this is that some of my 3D content will need to have light maps baked into the texture images (for other non-babylon related reasons) & I don't want the default babylon lighting to affect this light mapping. I noticed this topic asked a similar question: However the answer they got didn't seem to satisfy my requirements. So far - the closest solution I've come up with is: for (i = 0; i < newScene.lights.length; i++){ newScene.lights.setEnabled(false); } for (i = 0; i < newScene.materials.length; i++){ newScene.materials.emissiveColor = newScene.materials.diffuseColor; } However this assumes that all my materials are solid colour; which they won't all be; some of them will have image textures. Any advice would be greatly appreciated? Perhaps a shader effect might be of help? Thanks!
  17. Hi, I want to know if its possible to create a mesh using Blender and then to define its properties in the BabylonJS code. For example i want to create a wall using the cube in the blender and name it as WALL1. Then after loading the .babylon file generated the blender i want to define the color, texture and materials of the wall using the name of the wall WALL1. Thanks, Raghavender Mylagary
  18. I'm changing the texture of a model in runtime by creating a material with diffuseTexture and setting it as the material of all meshes related to the model. I have 2 questions about that: 1) mesh.material is null before I set it to the material I created (even when I test it seconds after initialization), and the model's texture is still rendered. It leads me to believe the mode's texture/material is set somewhere else, and mesh.material simply overrides the original place. What is the parameter that contains the texture/material of the mesh? I wasn't able to find it when I went over a mesh's properties in the dev tools. 2) The material with diffuseTexture I create in runtime is "shiny" (reflects light more I guess?). The puppy on the right is the result of the original material before my intervention, as specified in the .babylon file - this is the result I'd like to achieve. The puppy on the left is the result after I set the material of all meshes to the one I create in runtime - the "shiny" one. Here's some relevant info from the .babylon file: "materials":[{"name":"puppy.puppy_mat","id":"puppy.puppy_mat","ambient":[1,1,1],"diffuse":[0.8,0.8,0.8],"specular":[0,0,0],"emissive":[0,0,0],"specularPower":12,"alpha":1,"backFaceCulling":true,"checkReadyOnlyOnce":false, "diffuseTexture":{"name":"puppy_01.jpg","level":1,"hasAlpha":1,"coordinatesMode":0,"uOffset":0,"vOffset":0,"uScale":1,"vScale":1,"uAng":0,"vAng":0,"wAng":0,"wrapU":1,"wrapV":1,"coordinatesIndex":0}}] Which parameter is responsible for what I'd like to achieve (making the material not "shiny") and how/where do I set it in runtime?
  19. Hi everyone! Is there any way in Babylon.js to combine multiple textures, or multiple materials on a single mesh? I know that it's possible to apply diffuseTexture and ambientTexture on a material at once, but if I have more than 2 textures? Like for example, it could be used for maps, having a satellite image as a base texture and then applying a layer with ambient pressure, and UV index, and wind direction, and whatnot
  20. Hi All ; I m wondering "What is the largest texture size of BabylonJS supported ?" jpg or png texture file type, As a result of my experiments I think 16384 x16384 pixel its right? I m trying to make slideshow like carousel on mesh ; I made it with uScale and UOffset method its workin good but, Is there another way or Idea for that ? Which is the best route ? I look forward to your precious thoughts , Thank you All with my respects.
  21. Hi folks! New to the forum, but looking forward to share and consume knowledge and info with you all! I have a question that has bothered me since I started writing shader based gl code, how to optimize the rendering pipeline in the best way. I've read a ton of GL books and gone through countless tutorials on the subject, but each one just touch on the basics on how to get things working. Not on how to actually set up a optimized and clean rendering pipeline for a working graphics engine. The parts that stand out in my case is how to handle textures and shader programs in a good way, and what standard I should go after when it comes to handling of these precious resources. The basic question is, how many times in a single render cycle should I be allowed to change 1: Shader program, 2: Texture. To take a real life example on a game that I'm working on at the moment, I first do my general rendering that uses 2 textures, one for sprites and one for font sprite, the textures I use is quite big, 2048^2 as I'm working on a HD version of a game. Here I'm pondering on perhaps using a 4096^2 texture as well, looks like most devices can handle these sizes, and I can cram in pretty much all the gfx assets I need onto one of those babies. But is it good practice? Do I win anything when it comes to rendering speeds, and is the win big enough to handle large complex sprite maps like that? Or can I have 50-100 different texture images that I can pick from during a single cycle? The second parameter is the Shader program, same goes here really, I have a shader for general sprite handling, and a shader for the font renderer, but I also have a special shader for post fx that I use for a FBO that will be the final scene in the pipeline. I think there will be more programs involved here, and I might need to switch between them during a rendering cycle. Is it to much to switch shader programs 10-20 times in a single cycle, or is it within acceptable limits? The engine setup that I have today works quite well and I can't really find any problems with rendering speeds, but I want to push this a bit, I'm working on a particle engine that I want to use, and that will bring an additional shader into the equation, and probably additional sprite maps that needs loading. What I really want to know is if there's some kind of standard to comply with, would be great to have some frames to work within when it comes to the rendering pipeline. Looking forward to hear your input on these questions!
  22. I'm loading textures via AssetsManager and saving references to them before the scene is rendered. When creating a CubeTexture I'd like to pass it references to the already loaded textures (via the constructor or any other way), instead of urls of the textures, If I pass to CubeTexture urls of the textures then they are being loaded all over again before being displayed, which I'd like to avoid. For instance instead of this constructor: constructor(rootUrl: string, scene: Scene, extensions?: string[], noMipmap?: boolean, files?: string[]); It'd be nice to have something like this: constructor(textures: BABYLON.Texture[], scene: Scene, noMipmap?: boolean); Does such a thing exist in one way or another or is it a feature request? Are the textures supposed to be loaded all over again when their urls are passed to CubeTexture after the textures have already been loaded via AssetManager from the same urls? Is it a bug? Am I doing something wrong?
  23. Hi everybody ! This week I faced with a strange behaviour. I generally lose time looking for a solution on my own, I did it again, but today I have the unpleasant feeling to be powerless. I'm trying to precompute kind of PCSS map (soft shadows) to get nice-looking shadows in static scenes. For this purpose, I use renderTargetTextures with refreshRate = RENDER_ONCE. And I use three shaders, called in this order : The one of shadowGenerator which gives me the shadowMap of the blockers. The PCSS one, which uses the shadowMap and gives me a PCSSMap. The material's shader, which simply display the PCSSMap in real time. After each renderTargetTexture created, I call scene.customTargets() to order the calls. Okay : This works great ! Now, I would like to correct small artifacts and I need to repeat step 1 & 2 for each blocker separatly. And here comes the drama. Let's take a look at the new call order : 1. ShadowGenerator creates blockerMap with the first blocker. 2. PCSSGenerator creates PCSSMap for the first blocker. 1bis. ShadowGenerator creates blockerMap with the second blocker. 2bis. PCSSGenerator creates PCSSMap for the second blocker AND mix it with last PCSSMap (the first one). 1ter. ShadowGenerator creates blockerMap with the third blocker. 2ter. PCSSGenerator creates PCSSMap for the third blocker AND mix it with last PCSSMap (the second one). 3. Display result. My issue is : I can't grab the last PCSSMap. After a few tests, I highlighted when my issue appears and when it doesn't. So here is a big simplification of my PCSS fragment shader (it only outputs one of the two textures it takes in uniform) : uniform samplerCube blockerMap; uniform samplerCube previousPCSSMap; // This function samples the blockerMap. We don't mind about the result. float sampleFunction() { for (int i = 0.0; i < POISSON_COUNT; i++) { vec4 sample = textureCube(blockerMap, direction); } return 1.0; } void main(void) { // To fill } And here are 4 usecases : 1. It returns blockerMap, everything is OK. void main(void) { //sampleFunction(); gl_FragColor = textureCube(blockerMap, direction); //gl_FragColor = textureCube(previousPCSSMap, direction); } 2. It returns the previous PCSSMap, everything is OK. void main(void) { //sampleFunction(); //gl_FragColor = textureCube(blockerMap, direction); gl_FragColor = textureCube(previousPCSSMap, direction); } 3. It returns the blockerMap, everything is OK. void main(void) { sampleFunction(); gl_FragColor = textureCube(blockerMap, direction); //gl_FragColor = textureCube(previousPCSSMap, direction); } 4. It returns the blockerMap instead of the previous PCSSMap, what's wrong ? void main(void) { sampleFunction(); //gl_FragColor = textureCube(blockerMap, direction); gl_FragColor = textureCube(previousPCSSMap, direction); } As you can see, sampleFunction() works with blockerMap and has absolutely no contact with the previousPCSSMap. However, the previousPCSSMap seems to be replaced by the blockerMap and I have absolutely no idea how it's possible. As it's nonsense, I dare coming here begging for your help... Some more info : - I use shadowGenerator from Babylon. - I use my own PCSSGenerator. But this one is a CC of shadowGenerator. The unique difference is the shader which is called. - The last shader (material's one) only displays the result, the issue should not come from here. - I verified 1000 times, I don't send blockerMap in previousPCSSMap in my code. Maybe it does in a dark side of the library, but I don't think so. - I systematically empty my cache between each shader modification. - Of course, my PCSS shader contains a lot of calculations and uniforms I didn't show here. But I really commented a lot my code to obtain something really close to the usecase above. I'm working on isolating the issue in a new project. Thanks ! PeapBoy
  24. Hi! In my Babylon scene I have a room with a plane, representing a screen. My idea is to display a video on this screen when it's clicked and turn it of if it's clicked again. Code: var screen1 = scene.getMeshByName("Lerret");activateVideoTexture(screen1);function activateVideoTexture(mesh){ mesh.actionManager = new BABYLON.ActionManager(scene); var action = new BABYLON.ExecuteCodeAction(BABYLON.ActionManager.OnPickTrigger, function(){ mesh.material.diffuseTexture=new BABYLON.VideoTexture("video", ["01_Tumor-Glioblastoma.mp4"], 640, scene, true); mesh.material.emissiveColor = new BABYLON.Color3(1,1,1); }); var action2 = new BABYLON.ExecuteCodeAction(BABYLON.ActionManager.OnPickTrigger, function(){ mesh.material.diffuseTexture.dispose(); mesh.material = new BABYLON.StandardMaterial("texture1", scene); }); mesh.actionManager.registerAction(action).then(action2);}I've got to major problems: 1: When the VideoTexture is activated/applied the screen only show the very top left corner of the video (i think). The screen is all brown, which is the color in the top left corner in the video. It seems like the plane just projects the color of the top-left pixel in the video I also get this warning in the console "[.WebGLRenderingContext-05B3F9D8]RENDER WARNING: there is no texture bound to the unit 1" So how do I make the video fit/stretch to the size of the plane it is applied to? I've set the size parameter to 640, which is the pixel width of the video. Don't know if it is this "size" is refering to. 2: When you click the second time I want the videoTexture to disappear, which actually works as it is now, except that the sound continues. How can I handle this?
  25. Hello I made a custom ring and i want to add some textures on that ring, but it's not working properly. Here's my scene. http://www.babylonjs-playground.com/#1FKMAH#0 I'm a very beginner of Babylon.js index.html .DS_Store ring.babylon