Search the Community

Showing results for tags 'texture'.

More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


  • HTML5 Game Coding
    • News
    • Game Showcase
    • Coding and Game Design
  • Frameworks
    • Phaser 3
    • Phaser 2
    • Pixi.js
    • Babylon.js
    • Panda 2
    • melonJS
    • Haxe JS
    • Kiwi.js
  • General
    • General Talk
  • Business
    • Collaborations (un-paid)
    • Jobs (Hiring and Freelance)
    • Services Offered

Find results in...

Find results that contain...

Date Created

  • Start


Last Updated

  • Start


Filter by number of...


  • Start



Website URL





Found 263 results

  1. Hello ! As WebGL2 comes with new texture formats, I decided to play a bit with them, and it seems to work well in pure WebGL2: (If it prints red, that means the RGB texture did work 🙂) I saw texture format has been added to createRenderTargetTexture function so I wanted to try it out. But whatever I do, I never achieve to create a RGB Render Target Texture. 😥 This code works to create a RGBA RenderTarget: This code fails to create a RGB RenderTarget: Framebuffer is incomplete. I already pulled the last version of BJS and added gl.pixelStorei(gl.UNPACK_ALIGNMENT, 1) everywhere but it doesn't help much. I'm struggling with this, I don't understand where something is different from the pure WebGL2 version. I verified InternalSizedFormat, InternalFormat and TextureType and they're OK. If anybody has an idea... Thanks in advance 😊 PeapBoy
  2. Hi, First of all thank you for Babylon.JS, it is a wonderful library and a pleasure to code with it I am working on a game project and I need to display textures with a 'nearest neighbour' sampling mode (pixelated effect). Babylon.JS theoretically offers this possibility, but I could not manage to get it working. Here is a playground illustrating my modest struggle: Left cube's texture is created with the NEAREST_SAMPLINGMODE parameter, while the right one uses TRILINEAR_SAMPLINGMODE. In my browser (Chrome v38), there is absolutely no visible difference. I have seen a couple of threads about this on the forum but nothing that provides a real solution. Hopefully this thread will be able to do that. Of course there is always the possibility to multiply the size of the texture using a nearest neighbour interpolation with an image manipulation software, but I'd prefer not as it adds steps in the workflow and prevents doing big texture atlases. Thank you very much in advance
  3. ArcadixInfotech

    Texture in box2d rope/distance joint

    Hello, Is it possible to give a texture to box2d's rope or distance joint. I can see the joints in draw debug mode, but need something like an actual rope texture. Thanks
  4. yasuhiko

    basic demo "Video" for HLS source

    Hi, I am trying to draw HLS video stream over PIXI video texture on iOS safari browser (iOS 11.2) referring but not succeeded. (Sorry for poor English as I am Japanese) When I set mp4 video as source, the demo code worked over iPhone + mobile safari (OS: 11.2). But when I set url of HLS (m3u8) and tapped play button, video did not drawn. I tried some change but not succeeded to play HLS stream over PIXI video texture. Below is my code, modified part of . ... function onPlayVideo() { // Don't need the button anymore button.destroy(); /// modify start // mp4 // (1) mp4 OK : video/audio played ( is my own server) // var texture = PIXI.Texture.fromVideo(''); // (2) mp4 OK : video/audio played // var texture = PIXI.Texture.fromVideoUrl(''); // HLS // (3) Not work : when play button pressed, loading m3u8 not started. // #http://184.72 ... is effective m3u8 stream // var texture = PIXI.Texture.fromVideo(''); // (4) Not work : when play button pressed, loading m3u8 not started. // var texture = PIXI.Texture.fromVideoUrl(''); // (5) Not work : when play button pressed, loading m3u8 started and audio play started. but video is not drawn on canvas. let baseVideoTexture = PIXI.VideoBaseTexture.fromUrl({ src: '', mime: 'application/' }); var texture = PIXI.Texture.from(baseVideoTexture); /// modify end // create a new Sprite using the video texture (yes it's that easy) var videoSprite = new PIXI.Sprite(texture); ... Please help/guide me regarding right way/manner to play HLS stream over video texture of PIXI. (i.e. how to fix above code) entire HTML which I modified is attached (pixi_video_hls.html) If more information needed for answer, let me know. Thank you in advance. pixi_video_hls.html
  5. Doug

    Hi Rich.  @rgk mentioned that you might be able to please add a "patron" badge to my forum profile?  Thanks very much!

  6. Hello to all of you! I want to thank everyone for support on my previous issue, especially @JohnK. At that time I had(and still have because of trouble of applying his suggestion) one structural problem with performance that I am caring for a longer period of time. But that is out of the scope for the moment being because I had one little problem which troubles me a lot. Essentially from time to time when zooming out(on the maximum level) and panning the camera I have the horrendous flickering which is even more obvious than on demo due to fact that I also add transparent hole(I have not added it here for the sake of the code complexity). Issue can be seen here: From my understanding the issue is happening because of some sort of weird collision between two textures ground and location one. But I really cannot afford to do any positioning change. Is there any other, more elegant way to resolve this problem? Probably I am missing something really trivial. Thanks to all
  7. simple_life

    Scintillating texture

    The loaded texture is sometimes absent as the camera moves
  8. On desktop I have a clear view of how it works, so I will start by asking how textures are manage on mobile hardware (tablet or smartphone) but of course if someone wants to ask something about desktop, this topic is open. We assume that all textures size are obviously in power of two. If I have textureOne, jpg file, and textureTwo which is the same but in png, do they take the same amount of ram once loaded ? If yes, using for example tga will change only the download time and not the performance ? Could uncompressing a texture kill some FPS ? (example : jpg file to RAM ; png file to RAM) Is Mobile hardware have VRAM or only RAM shared between CPU and GPU unit ? It depends of the device I supposed ? Is there a way to use mipmaps or is this specific to dds format ? Thanks
  9. waverider

    texture flickers[solved]

    Hy y'all, the texture of the box flickers when i move the camera around, but i still want the box to overlap how can i fix this?
  10. I need to generate some textures on the fly, bassicaly when map is loaded I generate several dozens simple gradient textures. I found here: (last comments) that this is possible to use just one helper canvas to do this but after reading some other sources I am a little confused how to handle destroy of this textures. At the moment I have some helperCanvas with 2D context to generate my textures and then I do: // This create texture from helper canvas as I understand var texture1 = new PIXI.Texture(new PIXI.BaseTexture(helperCanvas)); // This one "save" texture in GPU memory if I understood correctly. renderer.textureManager.updateTexture(texture1.baseTexture); // Create new sprite from this texture var sprite = new Sprite( texture1 ); I do this in loop to create all needed sprites but I am not sure what should I do when map is unloaded, when I don't need these textures. Is any action needed or GC can handle this? Should I remove them manually from textureManager by some method or call destroy on sprite?
  11. I'm at my wit's end. I've tried across several game engines and drawing programs now, tried every little setting and have really read everything there is to read. I just cannot get a sharp appearance on any asset as soon as they are rendered in an HTML Canvas or WebGL. In desktop applications, they look fine. Here, for example, is a good asset - it's sharp and clear and as it as a PNG, it should scale down nicely (I need it about 25% of this size): As soon as I load it as a sprite via Phaser, I get a result like this (don't mind the money-man): It's blurry and it's even worse when scaled down (here scaled to 0.1x, 0.1y): I'm on an iMac, so maybe it has to do with the retina display, but I am getting similar results on lower quality monitors as well. What am I doing wrong? When I load other assets in, such as the ones used in the Phaser 2 tutorial, they look crystal clear. I feel like there's something really simple that I am missing. :-( My old (finished) build was in Unity 2D, and I was having the exact same issue. This is why I swapped to Phaser instead of using the Unity WebGL build. I've tested on both engines, and some assets render perfectly (particularly, pixel art renders perfectly. Other, more vector-like assets render poorly). Really hope there's somebody here who can help. :-( Yolanda
  12. Hi Guys, I recently using the last version of PIXI.js (1.5.1) and we are dealing with some problems with particular texture (that texture is in another host). Uncaught SecurityError: Failed to execute 'texImage2D' on 'WebGLRenderingContext': The cross-origin image at XXXXXX may not be loaded. To load the texture i'm using the asset loader and setting the crossorigin property by true. // assetsToInitloader has the url to the texture var preloader = new PIXI.AssetLoader(assetsToInitloader); preloader.onComplete = onAssetsInitloaded; preloader.crossorigin = true; preloader.load(); The same code is working fine on Safari, and having problems on Firefox and Chrome. Thanks in advance, Cheers!
  13. josescxavier

    New set texture doesn't show up

    Hi, I'm having a problem after setting a texture. It doesn't show up. It only show the new texture after modify it. Here is the code where I create the objects: this.pixicanvas = new PIXI.Application({width: 600, height: 800}); this.rx_image = null; this.zoom_min = 1; this.aim_tmp = 0; this.state = { pixi_height: 0, pixi_width: 0, }; this.handleZoom = this.handleZoom.bind(this); this.rx_image = new PIXI.Sprite(PIXI.Texture.EMPTY); this.container_points = new PIXI.Container(); Here where I update the texture: updateRxSprite(new_img){ PIXI.loader.reset(); PIXI.loader.add('resource-key', new_img).load( (loader, resources) => { this.rx_image.texture = PIXI.Texture.fromImage('resource-key'); var scale = 1; var scale_screen = this.pixicanvas.screen.width; var scale_rx_image = this.rx_image.width; var aspect_ratio_sprite = this.rx_image.width/this.rx_image.height; var aspect_ratio_screen = this.pixicanvas.screen.width/this.pixicanvas.screen.height; if(aspect_ratio_sprite<=aspect_ratio_screen){ scale_screen = this.pixicanvas.screen.height; scale_rx_image = this.rx_image.height; } if(scale_screen > 0 && scale_rx_image > 0){ scale = scale_screen / scale_rx_image; } this.rx_image.scale.x = scale; this.rx_image.scale.y = scale; this.zoom_min = scale; this.rx_image.x = (this.pixicanvas.screen.width-this.rx_image.width)/2; this.rx_image.y = (this.pixicanvas.screen.height-this.rx_image.height)/2; this.rx_image.on('pointerdown', this.props.onClick); this.d3zoom_element.on(".zoom", null); this.d3zoom_behavior = D3ZOOM.zoom().scaleExtent([this.zoom_min, 5]).on("zoom", this.handleZoom);; var new_transform = D3ZOOM.zoomIdentity; new_transform.k = this.rx_image.scale.x; new_transform.x = this.rx_image.position.x; new_transform.y = this.rx_image.position.y;, new_transform); }); } After this code the new texture isn't showed up until I modify it, per example, set a new position to sprite. Before it stopped work I added the sprite to stage after set the new texture. Now because I need the sprite on a specific Z order I added the sprite with a EMPTY texture and only set the new texture after the user upload it. What I'm doing wrong?
  14. I'm trying to get video textures to work on mobile devices and I seem to be running into an issue where the video won't autoplay because it doesn't meet the requirements that Apple has laid out here: This can be seen in this playground: Is there a way to pass custom attributes to the video element that is contained within the video texture? My hypothesis is the absence of the playsinline data attribute on the video element is preventing this from working correctly. @Deltakosh any insight here?
  15. Gerente

    Dynamic Simple Texture

    Hello, there is a way to create by code a texture of size W and H ? Texture.WHITE creates a 10x10 texture but if you change it size after adding it into a PIXI.Sprite it will resize back to 10x10. If you resize it after assign it to the Sprite it will change the scale. I need it basically because the "containsPoint" function only works based on the texture size or the Sprite.
  16. V!nc3r

    Lightmap texture channel broke?

    Hi! As we can see in my cornell box scene, something is now wrong with lightmap texture : All was working good last time I play with it. If line 7 you change mode to ambientTexture instead of lightmapTexture... var mode = lightmapModes.AMBIENT; will see that textures are correctly assigned, so I don't think this bug is due to my code (or maybe I'm wrong? )
  17. Hi ! I am trying to set 2 different textures _by code_ on a loaded mesh exported of Blender. There is no problem on "creating" a mesh from MeshBuilder and set it faceUV with options, but from a "loaded" mesh (created from blender, without any texture) I don't manage to do it. I give this playground for a start (my code is too complicated to put it in a playground) : The goal : set any texture (from internet) to the loaded mesh to any face of the Dude.babylon. I tried few things like VertexBuffer.UVKind but... it should not be the way to go More details : My Blender object is a single plane : 6 vertices, 2 faces. My goal is to dynamically set different texture on each face . The wireframe show me the 2 faces but I have no idea how to set them the texture. Thanks for any help
  18. No matter what I try or how I implement things, I keep getting some jittery scroll movement. I was using the <canvas> tag before this, without PixiJS and it was a lot of jittery movement. Just one drawImage call per rAF-call would take far more than 16,6 ms. I used the <canvas> for drawing frames. But I also used the transform CSS property for instance. With and without CSS transitions. But currently I'm using PixiJS with a RenderTexture and the scrolling still seems somewhat jittery to me, though maybe less jittery. I'm not drawing vector graphics. I'm drawing images (PNG files actually). When an image has loaded I add it to a somewhat large RenderTexture (4096x4096). Because width of the images don't exceed 1024 pixels, I now store the images inside four columns of 1024 by 4096 pixels. I then have a sprite for which I create a Texture (which I recently found out to be just a reference to a BaseTexture combined with a certain frame). Each time I scroll I create a new Texture pointing to the same RenderTexture but using a different frame. Though at a certain point it seems I need two sprites with textures both pointing to the same RenderTexture but with different frames. Because, let's say, when the first frame starts at 4000 and I need about 800 pixels from the image (e.g. 800 could be window height or screen height) I need to have a frame pointing at the last 96 pixels of the first column within the RenderTexture and a frame which points to the other column, getting the remaining 704 pixels. Should this be a proper way of handling this? Or is there a faster way to handle this somehow? Maybe I could make sure to make the height of the columns within the RenderTexture are dividable by the current window height. Though then it would mean some column height would go unused (but then this would probably be true for all columns, so maybe not such of a big deal). And this reordering would then need to happen each time a resize occurs. I guess a large screen size (regarding height) would not work very well with how I'm handling this now? Any advice would be much appreciated By the way, I'm also using a small easing function which I call via setTimeout when there is movement. But the actual drawing takes place currently in the ticker function. It just calculates current scrolling speed, does not draw anything.
  19. DoraemonEXA

    Loading rotated texture atlas

    I have a texture atlas generated by Texture Packer with the "rotate" option is enabled. In order to load & render all the frames correctly, now I have to use a simple script to pre-process the json file, swap the "w" and "h" properties if the frame is marked "rotated". Is it possible to make a custom loader to load such texture atlas and store the restored frame in the cache
  20. Hey guys, I want to create a custom RenderTargetTexture of type BABYLON.Engine.TEXTUREFORMAT_LUMINANCE but it doesn't work. here's my code: textureFormat is always 5 (probably TEXTUREFORMAT_RGBA), regardless of what I set it to be. // Render target var renderTarget = new BABYLON.RenderTargetTexture( "depth", 128, // res scene, false, // generateMipMaps true, // doNotChangeAspectRatio BABYLON.Engine.TEXTURETYPE_UNSIGNED_INT, // type false, // isCube BABYLON.Texture.BILINEAR_SAMPLINGMODE, // samplingMode false, // generateDepthBuffer false, // generateStencilBuffer false); //isMulti renderTarget.noMipmap = true; renderTarget.wrapU = BABYLON.Texture.CLAMP_ADDRESSMODE; renderTarget.wrapV = BABYLON.Texture.CLAMP_ADDRESSMODE; renderTarget.wrapR = BABYLON.Texture.CLAMP_ADDRESSMODE; renderTarget.textureFormat = BABYLON.Engine.TEXTUREFORMAT_LUMINANCE; renderTarget.hasAlpha = false; renderTarget.gammaSpace= false; ///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////// My wild guess is that the constructor doesn't pass textureFormat to _renderTargetOptions this._renderTargetOptions = { generateMipMaps: generateMipMaps, type: type, samplingMode: samplingMode, generateDepthBuffer: generateDepthBuffer, generateStencilBuffer: generateStencilBuffer };
  21. AlbertTJames

    MacOS Sierra and Skybox / Cube texture bug

    Hey, I just upgraded to OSX Sierra and this bit of code does not work anymore on chrome : /* --- Skybox --- */ var skyboxObject = BABYLON.Mesh.CreateBox("skyBox", 10000.0, scene); var skyboxMaterial = new BABYLON.StandardMaterial("skyBox" + sceneKey, scene); skyboxMaterial.backFaceCulling = false; skyboxMaterial.reflectionTexture = new BABYLON.CubeTexture(taskObject.ASSETS_FOLDER + "/textures/fantasy/Sky", scene, ["_px.png", "_py.png", "_pz.png", "_nx.png", "_ny.png", "_nz.png"]); skyboxMaterial.reflectionTexture.coordinatesMode = BABYLON.Texture.SKYBOX_MODE; skyboxMaterial.diffuseColor = new BABYLON.Color3(0, 0, 0); skyboxMaterial.specularColor = new BABYLON.Color3(0, 0, 0); skyboxObject.material = skyboxMaterial; skyboxObject.rotation.x = Math.PI; I get this : Only one side of the cube. No error in the console. I will try to get more information on this, I have been swamped these past few days... sorry. This bug is not present in safari.
  22. negrant

    Particles dispose

    Hi! Have some problem, I create many particles systems in my game and very often delete them after first playing, but if I using the same texture in particle systems, then after first disposing of particle system the texture disposing too. And I should to cloning each one new texture, but it is bad for performance. How I can to create a particles systems with same texture but after disposing particles system do not dispose texture? Thanks!
  23. Art Vandelay

    Tile Offset

    Hi everyone, I am trying to tile a floor but I would like to create a sort of "offset" in the tiling. I added pictures to best explain what I am trying to do. The first is what I am doing now. The second is what I would like. How can I do this? Thanks!
  24. Hi, I'm new to BabylonJS and 3D. I created a scene in Blender with a character. It has a single material, UV Map and texture. It works fine in BabylonJS but when I replace the texture with new BABYLON.Texture or new BABYLON.DynamicTexture (in my case), the new texture does not take into account the UV Map. I think I forgot something, but I don't know where to watch. (Sorry for my English ^^')
  25. Hello, Would it be possible to somehow have textures mapped onto meshes and the direction / orientation of these textures dictated by the vertex colour? Assuming I have a piece of furniture: X = Red (the top & the long rails between legs) Y = Green (the short rails between legs) Z = Blue (the legs) I would then specify how the texture could be applied by using something (which doesn't yet exist) like: coordinatesIndex = vertexColor I wouldn't have a clue where to start and am happy to pay to have someone do this for me. Are offers of paid work allowed here? If not, I will amend this post. Thank you.