Search the Community

Showing results for tags 'texture'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • HTML5 Game Coding
    • News
    • Game Showcase
    • Coding and Game Design
  • Frameworks
    • Phaser 3
    • Phaser 2
    • Pixi.js
    • Babylon.js
    • Panda 2
    • melonJS
    • Haxe JS
    • Kiwi.js
  • General
    • General Talk
  • Business
    • Collaborations (un-paid)
    • Jobs (Hiring and Freelance)
    • Services Offered

Found 257 results

  1. Building a RGB RawTexture

    Hello, I've been messing around with some colormaps I generate. I have a ShaderMaterial and would like to send some colormaps to the shader using ShaderMaterial.setTexture(). My colormaps are dynamically generated and result in a RGB image stored as a Uint8Array. To illustrate the test, I am using a 1x1 texture. I have no problem a RGBA image as uniform on the shader, like that: // this works well let cmTexture = new BABYLON.RawTexture( new Uint8Array([128, 128, 128, 255]), // data 1, // width 1, // height BABYLON.Engine.TEXTUREFORMAT_RGBA, // format this._scene, // scene false, // gen mipmaps false, // invertY BABYLON.Texture.TRILINEAR_SAMPLINGMODE, BABYLON.Engine.TEXTURETYPE_UNSIGNED_INT ) But, if I want to create a RGB texture (no alpha channel), it does not work: // JS lets me do it, but WebGL yells at me let cmTexture = new BABYLON.RawTexture( new Uint8Array([128, 128, 128]), // data 1, // width 1, // height BABYLON.Engine.TEXTUREFORMAT_RGB, // format this._scene, // scene false, // gen mipmaps false, // invertY BABYLON.Texture.TRILINEAR_SAMPLINGMODE, BABYLON.Engine.TEXTURETYPE_UNSIGNED_INT ) Then, I have the following error: The first warning ("WebGL: ...") occurs at the creation of the RawTexture, while the second one occurs when doing shaderMaterial.setTexture(...) I can still use RGBA so it's not a super big deal, but there is probably a little bug somewhere... Cheers.
  2. I'm at my wit's end. I've tried across several game engines and drawing programs now, tried every little setting and have really read everything there is to read. I just cannot get a sharp appearance on any asset as soon as they are rendered in an HTML Canvas or WebGL. In desktop applications, they look fine. Here, for example, is a good asset - it's sharp and clear and as it as a PNG, it should scale down nicely (I need it about 25% of this size): As soon as I load it as a sprite via Phaser, I get a result like this (don't mind the money-man): It's blurry and it's even worse when scaled down (here scaled to 0.1x, 0.1y): I'm on an iMac, so maybe it has to do with the retina display, but I am getting similar results on lower quality monitors as well. What am I doing wrong? When I load other assets in, such as the ones used in the Phaser 2 tutorial, they look crystal clear. I feel like there's something really simple that I am missing. :-( My old (finished) build was in Unity 2D, and I was having the exact same issue. This is why I swapped to Phaser instead of using the Unity WebGL build. I've tested on both engines, and some assets render perfectly (particularly, pixel art renders perfectly. Other, more vector-like assets render poorly). Really hope there's somebody here who can help. :-( Yolanda
  3. New set texture doesn't show up

    Hi, I'm having a problem after setting a texture. It doesn't show up. It only show the new texture after modify it. Here is the code where I create the objects: this.pixicanvas = new PIXI.Application({width: 600, height: 800}); this.rx_image = null; this.zoom_min = 1; this.aim_tmp = 0; this.state = { pixi_height: 0, pixi_width: 0, }; this.handleZoom = this.handleZoom.bind(this); this.rx_image = new PIXI.Sprite(PIXI.Texture.EMPTY); this.container_points = new PIXI.Container(); Here where I update the texture: updateRxSprite(new_img){ PIXI.loader.reset(); PIXI.loader.add('resource-key', new_img).load( (loader, resources) => { this.rx_image.texture = PIXI.Texture.fromImage('resource-key'); var scale = 1; var scale_screen = this.pixicanvas.screen.width; var scale_rx_image = this.rx_image.width; var aspect_ratio_sprite = this.rx_image.width/this.rx_image.height; var aspect_ratio_screen = this.pixicanvas.screen.width/this.pixicanvas.screen.height; if(aspect_ratio_sprite<=aspect_ratio_screen){ scale_screen = this.pixicanvas.screen.height; scale_rx_image = this.rx_image.height; } if(scale_screen > 0 && scale_rx_image > 0){ scale = scale_screen / scale_rx_image; } this.rx_image.scale.x = scale; this.rx_image.scale.y = scale; this.zoom_min = scale; this.rx_image.x = (this.pixicanvas.screen.width-this.rx_image.width)/2; this.rx_image.y = (this.pixicanvas.screen.height-this.rx_image.height)/2; this.rx_image.on('pointerdown', this.props.onClick); this.d3zoom_element.on(".zoom", null); this.d3zoom_behavior = D3ZOOM.zoom().scaleExtent([this.zoom_min, 5]).on("zoom", this.handleZoom); this.d3zoom_element.call(this.d3zoom_behavior); var new_transform = D3ZOOM.zoomIdentity; new_transform.k = this.rx_image.scale.x; new_transform.x = this.rx_image.position.x; new_transform.y = this.rx_image.position.y; this.d3zoom_element.call(D3ZOOM.zoom().transform, new_transform); }); } After this code the new texture isn't showed up until I modify it, per example, set a new position to sprite. Before it stopped work I added the sprite to stage after set the new texture. Now because I need the sprite on a specific Z order I added the sprite with a EMPTY texture and only set the new texture after the user upload it. What I'm doing wrong?
  4. I'm trying to get video textures to work on mobile devices and I seem to be running into an issue where the video won't autoplay because it doesn't meet the requirements that Apple has laid out here: https://webkit.org/blog/6784/new-video-policies-for-ios/ This can be seen in this playground: http://www.babylonjs-playground.com/#1X8NRY Is there a way to pass custom attributes to the video element that is contained within the video texture? My hypothesis is the absence of the playsinline data attribute on the video element is preventing this from working correctly. @Deltakosh any insight here?
  5. Dynamic Simple Texture

    Hello, there is a way to create by code a texture of size W and H ? Texture.WHITE creates a 10x10 texture but if you change it size after adding it into a PIXI.Sprite it will resize back to 10x10. If you resize it after assign it to the Sprite it will change the scale. I need it basically because the "containsPoint" function only works based on the texture size or the Sprite.
  6. Lightmap texture channel broke?

    Hi! As we can see in my cornell box scene, something is now wrong with lightmap texture : https://www.babylonjs-playground.com/#J5E230#11 All was working good last time I play with it. If line 7 you change mode to ambientTexture instead of lightmapTexture... var mode = lightmapModes.AMBIENT; ...you will see that textures are correctly assigned, so I don't think this bug is due to my code (or maybe I'm wrong? )
  7. No matter what I try or how I implement things, I keep getting some jittery scroll movement. I was using the <canvas> tag before this, without PixiJS and it was a lot of jittery movement. Just one drawImage call per rAF-call would take far more than 16,6 ms. I used the <canvas> for drawing frames. But I also used the transform CSS property for instance. With and without CSS transitions. But currently I'm using PixiJS with a RenderTexture and the scrolling still seems somewhat jittery to me, though maybe less jittery. I'm not drawing vector graphics. I'm drawing images (PNG files actually). When an image has loaded I add it to a somewhat large RenderTexture (4096x4096). Because width of the images don't exceed 1024 pixels, I now store the images inside four columns of 1024 by 4096 pixels. I then have a sprite for which I create a Texture (which I recently found out to be just a reference to a BaseTexture combined with a certain frame). Each time I scroll I create a new Texture pointing to the same RenderTexture but using a different frame. Though at a certain point it seems I need two sprites with textures both pointing to the same RenderTexture but with different frames. Because, let's say, when the first frame starts at 4000 and I need about 800 pixels from the image (e.g. 800 could be window height or screen height) I need to have a frame pointing at the last 96 pixels of the first column within the RenderTexture and a frame which points to the other column, getting the remaining 704 pixels. Should this be a proper way of handling this? Or is there a faster way to handle this somehow? Maybe I could make sure to make the height of the columns within the RenderTexture are dividable by the current window height. Though then it would mean some column height would go unused (but then this would probably be true for all columns, so maybe not such of a big deal). And this reordering would then need to happen each time a resize occurs. I guess a large screen size (regarding height) would not work very well with how I'm handling this now? Any advice would be much appreciated By the way, I'm also using a small easing function which I call via setTimeout when there is movement. But the actual drawing takes place currently in the ticker function. It just calculates current scrolling speed, does not draw anything.
  8. Hi ! I am trying to set 2 different textures _by code_ on a loaded mesh exported of Blender. There is no problem on "creating" a mesh from MeshBuilder and set it faceUV with options, but from a "loaded" mesh (created from blender, without any texture) I don't manage to do it. I give this playground for a start (my code is too complicated to put it in a playground) : https://www.babylonjs-playground.com/index.html#YYQ1LC#39 The goal : set any texture (from internet) to the loaded mesh to any face of the Dude.babylon. I tried few things like VertexBuffer.UVKind but... it should not be the way to go More details : My Blender object is a single plane : 6 vertices, 2 faces. My goal is to dynamically set different texture on each face . The wireframe show me the 2 faces but I have no idea how to set them the texture. Thanks for any help
  9. Loading rotated texture atlas

    I have a texture atlas generated by Texture Packer with the "rotate" option is enabled. In order to load & render all the frames correctly, now I have to use a simple script to pre-process the json file, swap the "w" and "h" properties if the frame is marked "rotated". Is it possible to make a custom loader to load such texture atlas and store the restored frame in the cache
  10. Hey guys, I want to create a custom RenderTargetTexture of type BABYLON.Engine.TEXTUREFORMAT_LUMINANCE but it doesn't work. here's my code: textureFormat is always 5 (probably TEXTUREFORMAT_RGBA), regardless of what I set it to be. // Render target var renderTarget = new BABYLON.RenderTargetTexture( "depth", 128, // res scene, false, // generateMipMaps true, // doNotChangeAspectRatio BABYLON.Engine.TEXTURETYPE_UNSIGNED_INT, // type false, // isCube BABYLON.Texture.BILINEAR_SAMPLINGMODE, // samplingMode false, // generateDepthBuffer false, // generateStencilBuffer false); //isMulti renderTarget.noMipmap = true; renderTarget.wrapU = BABYLON.Texture.CLAMP_ADDRESSMODE; renderTarget.wrapV = BABYLON.Texture.CLAMP_ADDRESSMODE; renderTarget.wrapR = BABYLON.Texture.CLAMP_ADDRESSMODE; renderTarget.textureFormat = BABYLON.Engine.TEXTUREFORMAT_LUMINANCE; renderTarget.hasAlpha = false; renderTarget.gammaSpace= false; ///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////// My wild guess is that the constructor doesn't pass textureFormat to _renderTargetOptions this._renderTargetOptions = { generateMipMaps: generateMipMaps, type: type, samplingMode: samplingMode, generateDepthBuffer: generateDepthBuffer, generateStencilBuffer: generateStencilBuffer };
  11. Particles dispose

    Hi! Have some problem, I create many particles systems in my game and very often delete them after first playing, but if I using the same texture in particle systems, then after first disposing of particle system the texture disposing too. And I should to cloning each one new texture, but it is bad for performance. How I can to create a particles systems with same texture but after disposing particles system do not dispose texture? Thanks!
  12. Tile Offset

    Hi everyone, I am trying to tile a floor but I would like to create a sort of "offset" in the tiling. I added pictures to best explain what I am trying to do. The first is what I am doing now. The second is what I would like. How can I do this? Thanks!
  13. Hey, I've never used any procedural textures. Where do I place the procedural texture and .fx files to reference? Thanks - I'm sure it's obvious. DB
  14. Hello, Would it be possible to somehow have textures mapped onto meshes and the direction / orientation of these textures dictated by the vertex colour? Assuming I have a piece of furniture: X = Red (the top & the long rails between legs) Y = Green (the short rails between legs) Z = Blue (the legs) I would then specify how the texture could be applied by using something (which doesn't yet exist) like: coordinatesIndex = vertexColor I wouldn't have a clue where to start and am happy to pay to have someone do this for me. Are offers of paid work allowed here? If not, I will amend this post. Thank you.
  15. Is it possible to make the environment texture follow the rotation of the camera?
  16. Mask issue with Webgl renderer

    Hi, All I'm relatively new to PixiJS and I'm facing an issue that I don't understand. # What I'm trying to accomplish : Basically I create a sprite based on an image and then I display a sort of black overlay over it. The idea is to let the user see through the black overlay thanks to a mask which will move according to user mouse movement. The mask texture is created using multiple graphics and filters rendered by a second webgl renderer. # The issue I'm facing Sometimes (~50% of the time) when I reload the page online, the mask is invisible. It's as if there isn't any mask applied to the black overlay whereas locally it's always working. Attached to this post you will find two pictures : The first one represents the black overlay with the mask which is not working. The second one represents the black overlay with the mask which is working. I'm using the latest release of PixiJS and the source code for this example can be found here : https://jsfiddle.net/vhynmdxh/ If someone has already solved this type of issue I would be happy to discuss with them :). Thanks.
  17. I'm using Blender exporter 5.5 I have a mesh that has a bunch of materials. I exported to Babylon.js format but all the textures got merged into one texture. I researched further and discovered that I had 5 materials but only 2 "UV Map" entries. I added a new "UV Map" entry for each material and assigned their textures to use that UV map, but it didn't make a difference. Is there something I'm forgetting to do?
  18. I was wondering whether or not it's possible to get the world/3D positions of different textures using the TerrainMaterial. Let's use the TerrainMaterial PG as an example: https://www.babylonjs-playground.com/#E6OZX#7 Now, let's say I want to place meshes wherever the rock texture is used on the ground mesh, programmatically and automatically. I don't seem to find any relevant references or functions in the object, and can't think of a proper way without loading the image separately and somehow map those colors to the ground positions.
  19. Is it possible to morph textures along with the mesh. In my video I'm using blend shapes from one shape to another but also morphing between textures. Just wondering if this is possible with Babylon.js
  20. Hi, I'm exploring picking options of babylon.js. Doc's example shows how to place the image (first plane) of shot on the second plane. However, if I shoot near the edge of the mesh - texture of the gun shot partly is on the air. Simple example (with white color instead of shot texture): https://www.babylonjs-playground.com/#YHGBXI#2 This looks like air is damaged, what is impossible. So I am thinking about replacing additional plane with additional texture, which will be on the existing one. I've found a way to get picked face ID, coordinates of globally picked point, coordinates of picked point on the specific face/texture - lines 26-28. But I can't find a way to place a non-repeatable (and with image's original size) secondary texture at that point. By "non-repeatable" I mean to draw it only once on a mesh, in the specific place, not 100 times all around it. Is that possible? This way if I shoot near the edge of an object - I'll see only part of the damage image, what is correct.
  21. I am attempting to apply a texture of a diffuse image of the Earth to a sphere and have the scene render with the Earth's north pole aligned with the body/local +Z axis. However, I can't seem to implement this. Below is a playground scenario I put together for the scene. I've attempted to achieve this with the following lines (26 - 33 of the playground scene). earthSurface.diffuseTexture.uAng = 0; earthSurface.diffuseTexture.vAng = 0; earthSurface.diffuseTexture.wAng = 0; earthSurface.diffuseTexture.uScale = 1.0; earthSurface.diffuseTexture.vScale = 1.0; earthSurface.diffuseTexture.wScale = 1.0; http://www.babylonjs-playground.com/#634KR1
  22. basic demo "Video" for HLS source

    Hi, I am trying to draw HLS video stream over PIXI video texture on iOS safari browser (iOS 11.2) referring http://pixijs.io/examples/?v=v4.6.2#/basics/video.js but not succeeded. (Sorry for poor English as I am Japanese) When I set mp4 video as source, the demo code worked over iPhone + mobile safari (OS: 11.2). But when I set url of HLS (m3u8) and tapped play button, video did not drawn. I tried some change but not succeeded to play HLS stream over PIXI video texture. Below is my code, modified part of http://pixijs.io/examples/?v=v4.6.2#/basics/video.js . ... function onPlayVideo() { // Don't need the button anymore button.destroy(); /// modify start // mp4 // (1) mp4 OK : video/audio played (#fvlivedemo.gnzo.com is my own server) // var texture = PIXI.Texture.fromVideo('http://fvlivedemo.gnzo.com/testVideo.mp4'); // (2) mp4 OK : video/audio played // var texture = PIXI.Texture.fromVideoUrl('http://fvlivedemo.gnzo.com/testVideo.mp4'); // HLS // (3) Not work : when play button pressed, loading m3u8 not started. // #http://184.72 ... is effective m3u8 stream // var texture = PIXI.Texture.fromVideo('http://184.72.239.149/vod/smil:BigBuckBunny.smil/playlist.m3u8'); // (4) Not work : when play button pressed, loading m3u8 not started. // var texture = PIXI.Texture.fromVideoUrl('http://184.72.239.149/vod/smil:BigBuckBunny.smil/playlist.m3u8'); // (5) Not work : when play button pressed, loading m3u8 started and audio play started. but video is not drawn on canvas. let baseVideoTexture = PIXI.VideoBaseTexture.fromUrl({ src: 'http://184.72.239.149/vod/smil:BigBuckBunny.smil/playlist.m3u8', mime: 'application/vnd.apple.mpegurl' }); var texture = PIXI.Texture.from(baseVideoTexture); /// modify end // create a new Sprite using the video texture (yes it's that easy) var videoSprite = new PIXI.Sprite(texture); ... Please help/guide me regarding right way/manner to play HLS stream over video texture of PIXI. (i.e. how to fix above code) entire HTML which I modified is attached (pixi_video_hls.html) If more information needed for answer, let me know. Thank you in advance. pixi_video_hls.html
  23. How to cache Texture or BitmapData

    Hello, Can anyone share information how can I do manual caching of sprite or speed up rendering in general? If I have group with lots of small sprites, it's wise to convert it into single BitmapData and use it as texture to prevent recalculations. However, I have no clue how can I do that and what other methods Phaser may offer. Thanks everyone in advance for help
  24. Hi, I want to create a PIXI.Texture from a WebGL texture. Unfortunately, I have not managed this yet. My WebGL texture is created in the same WebGL renderer context as Pixi with "app.renderer.gl.createTexture()", this works. Then I tried to hang the texture on an existing PIXI.Texture: sprite.texture.baseTexture._glTextures[app.renderer.CONTEXT_UID].texture = myWebglTexture Even without the above line of code, my WebGL texture is inserted into the canvas element and always swaps the texture of the last sprite loaded in Pixi (Change-Texture.mp4). Unfortunately I cannot swap my WebGL Texture with a texture of a specific sprite :-( This approach has unfortunately not worked: Does anyone have an idea? Thank you for your support!
  25. Hi, I’m new in Babylon.js, and trying to use it to create geometry that has animated vertices and an animated procedural texture. I’m animating the vertices in the vertex shader. For the procedural texture, I tried to follow the instructions: https://doc.babylonjs.com/how_to/how_to_use_procedural_textures as well as checked the playground example. https://www.babylonjs-playground.com/#24C4KC#17 the problem with the example is that i can’t really find a complete implementation with the shader/config.json files. And i have a couple of basic questions as well. When creating a custom procedural texture with an external folder with the config.json and custom.fragment.fx files, is that the only fragment shader that can be used in the scene? Or can a BABYLON.ShaderMaterial be used additionally? I'm having a hard time grasping the concept of a ’fragment shader’ procedural texture VS a fragment shader as the last step of the webgl pipeline. Thanks.