Siberia

Members
  • Content Count

    11
  • Joined

  • Last visited

About Siberia

  • Rank
    Member

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Yeah, it works well 😅 but with some meshes, I need to apply specific effects. Waving the sampler from the center of the meshes. So in this specific case, I Just need to map the mesh uv coords to the sampler coord. I'm preparing a demo 😀
  2. It's me again with a little question. 😅 I try to map my uv coord to the sampler coord in my fragment shader, but i have problem when i move or zoom in zoom out the mesh. I tried this : - Put MeshDimensions as a uniform to the fragment shader - Pass translationMatrix as varying to the fragment To map my uv coord to the sampler coord, i'm doing this in the fragment : vec2 mappedCoord = (vec3(uv * meshDimensions,1.0) * translationMatrix).xy / canvasDimensions); But it dont work... i know how to do it in custom shader for filters, but not for a mesh.
  3. The PIXI shaman spoke, we listen. 🙌 It works!! Thanks Ivan!
  4. Ok, so, i think I have a brain lock. This is often the case when i'm working with matrix and projection 😄 Here the vertex : precision mediump float; attribute vec2 aVertexPosition; attribute vec2 aUvs; uniform mat3 translationMatrix; uniform mat3 projectionMatrix; uniform vec2 canvasDimensions; varying vec2 vUvs; varying vec2 vSamplerUvs; void main() { vUvs = aUvs; vSamplerUvs = ((translationMatrix * vec3(aVertexPosition, 1.0)).xy - (mesh position?)) / (canvasDimensions?); gl_Position = vec4((projectionMatrix * translationMatrix * vec3(aVertexPosition, 1.0)).xy, 0.0, 1.0); }
  5. Here an example. The background is a container holding n container, which have all a zIndex. - The background, natural elements, objects, etc. - n Characters - n Line of sight (a pixi mesh, bound to characters) Basically, I need to get a render texture from the background, and pass it to meshes (n meshes). Meshes are in fact quad, with specific shaders to render the line of sight. I just need to pass a portion of the render texture to the mesh, where the texture will be rendered in grayscale.
  6. Hey, it's working fine! I just need to calculate a matrix and pass it to the vertex shader to position the texture correctly. By the way, is there a method to calculate a matrix automatically based on the properties of the target container? Another point, you have to make two render calls to display the layer to the screen (in addition to generating the cache). I was wondering if it would be good to use the rendered texture in a sprite rather than calling render twice? And thanks again Ivan!
  7. Oh, thanks Ivan! I could steal some code from pixi-layers. we just need the getRenderTexture method : I Need : - LayerTextureCache (without double buffer support) - LayerTextureCache handling in our own PIXI.Container subclass - especially in render method I didn't miss anything?
  8. Hello ! I have questions concerning performance and "RenderTexture and/or filter" for a specific case. The context : Our canvas is a big container with a lot of layers, here the order of rendering : 1 - Background image layer (a huge texture) 2 - A tile layer, a container that hold x sprites (furniture, etc.) 3 - A character layer, a container that hold x sprites controlled by players 4 - A lighting layer, container that hold individual animated light sources AND vision sources for characters (PIXI.meshes and custom shaders) 5 - A controls layer that hold x PIXI.Graphics objects Some characters have a nightvision, it wouldn't be a problem if their nightvision wasn't grayscale. To handle the grayscale, we need to turn to gray the background layer, the tile layer and the character layer into the field of vision The option we have retained : 1. Create a RenderTexture on layer 1/2/3 (only when 1/2/3 have changed), and process the texture in layer 5 in a PIXI.Mesh with a custom fragment and vertex shader. 2. Create a RenderTexture on layer 1 only (only when 1 has changed), and use filters on individual sprites in 2/3, only when necessary. Often, it would be less that 15 sprite, but sometimes more that 15. Above all, we are looking for the best performance. Option 1 has big advantages, but an acquaintance tells me that the option with the filters would undoubtedly be more efficient. You see, the "probably" is a problem. But it is true that the layers 1/2/3 can be particularly heavy, with huge textures and a lot of sprites. Do you have any advice on which option to choose? Thanks.
  9. Thank you Ivan. By the way, is it possible to import 2D/3D models from blender, and to create a Geometry object from them?
  10. Thank you Ivan! I updated the playground by putting the Graphics geometry into the mesh and I updated the vertex shader with a quick hack for the UVs. It works! But... i can't specify the width and height of the mesh. It is tied to the graphics object. Other thing, I want my graphic object to be hidden (it is just serving for primitive drawing). Putting grapher.visible = false make the mesh to not render.
  11. Hello ! My objective is to create geometry for meshes. until now I used quad so it wasn't too hard to model. Now I want to model some more complex shapes in 2D, polygons, cones etc. Since I'm familiar with the tools exposed in the PIXI.Graphics object, I thought to myself, "Hey! I'm going to draw a shape, and reuse the geometry contained in the Graphics object." Here is my playground: https://www.pixiplayground.com/#/edit/jGD8JtmKkkbseor6FklBg But after a few tests, it doesn't look so easy. What did I miss? Thanks for your help!