Search the Community

Showing results for tags 'Texture'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • HTML5 Game Coding
    • News
    • Game Showcase
    • Facebook Instant Games
    • Web Gaming Platform
    • Coding and Game Design
  • Frameworks
    • Phaser 3
    • Phaser 2
    • Pixi.js
    • Babylon.js
    • Panda 2
    • melonJS
    • Haxe JS
    • Kiwi.js
  • General
    • General Talk
  • Business
    • Collaborations (un-paid)
    • Jobs (Hiring and Freelance)
    • Services Offered

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


Website URL


Twitter


Skype


Location


Interests

Found 276 results

  1. After wrestling countless hours with a problem (as you do) I came to the conclusion that ShaderMaterial is broken in 3.0. More accurately, I can't get texture lookups to work in vertex shader. But they do work in Babylon 2.5 Add this line in any vertex shader and it wont compile: vec4 test = texture2D(textureSampler, uv);
  2. Hi guys! Found very usefull feature in three.js like ability to set texture as scene backgorund (https://threejs.org/docs/index.html#Reference/Scenes/Scene) Is there something like that in Babylon? Ofcourse we can use two scenes, put texture to plane, setup orthogonal camera... but it's very uncomfortable
  3. var sprites = {};var loader = PIXI.loader.add('cloudstars',"imgs/cloudstars.jpg").load(function (loader, resources){ sprites.cloudstars = new PIXI.TilingSprite(resources.cloudstars.texture); }).add('star1',"imgs/star1.png").load(function (loader, resources){ sprites.star1 = new PIXI.TilingSprite(resources.star1.texture); }).add('star2',"imgs/star2.png").load(function (loader, resources){ sprites.star2 = new PIXI.TilingSprite(resources.star2.texture); }).add('star3',"imgs/star3.png").load(function (loader, resources){ sprites.star3 = new PIXI.TilingSprite(resources.star3.texture); }).add('star4',"imgs/star4.png").load(function (loader, resources){ sprites.star4 = new PIXI.TilingSprite(resources.star4.texture); }).add('ship',"imgs/ship_blue.png").load(function (loader, resources){ sprites.ship = new PIXI.Sprite(resources.ship.texture); }).add('shield_straight',"imgs/shield_straight.png").load(function (loader, resources){ sprites.shield_straight = new PIXI.Sprite(resources.shield_straight.texture); }).add('shield_edge',"imgs/shield_edge.png").load(function (loader, resources){ sprites.shield_edge = new PIXI.Sprite(resources.shield_edge.texture); }).add('title_ship',"imgs/title_ship.png").load(function (loader, resources){ sprites.title_ship = new PIXI.Sprite(resources.title_ship.texture); }).once('complete',function(){ var ready = setTimeout(accountSetup,3000); }).load() This seams to work but I figure it's not the correct way to do this as I kinda guessed my way through some of it. Will this cause a problem if I use this method to load all of my sprites into a standard javascript array for use later on? Talking about maybe 100 sprites including some tiling sprites for backgrounds. Also do I really need to use the "loader" in "function(loader, resources)" part? I don't seem to use it inside the function.
  4. I have image with 10 different animations and 10 frames by each. On stage i have for example set of 15 diffeent objects with animation. From time to time i have to change set of animation to another.One way to do this is to delete objects and create them once more with needed sprite animation. var moveClip1 = createClip(5); createClip: function (symbolIndex) { var image = this.game.getImage('images.symbols'); var base = new PIXI.BaseTexture(image); var textures = []; for (j = 0; j < steps; j++) tempTexture = new PIXI.Texture(base, { x: this.game.symbolHeight * j, y: this.game.symbolWidth * symbolIndex, width: this.game.symbolHeight, height: this.game.symbolWidth }); textures.push(tempTexture); symbolClip= new PIXI.extras.AnimatedSprite(textures); } return (symbolClip) } Is it posible to change texture in existing object moveClip1 without recrete the object?
  5. Hi all! I made map with buildings and trees. The map using 15 textures 512x512. What better for performance? Using 15 textures 512x512? Or combine all map textures in one big texture 2048x2048? and what the max size of the texture babylon is supports? Thank!
  6. Hello Pixi Developers and Users I would like first to present myself as a newcomer on the forum Im the developer of Sprite Basic Compiler Game Engine (https://spritebasic.com) It is cloud-based game framework with Basic Scripts tha uses PixiJS as web gl renderer as long as PhysicsJS as physic engine and other librairies needed to create games like the great Kittykatattack libraries For a demo and future game i wish to write, i would like to create a tilingsprite, for an endless scrolling background, made of an array of textures What would be the best strategy to concatenate an array of textures, that might be of different dimensions, into a large tiling sprite, either in portrait or landscape. Would create a large tiling sprite make performances suffer or the part of the tiling sprite that is not currently displayed would be correctly cropped from screen display/calculations? Many thanks if you can light me on this specific Benoit
  7. I am having a really weird issue. I am creating a sprite from an image as follows: var grid1_texture = new PIXI.Texture.fromImage("whiteSquare.jpg"); Now this works fine and I am able to add this to the stage. I basically want to add all my images to a folder called resources. So I changed the code to this: var grid1_texture = new PIXI.Texture.fromImage("/resources/whiteSquare.jpg"); When I do this, I can see in Brackets editor that the link is being made however, the sprites do not get rendered on the stage. Please help!
  8. Hi, I'm having a serious problem with the memory usage, sometimes it hits 2.6 GB and never goes lower. I have to load some textures that vary from 10 MB to 40 MB for each map, and I'm sure that when I'm switching between the maps this textures are not being removed from the memory, so the problem grows every time you switch the Maps. To test the issue, I created a 186 MB image then I loaded it into PIXI and got a big black texture throwing a lot of WebGL errors, maybe because it's too big? Anyway, I noticed that the RAM consuption grows a lot on the task manager, so I started my attemps to remove it from there, but I could not do it. I tried to do the following: texture.destroy(true); texture = null; But I got nothing, so I did this (to throw everything away): for (key in PIXI.utils.TextureCache) { PIXI.utils.TextureCache[key].destroy(true); } But the memory still remains with the same size, so I tried to use the destroy(true) and after I run the GC, the RAM lowered a bit but I was still able to notice the 186 MB texture being loaded. What must I do to remove it from there?
  9. Hi, I'am noob I made texture creator tool using this tutorial ( http://phaser.io/examples/v2/create/gen-paint ). It's cool. But I don't know how set animation to texture from above tutorial. Could you please let me know how set animation to texture from array string?
  10. Hi there, I'm fairly new to Babylon.js, having previously worked mostly on OGL and Apple's SceneKit (similar to Babylon) outside the browser. As such, I'm probably overlooking something absolutely basic, so please bear with me if I ask a stupid question. I'm tasked with bringing a 3D visualisation app to the web that until now only runs natively on a workstation. One task that I assumed should be simple involves materials. I can, and did, assign file-based images to materials (diffuse and specular maps), and this works really nice. But after looking through the extensive (thank you!) documentation, I'm slightly at a loss at how to achieve the following: The original (non-web) application usually takes a client-provided image (file, usually TIFF or PDF), and then applies a filter onto that image to generate the specular map. This generation process can be quite involved, and may include importing other client-provided imagery. I have written some javascript code that imports the file to a HTML 5 canvas, runs the filter, and then creates the JavaScript image object from the result as follows: function convertCanvasToImage(canvas) { var image = new Image(); image.src = canvas.toDataURL("image/png"); return image; } So far, so good. Now, for the life if me, I don't know how to get this image into a material property; at least not short of writing this as a file somewhere, and then loading it - this can create a ton of issues and I'd like to avoid that. There is brilliant support for a file-based image that take a string as the name, e.g.; materialMain.diffuseTexture = new BABYLON.Texture("textures/owl90.png", scene); But there is no way that I have found to do the same with a JavaScript object of type Image. I have looked at procedural textures that can be based on image files, but their setup appears a bit too inflexible (relying on cofig files) and complex/big calibre (animations, shader) for something that I assume to be basic. So, how can I load an image / html 5 context (basically an RGBA raster image) into a babylon texture? I can't be the first poor sod to try this - what am I overlooking? Thanks for any help, -ch
  11. I need to put thousands of shapes onto the stage and currently I am using PIXI.Texture. I draw a PIXI.Graphics object and then convert it into a PIXI.Texture using `RenderTexture.create().` Then I get a sprite from the texture. But when I draw 3000+ objects on my canvas I find the FPS drops from 60 to 30. If I draw 6000+ objects it slows down everything and my laptop's fan is roaring. I have found some article on https://stackoverflow.com/questions/23468218/draw-10-000-objects-on-canvas-javascript saying that in pure canvas javascript we can just put different layers/canvas on the screen to draw like 1000 object in one layer. But what is the correct way to do this in PIXI?
  12. WebGL supports textures with base64 Data URLs, following the format "data:[mediatype];base64,[base64string]". However, when creating a texture with a data url in Babylon.js: var material = new BABYLON.StandardMaterial("0", scene);material.diffuseTexture = new BABYLON.Texture(dataUrl, scene, false, false, BABYLON.Texture.CUBIC_MODE);the browser console shows the message: Uncaught TypeError: Cannot read property 'replace' of nullIs there something I am doing wrong or is it a Babylon bug? I attached my full test code. Thanks! index.html
  13. Hello everybody! I am trying to make text plane fade animation with text drawn on texture. I created dynamic texture, then created a plane. Next I set texture as textPlane.material.diffuseTexture. Texture has hasAlpha set to true. this.textPlaneTexture.drawText(text, null, 50, 'bold 100px Roboto Mono', 'pink', 'transparent'); So there's text written on the screen with transparent background. Next I created fade animation like: var fading = new BABYLON.Animation.CreateAndStartAnimation('fade' + this.text, this.textPlane.material, 'alpha', 30, 30, 1, 0, 0, null, () => { this.textPlane.isVisible = false; }); But here's the problem when the alpha is changing then also backgorund is getting from black to transparent. Why is that? How to prevent changing background alpha, only the text is alpha change is expected. Here's the playground http://playground.babylonjs.com/#28LOAX
  14. Is there anyway to load textures in pixi synchronously? Something like: var texture = PIXI.Texture.fromImage("bunny.png", onComplete);function onComplete() { var bunny = new PIXI.Sprite(texture);}I tried the following: var img = new Image();img.onload = onComplete;img.src = 'bunny.png';function onComplete() { var texture = new PIXI.Texture(new PIXI.BaseTexture(img));}But the code above gives me an error. Please let me know how I can make pixi work synchronously. Thank you in advance. EDIT: I also tried: http://jsfiddle.net/8MawM/ but, it also gives me an error.
  15. I am trying to create a fragment shader via a PIXI.AbstractFilter to create a wave rippling effect to be applied to a background texture. I have already worked out the algorithm for the wave effect in JavaScript. What I am having difficulty doing is getting the data I need into the shader through PIXI. For my effect to work, I need to have a large Float32Array to keep track of wave heights and a texture containing the original, unaltered contents of the background image to read from in order to apply the effect of pixel displacement (light refraction). I've been doing a lot of searching and have come up with some partial solutions. I attempt to load my large Float32Array into the shader as a texture with type GL.FLOAT (with the OES_texture_float extension) and an internal format of GL.LUMINANCE and read from it. From what I can tell, my shader isn't receiving my data the way I need it to. Just as a test, I set gl_FragColor to read from my data texture, and instead of the solid black that should have appeared, it rendered a color from either the source texture or the texture of the sprite that the filter is applied to.If I weren't using PIXI, what I would try next is to use gl.getUniformLocation, but it takes the current program as its first parameter, and I don't know of a way to access that. The basic flow of my shader needs to go: Read From Array -> Calculate displacement based on value -> Render the current fragment as the color at x+displacement, y+displacement -> Get updated version of array This is my code in the constructor for my shader: ws.Shader = function(tex) { // GLSL Fragment Shader for Wave Rendering ws.gl = game.renderer.gl; ws.flExt = ws.gl.getExtension("OES_texture_float"); var unis = { dataTex: { type: "sampler2D", value: ws.gl.TEXTURE1 }, canvasTex: { type: "sampler2D", value: ws.gl.TEXTURE2 }, mapSize: { type: "2f", value: [ws.width+2,ws.height+2] }, dispFactor: { type: "1f", value: 20.0 }, lumFactor: { type: "1f", value: 0.35 } }; var fragSrc = [ "precision mediump float;", "varying vec2 vTextureCoord;", "varying vec4 vColor;", "uniform sampler2D uSampler;", "uniform sampler2D dataTex;", "uniform sampler2D canvasTex;", "uniform vec2 mapSize;", "uniform float dispFactor;", "uniform float lumFactor;", "void main(void) {", "vec2 imgSize = vec2(mapSize.x-2.0,mapSize.y-2.0);", "vec2 mapCoord = vec2((vTextureCoord.x*imgSize.x)+1.5,(vTextureCoord.y*imgSize.y)+1.5);", "float wave = texture2D(dataTex, mapCoord).r;", "float displace = wave*dispFactor;", "if (displace < 0.0) {", "displace = displace+1.0;", "}", "vec2 srcCoord = vec2((vTextureCoord.x*imgSize.x)+displace,(vTextureCoord.y*imgSize.y)+displace);", "if (srcCoord.x < 0.0) {", "srcCoord.x = 0.0;", "}", "else if (srcCoord.x > mapSize.x-2.0) {", "srcCoord.x = mapSize.x-2.0;", "}", "if (srcCoord.y < 0.0) {", "srcCoord.y = 0.0;", "}", "else if (srcCoord.y > mapSize.y-2.0) {", "srcCoord.y = mapSize.y-2.0;", "}", /*"srcCoord.x = srcCoord.x/imgSize.x;", "srcCoord.y = srcCoord.y/imgSize.y;",*/ "float lum = wave*lumFactor;", "if (lum > 40.0) { lum = 40.0; }", "else if (lum < -40.0) { lum = -40.0; }", "gl_FragColor = texture2D(canvasTex, vec2(0.0,0.0));", "gl_FragColor.r = gl_FragColor.r + lum;", "gl_FragColor.g = gl_FragColor.g + lum;", "gl_FragColor.b = gl_FragColor.b + lum;", "}"]; ws.shader = new PIXI.AbstractFilter(fragSrc, unis); // Send empty wave map to WebGL ws.activeWaveMap = new Float32Array((ws.width+2)*(ws.height+2)); ws.dataPointerGL = ws.gl.createTexture(); ws.gl.activeTexture(ws.gl.TEXTURE1); ws.gl.bindTexture(ws.gl.TEXTURE_2D, ws.dataPointerGL); // Non-Power-of-Two Texture Dimensions ws.gl.texParameteri(ws.gl.TEXTURE_2D, ws.gl.TEXTURE_MIN_FILTER, ws.gl.NEAREST); ws.gl.texParameteri(ws.gl.TEXTURE_2D, ws.gl.TEXTURE_WRAP_S, ws.gl.CLAMP_TO_EDGE); ws.gl.texParameteri(ws.gl.TEXTURE_2D, ws.gl.TEXTURE_WRAP_T, ws.gl.CLAMP_TO_EDGE); ws.gl.texImage2D(ws.gl.TEXTURE_2D, 0, ws.gl.LUMINANCE, ws.width+2,ws.height+2,0, ws.gl.LUMINANCE, ws.gl.FLOAT, ws.activeWaveMap); // Send texture data from canvas to WebGL var canvasTex = ws.gl.createTexture(); ws.gl.activeTexture(ws.gl.TEXTURE2); ws.gl.bindTexture(ws.gl.TEXTURE_2D, canvasTex); // Non-Power-of-Two Texture Dimensions ws.gl.texParameteri(ws.gl.TEXTURE_2D, ws.gl.TEXTURE_MIN_FILTER, ws.gl.NEAREST); ws.gl.texParameteri(ws.gl.TEXTURE_2D, ws.gl.TEXTURE_WRAP_S, ws.gl.CLAMP_TO_EDGE); ws.gl.texParameteri(ws.gl.TEXTURE_2D, ws.gl.TEXTURE_WRAP_T, ws.gl.CLAMP_TO_EDGE); ws.gl.texImage2D(ws.gl.TEXTURE_2D, 0, ws.gl.RGBA, ws.gl.RGBA, ws.gl.UNSIGNED_BYTE, tex.imageData); } I then attempt to update dataTex in the ws object's update loop: ws.activeWaveMap.set(ws.outgoingWaveMap); // WebGL Update ws.gl.activeTexture(ws.gl.TEXTURE1); ws.gl.bindTexture(ws.gl.TEXTURE_2D, ws.dataPointerGL); /* // Non-Power-of-Two Texture Dimensions ws.gl.texParameteri(ws.gl.TEXTURE_2D, ws.gl.TEXTURE_MIN_FILTER, ws.gl.NEAREST); ws.gl.texParameteri(ws.gl.TEXTURE_2D, ws.gl.TEXTURE_WRAP_S, ws.gl.CLAMP_TO_EDGE); ws.gl.texParameteri(ws.gl.TEXTURE_2D, ws.gl.TEXTURE_WRAP_T, ws.gl.CLAMP_TO_EDGE);*/ ws.gl.texImage2D(ws.gl.TEXTURE_2D, 0, ws.gl.LUMINANCE, ws.width+2,ws.height+2,0, ws.gl.LUMINANCE, ws.gl.FLOAT, ws.activeWaveMap); I'm sure that plenty of this isn't right, but I believe that I can sort things out once I can get to the point where I can actually access my data. Can anyone point me in the right direction? This is central enough to my project that I am willing to discard PIXI altogether if there isn't a way to implement what I am trying to do. Also, I am using PIXI via Phaser, if that makes a difference. Thanks!
  16. Adem

    Texture

    what if i only want to optimize only the texture in a scene?
  17. I want to process images on client side with JIMP (awesome library for image processing). I read png file with jimp, modify it with jimp, and then I've get stucked, because I don't know how to display it in pixi.js. The image read by jimp has bitmap property, and when I trying to create texture from it (with PIXI.Texture.from for example I gets errors). Thanks in advance!
  18. hi everyone. i'm developing 2d MMORPG. and image using jpg or png but this is slow and need more memory so it give many problems on mobile by overflowed memory so i'm consider to use texture compress.. but this is not easy because our game need supporting mobile (ios, android) DXT: supported by all desktop devices and some Android devices PVR: supported by all iOS devices and some Android devices ETC1: supported by most Android devices so i need auto-generater. anybody know this? like this https://blog.playcanvas.com/webgl-texture-compression-made-easy/ thx everybody
  19. Hello, Does anyone have an example on how to stream video into a scene and apply onto an object as part of a dynamic texture? Also, in a recent post I had discovered that when loading many textures into an array, the last texture is often not loaded, and sometimes more than one texture is not loaded. My solution was to load a small texture that isn't used in the scene, and then all textures except the last "dummy" texture almost always loads correctly. Wingnut and others tried to duplicate this in the playground as I did, as was unable to reproduce. However, I recall using a babylon.js function which waits for all textures to load before continuing the script. I can't locate the scripts I used this, so if anyone can point me to these functions, I would be grateful. However, streaming video onto an object as a texture (dynamice texture) is most important - as I'll find my lod scripts on disk when I really put in the effort. It's simply that I need streaming video on an object as soon as I can figure it out, or as soon as one of the genius' on this forum provides an example. As always, thanks for any help you might provide. Cheers, DB
  20. Hi All, I would like to replicate what I've seen referred to in some other engines as ambient light; definition of which being - a light that evenly distributes across the whole scene; e.g. doesn't cast shadows, and no parts of the scene look darker / lighter than any other part. My main reason for this is that some of my 3D content will need to have light maps baked into the texture images (for other non-babylon related reasons) & I don't want the default babylon lighting to affect this light mapping. I noticed this topic asked a similar question: However the answer they got didn't seem to satisfy my requirements. So far - the closest solution I've come up with is: for (i = 0; i < newScene.lights.length; i++){ newScene.lights.setEnabled(false); } for (i = 0; i < newScene.materials.length; i++){ newScene.materials.emissiveColor = newScene.materials.diffuseColor; } However this assumes that all my materials are solid colour; which they won't all be; some of them will have image textures. Any advice would be greatly appreciated? Perhaps a shader effect might be of help? Thanks!
  21. Hi, I want to know if its possible to create a mesh using Blender and then to define its properties in the BabylonJS code. For example i want to create a wall using the cube in the blender and name it as WALL1. Then after loading the .babylon file generated the blender i want to define the color, texture and materials of the wall using the name of the wall WALL1. Thanks, Raghavender Mylagary
  22. I'm changing the texture of a model in runtime by creating a material with diffuseTexture and setting it as the material of all meshes related to the model. I have 2 questions about that: 1) mesh.material is null before I set it to the material I created (even when I test it seconds after initialization), and the model's texture is still rendered. It leads me to believe the mode's texture/material is set somewhere else, and mesh.material simply overrides the original place. What is the parameter that contains the texture/material of the mesh? I wasn't able to find it when I went over a mesh's properties in the dev tools. 2) The material with diffuseTexture I create in runtime is "shiny" (reflects light more I guess?). The puppy on the right is the result of the original material before my intervention, as specified in the .babylon file - this is the result I'd like to achieve. The puppy on the left is the result after I set the material of all meshes to the one I create in runtime - the "shiny" one. Here's some relevant info from the .babylon file: "materials":[{"name":"puppy.puppy_mat","id":"puppy.puppy_mat","ambient":[1,1,1],"diffuse":[0.8,0.8,0.8],"specular":[0,0,0],"emissive":[0,0,0],"specularPower":12,"alpha":1,"backFaceCulling":true,"checkReadyOnlyOnce":false, "diffuseTexture":{"name":"puppy_01.jpg","level":1,"hasAlpha":1,"coordinatesMode":0,"uOffset":0,"vOffset":0,"uScale":1,"vScale":1,"uAng":0,"vAng":0,"wAng":0,"wrapU":1,"wrapV":1,"coordinatesIndex":0}}] Which parameter is responsible for what I'd like to achieve (making the material not "shiny") and how/where do I set it in runtime?
  23. Hi everyone! Is there any way in Babylon.js to combine multiple textures, or multiple materials on a single mesh? I know that it's possible to apply diffuseTexture and ambientTexture on a material at once, but if I have more than 2 textures? Like for example, it could be used for maps, having a satellite image as a base texture and then applying a layer with ambient pressure, and UV index, and wind direction, and whatnot
  24. Hi All ; I m wondering "What is the largest texture size of BabylonJS supported ?" jpg or png texture file type, As a result of my experiments I think 16384 x16384 pixel its right? I m trying to make slideshow like carousel on mesh ; I made it with uScale and UOffset method its workin good but, Is there another way or Idea for that ? Which is the best route ? I look forward to your precious thoughts , Thank you All with my respects.
  25. Hi folks! New to the forum, but looking forward to share and consume knowledge and info with you all! I have a question that has bothered me since I started writing shader based gl code, how to optimize the rendering pipeline in the best way. I've read a ton of GL books and gone through countless tutorials on the subject, but each one just touch on the basics on how to get things working. Not on how to actually set up a optimized and clean rendering pipeline for a working graphics engine. The parts that stand out in my case is how to handle textures and shader programs in a good way, and what standard I should go after when it comes to handling of these precious resources. The basic question is, how many times in a single render cycle should I be allowed to change 1: Shader program, 2: Texture. To take a real life example on a game that I'm working on at the moment, I first do my general rendering that uses 2 textures, one for sprites and one for font sprite, the textures I use is quite big, 2048^2 as I'm working on a HD version of a game. Here I'm pondering on perhaps using a 4096^2 texture as well, looks like most devices can handle these sizes, and I can cram in pretty much all the gfx assets I need onto one of those babies. But is it good practice? Do I win anything when it comes to rendering speeds, and is the win big enough to handle large complex sprite maps like that? Or can I have 50-100 different texture images that I can pick from during a single cycle? The second parameter is the Shader program, same goes here really, I have a shader for general sprite handling, and a shader for the font renderer, but I also have a special shader for post fx that I use for a FBO that will be the final scene in the pipeline. I think there will be more programs involved here, and I might need to switch between them during a rendering cycle. Is it to much to switch shader programs 10-20 times in a single cycle, or is it within acceptable limits? The engine setup that I have today works quite well and I can't really find any problems with rendering speeds, but I want to push this a bit, I'm working on a particle engine that I want to use, and that will bring an additional shader into the equation, and probably additional sprite maps that needs loading. What I really want to know is if there's some kind of standard to comply with, would be great to have some frames to work within when it comes to the rendering pipeline. Looking forward to hear your input on these questions!