Search the Community

Showing results for tags 'webgl'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • HTML5 Game Coding
    • News
    • Game Showcase
    • Coding and Game Design
  • Frameworks
    • Phaser
    • Pixi.js
    • Babylon.js
    • Panda.js
    • melonJS
    • Haxe JS
    • Kiwi.js
  • General
    • General Talk
  • Business
    • Collaborations (un-paid)
    • Jobs (Hiring and Freelance)
    • Services Offered

Found 322 results

  1. According to http://webglstats.com/ more than 90% of all devices now support WebGL! Does that mean that it's (finally) time to make 3D games? Has someone had an experience in selling/distributing webgl based games? And what do you guys think about 3d web games in general?
  2. Job posting: Studio FOW Position: Senior Game Software Developer We are looking for a game senior game software developer who can deliver web based, HTML5 game applications. The role includes overseeing the full development life cycle including identifying the correct technology, architecting the application, security, testing and deploying. This is not a project management role, you will be solely responsible for implementation. The candidate must have a proven track record in developing web based game applications using WebGL and JavaScript. We expect the developer to be able to deliver a modular, scalable and testable application. You will also be working to integrate existing pre-rendered video content into the game which forms an integral part of the player experience. Experience in enterprise grade applications would be necessary so the design could address high availability requirements (i.e. load balancing and clustering). Experience in WebSockets and relational databases such as MySQL, PostgreSQL or MS SQL are required. Experience in ECMAScript, NodeJS and node-webkit would be a plus. Knowledge of other programming languages (Python, C++) is also a plus. Please note that this game will be of the adult (18+) variety and therefore anyone uncomfortable with such content need not apply. A portfolio and references will be required during the interview process.
  3. Hi guys how can I connect all spheres using a tube from all to all, dynamically depending on the number of spheres and always from all to all like the example below, someone could send me a example or an tutorial ?plz
  4. Just new model (now no coding only updated new model)
  5. Made anything cool with pixi.js? Post it up here and share it with the world! Whether it's a cool game demo, a fully fledged website or some tripped-out crazy experiment, we would all LOVE to see it! To get the ball rolling, here are some pixi.js projects that exist out on the internets: games: http://www.goodboydigital.com/runpixierun/ http://www.theleisuresociety.co.uk/fightforeveryone/ http://flashvhtml.com/ experiments: http://gametest.mobi/pixi/morph/ http://gametest.mobi/pixi/balls/ http://www.goodboydigital.com/pixijs/bunnymark/
  6. Hello there! I have created a game with this amazing engine and Tiled app. I've added a scrolling background image layer to the map in Tiled, but when i modify the renderer to me.video.WEBGL or me.video.AUTO in the me.video.init function in the js file, the background imagelayer doesn't show up to me. If I use the canvas renderer than it's working fine. I've tested this three different application with the latest melonjs and boilerplate. Is it a known issue? It would be great if someone could give answer for me. Thanks!
  7. First release of full game Perplexus Shadow Open Where everybody could buil its own perplexus and upload and than everybody could play it (on all web browsers with webgl) Also on phones android and iphones Added Physics "Physics step" where 10 or 15 works as fast as 50 on desktop with nvidia 960 ... you can continue playing on check points on which you go through ... https://ajna4taiga.no-ip.org/PerplexusShadowOpen/PerplexusProd.html or update https://ajna4taiga.no-ip.org/PerplexusShadowOpen/Home.html I will release soon vidows how to build your own perplexus like lego ...and videos how to play each level from start to end.
  8. Hi All, Half a year ago I started doing game development during my lunch breaks! It's a side-scrolling shooter using Crafty.js It's still work in progress, and the code (and demo) can be found here: https://github.com/matthijsgroen/game-play I also did a presentation about it at work, which can be found here: (at the 10 minute mark the demo begins, which shows the development proces in-game!) At the moment I'm still in the process of creating graphics, and deciding which resolution it will become, and pick between Canvas or WebGL.
  9. Hey there, I've recently started to dig my way more into three.js in order to build my own image-viewer-app as my first three.js project. I'm using three.js r83 and both the EffectComposer aswell as the Shader/RenderPass from the three.js examples. (View on github) Since I'm familiar with other programming languages I was able to figure out a lot of stuff on my own, but currently I'm struggling with this specific problem: My App should be able to add post-processing effects to the currently viewed image. The post-processing part already works like a charm, but I would like to add more effects as I want to test/experiment around with some new sorts of possibilities for an image-viewer. Since I'm obsessed with performance, I came up with some ideas on how to scale the post-processing into different EffectComposers in order to keep weight (Number of Shaders to render) on each Composer low and therefore it's performance high. What I did: After debugging both the EffectComposer and Shader/RenderPass from the three.js examples, I came up with the idea to render a texture, that I'm able to re-use as a uniform in another Composer later on. This would enable me to encapsulate and pre compute whole post-processing chains and re-use them in another Composer. While I was debugging through the ShaderPass, I found what I think is the key element to get this to work. I won't post the Code here as it's accessible via github, but if you have a look into the ShaderPass.js on Line 61 you can see the classes' render function. The parameter writeBuffer is a WebGLRenderTarget and, afaik, it is used to store what the composer/renderer would usually put out to the screen. I've created 2 identical Composers using the following code: var txt = testTexture; var scndRenderer = new THREE.WebGLRenderer({ canvas: document.getElementById("CanvasTwo"), preserveDrawingBuffer: true }); scndRenderer.setPixelRatio(window.devicePixelRatio); var containerTwo = $("#ContainerTwo")[0]; scndRenderer.setSize(containerTwo.offsetWidth, containerTwo.offsetHeight); console.log("Creating Second Composer."); console.log("Texture used:"); console.log(txt); var aspect = txt.image.width / txt.image.height; var fov = 60; var dist = 450; // Convert camera fov degrees to radians fov = 2 * Math.atan(( txt.image.width / aspect ) / ( 2 * dist )) * ( 180 / Math.PI ); var scndCam = new THREE.PerspectiveCamera(fov, aspect, 1, 10000); scndCam.position.z = dist; var scndScene = new THREE.Scene(); var scndObj = new THREE.Object3D(); scndScene.add(scndObj); var scndGeo = new THREE.PlaneGeometry(txt.image.width, txt.image.height); var scndMat = new THREE.MeshBasicMaterial({ color: 0xFFFFFF, map: txt }); var scndMesh = new THREE.Mesh(scndGeo, scndMat); scndMesh.position.set(0, 0, 0); scndObj.add(scndMesh); scndScene.add(new THREE.AmbientLight(0xFFFFFF)); //PostProcessing scndComposer = new THREE.EffectComposer(scndRenderer); scndComposer.addPass(new THREE.RenderPass(scndScene, scndCam)); var effect = new THREE.ShaderPass(MyShader); effect.renderToScreen = false; //Set to false in order to use the writeBuffer; scndComposer.addPass(effect); scndComposer.render(); I then modified three's ShaderPass to access the writeBuffer directly. I added a needsExport property to the ShaderPass and some logic to actually export the writeBuffers texture: renderer.render(this.scene, this.camera, writeBuffer, this.clear); //New Code if (this.needsExport) { return writeBuffer.texture; } I then simply set the needsExport for the last pass to true. After rendering this pass, the texture stored in the writeBuffer is returned to the EffectComposer. I then created another function inside of the EffectComposer to just return the writeBuffer.texture, nothing too fancy. The Issue: I'm trying to use the writeBuffers texture (which should hold the image that would get rendered to screen if I would have put renderToScreen to true) as a uniform in another EffectComposer. As you can see in code block 1, the texture itself isn't resized or anything. The used texture got the right dimensions to fit into a uniform for my second composer, however I'm constantly receiving a black image from the second composer no matter what I do. This is the code I'm using: function Transition(composerOne, composerTwo) { if (typeof composerOne && composerTwo != "undefined") { var tmp = composerOne.export(); //Clone the shaders' uniforms; shader = THREE.ColorLookupShader; shader.uniforms = THREE.UniformsUtils.clone(shader.uniforms); var effect = new THREE.ShaderPass(shader); //Add the shader-specific uniforms; effect.uniforms['tColorCube1'].value = tmp; //Set the readBuffer.texture as a uniform; composerTwo.passes[composerTwo.passes.length - 1] = effect; //Overwrite the last pass; var displayEffect = new THREE.ShaderPass(THREE.CopyShader); displayEffect.renderToScreen = true; //Add the copyShader as the last effect in Order to be able to display the image with all shaders active; composerTwo.insertPass(displayEffect, composerTwo.passes.length); composerTwo.render(); } } Conclusion: To be completely honest, I don't have a clue about what I'm doing wrong. From what I've read, learned while debugging and from what I've figured out so far, I would argue that this is a bug. I would be really glad if someone could prove me wrong or submit a new idea on how to achieve something like what I'm already trying to do. If there are any more informations needed to solve this question, please let me know! Regards, Michael
  10. I am trying to create a fragment shader via a PIXI.AbstractFilter to create a wave rippling effect to be applied to a background texture. I have already worked out the algorithm for the wave effect in JavaScript. What I am having difficulty doing is getting the data I need into the shader through PIXI. For my effect to work, I need to have a large Float32Array to keep track of wave heights and a texture containing the original, unaltered contents of the background image to read from in order to apply the effect of pixel displacement (light refraction). I've been doing a lot of searching and have come up with some partial solutions. I attempt to load my large Float32Array into the shader as a texture with type GL.FLOAT (with the OES_texture_float extension) and an internal format of GL.LUMINANCE and read from it. From what I can tell, my shader isn't receiving my data the way I need it to. Just as a test, I set gl_FragColor to read from my data texture, and instead of the solid black that should have appeared, it rendered a color from either the source texture or the texture of the sprite that the filter is applied to.If I weren't using PIXI, what I would try next is to use gl.getUniformLocation, but it takes the current program as its first parameter, and I don't know of a way to access that. The basic flow of my shader needs to go: Read From Array -> Calculate displacement based on value -> Render the current fragment as the color at x+displacement, y+displacement -> Get updated version of array This is my code in the constructor for my shader: ws.Shader = function(tex) { // GLSL Fragment Shader for Wave Rendering ws.gl = game.renderer.gl; ws.flExt = ws.gl.getExtension("OES_texture_float"); var unis = { dataTex: { type: "sampler2D", value: ws.gl.TEXTURE1 }, canvasTex: { type: "sampler2D", value: ws.gl.TEXTURE2 }, mapSize: { type: "2f", value: [ws.width+2,ws.height+2] }, dispFactor: { type: "1f", value: 20.0 }, lumFactor: { type: "1f", value: 0.35 } }; var fragSrc = [ "precision mediump float;", "varying vec2 vTextureCoord;", "varying vec4 vColor;", "uniform sampler2D uSampler;", "uniform sampler2D dataTex;", "uniform sampler2D canvasTex;", "uniform vec2 mapSize;", "uniform float dispFactor;", "uniform float lumFactor;", "void main(void) {", "vec2 imgSize = vec2(mapSize.x-2.0,mapSize.y-2.0);", "vec2 mapCoord = vec2((vTextureCoord.x*imgSize.x)+1.5,(vTextureCoord.y*imgSize.y)+1.5);", "float wave = texture2D(dataTex, mapCoord).r;", "float displace = wave*dispFactor;", "if (displace < 0.0) {", "displace = displace+1.0;", "}", "vec2 srcCoord = vec2((vTextureCoord.x*imgSize.x)+displace,(vTextureCoord.y*imgSize.y)+displace);", "if (srcCoord.x < 0.0) {", "srcCoord.x = 0.0;", "}", "else if (srcCoord.x > mapSize.x-2.0) {", "srcCoord.x = mapSize.x-2.0;", "}", "if (srcCoord.y < 0.0) {", "srcCoord.y = 0.0;", "}", "else if (srcCoord.y > mapSize.y-2.0) {", "srcCoord.y = mapSize.y-2.0;", "}", /*"srcCoord.x = srcCoord.x/imgSize.x;", "srcCoord.y = srcCoord.y/imgSize.y;",*/ "float lum = wave*lumFactor;", "if (lum > 40.0) { lum = 40.0; }", "else if (lum < -40.0) { lum = -40.0; }", "gl_FragColor = texture2D(canvasTex, vec2(0.0,0.0));", "gl_FragColor.r = gl_FragColor.r + lum;", "gl_FragColor.g = gl_FragColor.g + lum;", "gl_FragColor.b = gl_FragColor.b + lum;", "}"]; ws.shader = new PIXI.AbstractFilter(fragSrc, unis); // Send empty wave map to WebGL ws.activeWaveMap = new Float32Array((ws.width+2)*(ws.height+2)); ws.dataPointerGL = ws.gl.createTexture(); ws.gl.activeTexture(ws.gl.TEXTURE1); ws.gl.bindTexture(ws.gl.TEXTURE_2D, ws.dataPointerGL); // Non-Power-of-Two Texture Dimensions ws.gl.texParameteri(ws.gl.TEXTURE_2D, ws.gl.TEXTURE_MIN_FILTER, ws.gl.NEAREST); ws.gl.texParameteri(ws.gl.TEXTURE_2D, ws.gl.TEXTURE_WRAP_S, ws.gl.CLAMP_TO_EDGE); ws.gl.texParameteri(ws.gl.TEXTURE_2D, ws.gl.TEXTURE_WRAP_T, ws.gl.CLAMP_TO_EDGE); ws.gl.texImage2D(ws.gl.TEXTURE_2D, 0, ws.gl.LUMINANCE, ws.width+2,ws.height+2,0, ws.gl.LUMINANCE, ws.gl.FLOAT, ws.activeWaveMap); // Send texture data from canvas to WebGL var canvasTex = ws.gl.createTexture(); ws.gl.activeTexture(ws.gl.TEXTURE2); ws.gl.bindTexture(ws.gl.TEXTURE_2D, canvasTex); // Non-Power-of-Two Texture Dimensions ws.gl.texParameteri(ws.gl.TEXTURE_2D, ws.gl.TEXTURE_MIN_FILTER, ws.gl.NEAREST); ws.gl.texParameteri(ws.gl.TEXTURE_2D, ws.gl.TEXTURE_WRAP_S, ws.gl.CLAMP_TO_EDGE); ws.gl.texParameteri(ws.gl.TEXTURE_2D, ws.gl.TEXTURE_WRAP_T, ws.gl.CLAMP_TO_EDGE); ws.gl.texImage2D(ws.gl.TEXTURE_2D, 0, ws.gl.RGBA, ws.gl.RGBA, ws.gl.UNSIGNED_BYTE, tex.imageData); } I then attempt to update dataTex in the ws object's update loop: ws.activeWaveMap.set(ws.outgoingWaveMap); // WebGL Update ws.gl.activeTexture(ws.gl.TEXTURE1); ws.gl.bindTexture(ws.gl.TEXTURE_2D, ws.dataPointerGL); /* // Non-Power-of-Two Texture Dimensions ws.gl.texParameteri(ws.gl.TEXTURE_2D, ws.gl.TEXTURE_MIN_FILTER, ws.gl.NEAREST); ws.gl.texParameteri(ws.gl.TEXTURE_2D, ws.gl.TEXTURE_WRAP_S, ws.gl.CLAMP_TO_EDGE); ws.gl.texParameteri(ws.gl.TEXTURE_2D, ws.gl.TEXTURE_WRAP_T, ws.gl.CLAMP_TO_EDGE);*/ ws.gl.texImage2D(ws.gl.TEXTURE_2D, 0, ws.gl.LUMINANCE, ws.width+2,ws.height+2,0, ws.gl.LUMINANCE, ws.gl.FLOAT, ws.activeWaveMap); I'm sure that plenty of this isn't right, but I believe that I can sort things out once I can get to the point where I can actually access my data. Can anyone point me in the right direction? This is central enough to my project that I am willing to discard PIXI altogether if there isn't a way to implement what I am trying to do. Also, I am using PIXI via Phaser, if that makes a difference. Thanks!
  11. This article was originally published on Habrahabr, a popular Russian website for IT professionals. Its topical theme sparked the interest of thousands of readers who left dozens of comments. We are glad to present the translation of this highly intriguing research on the performance of Unity WebGL and Blend4Web, with reported issues taken into account as well as benchmarks updated for the latest builds of both engines. Article in English: https://www.blend4web.com/en/community/article/280/ Here's some of the test results: So, if you interested in full article just follow the link and read it.
  12. Hi all I notice that I can only apply mask up to 2 levels deep. If you apply a 3rd level depth mask it removes the one before it. I'm using 4.2.2. Any ideas? Thanks! var main = new PIXI.Container(); stage.addChild(main); //works var mask1 = new PIXI.Graphics(); stage.addChild(mask1); main.mask = mask1; var child1 = new PIXI.Container(); main.addChild(child1 ) //works var mask2 = new PIXI.Graphics(); main.addChild(mask1); child1.mask = mask2; var child2 = new PIXI.Container(); child1.addChild(child2); //does not work var mask3 = new PIXI.Graphics(); child1.addChild(mask3 ); child2.mask = mask3 ;
  13. Hi. Im fairly new on Pixi and im trying to do something with multiple renderers. I know i could add multiple canvas instead, however i need a dedicated webgl renderer to manipulate the transform and try to do some trapezoid forms. I also need both renderers to works on the same canvas to avoid creating multiple layers on the document.body. My approach was: 1. Have a main renderer and a main stage. 2. Have a sideRenderer that will be affected by different transforms (using gl.uniformMatrix4fv to change the shape of the whole renderer and achieve different shapes) and a sideStage that will hold any content (in this example, a simple sprite). 3. make the sideRenderer render to a RenderTexture, which will be the source of a Sprite, which will be added on the main stage. So in theory, anything that the side renderer renders to the RenderTexture should appear on the sprite on the main stage. If somehow i modify side renderer, the transformed output should be shown on the RenderTexture, if that makes any sense. I tried this with this example, and it doesnt works. If i append the sideRenderer.view to the document.body, it renders as expected, but its not what i want, as i need it to be part of a more complex logic. At some point this makes me realize that i cannot mix renderers like this ( maybe the sideRender is still working on the back while the mainRender is trying to render an incomplete RenderTexture ? ), cannot make one renderer render something for another renderer (sideRenderer to mainRenderer or viceversa), so i would like to know if there is any workaround or any way to override this behavior? Thanks for the help var renderer = null; var sideRenderer = null; var stage = null; var sideStage = null; var WIDTH = 1000; var HEIGHT = 500; var rt = new PIXI.RenderTexture( 1000, 500 ); var spriteRt = new PIXI.Sprite( rt ); init(); function init() { var rendererOptions = { backgroundColor: 0xffffff, transparent: true } // Create the renderer renderer = PIXI.autoDetectRenderer( WIDTH, HEIGHT, rendererOptions ); sideRenderer = PIXI.autoDetectRenderer( WIDTH, HEIGHT, rendererOptions ); // Add the canvas to the HTML document document.body.appendChild( renderer.view ); // Create a container object called the `stage` stage = new PIXI.Container(); sideStage = new PIXI.Container(); stage.addChild( spriteRt ); var loader = PIXI.loader; loader.add( 'texture', './media/crate.png' ); loader.once( 'complete', onLoadedAsset ); loader.load(); } function onLoadedAsset() { var texture = PIXI.Texture.fromFrame( './media/crate.png' ); var sprite = new PIXI.Sprite( texture ); sideStage.addChild( sprite ); update(); } function update() { sideRenderer.render( sideStage, rt ); renderer.render( stage ); requestAnimationFrame( update ); }
  14. Hi, I'm very glad to greet all of you. I'm new. I'm not a programmer. I dedicate myself to the artistic part. I'm a 3d designer. But a few months ago, I'm in a project. In which we have a problem I hope you can give me some of your time and have answers that help to dispel my doubts. The direct question is possible to export an object either obj, threejs or some other format that is loaded in the code and to be able to apply the cloth function that is found in the physical motors either babylon.js cannon.js oimo.js etc I hope you can understand my question Thank you
  15. html5

    Hi everyone, Famobi has been around for more than two years already, but somehow we haven’t actively taken part in this wonderful forum during this period. Many of you know us already and have published their wonderful games in our network. So first of all we want to say THANK YOU! Thank you for the fantastic games you create, thanks for making the HTML5 games industry the next big thing and thanks for just being really great people. After all you and your games are our daily business. And we have lots of fun with them. But even more important, our clients love them. We have spread your games to many portals, companies and brands and gave them the attention they deserve. Since your games have been the foundation of our company and its ongoing success, it’s only fair that we share with you our current state and upcoming projects. Of course we continue and steadily improve our daily work as a distributor of your games. We place them on all the biggest and most known portals around the globe. And new portals, big and small, are registering for an account at Famobi every day. Another focus right now is Facebook Instant Games. Shortly we will begin placing games in the Facebook messenger. So if you have amazing high score games with a quick and easy gameplay, let us know anytime. One can never have enough of those And in general, please continue to send us your games. There are no restrictions to genre or age. However we have a few requirements based on the needs of our clients and partners, that have proven to be crucial for maximum success. The games must be: Full responsive. Games must work in portrait and landscape mode. Without text. No texts means anyone will understand your game regardless of language. Small. Preferrably the file size should not exceed 3 MB. Smooth performance. Even on lower-end devices. We test our games from iPhone 4S and Samsung Galaxy S4 mini upwards. These are a few examples of games that fulfill these requirements and that we really love: Solitaire Classic Street Race Fury 4 in a Row Classic Bottle Flip Challenge Mandala Coloring Book Backgammon Classic Kids Color Book 2 But before our post reaches the dimensions of a novel, let’s come to an end for now. For all those who didn’t know or contact us yet, you can reach us anytime under these addresses: Game submission: Please use our submit form right here: https://famobi.com/#contact General questions: info@famobi.com Purchase of games: sales@famobi.com Thanks so much and let’s continue to shape the industry! Cheers from the whole team!
  16. function QuantizeFilter() { var vertexShader = null; //Doesn't actually quantize, just testing. var fragmentShader = [ 'precision mediump float;', '', 'varying vec2 vTextureCoord;', '', 'uniform vec4 dimensions;', //This is the variable. 'uniform vec3 palette[3];', 'uniform sampler2D uSampler;', '', 'void main(void)', '{', ' gl_FragColor = vec4(0.5, 0.5, 0.5, 1.0);', '}' ].join('\n'); var uniforms = { dimensions: { type: '4fv', value: new Float32Array([0, 0, 0, 0]) }, palette: { type: '3fv', value: [ [255.0, 255.0, 255.0], [200.0, 200.0, 200.0], [0.0, 0.0, 0.0] ] } }; PIXI.AbstractFilter.call(this, vertexShader, fragmentShader, uniforms ); } QuantizeFilter.prototype = Object.create(PIXI.AbstractFilter.prototype); QuantizeFilter.prototype.constructor = QuantizeFilter; Object.defineProperties(QuantizeFilter.prototype, { palette: { get: function() { return this.uniforms.palette.value; }, set: function(value) { this.uniforms.palette.value = value; } } }); Custom (test) filter for Pixi.js V4 I'd like to make the 'uniform vec3 palette[3];' array size, size to the 'palette' uniform input. So I could set palette to 256 or so arrays of colors and the uniform will size appropriately: 'uniform vec3 palette[256];' Hypothetically, I've thought of just making the string in javascript, and prepending it to the fragment shader text, but I don't know of a way to do that. Any help is appreciated, ty.
  17. Update: Good news everyone! The Atomic Game Engine is now Open Source under the permissive MIT license! We made a blog post with the announcement: http://atomicgameengine.com/blog/announcement-2/ The Atomic Game Engine features Windows and Mac Editors, 2D & 3D rendering and physics, Tiled and Spriter support, JavaScript/TypeScript/C# scripting, full Editor and Player source code hosted on GitHub, and deploys natively to Windows, OSX, Android, iOS, and HTML5. Atomic Game Engine 2016 Feature Reel Atomic Editor Build Settings Atomic on Mobile Atomic Examples 3D WebView Scene Example Google Maps in the UIWebView Multi-tab browser example New code editor with syntax coloring and autocomplete for JavaScript and TypeScript Cheers! Josh Technical Director, Co-founder www.AtomicGameEngine.com
  18. Some time ago, we launched what turned out to be a really popular browser game: TANX. It's an online tank battle game and it's designed to be all about instant mayhem and fun. But we always felt as though it wasn't pushing WebGL hard enough. So we've spent the last few months revamping it. Here's the result: It's now using the PBR (physically based rendering) support in PlayCanvas. The level, tanks and power ups have all been rebuilt from scratch. So, it's the same great gameplay but with fancy new graphics. Read more about it here. And if you want to play, head to: https://tanx.io Please send us your feedback and suggestions. Want to help us out? We'd really appreciate a retweet: https://twitter.com/playcanvas/status/798871630323843072 See you on the battlefield.
  19. The above screenshot is from one of Deepnight's games, Delicious Cortex. I've attached it to show the effect I'm trying to achieve. I've figured out how to properly scale my game to 3x with crisp rendering, and how to apply a filter to the whole game world. I've hit a snag with the filter step though; Sebastian applies a 3px by 3px mosaic grid in overlay mode to a game that's scaled 3x. When I apply a shader to the game world it's being rendered before its scaled (as expected), but the effect I'm going for would require access to the screen post-scaling. Is this possible? I've successfully re-created the effect in my js13k entry Super Glitch Box, but that was a canvas-rendered game, and not Phaser. I would like to avoid simply scaling up all the assets in-game and leaving the game scale at 1.0, to preserve the appearance of pixel-perfect movement.
  20. Hi guys, We developed a game last year with phaser 2.3.0 and webgl, it worked fine on computer browsers and mobile devices. But since the last update of chrome for android (v53.0.2785.97), the screen is flickering until it becomes totally black, without any error or warning. The issue is present on mobile only, it worked on chrome for android v52, but not v53. We tried to upgrade to phaser 2.6.2, same issue on chrome for android v53. If we use Phaser.CANVAS instead of webgl, it works fine, but we prefer using webgl for performances. Does anyone encounter the same problem? Thanks for your help. Joe K.
  21. Hi everyone! I have one question.. Do you like tanks ?! Resonantly we released beta version of our small multiplayer game called DatTank! http://dattank.com And you are right, it's all about tanks Come and check! Tell us what do you thing about it.
  22. http://data.cyberlympics.com/html/game.html?param=Devader_1 controls: WASD for movement, mouse to target and shoot, E/right click releases a large rocket at the target Q/middle mouse button to place turrets aim of the game: don't die and kill those pesky spiders that are eating the hexas. if you die, or all the hexas are gone, you lose. the game was created for a small fun contest with the theme "mining and herds", this was my take on it. I'm still missing a start/end screen with a highscore list. not quite sure if I should try to use facebook api for this.
  23. My friend and I have been developing a 3D multi-player eat-em-up game called Zor.bio. It's a WebGL game written in three.js. We would love any feedback you could give us. Game Info: Game: http://zor.bio Multi-player eat-em-up, eat food to get bigger and eat other spheres to become the biggest Sphere Currently in Alpha Development Looking for feedback on: Abilities, we have a lot of ideas about what kind of abilities to add but knowing what are the right abilities to add is hard Multi-player performance specifically with dealing with lag with websockets How to improve jitter we are using lerp to smooth out the sphere positions but sometimes other clients lag and jump around, trying to figure out why. Any other feedback. Thank you so much for taking a look!
  24. I am working for a Wearable Computing and Augmented Reality Startup in Bremen, Germany: http://www.ubimax.de For improvements of our (PIXI.js powered) Web-Editor that configures our Augmented Reality solutions we are looking for a Web Application Developer (m/f) to join our team in Bremen (job permit for the EU is required). It says full-time in the job description but students looking for an internship are also very welcome! We are a team of people from all over the globe, so everyone in our team speaks English fluently but German is a big plus. The Job description (in German) is attached to this post. Please apply with your full resume including school and other certificates as well as code examples (e.g. github links) and references to career@ubimax.de . Feel free to ask me for further details on the job. 162810_Ubimax_Stellenauschreibung_WebApplicationDeveloper.pdf