Jump to content

Search the Community

Showing results for tags 'polygons'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


  • HTML5 Game Coding
    • News
    • Game Showcase
    • Facebook Instant Games
    • Web Gaming Standards
    • Coding and Game Design
    • Paid Promotion (Buy Banner)
  • Frameworks
    • Pixi.js
    • Phaser 3
    • Phaser 2
    • Babylon.js
    • Panda 2
    • melonJS
    • Haxe JS
    • Kiwi.js
  • General
    • General Talk
    • GameMonetize
  • Business
    • Collaborations (un-paid)
    • Jobs (Hiring and Freelance)
    • Services Offered
    • Marketplace (Sell Apps, Websites, Games)

Find results in...

Find results that contain...

Date Created

  • Start


Last Updated

  • Start


Filter by number of...


  • Start



Website URL





Found 12 results

  1. Hi all I want to render a very big number of polygons. Currently and broadly speaking I am drawing each polygon in a similar manner to this var shape = new PIXI.Polygon(poly[0].map(function (point) { var proj = project([point[1], point[0]]); return new PIXI.Point(proj.x, proj.y); })); mycontainer.beginFill(color, alpha); mycontainer.drawShape(shape); } where mycontainer = new PIXI.Graphics(); is the object that holds all my polygons. Also, these polygons come in different irregular shapes and sizes, ie they do not have the same rectangular shape. My browser looks to be happy with something like 100,000 of these polygons. However having 200,000 polygons looks to slow it down. Panning, zooming etc comes with noticeably lag. Is there something like the PIXI.particles.ParticleContainer object that I could use to attach my polygons please (or some other approach that I might be missing). To my understanding I cannot use the ParticleContainer here because my polygons have different (and to some extend random) shape. Any ideas highly appreciated Regards
  2. Hello, found a bug in the matter physics engine collision detector, I'm working with polygonal colliders. When an object collides with another and passes through it the "collision end" event doesn't trigger until de object is outside the real shape of the sprite (image size). I'm attaching a small video of it in which I'm logging the collision between the small vert of the triangle against the body of the other triangle, you can see how the "collision end" triggers when the vert enters and leave from the same side but not when it goes through it (until leaving the image size). Hope it helps fixing it! and if anyone knowns a workaround it would really help me. Regards. Edit: here's a small c9 demo (event trigger notification on console). https://ide.c9.io/mmolina01/phaser-matter-demo https://phaser-matter-demo-mmolina01.c9users.io/ collisions.mov
  3. Hi all, I've a problem with Phaser and Box2D Physic. I created a game where I load sprites and apply a polygonal body to them through a series of coordinates. When I resize the sprite I send the coordinates to a function that recalculates them based on the scaling factor. The coordinate array returned to me is correct, but when I redraw the body I have the effect shown in the attachment image. In other words, a square is always drawn in the same position and with the same dimensions. The coordinates of the square AREN'T present in my array. I realized that this occurs when there are very close coordinates 'cause if the scaling factor isn't so small the problem doesn't occur. Can someone help me? Thanks in advance, Vittorio
  4. Hello to all of you. First of all I want to explain what I do. I am building location by passing Vertexes and name of the location(PolygonMeshBuilder for location and material for it). Then I add this specific location into array, remove it from the scene at end and shortly afterwards I merge it. With them I do not have any problems whatsoever. However I do have horrendous performance issues with location names. Here I am using the following steps: -new DynamicTexture => drawText on it =>new StandardTexture=> diffuseTexture and opacityTexture taking previously created dynamicTexture. Shortly afterwards I apply it to a new PolygonMeshBuilder method that again takes objects. I was hoping that I will mitigate it too by merging meshes but I was wrong. The same number is replicated all over again in various axes. Where is my probable mistake? Thanks
  5. Hi, I'm new to Pixi.js and the forums. I'm trying to figure out how to drag complex shapes made of polygons. I used PhysicsEditor to create the Polygons. I know a hitbox can be assigned to a sprite, but this seems to only work for a single simple polygon? So my next approach was to create a bunch of graphics shapes from the polygons and add them to a container. This works, but I don't know if it's the proper way to do it. Here is the code below. let polygons = new Polygons().getPolygons(); let container = new PIXI.Container(); polygons.forEach((data: any) => { var graphic = new PIXI.Graphics(); graphic.beginFill(0x00dd00, 1); graphic.drawPolygon(data.shape); graphic.endFill(); graphic.scale.x = scale; graphic.scale.y = scale; graphic.interactive = true; graphic.buttonMode = true; graphic.alpha = 0; container.addChild(graphic); graphic .on('pointerdown', this.onDragStart) .on('pointerup', this.onDragEnd) .on('pointerupoutside', this.onDragEnd) .on('pointermove', this.onDragMove); }); container.addChild(sprite); container.x = event.data.global.x - container.width / 2; container.y = event.data.global.y - container.height / 2; this.app.stage.addChild(container); } My mouse up creates the sprite by clicking on the screen. This is my Drag function. onDragMove = (event: any): void => { if (this.dragging) { console.log(event.currentTarget) event.currentTarget.parent.alpha = .5; let newPosition = event.data.global; let parent = event.currentTarget.parent; parent.x = newPosition.x - parent.width / 2; parent.y = newPosition.y - parent.height / 2; this.wasDragging = true; } } The weird thing with this code is that all my sprites jump on top of each other while dragging. If I use this exact same code for my onDragEnd function and comment out the onDragMove it works as expected, but obviously I can't see the drag happening. So my question is, is this the proper way to do this? If it is, why is the drag function not working? Is there a better way to do this?
  6. Hi guys! How I can remove the polygons from my model or remove some varticies or edges. In short, how to cut off part of model? or babylong and webGl dosn't have such possibility? Thank!
  7. I have some problem. In blender I have a model with smoothed polygons, but when I am export a model to babylon, than some polygons do not looks smoothed. On a hood and a neck. In blender all right In babylon some polygons do not looks smoothed
  8. Hi, I have some meshes, created with PolygonMeshBuilder. They're not overlapping, but their bounding boxes are as you can see in the attached screenshot. I want to achieve pixel perfect picking for those. Using scene.pick without fastcheck for some reason still picks those meshes by bounding box. (In the attached screenshot - the green mesh is getting picked instead of the purple one) Any ideas? Thanks
  9. Hello! I'm having trouble figuring out the how to map specific parts of a larger image to specific polygons forming a complex shape created by through the PolygonMeshBuilder. After hours of researching the problem and finding only similar answers that either don't, or I can't figure out how to, apply to my specific situation, I'm turning to asking directly for help. I was given code that generates a map of the US, where each state is a separate object generated from an XML file using the PolygonMeshBuilder class. The assignment is to take a provided image displaying a map of the entire US, and have each state object pull it's specific portion of the map image to texture it. $.each(polys, function( index, value ) { var pts = $(this).find('outerBoundaryIs').find('LinearRing'). find('coordinates').text().replace(/,0 /g," ").replace(/,/g, " "); var groundMat = new BABYLON.StandardMaterial("red", scene); groundMat.diffuseTexture = new BABYLON.Texture("usa-physical-map2.jpg", scene); groundMat.specularColor = new BABYLON.Color3.Black(); var ground = new BABYLON.PolygonMeshBuilder(stName + "_" + index, BABYLON.Polygon.Parse(pts), scene).build(); ground.parent = state; ground.material = groundMat; //ground.material = new BABYLON.StandardMaterial("red", scene); //ground.material.diffuseTexture = new BABYLON.Texture("usa-physical-map2.jpg", scene); var ptCoords = ground.getVerticesData(BABYLON.VertexBuffer.PositionKind); console.log("ptCoords: " + ptCoords); var tCoords = ground.getVerticesData(BABYLON.VertexBuffer.UVKind); console.log("tCoords: " + tCoords);This is the relevant code that generates the ground of the state which is what is drawn to the scene. I've added some bits in effort to complete the task, but so far all I can achieve is displaying the whole (or as much fits on the state) map on each state. I am able to pull the coordinates of where each map cutout should be, but I cannot figure out how to take that information and actually apply it to a polygon in the object. Thank you in advance for your assistance.
  10. Hello All, Hope you are all well and settling into 2015 I've just started working with Pixi. I have a few questions before I start to get an understanding of things: What is the difference between the use of PIXI.Graphics and PIXI.Polygon? Can both be used to render to the screen? (.Graphics using Canvas-like lineTo & moveTo methods, whereas .Polygon takes vertices?)Is it necessary to have a displayObjectContainer in order to render interactive graphics? What is the purpose / benefit of it?Does Box2D play nicely with Pixi for use with collision tests on complex 2D polygons?Thanks in advance! Jordan
  11. Hi! I'm David, co-founder and front end of Arpic Games, a startup in Valencia (spain). We are making Starriser, an MMO of space strategy in real time. We are talking hundreds of thousands of players in real time in a giant world steering huge fleets of ships and managing planets, waging wars and trading with other players. A snapshot of the Starriser's galaxy: https://www.dropbox.com/s/sqro8ylbdkzqgfk/Starriser%20Galaxy.png?dl=0 This is a little one. Basically, the map is a graph. Nowadays Starriser uses Canvas 2D natively. I'm researching for graphic libraries and Pixi seems to be the best. The only problem is the primitives draw system. Pixi makes polygons from lines to have nice corners. The problem here is that its computation is slow. Well, not slow in general, slow if you put 5.000 lines and need to change their thickness every time the user zooms in. Even more if you smooth the zoom. Basically, I need to change the thickness of 5.000 lines without affecting performance. Computing every frame's line's polygons in a zoom animation its too much. A possible solution I thought is to use native webgl primitives. They don't have nice corners, but I don't need then. Is there any way to use native webGL lines in pixi? Finally, in my research I've noticed that pixi is a great solid library. Great work! Thanks! David.
  12. Hi, I have a sprite and a corresponding json physics for collision polygons. I know how to scale a sprite, but how can I scale the polygons at the same time ? I read http://www.html5gamedevs.com/topic/4795-it-is-possible-to-scale-the-polygon-with-p2-physics/ but was for polygons created in code. Thx
  • Create New...