Search the Community

Showing results for tags 'vertexdata'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • HTML5 Game Coding
    • News
    • Game Showcase
    • Facebook Instant Games
    • Coding and Game Design
  • Frameworks
    • Phaser 3
    • Phaser 2
    • Pixi.js
    • Babylon.js
    • Panda 2
    • melonJS
    • Haxe JS
    • Kiwi.js
  • General
    • General Talk
  • Business
    • Collaborations (un-paid)
    • Jobs (Hiring and Freelance)
    • Services Offered

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


Website URL


Twitter


Skype


Location


Interests

Found 8 results

  1. I am getting the error: "Cannot merge vertex data that do not have the same set of attributes." when I try to merge a BABYLON.Mesh created using Vertex data with ones created with MeshBuilder. What attributes are different/am I missing? https://playground.babylonjs.com/#AGL702 (Line 26) Thanks
  2. Good day everyone. I am now works on my own huge project, which uses a lot of mathematical models. So, the problem is ,I have mathematical model of hexasphere from there https://github.com/arscan/hexasphere.js/blob/master/src/face.js And I need to create meshes from vertex data of this model. I'm using this: //hexaSphere - is object, created with library upper for (let i = 0; i < hexaSphere.tiles.length; i++) { let centerpoint = data.centerPoint; let x = centerpoint.x; let y = centerpoint.y; let z = centerpoint.z; let spawnVec = new BABYLON.Vector3(x, y, z); //here I should instantiace center of vector let buildPoints = []; for (let j = 0; j<data.boundary.length; j++){ let x = data.boundary[j].x; let y = data.boundary[j].y; let z = data.boundary[j].z; let point = new BABYLON.Vector3(parseFloat(x), parseFloat(y), parseFloat(z)); buildPoints.push(point); } let polygonalMesh = new BABYLON.MeshBuilder.CreatePolygon(""+i, {shape:buildPoints}, scene); } So, the result of this is (please, do not look on colored spheres and sticks, It was hard to remove from my project because of bad architecture, they are do not touching this part of code, it is isolated): It is not 3D, but 2D projections of each hexagon or pentagon to the XoZ BABYLON plane. I started to think about position, as "How to" said, but polygonalMesh.position = spawnVec; give me that result: How can I create meshes from vertices data in 3D space? I just can not test rotation, because if it's correct, it can be visible to match each other borders of morphed meshes. There is no way to use loaded mesh, because each of hexagon is different. This mathematical model do not use right hexagons, because it is hard co connect (mathematical theorem). And I am sure this library is correct, because I tested it with right hexagon (imported mesh with babylon-loaders.js, created it on blender) and tried to match each other rotation, so I got next result (green dots is rotation co-surface normales, do not look on them): And everything with math was fine: blue dots is 2 of 6 or 5 border points, red is centerPoints. I do not have right now screen of whole sphere, but it was sphere, not was MeshBuilder created. I just putted a copy of IMPORTED model on each center and rotated it with some linear algebra to check everything, before post here. So, my question is: how can I create polygonal mesh in 3D space, not his 2D projection, using BABYLON.MeshBuilder (or other BABYLON solution), because displacement of right hexagons and this mathematical model is too huge for acceptance (impossible to play, looks ugle and other), so I need dynamical creation. p.s There is impossible to create lovely playground example, because of library, but http://playground.babylonjs.com/#4G18GY#67 have the same problem. In arguments of array have all 3D spaces, but it is creating just 2D projection.
  3. I'm completely in the dark when it comes to creating meshes from scratch. I figured it out as far as positions, indices, applying vertex data...I have this basic example, creating a barn-wall style pentagon from 5 x,y,z primitive vertex arrays - var v2pent = (v3s, mesh) => { // clockwise from tl, 0 1 4, 1 3 4, 1 2 3 var vtxdt = new BABYLON.VertexData vtxdt.positions = _.flatten(_.map([0,1,4,1,3,4,1,2,3], i => v3s[i])) vtxdt.indices = [0,1,2,3,4,5,6,7,8] // these two lines added to try to debug, no help // vtxdt.normals = [0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 1] // vtxdt.uvs = _.flatten(_.map([0,1,4,1,3,4,1,2,3], i => [v3s[i][0],v3s[i][1]] )) var mesh = mesh || new BABYLON.Mesh('t1', scene) vtxdt.applyToMesh(mesh, true) return mesh } It sort of works, but the meshes are always dark gray to black (not picking up light?), and more importantly right now, not coming up when they're the foreground-most mesh when calling scene.pick - the other pentagon wall behind it will pick it up first! Help very much appreciated - I've read pretty much all the relevant docs and am totally confused. I don't know if its the uv and/or normal calculation, or if there's some helper function in the library to generate those for me (besides just ComputeNormals)? Is it something I could solve with BABYLON.Mesh.DOUBLESIDE, or an equivalent? The material used in this case is the following by the way: var wallMaterial = new BABYLON.TriPlanarMaterial("metal", scene); wallMaterial.tileSize = 12; wallMaterial.diffuseTextureX = metalTexture; wallMaterial.diffuseTextureZ = metalTexture; wallMaterial.backFaceCulling = false; (copy and pasted from my issue here https://github.com/BabylonJS/Babylon.js/issues/2735)
  4. Hy everyone! I'm making a game and need a little help with the transparency stuff. So, in the game I have one material that has a texture (a png image that contains transparent parts). The mashes are made out of planes, merged together and the planes shows parts of the texture (think about it like Minecraft). I can set the colors of each plane in the mesh with the colorKind vertexdata.. except the alpha part. So I tried to find some topics about it and saw somewhere a tip to set the hasVertexAlpha on the mesh to true and add the texture to the opacityTexture. That worked, now I can set the RGBA on any parts of the mesh but the rendering order goes wrong Here's an image: the red and the green are one mesh (that has two planes, the green is above the blue), the blue is the other mesh. The green is half transparent using the colorKind alpha : It's cool, this is the result I'm looking for! But when I rotate the camera, the blue is displayed above the green. And here is playground link to what I'm experiencing in the game: https://playground.babylonjs.com/index2_5.html#8XDI7F#2 So in short, I want to have a material with a diffuseTexture that has transparent parts and use the colorKind RGBA on every plane in every mesh. What am I doing wrong and how can I fix it? Thanks for the help!
  5. Hello! I am working on importing an animation from http://voxelbuilder.com. There the animations are simply a set of frames which are switched in order. So, if I want to create an animation with 3 meshes I should do: 1. Show 1st mesh for 1/3 of a second 2. Show 2nd mesh for 1/3 of a second 3. Show 3rd mesh for 1/3 of a second 4. Show 1st mesh again. Logically that is fine but how can I do this properly? From each mesh I have the following information: vertexData.positions = positions; vertexData.indices = indices; vertexData.normals = normals; vertexData.colors = colorlist; Is it possible to create an animation by just setting the different indices/positions/normals/colors for each frame? It would be sweet if after `vertexData.applyToMesh(myMesh, true)` and `myMesh.bakeCurrentTransformIntoVertices()` I could also be able to start the animation with `scene.beginAnimation(myMesh, 0, 3, true, 1)`. Thanks! EDIT: To add more information: the amount of vertices, their positions and their uv's can drastically change from a frame to another, so the system should handle ANY mesh sequence and not the same vertices but in different positions.
  6. Hi! I'm using pixi.js to distort a sprite by manipulating it's vertexData (Like Photoshop's perspective transform). It's working! But I'm using a canvas texture as source, and as soon as I render the scene after updating the texture, the vertexData resets itself! I need to update the canvas because I'm rendering some animation on it. Some of the code I'm using: this.image = new PIXI.Sprite(PIXI.Texture.fromCanvas(canvas)); function render() { var self = this; this.image.texture.update(); //Everything works fine if I comment this line for (var i = 0; i < this.handlers.length; ++i) { //The handlers are some other sprites I'm using as control points this.image.vertexData[i * 2] = this.handlers[i].position.x; this.image.vertexData[(i * 2) + 1] = this.handlers[i].position.y; } this.renderer.render(this.stage); requestAnimationFrame(render); } render(); It there any way to prevent the vertexData to reset? Or is there a better way to achieve the same effect? Thank you so much!
  7. Can you NOT replace VertexData on a mesh? I am creating a Empty Mesh in C# Unity (I save some metadata info about a box size). Thin in client code i create a box VertexData and apply to mesh... But i see nothing... When i try the same "ApplyBoxCollider" code a new BABYLON.Mesh it works fine. My apply vertex function: private static applyBoxCollider(box:BABYLON.Mesh, options: { size?: number, width?: number, height?: number, depth?: number, faceUV?: Vector4[], faceColors?: Color4[], sideOrientation?: number, updatable?: boolean }, scene: Scene): void { options.sideOrientation = this.updateSideOrientation(options.sideOrientation, scene); box.sideOrientation = options.sideOrientation; var vertexData = VertexData.CreateBox(options); vertexData.applyToMesh(box, options.updatable); } This WORKS: var box:BABYLON.Mesh = new BABYLON.Mesh("SHIT", this._scene); BABYLON.SceneManager.applyBoxCollider(box, boxOptions, this._scene); This does NOT: BABYLON.SceneManager.applyBoxCollider(existingBox, boxOptions, this._scene); I was hoping to create an empty mesh on the C# and then create the actual collision mesh on the client side and simply RESET the vertex data on an already "Parented Mesh" with all its checkCollision stuff already setup... Is that Bad... Should I just Create the whole mesh on client side only and use New Mesh instead of trying to replace the underlying vertextdata ??? Thoughts Anyone
  8. Never very confident in these things but in BJS2.5 shouldn't line 33527 uvs.push(col / subdivisionsX, 1.0 - row / subdivisionsX); in the VertexData.CreateGround function below be uvs.push(col / subdivisionsX, 1.0 - row / subdivisionsY); Fairly minor as I expect most grounds will be square. Then again the code may be correct and I have misunderstood something. VertexData.CreateGround = function (options) { var indices = []; var positions = []; var normals = []; var uvs = []; var row, col; var width = options.width || 1; var height = options.height || 1; var subdivisionsX = options.subdivisionsX || options.subdivisions || 1; var subdivisionsY = options.subdivisionsY || options.subdivisions || 1; for (row = 0; row <= subdivisionsY; row++) { for (col = 0; col <= subdivisionsX; col++) { var position = new BABYLON.Vector3((col * width) / subdivisionsX - (width / 2.0), 0, ((subdivisionsY - row) * height) / subdivisionsY - (height / 2.0)); var normal = new BABYLON.Vector3(0, 1.0, 0); positions.push(position.x, position.y, position.z); normals.push(normal.x, normal.y, normal.z); uvs.push(col / subdivisionsX, 1.0 - row / subdivisionsX); } }