mobileben

Members
  • Content Count

    34
  • Joined

  • Last visited

  • Days Won

    3

mobileben last won the day on September 11

mobileben had the most liked content!

About mobileben

  • Rank
    Advanced Member
  1. By cutting do you mean generating new UVs? Or really generating a new texture? Since you are not fully describing things, it does make it more difficult for people to help you.
  2. What are you trying to do? When you say a texture needs to be cut into several ones is how it cut determined at runtime?
  3. Since you want to control your drawing of the layers, you may want to create a class which uses a container as a parent for the sprites. The reason for the class is you could use that as the interface to control the layers (ie. sprite). If you know that the textures are inter-related, if you have not put them in the same atlas, you should. This will eliminate the need for the underlying renderer to change the texture when rendering (ie. higher likely hood depending on what the renderer is doing that they can be drawn with the same draw call).
  4. @bruno_, thanks so much for the information. It was very helpful. When you say you use "dummy files" do you mean wrapper files for Cordova, or you create dummy files for the ones that Cordova will supply? @mattstyles, thanks as well. And thanks for the note about testing often on the device. My idealized plan is to be able to switch easily between the browser and mobile, which is why I was wondering about approach. From my early guesses, the part that will be more challenging is anything that has to go through the JS bridge such as IAP. Mainly because I'm using Typescript. What I think I may have to do is actually do those parts in Javascript to avoid having to debug transpiled code and use Safari developer mode to debug through the simulator. As both your approaches are more of a two-step approach, I assume then that your index.html files as well as possibly how the app starts (Since Cordova will fire deviceready which we should be using to launch the game) are handled as one offs? I did find I can use the merges directory to use custom index.html files per platform. Right now I have what may be a workable model to develop on which would allow for dev browser/mobile from the onset.
  5. I currently have a setup where I use Webpack + Typescript for game dev. This is experimental, but it seems to be working well. I wanted to add Cordova in the mix to see how hard it would be to add support for mobile. One thing that strikes me is that Cordova does also become a bit of a build system, but seems cumbersome for just HTML5 dev. My plan would be to do most dev on the browser. Then move to mobile. One take away is it does seem a bit easier to start off with Cordova, only from the perspective that it seems easier to drop a project into Cordova versus dropping Cordova into your project. This really just means using the directory structure and having some of the config that a Cordova project likes, and then putting my code and required packages, etc, around it. For people that are using Cordova, what is your workflow? Meaning are you doing most dev on the browser in the directories and then doing Cordova builds as needed (ie. ignoring Cordova only until needed)? When you do an HTML5 build for the browser is it completely devoid of Cordova? Or are you including Cordova and just going through a slightly different startup routine? I suppose this leads to the question of are you doing only one distribution that runs on everything, or creating different distributions based on platform? Also, are there any viable contenders out there to use instead of Cordova?
  6. I assume when you ask "render it as canvas" you are referring to rendering the `PIXI.Graphics` as is? So in other words something like: const gfx = new PIXI.Graphics(); // Do stuff to make a graphic gfx.endFill(); app.stage.addChild(gfx); Yes, you are better off converting to a texture and the creating a sprite. The actual `_render` for a graphic does a bunch of work to display the parts of the graphic. Simplified graphic shapes like rectangles should be faster to draw. Just how much work is related to complexity and whether or not the graphic is batchable. Sprites on the other hand just update vertices (presumably only if needed, although looking at the code it doesn't look like it has any dirty bits) and then renders. Hmm, wondering. I'm not really a JS guy. But I've done some reading that seems like you can get some async stuff running. Has anyone dabbled with that? It would be super helpful if it is real async. Since then things like texture creation could be done off the main thread. The only caveat though ... which I don't know if pixi can handle is the need for some locks. I'm more used to multi-threaded game engines where one has those features to help better hide latency.
  7. Also thinking about this some more. It seems to me your main problem really is the lines. For the rects (points), you really just need to create a rect and convert it to a texture and then create a pool of sprites. Alternatively you could create that sprite based on a texture image. I'd recommend a white rect and then tint it to the color you need. You can also scale it as you need. Just create a pool of sprites and use them as you need (and hide what you don't). This should give you good performance regrading the drawing of your points. The line is a bit more problematic. You probably need to define "real-time". Depending on your application real-time isn't always real-time. Meaning at times you can actually eat up more frames doing something. For example, is it still usable if it is 30fps or 20fps? For the line, when you zoom and scale, rather than invalidate, why not build "offline" the newer line? Then when it's done, show that line and hide and either destroy or invalidate the other? So perhaps a good solution is a pool of sprites for points and rather than clearing creating a new line and the hiding and invalidating/destroying the old. This is one of those ideal cases where being multi-threaded is helpful, since you can offload both the new line render and destruction on another thread.
  8. I think for generating a texture from the line graphic you would do something like this, however I will be quick to add is in my test, it comes out a bit jaggy. The actual sine wave as a graphic is slightly jaggy as well, just not as bad. I also notice clipping. You should be able to draw the two and compare to see what I mean. const graphic = new PIXI.Graphics(); graphic.lineStyle(2, 0xff0000, 1); const startX = 0, startY = 0; const increment = 0.1; graphic.moveTo(startX, startY); for (let x=increment;x < 100;x += increment) { const y = Math.sin(x) * 20; graphic.lineTo(startX + x * 10, startY + y); } graphic.endFill(); let sineTex = app.renderer.generateTexture(graphic, PIXI.SCALE_MODES.LINEAR, window.devicePixelRatio); let lineSprite = new PIXI.Sprite(sineTex);
  9. Ahh, okay, I saw that. I misunderstood. You talked about zooming/scaling so wanted to see how it would differ. Essentially the rough views are the same (rectangles and lines)?
  10. Why don't you send a few examples of what a graph looks like at different scale so we can see the spectrum of visuals you are trying to achieve and how they interrelate?
  11. Good question. Curious, what are you trying to do (ie. why don't you want to extend from PIXI.Point)? Are you using Typescript? If so I notice that the objects are not created with prototypes.
  12. When scaling anything up, you will potentially artifacts. The severity of which depend on the direction (up or down) as well as how much and what you are scaling by. As you are doing an affine scale, you won't suffer any effects due to change in aspect ratio. Are all those images the same size BTW?
  13. There is one outstanding issue that I know of right now. Webfonts. When I create a PIXI.Text, it's still drawing larger based on the px I set. Not double, but about 25% larger. I noticed changing the value changes the level of blurriness. But too high and the font breaks down. The overall size remains the same. Is this expected? Another oddity I noticed. There seems to be a bug with set resolution The call to updateText/updateTexture never occurs. You need to do something else. So in my case, I called text.height afterwards. This would cause updateText/updateTexture to be called.
  14. I have something that works. I'll put the solution here, as it isn't large, to get your feedback. I can create an issue and file a PR if you think it is worthwhile. I haven't yet committed the branch to my fork. As a recap as to the why, I want to use native resolution for everything. Other OpenGL I've worked with is easy, since it is always native. Hence there is no meaning to @#x type files, for example. Everything is a pixel and the layers above handle any form of device pixel ratio type scaling. I wanted a good way to immunize against android as well as use the approach of small, med, and large type of assets the retina suffix named files. This means my app is initialized like the following: export const app = new PIXI.Application({ width: gameArea.width, height: gameArea.height, view: <HTMLCanvasElement> document.getElementById('game-canvas'), antialias: true, transparent: false, backgroundColor: 0xFFFFFFFF, forceCanvas: canvasType == "canvas", resolution: 1}); What I have done is introduce the concept of a fallback. The idea is that if prefixing is used, then it uses the prefix. If not, it will check for a fallback. If no fallback value exists, then it will use the default. I chose this approach because it is naturally handled by the code. Moreover, I needed to add an extra value (the fallback) because I have not control of changing the value. So while the concept of the defaultValue is nice from an API perspective, it is only explicitly set by BitmapText and it takes the renderer resolution value. So we have no way of controlling it otherwise. Code is as follows. I'm happy to make any changes/rename etc. import { settings } from '../settings'; let FALLBACK_RESOLUTION_FOR_URL = 0; /** * get the fallback resolution / device pixel ratio of an asset. This is used * in conjunction with getResolutionOfUrl * * @memberof PIXI.utils * @function getFallbackResolutionOfUrl * @return {number} fallback resolution / device pixel ratio. If 0, then ignore. */ export function getFallbackResolutionOfUrl() { return FALLBACK_RESOLUTION_FOR_URL; } /** * set the fallback resolution / device pixel ratio of an asset. * * @memberof PIXI.utils * @function setFallbackResolutionOfUrl * @param {number} value - the fallback value if no filename prefix is set. */ export function setFallbackResolutionOfUrl(value) { FALLBACK_RESOLUTION_FOR_URL = value; } /** * get the resolution / device pixel ratio of an asset by looking for the prefix * used by spritesheets and image urls * * @memberof PIXI.utils * @function getResolutionOfUrl * @param {string} url - the image path * @param {number} [defaultValue=1] - the defaultValue if no filename prefix is set. * @return {number} resolution / device pixel ratio of an asset */ export function getResolutionOfUrl(url, defaultValue) { const resolution = settings.RETINA_PREFIX.exec(url); if (resolution) { return parseFloat(resolution[1]); } else { const fallback = getFallbackResolutionOfUrl(); if (fallback) { return fallback; } } return defaultValue !== undefined ? defaultValue : 1; } To use, I'm simply doing the following after I create the Application. PIXI.utils.setFallbackResolutionOfUrl(window.devicePixelRatio);
  15. Okay, I believe I have figured this out. While I eventually sort of got `npm link` working, it generates a bunch of errors. First trick is you need to link bundles/pixi.js as this is where the actual standalone exists. The next "trick" deals with linking. I also discovered that when running `npm link` it also installs dependencies under bundles/pixi.js/node_modules. When it does this, it actually copies in the actual @pixi dependencies whereas when `npm run build` is executed, it creates symbolic links to the generated node_modules under packages. This really threw me for a loop when figuring this out, since I had to determine why any changes I made were not showing up. Previously I thought that I needed to run `npm link` and then install. However, this was incorrect. If you run `npm link` first, those the symlinks are not created. Also note that when `npm link` I had to use sudo as on my machine, my global node_modules is owned by root. That all being said, in the end I decided to forgo using `npm link` and instead manually created symbolic links. This leads to the following steps: 1. Clone repo (either actual or your fork) 2. Create symbolic link using ln -s REPO_DIRECTORY/bundles/pixi.js /usr/local/lib/node_modules/pixi.js Where REPO_DIRECTORY is your repo directory. YMMV depending on where node is placed on your machine. If you're on Windows and don't have unix commands, you'll need to create the appropriate shortcut. This replaces `npm link` 3. In your project directory, in the node_modules directory ln -s /usr/local/lib/node_modules/pixi.js pixi.js If you had pixi.js already installed via npm, you will need to uninstall it first. 4. You can now install and build (You could have done this earlier as well). So, go back to REPO_DIRECTORY and npm install npm run build 5. Back in your project directory, if you're running Webpack, than add the following to your Webpack.config.js file, if you need to make sure your resolve has symlinks set to true. resolve: { symlinks: true, modules: ['src', 'node_modules'], extensions: ['.ts', '.tsx', '.js'] } Note that I just copied and pasted what I had. 6. Profit Another interesting quirk I noticed from this whole thing, is to generate a proper .d.ts file, you need to make sure anything you export is commented properly. Else you will find the code is generated (eg. pixi.js) but the declaration file will not have it defined.