zeh

Members
  • Content Count

    15
  • Joined

  • Last visited

About zeh

  • Rank
    Member

Contact Methods

  • Website URL
    http://zehfernando.com
  • Twitter
    zeh

Profile Information

  • Gender
    Not Telling
  1. Thanks, good to know, I'll take a look at textureCoord. I've updated to RC 2 a couple of days ago. Seem to be progressing well.
  2. A bit late, but thanks again @ivan.popelyshev, I was able to make it work with the above code and examples. I had previously trying outputFrame/inputSize/resolution but kept getting errors that the uniforms were not matching; by checking the filter displacement example I was able to see I missed the highp precision. uniform highp vec4 outputFrame; uniform highp vec4 inputSize; uniform highp vec4 resolution; So the above works. For the record, I can now get the UV simply as uniform vec4 inputPixel; uniform highp vec4 outputFrame; vec2 getUV(vec2 coord) { return coord * inputPixel.xy / outputFrame.zw; } Without having to pass the dimensions manually.
  3. Edit: I got it to work by replacing "filterArea" with "inputPixel" in the original workaround. Unfortunately we still need to know the original resolution via a custom uniform, but at least it works the same as the original workaround. Would love to know of a better solution that doesn't require a u_resolution to be passed. The documention mentions outputFrame (among others) but I haven't been able to make that work - just declaring outputFrame as a "uniform vec4" prevents the fragment shader from compiling.
  4. Sure. Here it is. You can see how the "diamond" is not correctly placed within the size of the graphics element. If that wasn't clear, a similar example on ShaderToy shows the expected result. I'm not expecting anyone to fix my code, I just want some documentation on how to get a value from 0,0 (top left) to 1,1 (bottom right) on a fragment shader since vTextureCoord is not normalized and the previous workaround stopped working. Or to know whether something replaced the built-in "filterArea" in between beta-3 and rc-0.
  5. I have a few shaders I've created using Pixi 5.0.0-beta.3. To get the correct 0...1 UV for a position I'd normally do this, since vTextureCoord is not normalized: uniform vec2 u_resolution; // Manually passed uniform vec4 filterArea; // Automatically passed vec2 getUV(vec2 coord) { // Return correct UV for sprite coordinates return coord * filterArea.xy / u_resolution; } void main() { vec2 uv = getUV(vTextureCoord); gl_FragColor = vec4(uv.x, uv.y, 0.0, 1.0); } This doesn't seem to work anymore, though - filterArea is always empty. Unfortunately vTextureCoord is also not on the 0...1 range - if I try drawing a diamond... gl_FragColor = vec4(0.0, 0.0, 0.0, 1.0); vec2 uv = vTextureCoord; if (uv.x + uv.y > 0.5 && abs(uv.x - uv.y) < 0.5 && uv.x + uv.y <= 1.5) { // If within diamond, turn purple gl_FragColor = vec4(1.0, 0.0, 1.0, 1.0); } ...I get this (notice the incorrect scale): So, what's the way to calculate a 0...1 UV in Pixi 5.0? I've tried checking the diff between versions, but it's not very clear what has changed there. I've tried using outputFrame, inputSize, and inputPixel (I've seen that in some other examples) but to no avail. Any help in the right direction is appreciated!
  6. Cool, thanks again. I'm reading through it and it seems fairly approachable. Just as a heads up right now a first roadblock seems to be that the TypeScript definitions for Pixiv5 are not up-to-date (e.g. resources.VideoResource is not present, even though it exists; other resources.* are properly defined) so there's a little bit to fix until I get there. For reference, after a ~30 min test, this seems to work as a new MediaStreamResource, simply by extending VideoResource: import { Ticker, resources } from 'pixi.js'; /** * Resource type for HTMLVideoElement. * @class * @extends PIXI.resources.VideoResource * @memberof PIXI.resources * @param {MediaStream} source - Media stream to use * @param {object} [options] - Options to use * @param {number} [options.updateFPS=0] - How many times a second to update the texture from the video. * Leave at 0 to update at every render. */ export default class MediaStreamResource extends resources.VideoResource { constructor(source, options) { options = options || {}; const videoElement = document.createElement('video'); videoElement.srcObject = source; videoElement.play(); super(videoElement, { autoPlay: true, autoLoad: true, ...options.updateFPS, }); } } And the usage: const stream = ...; // MediaStream const res = new MediaStreamResource(stream, {}); const baseTexture = new BaseTexture(res, { mipmap: false }); const mediaTexture = new Texture(baseTexture); const mediaSprite = new Sprite(mediaTexture); this.addChild(mediaSprite); But there's still some errors to deal with (a weird DOMException that doesn't break anything, some Video-specific methods that are useless for MediaStreams, etc). Maybe a better solution would be to copy&paste VideoResource (extending BaseImageResource instead of VideoResource) and remove unneeded behavior. It'd be a bit of duplication but probably the correct solution. One alternative solution is also just detecting a MediaStream "source" given to VideoResource and using that instead. In the same way where you can pass a source URL. Anyway, I'll take a more serious stab at it soon.
  7. Thanks @ivan.popelyshev. Right now I'm using Pixi v5 from pixi.js@5.0.0-alpha.3 from npm, it seems to be working fine with the same workarounds... My current "high-level" solution (just a Sprite with a bunch of events) seem to be working well so for now the solution above suffices as I'm just trying to get this work moving... If I do have time later I'll try creating a new MediaStreamTextureResource or some such and contribute to v5. I have no problem dealing with pure WebGL, but I'd like to understand the Pixi internals first before I put something together to ensure it's following the standard architecture and can be cleanly merged.
  8. Thanks for the suggestion @ivan.popelyshev. Unfortunately "texture.baseTexture.mipmap = false" (right after calling Texture.fromVideo()) didn't do much... the error is still there if I don't do video.play() beforehand. I'll keep it though, hopefully it'll make the whole endeavor more stable. I do wonder if there's a more elegant overall solution however.
  9. Udpate with some progress: somehow, I need to call video.play() before I attempt to create the texture (despite autoplay). video.play(); This seems to make it work for the first update. Then we have to update on every frame... video.addEventListener("canplay", () => { const texture = Texture.fromVideo(video, SCALE_MODES.LINEAR, true, false); const sprite = new Sprite(texture); this.addChild(sprite); // Wait a frame before start updating, otherwise we still get the WebGL error requestAnimationFrame(() => { ticker.shared.add(() => { texture.update(); }); }); }); But it still seems to break pretty easily. I could wrap around all of that and make it cleaner to reuse, but I'm still curious if there's a better solution.
  10. I'm trying to create a Sprite using PixiJS v4 (or v5?) but feeding a texture from a Webcam (a MediaStream) into it. It doesn't seem to be possible. An old 2017 thread has a link to an example that should work. Unfortunately 1) this example doesn't work anymore, likely because of HTML change since then and 2) even if it worked, the solution also seems pretty convoluted (copying frame by frame?) so I'd like to avoid something like that if possible. On my case, I've tried creating a <Video> element and setting a source from that... const stream; // This is a MediaStream that comes from navigator.mediaDevices.getUserMedia() const video = document.createElement("video"); video.autoplay = true; video.srcObject = stream; document.documentElement.appendChild(video); This works well, since we can set the srcObject (not the src) of a <Video> to a stream and it works magically. But the problem arises when I try adding that to Pixi. I can do this: const texture = Texture.fromVideo(video, SCALE_MODES.LINEAR, true, false); const sprite = new Sprite(texture); Which is, I pass the <Video> instance (since I don't have a url) to the texture. But it doesn't work; Pixi itself is kinda silent, but them I get an initial Chrome render error: [.WebGL-0000017A07139680]GL ERROR :GL_INVALID_OPERATION : glGenerateMipmap: Can not generate mips And additional errors on every frame: [.WebGL-0000017A07139680]RENDER WARNING: texture bound to texture unit 8 is not renderable. It maybe non-power-of-2 and have incompatible texture filtering. Despite what the error says, I don't think it has anything to do with the bounds of the texture. Even if I grab a 256x256 texture from the camera, it stops working. For comparison, if I just set the <Video> src to a normal file... const video = document.createElement("video"); video.src = "myvideo.mp4"; document.documentElement.appendChild(video); ...then the above code works fine and I'm able to create a Texture from a <Video> tag. I've tried a different approach, creating a video URL out of a media stream (which, technically, should work): const stream; // This is a MediaStream that comes from navigator.mediaDevices.getUserMedia() const videoURL = URL.createObjectURL(stream); const texture = Texture.fromVideo(videoURL, SCALE_MODES.LINEAR, true, false); const sprite = new Sprite(texture); this.addChild(sprite); But I get a nasty error on the "URL.createObjectURL" line: Uncaught TypeError: Failed to execute 'createObjectURL' on 'URL': No function was found that matched the signature provided. So, anyway. Does anyone know of a way to create a texture from a webcam input in PixiJS?
  11. I know this is an old topic, but for future reference, I managed to solve this in a fairly simple yet alternative way if you're bundling your project together and don't want to include another .js file from CDN inside your HTML (in my case, my whole app is a single .js file, for optimization). 1. Install webfontloader from npm (this is the same as Google's WebFont): npm install webfontloader --save 1.1. If you're using TypeScript, you can also install its types npm install @types/webfontloader --save 2. Add your fonts to your HTML wrapper. In my case I had custom .woff fonts so I copied them to my destination asset folder and had them straight in the HTML: <!doctype html> <html lang="en"> <head> <meta charset="utf-8"> <!-- Metadata: main --> <title><%= htmlWebpackPlugin.options.title %></title> <!-- Styles --> <style> body { margin: 0; padding: 0; background: #999999; overflow: scroll; } /* n2 */ @font-face { font-family: 'My Font Name'; font-style: light; font-weight: 200; src: local('My Font Name Light'), local('My-Font-Name-Light'), url(fonts/MyFontName-Light.woff) format('woff'); } /* n4 */ @font-face { font-family: 'My Font Name'; font-style: normal; font-weight: 400; src: local('My Font Name Regular'), local('My-Font-Name-Regular'), url(fonts/MyFontName-Regular.woff) format('woff'); } /* n7 */ @font-face { font-family: 'My Font Name'; font-style: bold; font-weight: 700; src: local('My Font Name Bold'), local('My-Font-Name-Bold'), url(fonts/MyFontName-Bold.woff) format('woff'); } </style> <!-- Metadata: mobile devices --> <meta name="viewport" content="width=device-width, height=device-height, initial-scale=1.0, user-scalable=no, minimum-scale=1.0, maximum-scale=1.0"> </head> <body> <div id="app-container"></div> <% for (var chunk in htmlWebpackPlugin.files.chunks) { %> <script defer type="text/javascript" src="<%= htmlWebpackPlugin.files.chunks[chunk].entry %>"></script> <% } %></body> </html> 3. Load them inside the index.js file that starts Pixi. This function does it: import * as WebFont from "webfontloader"; ... WebFont.load({ custom: { families: [ "My Font Name:n2,n4,n7" ], }, fontactive: (name, fvd) => { console.log(`Loading font files ${name}:${fvd}`); }, active: () => { console.log("All font files loaded."); // CONTINUE here }, inactive: () => { console.error("Error loading fonts"); // REPORT ERROR here }, }); In the above code, "active" is the key, as it's the callback for when all fonts have loaded. No need for interval checks. In my own case the above code was part of a promise that resolve()d inside "active", and reject()ed inside "inactive". Check webfontloader for more details on loading fronts from vendors, or the "n2/n4/n7" nomenclature standard (again, same as "WebFont").