Search the Community
Showing results for tags 'Renderer'.
-
Hi, I am making a Lighting filter for PIXIJS 4 in RPGmaker MV but I've encountered some strange behavior regarding Render Textures. for some strange reason, rendering the scene to a render textures causes the position of the lights to mirror on the Y axis and I have no idea why. here are some photo's of what i mean. Below is a Screenshot of the scene in real time. and this is the resulting render Texture. As you can see, the blue light close to thew center barely moved, while the yellow one moved from the bottom of the screen to the top. neither lights changed in X-axis values. the lights themselves do not move, this only appears in the render texture. the render texture is generated using this code on one of the RPGmaker Events script call, however the effect seems to be affecting all PIXI render textures, including those used by the default engine. here is the code the generate the render texture an make a sprite. the sprite is rendered to the scene. rt = PIXI.RenderTexture.create(1280,720); renderer = Graphics._renderer; renderer.render(SceneManager._scene, rt); sprt = new PIXI.Sprite(rt); SceneManager._scene.addChild(sprt); At first I thought it may have to do with Transformation Matricies within the shader. As far as i know I couln't find anything obvious. here's a test shader I prepared below below. varying vec2 vTextureCoord; uniform sampler2D uSampler; vec4 lightDiffuse(vec2 lposition,vec4 ldiffuse,float lquadratic,float llinear){ float distance=length(lposition-gl_FragCoord.xy); float attenuation=1./(1.+llinear* distance+ lquadratic*(distance*distance)); return ldiffuse*attenuation; } void main(){ vec4 result=vec4(0.,0.,0.,0.); vec4 ambience=vec4(.1,.1,.1,1.); //Hard coded light 2 vec2 lPositionA=vec2(720.,200.); vec4 lDiffuseA=vec4(0.,1.,1.,1.); //Hard coded light 1 vec2 lPositionB=vec2(100.,125.); vec4 lDiffuseB=vec4(1.,1.,0.,1.); result+=lightDiffuse(lPositionA,lDiffuseA,.07,.05); result+=lightDiffuse(lPositionB,lDiffuseB,.007,.005); gl_FragColor=vec4(ambience.rgb+result.rgb,1)*texture2D(uSampler,vTextureCoord); } I've also prepared a test plugin for RPG maker that will load said shader and apply it as a filter the the Scenes Spriteset. (essentially Container containing sprites and tile maps but no UI or system sprites). Even with those hard coded values, the issue still arises, hence why I though it may a matrix issue. The plugin loads the loads the shader on start up and stores globally, then when you enter a Map scene, the map sprite will also create a filter for itself that contains the shader code. I can provide a build if its necessary. The code looks for the shader in "/js/Shaders/". the Shaders folter doesnt normally exist in RPGmaker projects. var DSE = DSE || {}; DSE.Lighting = function (_export) { _export.shader = null; function loadShader(name, type) { var xhr = new XMLHttpRequest(); xhr.open('GET', '/js/Shaders/' + name + type); xhr.onreadystatechange = _onShaderLoad.bind(this, xhr, type); xhr.send(); } function _onShaderLoad(xhr, type) { console.log("shader loaded?"); if (type == ".frag") { _export.shader = xhr.responseText; } } loadShader("LightTest", ".frag"); /** * @override */ Spriteset_Map.prototype.createLowerLayer = function () { Spriteset_Base.prototype.createLowerLayer.call(this); this.createParallax(); this.createTilemap(); this.createCharacters(); this.createShadow(); this.createDestination(); this.createLightLayers(); this.createWeather(); }; Spriteset_Map.prototype.createLightLayers = function () { console.log(_export.shader); this._filters = [new PIXI.Filter('', _export.shader)]; }; return _export; }({}); upon using both of those files and the script call in a new project the result is the same. I've yet to look in the PIXI JS file itself , but i figured I'd start either the PIXI.Filter or Render texture classes. I'm not sure exactly how they work, but I hope its simple enough. I couldn't find anything on google about this nor on this forums aside from this: https://github.com/pixijs/pixijs/issues/2074 which upon reading seems to be an entirely different issue. Anyway , I'm posting this here, to make sure it isn't something silly I've done rather than a bug with PIXIJs. Any advice would be greatly appreciated.
-
Many of my games I implant logic to only render when dirty. My UI code sets a dirty flag when ever anything changes rather than every possible frame. Most of my games are card games for mobile and there no constant animation, only when moving a card. I have done this with the goal to keep the device cool and not burn the battery. Players can be on for long periods of time. Is this solution overkill or does pixijs or phaser already optimize non changing frames?
-
Hi, I am trying to take a snapshot of the main container ( stage) of my application ,which I render on every frame ( customRenderer.render(stage)), and paste that snapshot on the topmost child container of the stage. The code looks like, const snapshot = this._customRenderer.generateTexture(this._stage) const sprite = new Sprite(snapshot ) this._stage.getChildByName("snapshotHolder").addChild(sprite) It takes snapshot alright, but if the stage is scaled down , the sprite even though of actual size of the stage ( lets say 1000x1000), the area covered of it by the snapshot is much less, rest of the area of the sprite is transparent. Not able to understand the logic behind this. I want to take the snapshot of the stage as it is visible (scaled or otherwise). Thanks for your help. -Arin
- 3 replies
-
- pixi
- rendertexture
-
(and 1 more)
Tagged with:
-
Hey guys, I'm brand new to PixiJS and have an issue that i can't seem to figure out. I'm working on a real time multiplayer game where there is a larger map ~3200 x 3200 and players are free to roam. However I want each players "camera" to only show a certain number of pixels so that someone with a large screen wont have an advantage over someone with a small screen. All the game simulation is handled on the server side. I know how to setup an on resize handler I'm just struggling with which values to update. I was able to fudge this with plain javascript like so: (I only every want to show 640x 360 pixels) const setCanvasSize = () => { console.log('Resizing canvas'); CANVAS_WIDTH = window.innerWidth; CANVAS_HEIGHT = window.innerHeight; const ratio = 16 / 9; if (CANVAS_HEIGHT < CANVAS_WIDTH / ratio) { CANVAS_WIDTH = CANVAS_HEIGHT * ratio; } else { CANVAS_HEIGHT = CANVAS_WIDTH / ratio; } canvas.width = 640; canvas.height = 360; canvas.style.width = `${CANVAS_WIDTH}px`; canvas.style.height = `${CANVAS_HEIGHT}px`; }; And now I need to replicate a similar behavior but am having some trouble. Desired effect is like so: this is a screen from my hacky version..
-
Hey, I want to ask if there's a way to create render textures of displayObjects which aren't rendered/shown on screen? For example, let's say I have a container which has the sprites of normal maps, I want to create a render texture of this container which I could then maybe use in a shader. Now, I obviously wouldn't want this container to be visible at all, if I make it invisible, the renderTexture is going to be empty as well. So, the use case is that I have a container which is just used to create textures that could be used in a shader, and I dont want this container to be visible in the scene. Is there a way to achieve this?
- 4 replies
-
- rendertexure
- container
-
(and 1 more)
Tagged with:
-
I am creating a game, where a lot of tiles need to be rendered. My goal is to render with this method at least 500k tiles. Since the tiles won't be redrawn every time but moved, this method should work out. I am using the pixi-tilemap library to create a fast & simple renderer to render a dynamic tilemap. The renderer approves itself to work, but .position.set seems to pull the tilemap from the interface away. I know, the function .position.set does not contain the same parameters as the one of the demo, however this function also does not work with the parameters of the demo. Some parameters must be added. The tutorial I have been using this classic demo: https://github.com/Alan01252/pixi-tilemap-tutorial
-
Hi, i want to do something like this in phaser : I've been learning webGL because i think it must be done with webGL, but if someone can tell me how i can do this effect in phaser3 it can be cool ^^.
-
Hi all, currently I'm read master RICH's tutorial for improve performance by batch texture. when I use the firefox dev tools canvas capture in the article https://phaser.io/tutorials/advanced-rendering-tutorial/part5 it works when I use canvas renderer, but when I change to webgl, it can not capture the analysis. FF shows error and break the app. TypeError: methodSignatureEnums is not a function-----------------call-watcher.js:168:34 I tried many version of FF and phaser. none of them can work. is there anyone tried this before and works. please give me a workable FF version, many thanks
-
Not sure if here's the best place for this but since it is not a bug etc I avoided opening an issue on github I stumbled upon some odd rotation code in phaser while reading https://phaser.io/tutorials/advanced-rendering-tutorial/part7 From src/pixi/renderers/webgl/utils/WebGLSpriteBatch.js and src/pixi/display/Sprite.js // Rotate matrix by 90 degrees // We use precalculated values for sine and cosine of rad(90) a = a0 * 6.123233995736766e-17 + -c0; b = b0 * 6.123233995736766e-17 + -d0; c = a0 + c0 * 6.123233995736766e-17; d = b0 + d0 * 6.123233995736766e-17; Ok. Some rotationcode with fancy precalculated values. But: the value given in e-notation is 0.00000000000000006123233995736766 and therefore effectively zero. Which makes sense since this was meant to be the precalculation of cos(rad(90)) - which equals exactly to zero. I suppose the literal resulted from a rounding error when someone typed Math.cos(Math.radians(90)) into a console So a better version would be: // Rotate matrix by 90 degrees a = -c0; b = -d0; c = a0; d = b0; Maybe someone sees this and is willing to fix it on the side...
-
Morning! Just working on a bit of a debug console for my game, was just wondering if it was at all possible to switch the renderers at runtime? Have tried a couple of methods, but it just halts the code from executing I think (no error), but guessing it would be a nogo?
-
Hi guys, I tested my game on ipad4, and sometimes a few sprites can not be displayed(most of the time is the bg image), but the rest are rendered normally, since it does not happen every time, and it only happen on ipad4, moreover, it does not pop out any error, it's really hard to find the cause, does anyone know what could be the possible cause to this sort of partially render problem?
-
Hi All, TLDR: i really just want a javascript sprite rendering engine, is Phaser right for me and/or with 3.0 will i be able to just use that part of the engine? I am about half way through my first phaser game and am loving it. Because of the nature of the game, i am using very little of the engine. All I am currently using the engine for is input buttons and sprite rendering. No physics, movement, world updates etc... I am using all my own code for doing movement, collision, updates, etc.. In the current phaser version, i am starting to see some slowdown on mobile browsers. I still have some obvious optimization to do which will help but i am at a point now where i could move things around easily since basically everything is just pure javascript. The hooks are not deep into Phaser at all. I read with 3.0 we would be able to add and remove things from phaser to match what we need, is this still the plan? Would i be able to just use the Sprite Render and Button/Input parts? Is there another engine that might be better (maybe just straight PIXI). Or should i just use Canvas and make my own simple sprite management? Any advice from you guys with experience would be much appreciated.
- 2 replies
-
- deveopment
- phaser
-
(and 6 more)
Tagged with:
-
Hello there! I have created a game with this amazing engine and Tiled app. I've added a scrolling background image layer to the map in Tiled, but when i modify the renderer to me.video.WEBGL or me.video.AUTO in the me.video.init function in the js file, the background imagelayer doesn't show up to me. If I use the canvas renderer than it's working fine. I've tested this three different application with the latest melonjs and boilerplate. Is it a known issue? It would be great if someone could give answer for me. Thanks!
- 6 replies
-
- renderer
- imagelayer
-
(and 2 more)
Tagged with:
-
Hi. Im fairly new on Pixi and im trying to do something with multiple renderers. I know i could add multiple canvas instead, however i need a dedicated webgl renderer to manipulate the transform and try to do some trapezoid forms. I also need both renderers to works on the same canvas to avoid creating multiple layers on the document.body. My approach was: 1. Have a main renderer and a main stage. 2. Have a sideRenderer that will be affected by different transforms (using gl.uniformMatrix4fv to change the shape of the whole renderer and achieve different shapes) and a sideStage that will hold any content (in this example, a simple sprite). 3. make the sideRenderer render to a RenderTexture, which will be the source of a Sprite, which will be added on the main stage. So in theory, anything that the side renderer renders to the RenderTexture should appear on the sprite on the main stage. If somehow i modify side renderer, the transformed output should be shown on the RenderTexture, if that makes any sense. I tried this with this example, and it doesnt works. If i append the sideRenderer.view to the document.body, it renders as expected, but its not what i want, as i need it to be part of a more complex logic. At some point this makes me realize that i cannot mix renderers like this ( maybe the sideRender is still working on the back while the mainRender is trying to render an incomplete RenderTexture ? ), cannot make one renderer render something for another renderer (sideRenderer to mainRenderer or viceversa), so i would like to know if there is any workaround or any way to override this behavior? Thanks for the help var renderer = null; var sideRenderer = null; var stage = null; var sideStage = null; var WIDTH = 1000; var HEIGHT = 500; var rt = new PIXI.RenderTexture( 1000, 500 ); var spriteRt = new PIXI.Sprite( rt ); init(); function init() { var rendererOptions = { backgroundColor: 0xffffff, transparent: true } // Create the renderer renderer = PIXI.autoDetectRenderer( WIDTH, HEIGHT, rendererOptions ); sideRenderer = PIXI.autoDetectRenderer( WIDTH, HEIGHT, rendererOptions ); // Add the canvas to the HTML document document.body.appendChild( renderer.view ); // Create a container object called the `stage` stage = new PIXI.Container(); sideStage = new PIXI.Container(); stage.addChild( spriteRt ); var loader = PIXI.loader; loader.add( 'texture', './media/crate.png' ); loader.once( 'complete', onLoadedAsset ); loader.load(); } function onLoadedAsset() { var texture = PIXI.Texture.fromFrame( './media/crate.png' ); var sprite = new PIXI.Sprite( texture ); sideStage.addChild( sprite ); update(); } function update() { sideRenderer.render( sideStage, rt ); renderer.render( stage ); requestAnimationFrame( update ); }
-
Hi there, for some low performance devices, phaser's webgl renderer doesn't work, but when I try pixi V3, it goes well, so I am wondering is there any chance to use PIXI V3 renderer in Phaser?
-
Hi all, I'm using Spine and for that matter I'm using spine-runtime, their own JS library (phaser plugins were clearly not as complete / up-to-date). To use this lib I had to create a custom object (a SpineObject), and to implement its _renderCanvas and _renderWebgl methods. For the canvas, I managed to properly render the animation. Works like a charm. For WebGL, however, I'm struggling a lot. I don't know WebGL, which doesn't help. What happens is that I can't manage to make both renderers (Phaser and Spine) coexist. I can display my SpineObject OR the Phaser scene. Do you guys have any tips for coding such features? Any good advice on how to use two WebGL renderers, what to do and what to avoid? Thanks a lot
-
We're coming to a point soon where most recent browsers on mobile support WebGL and am thinking about axing the canvas renderer to further minimize phaser build size. I don't see a way to exclude the canvas renderer in a custom build... I realize it's part of pixi and maybe that's why it can't be separated out, but just wanted to know for sure if it was not possible. Thanks
-
Hello, after using csg there are two edge lines that shouldn't be there I think. http://www.babylonjs-playground.com/#1MH4BF Can this be corrected somehow? Thanks!
-
Hello everyone ! I need to know if there is a way in Pixi to add width to the renderer view without scaling the content. I already tried using renderer.autoResize = false with no success. For example : // First I set the height of the view var h = window.innerHeight-100; renderer.view.style.height = h + "px"; // Then, I set the width with the good factor to keep everything in a good shape renderer.view.style.width = (h*2.666) + "px"; // I would like to be able to add width after without affecting the shape of the sprites renderer.view.style.width+=[missingWidthToFillTheScreen]; Do you think it's possible ?
-
Hi i just started with phaser and thought it would be cool to mix in some 3d overlays for certain elements. Is it possible to render uv mapped meshes using phaser? I tried to pass the gl context (game.renderer.gl) to a function that does some raw webgl operations, but it bails out with the message: "INVALID_OPERATION: uniform2f: location not for current program", which seems pixi related? (https://github.com/pixijs/pixi.js/issues/181) Thanks! Karl
-
Hi everybody, hope you will forgive and understand my bad english (from france here) I'm a very occasional coder. Most of the time I don't know what I'm doing but never stop until it works I am currently working on a project for a digital comics I am drawing. As the main purpose is to be able to set graphic scenes, panels, pages and so on, in a visual environnement, I choose to construct my digital comic in flash pro CC which is a very simple tool for building 2D scenes with animations and sound. Then, I use the flwebgl javascript tool made by Adobe for exporting flash scenes into webgl canvases (they also use createjs sound library) and then use the flwebgl API to add some interactivity in my html5 page. But both flwebgl and its API are very basic. Especially the renderer. I miss a lot of features included in Pixi. So my question is : is there a way to use Pixi as the final renderer on another webgl library like the one used in flash pro cc ? To be honest, I have no idea how their library works. My knowledge in coding is too limited, definitely. You can have a look on flwebgl here : https://github.com/claus/flwebgl.ts/blob/master/lib/flwebgl-0.2.js (absolute chinese to me ) For those interested by using flash animation with webgl I recommand to give a try at OpenFl. It seems more complete than the adobe library but very more complex too. Too much for me, I must say ... And the assembling process is too complex for a simple project like mine. But maybe there is another way to exploit swf flash animations with Pixi ?
-
I need to do an update on a container of sprites, so I use container.removeChildren() to clean it out first. Then I just add in the new sprites using container.addChild(). The container is inside another container. Sometimes when I render after doing this I get the following error... DisplayObject.prototype.updateTransform = function() { // create some matrix refs for easy access var pt = this.parent.worldTransform; //Uncaught TypeError: Cannot read property 'worldTransform' of null var wt = this.worldTransform;Uncaught TypeError: Cannot read property 'worldTransform' of null 24.DisplayObject.updateTransform @ pixi.js:7698 23.Container.updateTransform @ pixi.js:7198 23.Container.updateTransform @ pixi.js:7201 48.WebGLRenderer.render @ pixi.js:13790 Renderer.draw @ index.html:626 tick @ index.html:236 event @ d3.min.js:549 tick @ d3.min.js:6651 d3_timer_mark @ d3.min.js:2486 d3_timer_step @ d3.min.js:2466The this context is the sprite, it seems like maybe a deleted sprite is being rendered and because removeChildren sets the parent of the removed children to null this happens. It happens seldom when I have the dev tools open (I'm working in chrome) but it happens every time when the devtools are closed (and the thing is cycling much faster). Is it possible that the delete event emitted by removeChildren on delete is not propagating fast enough? Is this a known issue with removeChildren? Is there a better way to manage the process of updating a container of sprites with a dynamic population?I tried hacking around it by only removing the excess sprites and then doing this... circles.removeChildAt(i);circles.addChildAt(circle, i);but I still got the same error.