hexus Posted June 16, 2016 Share Posted June 16, 2016 Yo! I've started developing a game and I've been learning a hell of a lot about WebGL for lighting and shadows, and I'm hoping to implement some sort of lighting pipeline, following a technique like this: http://ahamnett.blogspot.co.uk/2013/05/2d-shadows-shader.html This means I'll need to process textures in a fragment shader, and then process scene using the resulting bitmap in another shader. Just to test, I tried rendering a part of a game world to a render texture. This worked fine without any shaders (Phaser/PIXI filters) applied. The render texture is displayed using a Sprite on the right hand side, just so I could see the result. Cool. However, if I add filters to what I'm rendering, the perspective gets all messed up. For example, my lighting fragment shader: So instead of scratching my head over this, having done quite a bit of Googling the past couple of days, I thought I'd ask for help here. Is there a way to render to a render texture with shaders applied correctly? Is there a typical approach to setting up a pipeline like this? Essentially I'm looking to perform transformations as described in the above article so I can start casting some sweet shadows, but I can't quite see how yet. The perspective is fine if I apply shaders to the stage instead, but of course then I have no shader processing for the resulting render texture: Link to comment Share on other sites More sharing options...
Fatalist Posted June 17, 2016 Share Posted June 17, 2016 Does your sprite have a position(other than 0,0) or scale(other than 1,1) when you're rendering it on the rendertexture? rendertexture ignores sprite's position/rotation/scale. I've been working on a similar method recently, using pixi.js - http://light.netlify.com/ (the code is not for the faint of heart, it's more complicated method and needs some cleanup). drhayes 1 Link to comment Share on other sites More sharing options...
hexus Posted June 17, 2016 Author Share Posted June 17, 2016 Nice! That's exactly the sort of effect I'm going for, though I've noticed the shadows flicker around surfaces when the light moves. I'm actually drawing part of the entire game world, and no sprites have been scaled. What doesn't make sense to me is why only rendering it with a filter does the perspective get thrown off, when the filter renders fine regularly (not to a render texture). let matrix = new PIXI.Matrix( 1, 0, 0, 1, -object.x - object.width / 2 + this.distanceSprite.width / 2, -object.y - object.height / 2 + this.distanceSprite.height / 2 ); this.distanceTexture.render(this.world, matrix, true); That's how I'm rendering to the texture. Perhaps this is just a bug in Pixi v2, or I'm doing something wrong in my shader, which can be found on Shadertoy: https://www.shadertoy.com/view/MsyXz3 Link to comment Share on other sites More sharing options...
Fatalist Posted June 17, 2016 Share Posted June 17, 2016 16 minutes ago, hexus said: I've noticed the shadows flicker around surfaces when the light moves. Yeah, that's because of the soft edges, my current method turned out very unreliable, I'll have to do that differently. 19 minutes ago, hexus said: Perhaps this is just a bug in Pixi v2, or I'm doing something wrong in my shader, which can be found on Shadertoy: What values do you pass for iResolution? [renderTexture.width, renderTexture.height]? Are you assigning your filter to sprite.shader or sprite.filters ? Link to comment Share on other sites More sharing options...
hexus Posted June 17, 2016 Author Share Posted June 17, 2016 21 minutes ago, Fatalist said: What values do you pass for iResolution? [renderTexture.width, renderTexture.height]? Are you assigning your filter to sprite.shader or sprite.filters ? I apply the shader to the game world at the moment, using world.filters. The variables I use in my set up differ from those on shadertoy, but it's the resolution uniform passed in by Phaser/Pixi that I'm using. 21 minutes ago, Fatalist said: Yeah, that's because of the soft edges, my current method turned out very unreliable, I'll have to do that differently. Oh right. For the soft edges I was planning on a gradual gaussian blur based on the distance, though I haven't thought how I'll do that in practice yet. Link to comment Share on other sites More sharing options...
hexus Posted June 17, 2016 Author Share Posted June 17, 2016 Ah, actually I'm passing the screen resolution as the resolution uniform, so that the result indeed fits the screen. Maybe that's where this is going wrong: the screen resolution works fine for the game canvas, but when rendering to a render texture the resolution is going to be different - perhaps the transform matrix doesn't work the way I'd expect it to here. Later on I'll tinker with setting the resolution to the render texture size before calling the above .render(). Link to comment Share on other sites More sharing options...
Fatalist Posted June 17, 2016 Share Posted June 17, 2016 2 hours ago, hexus said: For the soft edges I was planning on a gradual gaussian blur based on the distance That's the only reliable way I guess. Currently I do it like this: in the 1d height texture, for each angle, I store the position of the obstacle that casts shadow on this column(along with the column height), obviously there are many such obstacles so I just choose the first obstacle, and in the final shader, I calculate what part of the light disc is covered by this obstacle from current pixel. I thought it would still look fine and I'd avoid multiple texture reads in the final shader. But it seams blurring is needed anyway... 1 hour ago, hexus said: Later on I'll tinker with setting the resolution to the render texture size before calling the above .render(). Yeah it has to be something with the resolution. Link to comment Share on other sites More sharing options...
hexus Posted June 18, 2016 Author Share Posted June 18, 2016 21 hours ago, Fatalist said: Yeah it has to be something with the resolution. Seems like it was! I played around, modified the filter/shader resolution to that of the render texture (and catered for positioning too), then rendered the render texture. Then, the uniforms get set back to screen space for Phaser's own rendering, and voila, job done. Makes a lot of sense now, seems like it should have been obvious. Thanks for the help! Now I can start properly messing around with a lighting pipeline. Tom Atom 1 Link to comment Share on other sites More sharing options...
Recommended Posts