Search the Community
Showing results for tags 'glsl'.
-
I think I have seen this question asked before, but I have not been able to implement a working solution in pixi.js v4.5.0. I am new to this, so bear with me.... I have a custom GLSL filter (the Game of Life, for now)-- I want to continually apply the filter to the sprite in a feedback loop. That is: Apply filter to sprite image Capture the result Set sprite image to result Apply filter to sprite image Loop I have attempted this with PIXI.Texture.fromCanvas(app.view) -> texture.update(), but that resulted in a black screen. I think using RenderTexture buffers with app.renderer.render may be on the right track, but I'm not sure how I should be applying them. I attempted to set my sprite.texture to a RenderTexture and it resulted in an error... I've also had various "Feedback loop detected" warnings in the console when I attempted to pass render textures as filter uniforms and read from those. Is there a way to set the sprite texture to the filtered sprite?
-
Hello, friends. There was such a task. Start up one! a wave over a sphere, which, as it were, looks like a map of the planet, which (map) in turn is generated from planeBufferGeometry and then these geometries are combined using THREE.BufferGeometryUtils.mergeBufferGeometries(geometries, false); Code of vertex shader: vertexShader:` varying vec2 vUv; uniform float time; void main(){ vUv=uv; vec3 newposition = position + position*sin(position.z*12.)*0.03; gl_Position = projectionMatrix * modelViewMatrix * vec4( newposition, 1. ); } `, The question is how to make only this one "wave", i.e. so that sin does not go over the entire sphere, but only in the middle, for example. And further. How can I change the direction of this "wave", now it comes from the Z coordinate of this sphere, I need it to go from the position I have defined (for example, approximate coordinates of London).
-
Our game has mostly interior environments, we need reasonably correct reflection for the floor. We’ve already using box projected cubemaps (i.e. parallax envMap) but as each mesh can only has 1 envMap, we have to split the floor to multiple parts according to the local cubemap position, which is unreasonable for our use case. We need someone to implement the POI based cubemap blending method as described very detailed here: https://seblagarde.wordpress.com/2012/09/29/image-based-lighting-approaches-and-parallax-corrected-cubemap/ References: https://seblagarde.files.wordpress.com/2012/08/parallax_corrected_cubemap-siggraph2012.pdf https://docs.godotengine.org/en/3.1/tutorials/3d/reflection_probes.html#blending https://docs.unity3d.com/Manual/UsingReflectionProbes.html We’re really looking forward for a long-term collaboration. The budget is to be negotiated. Please send your quote to: [email protected]
-
A part of my game's post-process render pipeline: Downscale render to 25% size Do some post-processing on the downscaled image Pass both the image before step 1 and the image after step 2 into a GLSL fragment shader with effect.setTextureFromPostProcessOutput(...) Fragment shader outputs the low-res processed image overlaid on top of the original high-res render Problem: The final render is pixelated. I guess the initial downscale made it so the shader doesn't use the higher-res input texture as the "base resolution"? What's going on here? How do I properly set fragment shader input textures of different resolutions in a post-processes?
- 6 replies
-
- fragment shader
- texture sampling
-
(and 2 more)
Tagged with:
-
So I figured with a few people making some cool shaders now and the purposed improvements to the CYOS. I figured we should have a thread for shader development to showcase what people are making and talk about different methods and concepts. To kick things off I figured id post a procedural skymap... this is a cleaned up version of the first on I posted last night and is based off a standard box element. I have not tested it in scene yet but the CYOS output is promising. Ill be looking to add volumetric weather here soon and will be making the suns position dependent on a light on the scene. Anyways feel free to comment it is pretty much a direct port of a Atmospheric GLSL process I found on github. http://www.babylonjs.com/cyos/#14WKFU#1 Does anyone have any good resources for volumetric cloud rendering with a light source? Im reading up on this first http://www-evasion.imag.fr/Publications/2008/BNMBC08/clouds.pdf
-
I am trying to implement an "Additive Shader" (from space shooter tutorial) where BLACK pixels are transparent (or do NOT ADD) and the rest of color add on top... Do we (BabylonJS Community) has a shader already that does something like that??? if not, i will have to make one... I tried to start off by just return a transparent color: void main(void) { gl_FragColor = vec4(0.0, 0.0, 0.0, 0.0); } I have "needsAlphaBlending = true" on shader material options object BUT I STILL SEE BLACK SQUARE (I little less bright , but still there)... I would assume that setting a color rgba (0,0,0,0) would make EVERY pixel transparent... But it is not. Any help or info would be very kool
- 10 replies
-
- alpha blending
- shader
-
(and 1 more)
Tagged with:
-
I took a quick look at the source code and it seems that we have no way to update only one element of an uniform array? Just like below.. // in JavaScript at init time var someVec2Element0Loc = gl.getUniformLocation(someProgram, "u_someVec2[0]"); var someVec2Element1Loc = gl.getUniformLocation(someProgram, "u_someVec2[1]"); var someVec2Element2Loc = gl.getUniformLocation(someProgram, "u_someVec2[2]"); // at render time gl.uniform2fv(someVec2Element0Loc, [1, 2]); // set element 0 gl.uniform2fv(someVec2Element1Loc, [3, 4]); // set element 1 gl.uniform2fv(someVec2Element2Loc, [5, 6]); // set element 2 Well I need to hack like this... var locs = engine.getUniforms(material.getEffect()._program, ['test1[0]']); engine.setFloat3(locs[0], 0.0, 1.0, 0.0);
-
Hey All, Can I use any GLSL fragment or vertex shader (including 3d raymarching stuff) as a texture in Babylonjs including animated ones? I've done some google searches and I know you can use some, but what are the limitations? For example could I put any animated texture from GLSL Sandbox http://glslsandbox.com/ onto a Babylon.js plane mesh? Do I need to put the uniform variables in the render loop for animation to work? Super hoping the answer is yes, but any and all info will be helpful?
-
Hey folks, first of all: thank you for this great forum. It came to the rescue a few times, now. So, thanks everyone, who's participating. I'm currently working on an idea, where I would like to project a spherical panorama texture to a mesh from inside (meaning from the viewpoint). Similar to a standard VR-Viewer, where the texture is mapped on a sphere from inside. But in my case I would like to map it on the actual scene-mesh which I get from 3dsmax. Now, I know, that I could create the UV's or bake the texture in 3dsmax, but I want to switch between two camera positions and therefore change the projected texture and the center of the spherical projection. I already got camera mapping to work with a planar image like this: http://www.babylonjs-playground.com/#203BJM#2 but that's not exactly what I need. My image has to get spherical wrapped around the camera AND stay in position, while the camera moves around. Is this understandable? I could provide a small scene, which I need to create, first, as my actual scene is way too big and consists of too many elements. But maybe someone already got a solution or an idea. I'm not even sure, if this is possible with babylonjs-coordinates modes or has to be solved with a custom shader. thanks
- 7 replies
-
- projection mapping
- spherical projection
-
(and 3 more)
Tagged with:
-
Do we have now or are we going to support the webgl (version 2 i think ) sampler2DArray. I know we have the sampler2D[x] approach that actual take a array of separate babylon textures... But i think each texture STILL counts against the MAX_TEXTURE_IMAGE_UNITS and the new sampler2DArray approach counts against a MAX_COMBINED_TEXTURE_IMAGE_UNITS as well as the sampler2DArray approach allow for tiling in your texture atlas... Here is sample WEBGL texture array code: uniform sampler2DArray myTextureSampler; in vec2 UV; in int index; out vec3 out_Color; void main(void) { //Use the texture coordinates as usual but the different textures are indexed by the third component of a vec3 out_Color = texture2DArray(myTextureSampler, vec3(UV, index)).rgb; } Can we support this now... or in the near future ??? @Deltakosh @NasimiAsl @Sebavan @RaananW and anybody else
-
I am trying to create a fragment shader via a PIXI.AbstractFilter to create a wave rippling effect to be applied to a background texture. I have already worked out the algorithm for the wave effect in JavaScript. What I am having difficulty doing is getting the data I need into the shader through PIXI. For my effect to work, I need to have a large Float32Array to keep track of wave heights and a texture containing the original, unaltered contents of the background image to read from in order to apply the effect of pixel displacement (light refraction). I've been doing a lot of searching and have come up with some partial solutions. I attempt to load my large Float32Array into the shader as a texture with type GL.FLOAT (with the OES_texture_float extension) and an internal format of GL.LUMINANCE and read from it. From what I can tell, my shader isn't receiving my data the way I need it to. Just as a test, I set gl_FragColor to read from my data texture, and instead of the solid black that should have appeared, it rendered a color from either the source texture or the texture of the sprite that the filter is applied to.If I weren't using PIXI, what I would try next is to use gl.getUniformLocation, but it takes the current program as its first parameter, and I don't know of a way to access that. The basic flow of my shader needs to go: Read From Array -> Calculate displacement based on value -> Render the current fragment as the color at x+displacement, y+displacement -> Get updated version of array This is my code in the constructor for my shader: ws.Shader = function(tex) { // GLSL Fragment Shader for Wave Rendering ws.gl = game.renderer.gl; ws.flExt = ws.gl.getExtension("OES_texture_float"); var unis = { dataTex: { type: "sampler2D", value: ws.gl.TEXTURE1 }, canvasTex: { type: "sampler2D", value: ws.gl.TEXTURE2 }, mapSize: { type: "2f", value: [ws.width+2,ws.height+2] }, dispFactor: { type: "1f", value: 20.0 }, lumFactor: { type: "1f", value: 0.35 } }; var fragSrc = [ "precision mediump float;", "varying vec2 vTextureCoord;", "varying vec4 vColor;", "uniform sampler2D uSampler;", "uniform sampler2D dataTex;", "uniform sampler2D canvasTex;", "uniform vec2 mapSize;", "uniform float dispFactor;", "uniform float lumFactor;", "void main(void) {", "vec2 imgSize = vec2(mapSize.x-2.0,mapSize.y-2.0);", "vec2 mapCoord = vec2((vTextureCoord.x*imgSize.x)+1.5,(vTextureCoord.y*imgSize.y)+1.5);", "float wave = texture2D(dataTex, mapCoord).r;", "float displace = wave*dispFactor;", "if (displace < 0.0) {", "displace = displace+1.0;", "}", "vec2 srcCoord = vec2((vTextureCoord.x*imgSize.x)+displace,(vTextureCoord.y*imgSize.y)+displace);", "if (srcCoord.x < 0.0) {", "srcCoord.x = 0.0;", "}", "else if (srcCoord.x > mapSize.x-2.0) {", "srcCoord.x = mapSize.x-2.0;", "}", "if (srcCoord.y < 0.0) {", "srcCoord.y = 0.0;", "}", "else if (srcCoord.y > mapSize.y-2.0) {", "srcCoord.y = mapSize.y-2.0;", "}", /*"srcCoord.x = srcCoord.x/imgSize.x;", "srcCoord.y = srcCoord.y/imgSize.y;",*/ "float lum = wave*lumFactor;", "if (lum > 40.0) { lum = 40.0; }", "else if (lum < -40.0) { lum = -40.0; }", "gl_FragColor = texture2D(canvasTex, vec2(0.0,0.0));", "gl_FragColor.r = gl_FragColor.r + lum;", "gl_FragColor.g = gl_FragColor.g + lum;", "gl_FragColor.b = gl_FragColor.b + lum;", "}"]; ws.shader = new PIXI.AbstractFilter(fragSrc, unis); // Send empty wave map to WebGL ws.activeWaveMap = new Float32Array((ws.width+2)*(ws.height+2)); ws.dataPointerGL = ws.gl.createTexture(); ws.gl.activeTexture(ws.gl.TEXTURE1); ws.gl.bindTexture(ws.gl.TEXTURE_2D, ws.dataPointerGL); // Non-Power-of-Two Texture Dimensions ws.gl.texParameteri(ws.gl.TEXTURE_2D, ws.gl.TEXTURE_MIN_FILTER, ws.gl.NEAREST); ws.gl.texParameteri(ws.gl.TEXTURE_2D, ws.gl.TEXTURE_WRAP_S, ws.gl.CLAMP_TO_EDGE); ws.gl.texParameteri(ws.gl.TEXTURE_2D, ws.gl.TEXTURE_WRAP_T, ws.gl.CLAMP_TO_EDGE); ws.gl.texImage2D(ws.gl.TEXTURE_2D, 0, ws.gl.LUMINANCE, ws.width+2,ws.height+2,0, ws.gl.LUMINANCE, ws.gl.FLOAT, ws.activeWaveMap); // Send texture data from canvas to WebGL var canvasTex = ws.gl.createTexture(); ws.gl.activeTexture(ws.gl.TEXTURE2); ws.gl.bindTexture(ws.gl.TEXTURE_2D, canvasTex); // Non-Power-of-Two Texture Dimensions ws.gl.texParameteri(ws.gl.TEXTURE_2D, ws.gl.TEXTURE_MIN_FILTER, ws.gl.NEAREST); ws.gl.texParameteri(ws.gl.TEXTURE_2D, ws.gl.TEXTURE_WRAP_S, ws.gl.CLAMP_TO_EDGE); ws.gl.texParameteri(ws.gl.TEXTURE_2D, ws.gl.TEXTURE_WRAP_T, ws.gl.CLAMP_TO_EDGE); ws.gl.texImage2D(ws.gl.TEXTURE_2D, 0, ws.gl.RGBA, ws.gl.RGBA, ws.gl.UNSIGNED_BYTE, tex.imageData); } I then attempt to update dataTex in the ws object's update loop: ws.activeWaveMap.set(ws.outgoingWaveMap); // WebGL Update ws.gl.activeTexture(ws.gl.TEXTURE1); ws.gl.bindTexture(ws.gl.TEXTURE_2D, ws.dataPointerGL); /* // Non-Power-of-Two Texture Dimensions ws.gl.texParameteri(ws.gl.TEXTURE_2D, ws.gl.TEXTURE_MIN_FILTER, ws.gl.NEAREST); ws.gl.texParameteri(ws.gl.TEXTURE_2D, ws.gl.TEXTURE_WRAP_S, ws.gl.CLAMP_TO_EDGE); ws.gl.texParameteri(ws.gl.TEXTURE_2D, ws.gl.TEXTURE_WRAP_T, ws.gl.CLAMP_TO_EDGE);*/ ws.gl.texImage2D(ws.gl.TEXTURE_2D, 0, ws.gl.LUMINANCE, ws.width+2,ws.height+2,0, ws.gl.LUMINANCE, ws.gl.FLOAT, ws.activeWaveMap); I'm sure that plenty of this isn't right, but I believe that I can sort things out once I can get to the point where I can actually access my data. Can anyone point me in the right direction? This is central enough to my project that I am willing to discard PIXI altogether if there isn't a way to implement what I am trying to do. Also, I am using PIXI via Phaser, if that makes a difference. Thanks!
-
Updating shadow based on vertex shader displacements
alexolotl posted a topic in Questions & Answers
I'm morphing an object's vertices using a vertex shader in Babylon.js. The morphed object looks great, but I can't figure out a way to cause the object's shadow to update as well. I know in Three.js there is a customDepthMaterial for a mesh where you can pass in the same custom vertex shader and correctly update the object's shadow, but is there something similar in Babylon.js? Thanks!- 2 replies
-
- babylon.js
- shader
-
(and 2 more)
Tagged with:
-
While trying to make my own shader I seem to get stuck in getting shadows to work for a directional light. I copy/pasted what I believe are all the relevant parts from the babylon standard shaders into my own vertex and fragment shaders, and while it compiles and light/textures work just fine, shadows do not appear. Could anyone point me to the issue? gl throws a warning about lack of textures, but I don't believe that affects the outcome as my complete shader with textures has the same issue. http://www.babylonjs-playground.com/#1JFVDG#0; Uncomment line 124 to apply the shader to the ground mesh. Thanks again as usual PS. the playground seems to hang quite a lot trying to run this code, out of the playground it works fine.
-
I am not very familiar with GLSL. I am trying to integrate an Ambient Occlusion Shader with Babylon using ShaderMaterial. The shader source can be found at: https://github.com/mikolalysenko/ao-shader Vertex Shader: https://github.com/mikolalysenko/ao-shader/blob/master/lib/ao.vsh Fragment shader: https://github.com/mikolalysenko/ao-shader/blob/master/lib/ao.fsh I've setup the shader in CYOS at: http://www.babylonjs.com/cyos/#1F1POU CYOS is throwing some errors: [.Offscreen-For-WebGL-0x7f9aaa872c00]PERFORMANCE WARNING: Attribute 0 is disabled. This has significant performance penalty /cyos/#1F1POU:1 [.Offscreen-For-WebGL-0x7f9aaa872c00]RENDER WARNING: there is no texture bound to the unit 0 When using the shader with `shaderMaterial` like: var aoShader = new BABYLON.ShaderMaterial("AO", scene, { vertexElement: "vertexShaderCode", fragmentElement: "fragmentShaderCode" }, { attributes: ["attrib0", "attrib1"], uniforms: ["projection", "view", "model", "tileCount", "tileSize", "tileMap"] }); aoShader.setFloat('tileSize', 0.16); This produces an error: `babylon.js:4 WebGL: INVALID_OPERATION: drawElements: no buffer is bound to enabled attribute` I believe the Shader is dependent on some external mesh data. I'm not sure how to pass it.
-
I really need help understanding the GLSL versions of the provided BABYLON uniforms. I have seen them called so many different names depending on who's docs your reading. The ONLY ONE I know for sure is in babylon.js when you say 'worldViewProjection' that is the GLSL computation of 'gl_ProjectionMatrix * gl_ModelViewMatrix' or you can use the built-in shortcut 'gl_ModelViewProjectionMatrix' I need someone who know BABYLON JS GLSL Stuff to PLEASE PLEASE PLEASE... Tell me what the others equal in GLSL terms: view = gl_??? projection = gl_??? (maybe its: gl_ProjectionMatrix ) viewProjection = gl_??? * gl_??? world = gl_??? worldView = gl_??? worldViewProjection = 'gl_ProjectionMatrix * gl_ModelViewMatrix' or can be simplified as just 'gl_ModelViewProjectionMatrix' I am sorry i don't know the GLSL stuff, but if i can someone who does to simple fill out the five 'gl_???' place holders above... Ill be your friend for life Note: I and trying yo create a block of code that will run in both unity using GLSL stuff and BABYLON JS using native uniforms. By Babylon Macros: //BABYLON-VERTEX-MACROS-START attribute vec3 position; vec4 GL_POSITION_ATTRIBUTE() { return vec4(position, 1.0); } attribute vec3 normal; vec3 GL_NORMAL_ATTRIBUTE() { return normal; } attribute vec2 uv; vec2 GL_UV_ATTRIBUTE() { return uv; } attribute vec2 uv2; vec2 GL_UV2_ATTRIBUTE() { return uv2; } attribute vec2 uv3; vec2 GL_UV3_ATTRIBUTE() { return uv3; } attribute vec2 uv4; vec2 GL_UV4_ATTRIBUTE() { return uv4; } attribute vec2 uv5; vec2 GL_UV5_ATTRIBUTE() { return uv5; } attribute vec2 uv6; vec2 GL_UV6_ATTRIBUTE() { return uv6; } attribute vec4 color; vec4 GL_COLOR_ATTRIBUTE() { return color; } uniform mat4 view; mat4 GL_VIEW_UNIFORM() { return view; } uniform mat4 projection; mat4 GL_PROJECTION_UNIFORM() { return projection; } uniform mat4 viewProjection; mat4 GL_VIEWPROJECTION_UNIFORM() { return viewProjection; } uniform mat4 world; mat4 GL_WORLD_UNIFORM() { return world; } uniform mat4 worldView; mat4 GL_WORLDVIEW_UNIFORM() { return worldView; } uniform mat4 worldViewProjection; mat4 GL_WORLDVIEWPROJECTION_UNIFORM() { return worldViewProjection; } //BABYLON-VERTEX-MACROS-END My GLSL Equivalent Macros (as you can see I still need the gl_??? parts): //BABYLON-VERTEX-MACROS-START vec4 GL_POSITION_ATTRIBUTE() { return gl_Vertex; } vec3 GL_NORMAL_ATTRIBUTE() { return gl_Normal; } vec2 GL_UV_ATTRIBUTE() { return vec2(gl_MultiTexCoord0.xy); } vec2 GL_UV2_ATTRIBUTE() { return vec2(gl_MultiTexCoord1.xy); } vec2 GL_UV3_ATTRIBUTE() { return vec2(gl_MultiTexCoord2.xy); } vec2 GL_UV4_ATTRIBUTE() { return vec2(gl_MultiTexCoord3.xy); } vec2 GL_UV5_ATTRIBUTE() { return vec2(gl_MultiTexCoord4.xy); } vec2 GL_UV6_ATTRIBUTE() { return vec2(gl_MultiTexCoord5.xy); } vec4 GL_COLOR_ATTRIBUTE() { return gl_Color; } mat4 GL_VIEW_UNIFORM() { return gl_???; } mat4 GL_PROJECTION_UNIFORM() { return gl_??? ; } mat4 GL_VIEWPROJECTION_UNIFORM() { return gl_???; } mat4 GL_WORLD_UNIFORM() { return gl_???; } mat4 GL_WORLDVIEW_UNIFORM() { return gl_???; } mat4 GL_WORLDVIEWPROJECTION_UNIFORM() { return gl_ModelViewProjectionMatrix; } //BABYLON-VERTEX-MACROS-END I know some GLSL guy look at that GLSL BLOCK and see those gl_??? and know exactly with those be... no problem for him. If you are that guy, please help me
-
Hi! I'm trying to apply one filter to a large number (~256) of small (32x32 px) sprites. Within the filter, I'm using vTextureCoord to get the current sprite's coordinates, to draw borders on it. vTextureCoord breaks, apparently referring to the containing canvas's coordinates instead of the individual sprites' coordinates. BUT if I apply that same filter twice (two elements in .filters[] array), in one of the copies vTextureCoord actually does point to the sprite coordinates, and borders are drawn correctly. The other copy still points to the canvas coordinates, and the whole thing becomes laggy for some reason, and filters don't get removed when they should, too. Live version (hover over the individual sprites to apply filter second time): http://uoowuo.github.io/cellulata/ All the sprites are white-tinted because of the first filter invocation, with incorrect vTextureCoord coordinates. Code: Shader https://github.com/uoowuo/cellulata/blob/master/src/classes/shaders.js#L48 First filter application https://github.com/uoowuo/cellulata/blob/master/src/classes/cell.js#L42 Second filter application https://github.com/uoowuo/cellulata/blob/master/src/classes/cell.js#L83 Hover is just for the sake of illustration, if I apply filter twice statically to all sprites, the behavior is still the same. Thanks for your time!
-
Hi guys. I was working on some shading and discovered a strange behavior. I'm new to all this so maybe I just don't understand something, perhaps you could help - I localized that behaviour in this shader: fragmentSrc = [ 'precision mediump float;', 'varying vec2 vTextureCoord;', 'uniform sampler2D texture;', 'void main(void) {', ' vec4 ownColor = texture2D(texture, vTextureCoord);', ' gl_FragColor = vec4(1.0, 0.0, 0.0, ownColor.a);', '}' ]; Now it looks to me that my shader should tint a sprite in red, but instead it sorta does but then it also snaps it into screen blending mode whenever alpha is less than 1. I've created this codepen with some colors where you can clearly see screen blending is taking place: http://codepen.io/waterplea/pen/dMzXje Can anybody explain to me why it behaves not as I expected it to? The way I see it it should output fully transparent pixels where original sprite had 0 alpha but instead it outputs 100% opaque red pixels in screen blending mode.
-
Hi! I have small quest for bjs-jedi-knight's If we need get screen position of point in 3d we make something like this: gl_Position = worldViewProjection * vec4(vPosition, 1.0); but what we make do, if we need back operation (get 3d point), if we know screen coords and have 3d object geometry? In BABYLON here help Scene.pick() method, but how looks his analog in GLS? What algoritm? p.s. i can make Scene.pick and translate result in Shader, but it will not so fast, as computing in GLSL, i think. p.p.s. May the BJS-Force be with you
-
Hello I'm having some issues with working with the aTextureCoord when writing custom filters for pixiJS (https://github.com/pixijs/pixi.js/issues/2142). Do any of you know by accident, how can I make the transforms so that I can work with the textureCoord as in a clean webGL environment?
-
Hey people! I wanna implement this shader (https://www.shadertoy.com/view/MslGWN#) in phaser state. So i transformed the shader to WebGL style, and added patch to phaser to update iChannel uniforms on update. Phaser.Filter.prototype.update = function (pointer, uniforms) { if (typeof pointer !== 'undefined') { var x = pointer.x / this.game.width; var y = 1 - pointer.y / this.game.height; if (x !== this.prevPoint.x || y !== this.prevPoint.y) { this.uniforms.mouse.value.x = x.toFixed(2); this.uniforms.mouse.value.y = y.toFixed(2); this.prevPoint.set(x, y); } } if (typeof uniforms !== 'undefined') { for (var i in uniforms) { this.uniforms[i].value = uniforms[i]; } } this.uniforms.time.value = this.game.time.totalElapsedSeconds();};But it does no helped. There is no affection from music to fractal generation. Like i do pass null's. Example could be found here: https://timopheym.me/phaser/shader_music.html (sorry, i don't have ssl certificate =( ) I have no idea where am i wrong. It's about two days i am trying to fight it... thanks!
-
Hello ! I'm currently working on a "fog of war" material : the standard material + a texture to keep track of once-lit areas, and display them later event if they are not illuminated (because of the standard lighting model or shadowgenerator). It would produce the ~same effect as classical real-time strategy games FoW,(with a moving light revealing the model) but based on actual lighting and on arbitrary UV-unwrapped models. However, I can't figure out how to write to a texture : when the shader gl_FragColor rgb component would be different from (0,0,0), it must modify the corresponding point on texture. From what I found, Babylon.js' DynamicTexture can be altered from the js part (as an HTML canvas), and I should rather use a Framebuffer Object, but I don't understand how to create and manage it from Babylon. Sorry if this is a really noob question, any clue or thought is welcome. Thanks for reading this :-)
-
Hey there, I'm just getting started with Phaser, and am looking at GLSL shaders (also for the first time) to see how they can be applied to sprites for effects on characters, backgrounds, etc. I ran into behaviour today that's likely a function of how GLSL works, rather than Phaser itself, and probably not the ideal approach. In any case, I'm hoping that someone can confirm one way or the other. Currently, my game has: A TileMap background using two layers using different values for scrollFactorX/Y A Sprite for the player with a filter applied, which is being rendered correctly (although the sprite graphic itself has disappeared; more on this later) What's odd is that when navigating through the map and the filtered sprite reaches a viewport boundary, it remains "stuck" there rather than moving offscreen. Since the shader example I'm using has no knowledge of the game, and it looks like the sprite is simply unmasking the shader output, I'm wondering a few things: Am I right to assume that this doesn't work for good reasons? (If so, what are they? ) Assuming that I need shaders (versus a sprite sheet) for elements that may venture offscreen, should I be manually passing rect data to the filter that tracks the shapes I want filtered, separately from the sprite itself? I'm looking forward to any insight that folks can share, so let me know if I can provide more details. Thanks!
-
Hi, I'm trying to create a shader and I'm having a really hard time to find a good tutorial for GLSL shaders for WebGL. I have never written a shader before and so I really need to learn the basics, does anyone know a good resource for this or could offer some help with it? Here are some of the more basic question I have: - What editor would you recommend for shader language? - Where to look up types and functions (documentation)? - Tutorials for WebGL? Thanks for any help and recommendations, Dinkelborg
-
Hello there! I'm writing a custom lighting shader that would take a low-frequency lightmap and apply it to sprite / image with 'hard light' algorithm. Here is the relevant code: "void main (void)","{", // sampling sprite albedo "vec4 albedo = texture2D (uAlbedo, vTextureCoord.xy);", // calculating current fragment's position in lightmap space // ................... (unrelevant code) // sampling lightmap with calculated coords "vec4 lightmap = texture2D (uLightmap, lightmapCoord.xy);", // per-component 'hard light' blending of albedo with lightmap "vec3 A = step (vec3 (0.5, 0.5, 0.5), lightmap.rgb);", "gl_FragColor.rgb = (1.0 - A) * 2.0 * albedo.rgb * lightmap.rgb;", "gl_FragColor.rgb += A * (1.0 - 2.0 * (1.0 - albedo.rgb) * (1.0 - lightmap.rgb));", "gl_FragColor.a = albedo.a;","}"The problem is that even though I deliberately set gl_FragColor.a to albedo.a at the end, I still get these strange artifacts (glowing transparent corners): And if I completely set gl_FragColor.a to 0 for debugging purposes, I get this: By commenting out I found that glowing corners are specifically caused inside this line: "gl_FragColor.rgb += A * (1.0 - 2.0 * (1.0 - albedo.rgb) * (1.0 - lightmap.rgb));",Seems like something inside PIXI WebGL rendering code makes these alphas glow. Any help would be appreciated.
-
I'm working on a soft particle shader, which of course needs a depth texture for the scene. Is there any way to generate that easily with babylon so it can be passed as a texture to a shader?