Search the Community
Showing results for tags 'sampler2d'.
-
Hello, I've been messing around with some colormaps I generate. I have a ShaderMaterial and would like to send some colormaps to the shader using ShaderMaterial.setTexture(). My colormaps are dynamically generated and result in a RGB image stored as a Uint8Array. To illustrate the test, I am using a 1x1 texture. I have no problem a RGBA image as uniform on the shader, like that: // this works well let cmTexture = new BABYLON.RawTexture( new Uint8Array([128, 128, 128, 255]), // data 1, // width 1, // height BABYLON.Engine.TEXTUREFORMAT_RGBA, // format this._scene, // scene false, // gen mipmaps false, // invertY BABYLON.Texture.TRILINEAR_SAMPLINGMODE, BABYLON.Engine.TEXTURETYPE_UNSIGNED_INT ) But, if I want to create a RGB texture (no alpha channel), it does not work: // JS lets me do it, but WebGL yells at me let cmTexture = new BABYLON.RawTexture( new Uint8Array([128, 128, 128]), // data 1, // width 1, // height BABYLON.Engine.TEXTUREFORMAT_RGB, // format this._scene, // scene false, // gen mipmaps false, // invertY BABYLON.Texture.TRILINEAR_SAMPLINGMODE, BABYLON.Engine.TEXTURETYPE_UNSIGNED_INT ) Then, I have the following error: The first warning ("WebGL: ...") occurs at the creation of the RawTexture, while the second one occurs when doing shaderMaterial.setTexture(...) I can still use RGBA so it's not a super big deal, but there is probably a little bug somewhere... Cheers.