JCPalmer Posted April 20, 2017 Share Posted April 20, 2017 I have just reverse engineered what happens to a Hair particle system when it is converted to a mesh. No matter how many "segments" you say to use when it is a particle system, you get a block of 65 vertices (for 64 segments) for each strand. In this picture of a system with 2 strands: The output looks like this (already converted to left handed): vert: 0 location: -.7644,1.2222,-.8473 vert: 1 location: -.7644,1.2549,-.8473 vert: 2 location: -.7644,1.2904,-.8473 vert: 3 location: -.7644,1.3285,-.8473 vert: 4 location: -.7644,1.3692,-.8473 vert: 5 location: -.7644,1.4122,-.8473 ... vert: 64 location: -.7644,4.7778,-.8473 ========================================= vert: 65 location: 1.2222,.1286,.33 vert: 66 location: 1.2549,.1286,.33 vert: 67 location: 1.2904,.1286,.33 vert: 68 location: 1.3285,.1286,.33 vert: 69 location: 1.3692,.1286,.33 vert: 70 location: 1.4122,.1286,.33 ... vert: 129 location: 4.7778,.1286,.33 Clearly 65 data points for a 25,000 hairs is not going to be practical for a line system. In this example, it only really needs 2 points per strand. Any ideas on a way to reduce the data points between the first and last, based on some kind of straightness between sets of 3 points tolerance? When the strands are not straight, pulling out data points are going to make them look jagged, but with enough hairs, might this not be obscured? Especially since you are probably not going to get this close. Like most of my topics, I'll probably be talking to myself once again. ozRocker, Jaskar, NasimiAsl and 1 other 4 Quote Link to comment Share on other sites More sharing options...
GameMonetize Posted April 20, 2017 Share Posted April 20, 2017 You're not I read all your posts. I only reply when I can bring something constructive to the discussion JCPalmer 1 Quote Link to comment Share on other sites More sharing options...
JCPalmer Posted April 21, 2017 Author Share Posted April 21, 2017 Think the Blender operation Limited dissolve on the converted mesh might to a nice job. Am starting to work on a LineMesh sub-class called Hair. It does not strictly need to be a sub-class, but it is cleaner. In order to build a LineMesh with multiple lines, you need to know how many vertices are in each line to build the vertexData.indices. I only know where each strand begins when it it first converted ( every 65 points ). Limited dissolve might do all the "mathy stuff" to remove many points and still look good, but now I will not know how many points are in each strand. Fortunately, I was already planning to start my export process while it is still a particlesystem, and do the conversion to a mesh operation right in the script. The original reason I was going to do that is so that you can still edit / comb it after export. All that is required is to push an undo state onto the stack before the conversion, then do an undo after all the data is sucked out. bpy.ops.ed.undo_push() # find the modifier name & temporarily convert it for mod in [m for m in mesh.modifiers if m.type == 'PARTICLE_SYSTEM']: bpy.ops.object.modifier_convert( modifier = mod.name ) break bmesh.ops.dissolve_limit(args) bpy.ops.ed.undo() If I get all the beginning points of each strand BEFORE the dissolve, I can hopefully still find them afterward (and thus know the number of points in each line). GameMonetize 1 Quote Link to comment Share on other sites More sharing options...
gryff Posted April 21, 2017 Share Posted April 21, 2017 @JCPalmer; I read most of your posts - but I know if you are asking a question that there is a very good chance it will be beyond my abilities Not sure what you are planning here, but I have never liked hair creation with the particle system - all that "brushing" seems rather fussy cheers, gryff Quote Link to comment Share on other sites More sharing options...
Temechon Posted April 21, 2017 Share Posted April 21, 2017 Same as gryff here. Very interesting stuff each time, but too far from my head Quote Link to comment Share on other sites More sharing options...
JCPalmer Posted April 24, 2017 Author Share Posted April 24, 2017 Well, I'm through. There is a lot of overhead to get through to adding an entire new class of geometry across systems, especially when using source code generation,. In this 20 strand example, the 1300 vertices (20 * 65) reduce down to 418 after a built-in limited dissolve with a 5 degree angle limit (hard coded). That is a 68% reduction, and I am very pleased with the result. Of course, 65 verts per strand is a pretty low bar, so it is easy to improve when you suck. This might actually work. I am 3 days in to a 5 day test. I have already coded a JS routine to generate skeleton single bone matrix weights. Have not tested that, and probably going to spend the rest of the time seeing what it really can do at scale, and how good it looks. @gryff, I have basically closed code on TOB 5.3, for QI 1.1, and am doing an experiment before moving on. The helmet hair specimens for MakeHuman are limited, and most not really that good. I saw a turtorial on particle hair I thought pretty good. It spent a lot of time on what not to do, perhaps your fussing. I do not know how far I am going to get. For sure, it might only going to make it into TOB. Quote Link to comment Share on other sites More sharing options...
Wingnut Posted April 24, 2017 Share Posted April 24, 2017 I dunno, guys. @JohnK's fur is pretty nice. Yeah, I'm sure it is a post-processing effect, so it has limitations, and is a bit off-topic, here. Still, I think it needs to be "considered" when studying feasibility, plausibility, practicality and maybe some other 'alities' and 'ilities', too. As soon as you start down-scaling the number of verts in the strand, it loses its ability to "flow in the wind" and "swish with nice bendings" during fast head-turns. Sucks, eh? *nod*. Depending upon the length of the strands, I think 64 path-points is actually not near enough. All your girl NPC's are going to expect 1024 path-points per strand... or else they won't be able to "do their hair" in the latest fashions. Women be some hair-bendin' fools, they be. Quote Link to comment Share on other sites More sharing options...
gryff Posted April 25, 2017 Share Posted April 25, 2017 6 hours ago, JCPalmer said: It spent a lot of time on what not to do, perhaps your fussing. @JCPalmer ; well, Jeff, if you go to the third tutorial in that series - where he is "brushing" the hair - he admits several times that it can be "frustrating", as the hair shoots through the body. Short haircuts may not be too bad - long hair you need a diploma in hair styling Still not sure where you are going - 20 strands ? What happens with a 1000 strands, even with Limited Dissolve? cheers, gryff Quote Link to comment Share on other sites More sharing options...
JCPalmer Posted April 25, 2017 Author Share Posted April 25, 2017 20, I was only really doing 2 strands with 10 children. Need to get working process before trying scaled test. Speaking of, the scene now has 1538 strands. I do not know how it got that number. Limited dissolve did very little, so ended up with 98,768 vertices. That stray coming out the front is not in Blender. This is what stuff looks like when you are bootstrapping. I'll shortly know more. I am only going to have one page, so it can change at any time. I actually do not think long hair (beyond shoulder length) is any good for games, due to head turns with hair cutting thru the body. Make human only has 3 stock male heads of hair (black, brown, & Afro). What about bald, or old guy with just a little combed over, or facial? To get more believable characters you need more than 3. Wingy, I looked at fur, but not for very long. I am doing people not dogs. When I ran your page on an A8 Ipad Air 2, my fps dropped to the high 20's. I have never seen a single mesh do that before. Do not know or care what a girl NPC is. Quote Link to comment Share on other sites More sharing options...
gryff Posted April 25, 2017 Share Posted April 25, 2017 30 minutes ago, JCPalmer said: I actually do not think long hair (beyond shoulder length) is any good for games, due to head turns with hair cutting thru the body. Make human only has 3 stock male heads of hair (black, brown, & Afro). What about bald, or old guy with just a little combed over, or facial? To get more believable characters you need more than 3. Ohh I agree that long hair presents issues with head turning, Jeff. As for MakeHuman and hair, the are some contributed community assets, including a beard and moustache. And as for bald - see image below of my old friend Sholto (created a few years back for Second Life ). The hard one would be the "combover" case. I will follow this thread with interest cheers, gryff Quote Link to comment Share on other sites More sharing options...
JCPalmer Posted April 25, 2017 Author Share Posted April 25, 2017 The saga continues. In this episode, I found that the 65 vertex per strand conversion rate only holds when the emitter mesh is a cube. Have to find a way that always works. Seems like when I started shapekeys. Now after 5 generations of dev, they are really solid. Do not think that it is going to take that amount of effort here. I have already found if I interrogate the meshes edges array, that the edges are in order of the lines. Each edge has 2 indices into the vertices array(which are also in order though not always in 65 per stand). Since both are in order, there is a jump where the 2nd vert index of the previous edge is not equal to the starting vert index of the next edge. Do not think I even need the vertices to be in order. This is about a 3 hour rework, but I like it. It means that the clunky method of finding all the starts before the dissolve does not need to be done. Can open up multiple work flow possibilities, if this gets this far. My test for success with be that the stray stand in the face will be gone. Assuming export gets straighten out to always work, the next hurtle I think is going to be the fragment shader. When you get enough strands, then things are just solid color. I know you need a face / 3 points to get a normal, but could not a direction / 2 points be a way to somehow differentiate a color slightly? @Deltakosh? Quote Link to comment Share on other sites More sharing options...
JCPalmer Posted April 25, 2017 Author Share Posted April 25, 2017 Ok, re-coded. Since I am only using one scene, here is the full scale scene using the original method. The stray highlighted in white: Again, this is the url for the scene. The color shader used by the LinesMesh class really looks like the next problem to solve. FYI, though Blender has a real fancy method inside the Cycles renderer for hair, here is what the same scene looks like in blender using just a diffuse color. Still problems, but better than in BJS. A normal is a direction. The normal of line might be inverse of its direction, maybe? Seems like there should also be a "lower bound", to avoid the black on top. Maybe making a LinesMesh subclass instead of a static method was more than just a convenience. Wingnut 1 Quote Link to comment Share on other sites More sharing options...
JCPalmer Posted April 26, 2017 Author Share Posted April 26, 2017 Thinking about it, I can computer generate vertex colors ( actually I already have). For now, I was not changing the value of the color for the individual vertices of a strand, but adding / removing a -.1 to +.1 from each channel. This way the lines would not all be the same color. It would not increase the export size, since it is done on load. If this test worked, I could play with it to dial in the right amounts for hair. LinesMesh does not load that attribute though ( only positions ). Wondering if I might try to modify LinesMesh directly for PR? Might this fail for some reason I do not know about yet? I remember @jerome had even mentioned he wanted to do this. Quote Link to comment Share on other sites More sharing options...
JCPalmer Posted April 26, 2017 Author Share Posted April 26, 2017 Also, LineMesh is not supplying bones attributes, even though ShaderMaterial appears to. Quote Link to comment Share on other sites More sharing options...
jerome Posted April 26, 2017 Share Posted April 26, 2017 Yep, it's on my (long) todo list, but I'm having a BJS break for while currently. Quote Link to comment Share on other sites More sharing options...
JCPalmer Posted April 26, 2017 Author Share Posted April 26, 2017 I take the silence as no objection. I Guess in LinesMesh would need to: add an arg to LinesMesh constructor, say useVertexColor? : boolean When true, switch out the shaderPath arg to ShaderMaterial constructor from "color" to not yet known (May need to write a fragment shader that uses a varying vColor) Also remove the color uniform when using vertexcolor Maybe need to add a _colorBuffer property, and call it in _bind() In ShaderMaterial, it looks like it checks the mesh for a skeleton and conditionally adds it no, so probably set. Need to also: Add a test for vertex colors in mesh, similar to what is done in MaterialHelper Am I close? I am at the end of my 5 day limit, but have to stop now to take my little killer to his annual hair cut (almost on-topic). Maybe add a little on to this project tomorrow to make up. His pic from this mornings walk Jaskar and jerome 2 Quote Link to comment Share on other sites More sharing options...
Wingnut Posted April 26, 2017 Share Posted April 26, 2017 "Little killer" knows how to grow good hair. Thanks for the briefings and demos/pics, Jeff. Great reading for us. It's fun watching your mind work. Quote Link to comment Share on other sites More sharing options...
ozRocker Posted April 26, 2017 Share Posted April 26, 2017 I wish there was a dedicated chip for hair, also for pixel perfect collision. Hair and collision are always the most problematic and processor intensive areas. Quote Link to comment Share on other sites More sharing options...
JCPalmer Posted April 27, 2017 Author Share Posted April 27, 2017 14 hours ago, ozRocker said: I wish there was a dedicated chip for hair, also for pixel perfect collision. Hair and collision are always the most problematic and processor intensive areas. Oh, wow. I had heard that nVidia had a new GPU with code name Grecian in dev. Now it all makes sense! Jaskar 1 Quote Link to comment Share on other sites More sharing options...
ozRocker Posted April 27, 2017 Share Posted April 27, 2017 17 minutes ago, JCPalmer said: Oh, wow. I had heard that nVidia had a new GPU with code name Grecian in dev. Now it all makes sense! Sorry for my comment. I guess I was just thinking out loud. Quote Link to comment Share on other sites More sharing options...
JCPalmer Posted April 27, 2017 Author Share Posted April 27, 2017 Nothing, to be sorry about. I was making a joke. My style is more mock serious, so I did not put some little face next to. Does not really translate on paper, I guess. Jaskar 1 Quote Link to comment Share on other sites More sharing options...
GameMonetize Posted April 27, 2017 Share Posted April 27, 2017 based on your todo list I think it is good. I can offer my help to do the required change in ShaderMaterial to support vertex color if you want. Quote Link to comment Share on other sites More sharing options...
JCPalmer Posted April 27, 2017 Author Share Posted April 27, 2017 I compared the binding of Mesh with LinesMesh's override. Since now it will potentially need positions, vertex color, & the 4 matrix weights Indexes, I switched to using geometry's _bind(). Now also get vertex object arrays, so I actually deleted the positions buffer in LinesMesh. This is now _bind(): public _bind(subMesh: SubMesh, effect: Effect, fillMode: number): LinesMesh { // VBOs this._geometry._bind(this._colorShader.getEffect() ); // Color this._colorShader.setColor4("color", this.color.toColor4(this.alpha)); return this; } The constructor for LinesMesh is now: constructor(name: string, scene: Scene, parent: Node = null, source?: LinesMesh, doNotCloneChildren?: boolean, public useVertexColor? : boolean) { super(name, scene, parent, source, doNotCloneChildren); if (source) { this.color = source.color.clone(); this.alpha = source.alpha; this.useVertexColor = source.useVertexColor; } this._intersectionThreshold = 0.1; var options = { attributes: [VertexBuffer.PositionKind], uniforms: ["world", "viewProjection"], needAlphaBlending: false, defines: [] }; if (useVertexColor) { options.defines = ["VERTEXCOLOR"]; } else { options.uniforms.push("color"); options.needAlphaBlending = true; } this._colorShader = new ShaderMaterial("colorShader", scene, "color", options); } I did not actually make any changes to ShaderMaterial for VertexColor. I put the defines in the constructor above. Might be better to put it in ShaderMaterial though, so any shader material could use it.. The color shaders were so small, making completely separate ones seems overkill. I added Vertex color attribute, bones declarations, and split up viewPorjection & wolrd, so the bonesVertex code would work. // Attributes attribute vec3 position; #ifdef VERTEXCOLOR attribute vec4 color; #endif #include<bonesDeclaration> // Uniforms uniform mat4 viewProjection; uniform mat4 world; // Output #ifdef VERTEXCOLOR varying vec4 vColor; #endif void main(void) { mat4 finalWorld = world; #include<bonesVertex> gl_Position = viewProjection * finalWorld * vec4(position, 1.0); #ifdef VERTEXCOLOR // Vertex color vColor = color; #endif } The fragment shader now either uses a varying color or a uniform color.  #ifdef VERTEXCOLOR varying vec4 vColor; #else uniform vec4 color; #endif void main(void) { #ifdef VERTEXCOLOR gl_FragColor = vColor; #else gl_FragColor = color; #endif } When I run my scene, regardless of whether I say yes to the new useVertexColor constructor, it fails silently. Eventually, Arcrotate's _getViewMatrix() fails, but kind of discounting that. Scene runs fine will a 2.5. Must of missed something. Quote Link to comment Share on other sites More sharing options...
GameMonetize Posted April 27, 2017 Share Posted April 27, 2017 if vertex color you need to add a new attributes: attributes: [VertexBuffer.PositionKind, VertexBuffer.ColorKind], Quote Link to comment Share on other sites More sharing options...
JCPalmer Posted April 28, 2017 Author Share Posted April 28, 2017 Thanks, still same error in ArcrotateCamera._getViewMatrix(). I even put a syntax error into the vertex shader, nothing. Do not think I am getting that far. Starting to pay attention to Arcrotate. It is trying to call getViewMatrix in the constructor. It has this fair new _targetHost thing that gets checked for in the call to _getTargetPosition(). It is not there, so target as passed in the constructor is returned. My target, the red cube, has no "addToRef" method. I am ditching ArcRotateCamera. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.