ozRocker Posted January 17, 2017 Share Posted January 17, 2017 "The Chrome Media team has created Draco, an open source compression library to improve the storage and transmission of 3D graphics. Draco can be used to compress meshes and point-cloud data. It also supports compressing points, connectivity information, texture coordinates, color information, normals and any other generic attributes associated with geometry." Full article here https://opensource.googleblog.com/2017/01/introducing-draco-compression-for-3d.html MrVR, Dad72 and Vousk-prod. 3 Quote Link to comment Share on other sites More sharing options...
MrVR Posted February 24, 2017 Share Posted February 24, 2017 Hey @MackeyK24 maybe we can benefit from this library what do you think? they claim to be better than the gzip compression that your are using on the toolkit Quote Link to comment Share on other sites More sharing options...
Dad72 Posted February 24, 2017 Share Posted February 24, 2017 Very interesting and impressive. This is optimizing for TreeJS, but not for BabylonJS yet. Maybe one day. Quote Link to comment Share on other sites More sharing options...
adam Posted February 25, 2017 Share Posted February 25, 2017 It is in the process of being added to Three.js: https://github.com/mrdoob/three.js/pull/10879 Quote Link to comment Share on other sites More sharing options...
Dad72 Posted February 25, 2017 Share Posted February 25, 2017 It would be great to have this for Babylon. The models this load so fast, it's impressive. Quote Link to comment Share on other sites More sharing options...
JCPalmer Posted February 25, 2017 Share Posted February 25, 2017 I just wonder what the starting point that is being used for those graphics. (Much easier to improve when you suck). For instance, are they starting from identifying unique vertices like Blender / 3ds Max exporters already have? This makes your data size drop off a cliff. Are they starting with bone matrix indexes already packed? One thing I have found is just calculating the normals on load is really fast (& 100% size reduction). Blender exporters have a 'Defer Normals' checkbox. I have made it the default of the Tower of Babel variant. Think it should be changed to the default on the JSON variant too. Not hating, but those type of improvement cannot be related just to compression. It probably involves a lot of data reorganization / representation changing. I looked at this earlier. I do not really remember it, but just recently reorganized my representation of shapekeys. Size dropped by about 50% with very little extra cpu to load. Think I was inspired looking at that code. gryff and adam 2 Quote Link to comment Share on other sites More sharing options...
gryff Posted February 25, 2017 Share Posted February 25, 2017 22 minutes ago, JCPalmer said: I just wonder what the starting point that is being used for those graphics @JCPalmer: I think you make a good point Jeff From personal experience, with my "Blue Lady" creation, was that the .babylon file size went from around 6.9MB (in 2014) to 2.66MB (2016) just due to improvements in the Blender Babylon Exporter that you carried out over that time period. cheers, gryff Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.