Jump to content

TheCoolScorpio

Members
  • Posts

    3
  • Joined

  • Last visited

TheCoolScorpio's Achievements

Newbie

Newbie (1/14)

0

Reputation

  1. Ok sorry to make you confuse.. Let me help rephrase use case. we wanted local images/User generate image from client device to use as environment. so first i thought to use localfilesystem API. But after your reply i search and found that localfilesystem API is not accepted or got traction as web standard yet. so I fall back to using WebRTC route with getusermedia API and I thought let user click the photo from WebRTC and lets reuse same image stream from WebRTC to inject as material. Looks like from your reply its tricky to achieve in babylon . let me explore few more high level library . I will keep posted if i found some work around. Incase you come across any workaround please let us know.. Thanks
  2. Use case goes as following.. We are looking to give End User generated Image to be Visible as "environment" (in normal webgl terminology) just like skybox behaves. We know End User may not generate 6 cube images. or they may. we are fine if they take only one image and we may show it as environment texture. but due to mobile first and flaky network and also due to User generated Image may involve privacy issue. we may not able to push images back to server and round trip it back. we are fine even using getUserMedia API of webRTC specs and capture image byte stream. but thing which is not clear to me is how we pass raw image bytestream to environment type texture (not to 3d object).
  3. Is there any way using localfilesystem api to emmit all six cube images from local file system? Any pointer to modify loader in this case would be nice.. or does it handle uri scheme automatically? and already supported?
×
×
  • Create New...