Tbd33 Posted July 27, 2017 Share Posted July 27, 2017 I have a game I am working on and want to bring audio into the background. One of the key features of my game is that it is based on live data. One of these live data components is live audio. I have the ability to receive live audio on my server (it is a server written in C by the way) encode it how I see fit (to MP3 for example), and then package it and send it somewhere. Where I am totally lost is exactly how I am supposed to package and send this audio data to the browser. I do know that there are at least three protocols - HTTP, RTMP, and and RTSP. I think I want to stick with HTTP. Suppose I create an <audio> element on the browser. What does this element want in terms of "here is the live stream for you to connect to"? And how is my sever supposed to deliver this audio data? Do I need to open up a web socket? Does the audio file need to be saved to a disk (like a spool or scratch file)? I am pretty lost here after many days of research... Quote Link to comment Share on other sites More sharing options...
alex_h Posted July 28, 2017 Share Posted July 28, 2017 You definitely want to be using webaudio not an <audio> element. I think web audio can receive a live stream: https://developer.mozilla.org/en-US/docs/Web/API/AudioContext/createMediaStreamSource just do a google search for 'webaudio live stream' for more background info. I don't know about the server side of things I'm afraid though. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.