I'm building a streaming video server. Now I must transfer sequence data (data packets) of a video file to the client. But I don't know a timer tick to transfer a data packet. If I transfer too fast, the client doesn't have enough time to decode and display. I don't know whether it depends on Bitrate or other information of a video file. The video file I'm using is WMV. The protocol I'm using for streaming is Windows Media HTTP Streaming. I'm programming in C#.
Information of video file :
- Audio: Windo开发者_StackOverflow社区ws Media Audio 48000Hz stereo 64Kbps [Raw Audio 0]
- Video: Windows Media Video 9 320x240 24.00fps 230Kbps [Raw Video 1]
What formula should I use to calculate the time interval to transfer data?
You can calculate how much data to send based on the bitrate of the video file (which you say is ~300Kbps = 38400 Bytes per second), in pseudo code this would be something like:
startTime = Now;
bytesStreamed = 0;
videoFileDataRate = 38400;
while(streaming)
{
bytesStreamed += streamSomePackets();
streamDuration = Now - startTime;
var secondsStreamed = bytesStreamed /videoFileDataRate;
if(streamDuration < secondsStreamed * 0.99)
Throttle();
}
精彩评论