开发者

H.264 Real-time Streaming, Timestamp in NAL Units?

开发者 https://www.devze.com 2023-03-04 15:55 出处:网络
I\'m trying to build a system that live-streams video and audio captured by android phones. Video and auido are captured on the android side using MediaRecorder, and then pushed directly to a server w

I'm trying to build a system that live-streams video and audio captured by android phones. Video and auido are captured on the android side using MediaRecorder, and then pushed directly to a server written in python. Clients should access this live feed using their browser, so the I implemented the streaming part of the system using flash. Righ开发者_运维知识库t now both video and audio content appear on the client side, but the problem is that they are out of sync. I'm sure this is caused by wrong timestamp values in flash (currently I increment ts by 60ms for a frame of video, but clearly this value should be variable).

The audio is encoded into amr on the android phone, so I know exactly each frame of amr is 20ms. However, this is not the case with video, which is encoded into H.264. To synchronized them together, I would have to know exactly how many millisecs each frame of H.264 lasts, so that I can timestamp them later when delivering content using flash. My question is is this kind of information available in NAL units of H.264? I tried to find the answer in H.264 standard, but the information there is just overwhelming.

Can someone please point me at the right direction? Thanks.


Timestamps are not in NAL units, but are typically part of RTP. RTP/RTCP also takes care of media synchronisation.

The RTP payload format for H.264 might also be of interest to you.

If you are not using RTP, are you just sending raw data units over the network?

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号