[MPlayer-dev-eng] How to synchronize audio/video RTP streams?

Ross Finlayson finlayson at live.com
Mon Sep 23 12:49:08 CEST 2002


Recently, on this mailing list, there's been discussion of audio/video 
synchronization.  With this in mind, I want to ask a related question:  How 
should mplayer properly synchronize audio & video RTP streams (i.e., 
received using the LIVE.COM libraries)?

Note that for most RTP sessions, audio and video are delivered in separate 
packet streams, and do not, in general, arrive perfectly in sync.  The good 
news, however, is that there is timestamp information carried in each RTP 
packet, and this - along with the RTCP 'control' protocol - makes it 
possible for receivers to assign accurate real-time timestamps to each 
incoming RTP packet, which in turn makes it possible to synchronize audio 
and video.

Recent versions of the LIVE.COM RTP library properly implement this 
timestamping - so that whenever "demux_rtp_fill_buffer()" is called  (by 
"demux_fill_buffer()") to deliver a filled-in 'demux_packet' to the caller, 
the code knows an accurate presentation time for this data.  (For MPEG 
streams, this accuracy is to 1/90000 of a second - i.e., to ~11 us.)

My question, then, is: How best can the rest of mplayer use this 
information to properly synchronize audio and video?  I.e., can mplayer's 
existing audio/video synchronization mechanism make use of the timestamp 
information that's available for each 'demux_packet', and if so, 
how?  (There's no obvious field in a "demux_packet_t" for 
timestamp/synchronization information.)

It would be ideal if synchronization of RTP streams could be achieved using 
mplayer's existing synchronization mechanism(s), with as little change as 
possible to the existing mplayer code.

	Ross.




More information about the MPlayer-dev-eng mailing list