[FFmpeg-devel] RTCP packets issue

sateesh babu sateesh.k112001
Wed Jul 21 16:43:50 CEST 2010


Hi,

    I am using ffmpeg libraries to receive a mpeg-4 stream from a rtsp
server (Siqura C60 E encoder). The server expects a RR packet from the
client as  a reply to the SR packet it sends. VLC is able to play the
stream without any disconnection. The server seems to tear down the
connection with ffmpeg libraries. From reading the archived
discussions, I have done the following and recomiled the libraries.

1. Modified the function rtp_check_and_send_back_rr(RTPDemuxContext
*s, int count) so as to make ssrc of client not the same as ssrc of
the server in case server doesn't like this.

put_be32(pb, s->ssrc+1); // our own SSRC

and the CNAME code below

put_be16(pb, (6 + len + 3) / 4); /* length in words - 1 */
put_be32(pb, s->ssrc+1);

2. The current code puts a value of 0 for 'delay_since_last' value. I
have changed uint64_t ntp_time= s->last_rtcp_ntp_time to uint64_t
ntp_time= av_gettime()/1000000 to get a reasonable value.

3. Wireshark is displaying all the SR RTCP packets but sometimes the
libraryseems to send a RR packet with an old ntp timestamp and I am
guessing that the server is not liking this.

Is it possible that some RTCP packets can be discarded for a reason and
where can it happen in the code? It would be great if anyone can point
me in the right direction.


Thanks.
Sateesh



More information about the ffmpeg-devel mailing list