[FFmpeg-devel] [PATCH] document rtsp.h
Ronald S. Bultje
rsbultje
Mon Feb 16 17:32:45 CET 2009
Hi,
On Fri, Feb 6, 2009 at 7:34 AM, Michael Niedermayer <michaelni at gmx.at> wrote:
> On Thu, Feb 05, 2009 at 05:41:38PM -0500, Ronald S. Bultje wrote:
>> On Thu, Feb 5, 2009 at 12:27 PM, Michael Niedermayer <michaelni at gmx.at> wrote:
>> > On Thu, Feb 05, 2009 at 09:24:21AM -0500, Ronald S. Bultje wrote:
>> >> +/**
>> >> + * This describes the server response to each RTSP command.
>> >> + */
>> >
>> >> typedef struct RTSPHeader {
>> >
>> > poor name ...
>>
>> I can rename it if you suggest something better. :-).
>
> ServerResponse assuming your comment is correct
> an alternative would be ReceivedRTSPHHeader
RTSPServerResponse seems good to me. Then the status_code inside it
can stay what it is since it's contained structure makes it more than
clear that it's part of a "server_response" already.
>> >> + /** The following are used for dynamic/private protocols (payloads) */
>> >> + //@{
>> >> + /** handler structure */
>> >> + RTPDynamicProtocolHandler *dynamic_handler;
>> >> + /** private data associated with the dynamic protocol */
>> >> + PayloadContext *dynamic_protocol_context;
>> >> + //@}
>> >
>> > whats a dynamic/private protocol ?
>>
>> A private protocol is one with a payload ID (in the SDP description or
>> RTP header) above 96. Anything below 96, the mediatype for the payload
>> ID is fixed. Anything above, it's "private" and is mapped to a
>> mediatype in the rtpmap: SDP line. In the end, the mediatype is the
>> string given in each RTPDynamicProtocolHandler, e.g. "x-pn-realaudio"
>> for Realaudio-in-RDT, etc.
>>
>> A dynamic protocol is what we in ffmpeg implement outside rtpdec.c, in
>> practice, so it's one of those that implements a
>> RTPDynamicProtocolHandler (a "plugin" to rtpdec.c). These are commonly
>> the same as the private ones. Anyway, to not confuse anyone, I removed
>> the "private", since it literally really refers just to dynamic
>> protocols.
>
> why are not all dynamic?
Let's interpret "dynamic" a requiring a "plug-in" to rtpdec.c.
Here, it being dynamic implies implicitely that it requires some
special handling. RTPDynamicProtocolHandler has two "special-handling"
functions now (plus two two alloc/dealloc private data):
parse_sdp_a_line() and parse_packet(), for parsing payload data or SDP
line data.
Why are not all dynamic? Because apparently many require no
special-case handling of either SDP or payload data. The payload data
special handling usually means that the data is preceeded by some
format-specific header. E.g. for ASF, this header allows spreading
long ASF frames over multiple RTP fragments, or putting together
multiple shorter ASF frames in one single RTP fragment, and it has a
keyframe flag which RTP lacks. X-QT is similar. RDT/RM and ASF contain
complete "muxed" frames in their respective formats (unlike QT). Also,
their SDP contain complete headers for that bastard-"muxed" format.
Simpler formats, such as mp3, a/mulaw and pcm data simply don't
require this. Therefore, they don't have a dynamic protocol associated
with them. Their parsing is handled in a general-case parsing function
(rtp_parse_packet()) in rtpdec.c.
You want me to add a short version of this in the comment?
>> +/**
>> + * Client state, i.e. whether we are currently receiving data (PLAYING) or
>> + * setup-but-not-receiving (PAUSED). State can be changed in applications
>> + * by calling av_read_play/pause().
>> + */
>> enum RTSPClientState {
>> - RTSP_STATE_IDLE,
>> - RTSP_STATE_PLAYING,
>> - RTSP_STATE_PAUSED,
>
>> + RTSP_STATE_IDLE, /**< not initialized */
>
> RTSP_STATE_NOT_INIT ?
I'm indifferent, IDLE is fine for me really... Luca?
>> +/**
>> + * Private data for the RTSP demuxer.
>> + */
>> typedef struct RTSPState {
>> URLContext *rtsp_hd; /* RTSP TCP connexion handle */
>> + /** number of items in the 'rtsp_streams' variable */
>> int nb_rtsp_streams;
>> - struct RTSPStream **rtsp_streams;
>> + struct RTSPStream **rtsp_streams; /**< streams in this session */
>>
>> + /** whether we are currently receiving data from the server. This
>> + * value is set to PLAYING after a call to rtsp_read_play() and to
>> + * PAUSED after initialization or after a call to rtsp_read_pause(). */
>> enum RTSPClientState state;
[..]
>> > what is state?
>> > whether we are currently receiving data from the server
>> >
>> > didnt work out
>>
>> I'm not sure what you want me to add. I added how the variable is
>> changed but my guess is you want something else. sorry, I'm not
>> completely sure what information you're looking for here...
>
> the variable is called state.
> Add a text that fits after the question
> What is the variable state?
You didn't comment on the code so I guess it's still not OK, and I'm
still at a loss. For me, the comment answers what state is (it is an
indicator of whether we are currently receiving data from the server).
It really isn't more than a simple cache of the last PLAY/PAUSE
command sent to the server, to make sure we don't send 2x the same
unexpectedly or commands in the wrong state.
>
>> + /** the seek value requested when calling av_seek_frame(). This way,
>> + * the seek value is saved if we are currently paused and will be
>> + * transmitted at the next PLAY RTSP command. See rtsp_read_play(). */
>> int64_t seek_timestamp;
>
> do you realize that this is broken ?
>
> pause, seek to 1s from start, play 5h, pause, play
>
> the last play will seek back to 1s
> no? then the comment is wrong ...
It's only used once of course (practically, the code is kind of funny
but I think it's correct). It's cached in case we're paused while
seeking.
Ronald
More information about the ffmpeg-devel
mailing list