[FFmpeg-devel] [PATCH 1/6] Frame-based multithreading framework using pthreads
Michael Niedermayer
michaelni
Wed Nov 17 04:37:58 CET 2010
On Mon, Nov 15, 2010 at 01:22:57PM -0500, Alexander Strange wrote:
>
> On Nov 15, 2010, at 12:37 PM, Ronald S. Bultje wrote:
>
> > Hi,
> >
> > On Mon, Nov 15, 2010 at 5:20 PM, Reimar D?ffinger
> > <Reimar.Doeffinger at gmx.de> wrote:
> >> On Mon, Nov 15, 2010 at 08:37:01AM -0500, Alexander Strange wrote:
> >>> +* There is one frame of delay added for every thread beyond the first one.
> >>> + Clients using dts must account for the delay; pts sent through reordered_opaque
> >>> + will work as usual.
> >>
> >> Is there a way to query this? I mean the application
> >> knows how many threads it set, but that might not always
> >> be the same number as FFmpeg uses or so?
> >
> > This is poorly designed anyway (imho). While at it, why not start
> > using the AVFrame.pts/dts fields as is done for encoding (and a compat
> > wrapper for those poor souls using reordered_opaque).
>
> Nitpick #1:
> I don't like the way encoding works. avcodec_encode_video writes into a raw buffer, so you have to look at fields inside AVCodecContext.coded_frame, which is IMO ugly.
> Adding an avcodec_encode_video2() returning an AVPacket would be nice, but I haven't found a reason to care about this part of encoding recently so I didn't look.
agree, avcodec_encode_video() needs some slight API changes
>
> Nitpick #2:
> This only affects dts. AVFrame doesn't have a dts field, because dts isn't supposed to mean anything for decoded pictures.
>
> I definitely agree we should be handling AVPacket.pts/dts stuff for players, but only one timestamp should come out of lavc, and it should just be called "timestamp".
You must be carefull here so that players have access to both timestamps if they
want that. In an ideal world it of course is useless to have more than 1 but
in reality they can mismatch and while i agree we should provide apps with a
single best timestamp, apps which want to decide themselfs which is better
should be able to do so.
>
> Steps towards this:
> - track time spent delayed in lavc (dts of the packet input when the first frame was returned - dts of the first packet), somehow don't count the "official" delay assumed by the stream, store it in AVCodecContext
> - use that in guess_correct_pts() and move that code inside avcodec_decode_video somewhere
> - make up fake timestamps based on the stream FPS if we end up with AV_NOPTS_VALUE or non-monotime timestamps. Of course, for VFR material we can't really do anything sane here.
The current code advances the dts by reordered frame duration in the parser
where our parsers support it (mpeg1/2 and all non B frame codecs should work)
current ffmpeg.c also tries its best to advance timestamps by the duration
from the parser and in absence of this uses the recommanded default frame
duration (ticks_per_frame).
new code that produces timestamps in the video decoder must not be worse than
what we have now.
[...]
--
Michael GnuPG fingerprint: 9FF2128B147EF6730BADF133611EC787040B0FAB
The real ebay dictionary, page 3
"Rare item" - "Common item with rare defect or maybe just a lie"
"Professional" - "'Toy' made in china, not functional except as doorstop"
"Experts will know" - "The seller hopes you are not an expert"
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 198 bytes
Desc: Digital signature
URL: <http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/attachments/20101117/cf1ea941/attachment.pgp>
More information about the ffmpeg-devel
mailing list