[FFmpeg-devel] [PATCH] WIP: subtitles in AVFrame
george at nsup.org
Fri Nov 11 17:25:05 EET 2016
Le primidi 21 brumaire, an CCXXV, wm4 a écrit :
> I'm saying that a good solution is application-specific. It depends on
> the exact data flow depending on what kind of filter graph is used.
The library should still try to make things as simple as possible.
> libavfilter still doesn't know whether there are subtitle frames at
> a future timestamps if you send heartbeat frames.
Care to explain that statement?
Apparently you still have not understood how it is supposed to work.
Here is the problem, once and for all.
Consider a subtitle_hardburn filter with a video input and a subtitle
input. The last subtitle segment ended at t=500, the next one starts at
t=1500. A video frames arrives at t=1000. In order to process that
frame, the filter needs to know that there is no subtitle to hardburn at
that time, and for that, it needs the next subtitle frame; thus, it
buffers the video frame and requests a subtitle one.
If the subtitle comes from a separate file, the frame at t=1500 is read,
injected to the filter, and it works.
If the subtitle comes from the same file, trying to read that frame
cause the video at t=1000.1 to be read, then at t=1000.2... until
t=1499.9, thus buffering 5000 video frames before being able to process.
With heartbeat frames, the video frame at t=1000.1 generates one. When
it arrives to the filter, the filter can do its work.
Hence, the heartbeat frames reduced the latency to huge and unbounded to
> What I'm suggesting is that the API user has to be aware of the
> requirements his filter chain might have on subtitle input. Thus, if
> the user requests output for which a subtitle frame would have been
> required, but it didn't send one, libavfilter assumes that there is
Therefore, you are trying to turn the request_frame() call into the
equivalent of the heartbeat frames. Sorry, it can not work that way.
Data in libavfilter has to flow forwards, not backwards.
> If you want your heartbeat frames to work,
They work the way I described.
> they would have to go into
> the reverse direction: send the timestamp up to the source filters, let
> the user check the timestamp, and then send a heartbeat frame if for
> this _specific_ timestamp no subtitle frame will be available. But that
> is also complex.
Maybe that could work, but I am very dubious because of the big picture.
But even if it does, it would mean a huge change in all the API. I like
my idea better, and at least I know it works.
> I consider this reply very disrespectful.
It is no more disrespectful than trying to dictate how I should answer
in the first place.
> Maybe read the CoC?
-------------- next part --------------
A non-text attachment was scrubbed...
Size: 801 bytes
Desc: Digital signature
More information about the ffmpeg-devel