[FFmpeg-devel] Confusion over temporal filters.
divetec at rling.com
Sun Sep 10 08:57:13 EEST 2017
I'm thinking of adding a temporal filter (one that relies on context from
previous frames), and I've realised I'm a bit confused about how they
Say I open a file with ffplay and let it play up to frame 100. Then I open
the same file with another instance of ffplay and seek directly to frame
100. It seems to me that frame 100 should look the same in both cases.
Put another way, a specific set of filters, applied to a specific file,
should completely define what every frame of that file looks like. Seeking
around in the file should not change that. Is that right (in principle)?
So, looking at some of the existing temporal filters (eg. deflicker), I
don't think that is what happens. They filter based on the frames
previously passed to the filter, and if the user seeks over a bunch of
frames, the filter will see frames as consecutive that are not actually
consecutive in the file, so it will give a different result. Also, looking
at the API, I can't see a way to get the behaviour I expect. I can't see a
way for a filter to ask its inputs for a frame from a different (specific)
time. Is that right?
If my understanding is wrong, please let me know!
If my undersanding is correct, then I guess my questions are:
(1) is this behaviour a known issue (or a deliberate design choice)?
(2) is it OK for new temporal filters to keep the same behaviour as
existing ones -- that is, they will give different results when seeking
happens, compared to sequential processing? If so I'll just do the same
thing for my filter.
Thanks in advance for your thoughts.
More information about the ffmpeg-devel