[Libav-user] Timing problems while decoding
stefan berke
ffmpeg at modular-audio-graphics.com
Thu Apr 21 06:59:20 CEST 2016
Hello there,
i'm writing a high-quality video player using ffmpeg. Basic functioning
is: a worker thread pulls video frames from libavcodec/format and
buffers them. A window thread uploads the frames (YUV420) to the GPU and
displays them. This works at least up to 4k*4k at 60hz on good hardware.
It's most important to maintain a constant refresh rate of the window to
avoid tearing and flickering. Normally as long as the decoder is fast
enough there should not be any problem. However, the timing of the
window refresh is badly affected when the decoder (h264/h265) is
running. Even if it's run as a separate application. I just can't have
constant (eg 60hz) update (regardless of the graphics content) when
ffmpeg is working on the same machine.
The problem appears on Windows and (not as much but still) on Linux.
It happens in forced vsync mode (through Nvidia-Settings) or with
handmade timing using a high-precision spin-wait (using
QueryPerformanceCounter [win] or clock_gettime(CLOCK_MONOTONIC..) [linux]).
Systems tried: Ubuntu 14.04, Windows 7 & 10, I7 and Xeon CPUs and
various Nvidia Quadro cards.
The decoder does not use the GPU.
I also reduced the AVCodecContext::thread_count which does not help.
Reducing to 1 makes the fluctuation in timing less pronounced but still
visible. So both GPU and CPU are not fully loaded but the timing is
affected.
Running dummy tasks (like calculating std::sin()) on all available
processors does not influence the timing at all.
Has anyone an idea what ffmpeg is doing internally that affects the
screen refresh?
Thanks in advance,
Stefan
More information about the Libav-user
mailing list