[FFmpeg-user] Nvidia GPU [was (no subject)]

Carl Eugen Hoyos cehoyos at ag.or.at
Thu Dec 15 22:12:16 CET 2011


Tom Evans <tevans.uk <at> googlemail.com> writes:

> On Thu, Dec 15, 2011 at 6:42 PM, Lou <lou <at> lrcd.com> wrote:
> > * The D in VDPAU stands for Decode, so it don't help the actual
> >  encoding, and it doesn't use the GPU, but an onboard acceleration
> >  chip, IIRC.
> >
> > This is a subject which I am somewhat unfamiliar, so others may want to
> > add more accurate information or correct me.
> 
> I believe the decoding (or at least part of it) is offloaded onto the
> shaders*, and hence performance will differ from card to card,
> depending on the speed and number of shaders.

ffmpeg does NOT support hardware-accelerated video (and audio) decoding.

FFmpeg supports video player applications that use xVMC, VDPAU, VA-API or DXVA2.
(None of those use the shaders, the decoding speed is not depending at all on
the number of shaders, my GT520 which is a low-end card decodes H264 "faster"
than any high-end card with ten times as many shaders. De-interlacing speed
heavily depends on the number of shaders, but de-interlacing needs no libavcodec
support.)

Or in other words: libavcodec contains code that other applications can use to
decode in hardware, ffmpeg (the application) does not use that code.

Carl Eugen

Disclaimer: I only know about PureVideo for VDPAU, it is possible that other
technologies work differently (but I would be surprised).

Disclaimer 2: The situation is different on Android where FFmpeg supports
libstagefright to decode H264, but I don't know much about it.



More information about the ffmpeg-user mailing list