[MEncoder-users] Nvidia Cuda or ATI Avivo Support

Reimar Döffinger Reimar.Doeffinger at gmx.de
Wed Jan 12 19:11:20 CET 2011


On Wed, Jan 12, 2011 at 01:15:11PM +0100, Matus UHLAR - fantomas wrote:
> On 05.01.11 01:05, Vladimir Mosgalin wrote:
> > I'd like to post this link to prove these words - well not 75% but it
> > really looks like with current official intel, amd and nvidia gpu-accelerated
> > encoding solutions in windows you have to pay in quality for what you
> > gain in speed, and the price is pretty noticeable, with CUDA being total
> > disaster, especially. It's from fresh Sandy Bridge processors review by
> > anadtech.
> > 
> > http://www.anandtech.com/show/4083/the-sandy-bridge-review-intel-core-i5-2600k-i5-2500k-and-core-i3-2100-tested/9
> > 
> > (also it's pretty noticeable that nothing but intel 2 days fresh
> > solution is really faster than modern multicore CPU..)

There's nothing in the article that suggests that that isn't
just because the CPU encoder they compared against is utter crap.
Actually the fact that no specific "vendor" like x264 or MainConcept
is mention suggests it is, making the whole comparison pointless
and invalid.

> this seems to be mostly problem of used algorithms, not the architecture itself,
> with the exception of floaint point numbers' precision.

Video encoding doesn't need more than 16 bit precision, floating point
is pointless and slow.
For GPUs however the algorithms that are used are used because of
the architecture, nobody knows of a H.264 encoding "algorithm" that
gives good quality without running at less than half the speed on a GPU
compared to the best ones running on a CPU.

> Maybe a combination of those two could be found that would be fastest
> without quality loss

Maybe. Though my conclusion is that a lot of people seem unable to understand
that parallelization is hard, and it certainly sometimes is not possible,
and as such you have to consider the possibilty that a fast, good-quality GPU H.264
encoder maybe simply isn't possible (without introducing stuff that doesn't
really belong in a "just-3D-graphics" GPU - which is the approach that the video
decoding functionality uses - but even there encoding has the additional issue
that you can't know what kind of features encoding hardware will need to be
"state-of-the-art" for H.264, for decoding that doesn't change as long as the
standard is the same).


More information about the MEncoder-users mailing list