[FFmpeg-devel] [PATCH v3 4/4] libavutil/qsv: enabling d3d11va support

Soft Works softworkz at hotmail.com
Sun Apr 26 23:00:30 EEST 2020


> -----Original Message-----
> From: ffmpeg-devel <ffmpeg-devel-bounces at ffmpeg.org> On Behalf Of
> Mark Thompson
> Sent: Sunday, April 26, 2020 8:54 PM
> To: ffmpeg-devel at ffmpeg.org
> Subject: Re: [FFmpeg-devel] [PATCH v3 4/4] libavutil/qsv: enabling d3d11va
> support
> 
> On 24/04/2020 15:52, artem.galin at gmail.com wrote:
> > From: Artem Galin <artem.galin at intel.com>
> >
> > Makes selection of d3d11va device type by default and over DirectX 9,
> > which is still supported but requires explicit selection.
> 
> ... which might break users with older drivers/systems.  Some comment on
> exactly which setups are broken would be helpful here.

I have done some investigation on this question: https://github.com/softworkz/ffmpeg_dx11/issues/1

A short summary: 

- D3D11 will fail for Gen 3 Intel CPUs

- D3D11 will fail for Gen 4 and Gen 5 Intel CPUs as long as DX11 array textures are used
  (to get these working, the D3D11 hw context needs to be extended to support non-array textures)

- For all newer CPUs: For all drivers that are older than 14-16 months, D3D11 may fail
  (except when implementing support for non-array DX11 textures)

Note: by "working" or "fail" I'm considering the full set of hw features including VPP filters.
Simple decoding or encoding might still work in cases where I wrote "fail".


An additional objection I have is about the non-deterministic selection between D3D9 and DX11.
The -qsv_device parameter allows to specify the device numer/Index of a graphics adapter to use.

On Windows, the numbering graphic adapters is very different between D3D9 and DX11 (=> DXGI).
You could roughly say that D3D9 is counting by connected displays while DXGI is counting by 
Physical device.

As long as there is no way to specify whether to enforce D3D9 or DX11, it is impossible to know
which adapter ID should be specified when the implementation makes an internal decision whether
to select D3D9 or DX11. In that context, defaulting to DX11 will break applications that are 
specifying D3D9 adapter IDs.

There needs to be a deterministic and reliable behavior which can be controlled from the 
command line. The proposed method for selecting D3D9 is not sufficient from my point of view
because the QSV codecs are standalone codecs and intended to work without dealing with explicit
hw context creation. For the other half, there should also be a way to explicitly select "DX11-or-fail". 

IMO there should be a global command line option to explicitly choose between D3D9 or DX11.
(global, because there's no way to mix the two).


softworkz


More information about the ffmpeg-devel mailing list