[FFmpeg-devel] [PATCH 1/2] lavc/vaapi_decode: add missing flag when picking best pixel format
Wang, Fei W
fei.w.wang at intel.com
Fri Aug 5 08:16:19 EEST 2022
On Thu, 2022-08-04 at 20:59 -0700, Philip Langdale wrote:
> vaapi_decode_find_best_format currently does not set the
> VA_SURFACE_ATTRIB_SETTABLE flag on the pixel format attribute that it
> returns.
>
> Without this flag, the attribute will be ignored by vaCreateSurfaces,
> meaning that the driver's default logic for picking a pixel format
> will
> kick in.
>
> So far, this hasn't produced visible problems, but when trying to
> decode 4:4:4 content, at least on Intel, the driver will pick the
> 444P planar format, even though the decoder can only return the AYUV
> packed format.
>
> The hwcontext_vaapi code that sets surface attributes when picking
> formats does not have this bug.
>
> Applications may use their own logic for finding the best format, and
> so may not hit this bug. eg: mpv is unaffected.
>
> Signed-off-by: Philip Langdale <philipl at overt.org>
> ---
> libavcodec/vaapi_decode.c | 1 +
> 1 file changed, 1 insertion(+)
>
> diff --git a/libavcodec/vaapi_decode.c b/libavcodec/vaapi_decode.c
> index db48efc3ed..38813eb8e4 100644
> --- a/libavcodec/vaapi_decode.c
> +++ b/libavcodec/vaapi_decode.c
> @@ -358,6 +358,7 @@ static int
> vaapi_decode_find_best_format(AVCodecContext *avctx,
>
> ctx->pixel_format_attribute = (VASurfaceAttrib) {
> .type = VASurfaceAttribPixelFormat,
> + .flags = VA_SURFACE_ATTRIB_SETTABLE,
Better to fill .value.type with VAGenericValueTypeInteger together:
https://github.com/intel/media-driver/blob/4c95e8ef1e98cac661412d02f108e4e1c94d3556/media_driver/linux/common/ddi/media_libva.cpp#L2780
Thanks
Fei
> .value.value.i = best_fourcc,
> };
>
More information about the ffmpeg-devel
mailing list