[FFmpeg-cvslog] lavc/vaapi_decode: add missing flag when picking best pixel format

Philip Langdale git at videolan.org
Tue Aug 9 19:23:45 EEST 2022


ffmpeg | branch: master | Philip Langdale <philipl at overt.org> | Thu Aug  4 20:24:48 2022 -0700| [737298b4f7b60bc2b755fe8fa9135f50a496d94d] | committer: Philip Langdale

lavc/vaapi_decode: add missing flag when picking best pixel format

vaapi_decode_find_best_format currently does not set the
VA_SURFACE_ATTRIB_SETTABLE flag on the pixel format attribute that it
returns.

Without this flag, the attribute will be ignored by vaCreateSurfaces,
meaning that the driver's default logic for picking a pixel format will
kick in.

So far, this hasn't produced visible problems, but when trying to
decode 4:4:4 content, at least on Intel, the driver will pick the
444P planar format, even though the decoder can only return the AYUV
packed format.

The hwcontext_vaapi code that sets surface attributes when picking
formats does not have this bug.

Applications may use their own logic for finding the best format, and
so may not hit this bug. eg: mpv is unaffected.

> http://git.videolan.org/gitweb.cgi/ffmpeg.git/?a=commit;h=737298b4f7b60bc2b755fe8fa9135f50a496d94d
---

 libavcodec/vaapi_decode.c | 2 ++
 1 file changed, 2 insertions(+)

diff --git a/libavcodec/vaapi_decode.c b/libavcodec/vaapi_decode.c
index db48efc3ed..bc2d3ed803 100644
--- a/libavcodec/vaapi_decode.c
+++ b/libavcodec/vaapi_decode.c
@@ -358,6 +358,8 @@ static int vaapi_decode_find_best_format(AVCodecContext *avctx,
 
         ctx->pixel_format_attribute = (VASurfaceAttrib) {
             .type          = VASurfaceAttribPixelFormat,
+            .flags         = VA_SURFACE_ATTRIB_SETTABLE,
+            .value.type    = VAGenericValueTypeInteger,
             .value.value.i = best_fourcc,
         };
 



More information about the ffmpeg-cvslog mailing list