lavc/vaapi_decode: add missing flag when picking best pixel format

vaapi_decode_find_best_format currently does not set the
VA_SURFACE_ATTRIB_SETTABLE flag on the pixel format attribute that it
returns.

Without this flag, the attribute will be ignored by vaCreateSurfaces,
meaning that the driver's default logic for picking a pixel format will
kick in.

So far, this hasn't produced visible problems, but when trying to
decode 4:4:4 content, at least on Intel, the driver will pick the
444P planar format, even though the decoder can only return the AYUV
packed format.

The hwcontext_vaapi code that sets surface attributes when picking
formats does not have this bug.

Applications may use their own logic for finding the best format, and
so may not hit this bug. eg: mpv is unaffected.
This commit is contained in:
Philip Langdale 2022-08-04 20:24:48 -07:00
parent 9e029dc265
commit 737298b4f7
1 changed files with 2 additions and 0 deletions

View File

@ -358,6 +358,8 @@ static int vaapi_decode_find_best_format(AVCodecContext *avctx,
ctx->pixel_format_attribute = (VASurfaceAttrib) {
.type = VASurfaceAttribPixelFormat,
.flags = VA_SURFACE_ATTRIB_SETTABLE,
.value.type = VAGenericValueTypeInteger,
.value.value.i = best_fourcc,
};