libavcodec/vaapi_decode: fix the problem that init_pool_size < nb_surface

For vaapi if the init_pool_size is not zero, the pool size is fixed.
This means max surfaces is init_pool_size, but when mapping vaapi
frame to qsv frame, the init_pool_size < nb_surface. The cause is that
vaapi_decode_make_config() config the init_pool_size and it is called
twice. The first time is to init frame_context and the second time is to
init codec. On the second time the init_pool_size is changed to original
value so the init_pool_size is lower than the reall size because
pool_size used to initialize frame_context need to plus thread_count and
3 (guarantee 4 base work surfaces). Now add code to make sure
init_pool_size is only set once. Now the following commandline works:

ffmpeg -hwaccel vaapi -hwaccel_device /dev/dri/renderD128 \
-hwaccel_output_format vaapi -i input.264 \
-vf "hwmap=derive_device=qsv,format=qsv" \
-c:v h264_qsv output.264

Signed-off-by: Wenbin Chen <wenbin.chen@intel.com>
This commit is contained in:
Wenbin Chen 2022-01-11 14:55:37 +08:00 committed by Haihao Xiang
parent 35080149ef
commit a5b1e632c9
1 changed files with 1 additions and 1 deletions

View File

@ -650,7 +650,7 @@ int ff_vaapi_decode_init(AVCodecContext *avctx)
ctx->hwctx = ctx->device->hwctx;
err = vaapi_decode_make_config(avctx, ctx->frames->device_ref,
&ctx->va_config, avctx->hw_frames_ctx);
&ctx->va_config, NULL);
if (err)
goto fail;