avcodec/ac3enc: Use actual size of buffer in init_put_bits()

Since the very beginning (since de6d9b6404)
the AC-3 encoder used AC3_MAX_CODED_FRAME_SIZE (namely 3840) for the
size of the output buffer (without any check at all).
This causes problems when encoding EAC-3 for which the maximum is too small,
smaller than the actual size of the buffer: One can run into asserts used
by the PutBits API. Ticket #8513 is about such a case and this commit
fixes it by using the real size of the buffer.

Signed-off-by: Andreas Rheinhardt <andreas.rheinhardt@outlook.com>
This commit is contained in:
Andreas Rheinhardt 2021-03-29 18:19:43 +02:00
parent 3d97a0061c
commit 968c158abd
2 changed files with 1 additions and 2 deletions

View File

@ -27,7 +27,6 @@
#ifndef AVCODEC_AC3_H
#define AVCODEC_AC3_H
#define AC3_MAX_CODED_FRAME_SIZE 3840 /* in bytes */
#define EAC3_MAX_CHANNELS 16 /**< maximum number of channels in EAC3 */
#define AC3_MAX_CHANNELS 7 /**< maximum number of channels, including coupling channel */
#define CPL_CH 0 /**< coupling channel index */

View File

@ -1729,7 +1729,7 @@ static void ac3_output_frame(AC3EncodeContext *s, unsigned char *frame)
{
int blk;
init_put_bits(&s->pb, frame, AC3_MAX_CODED_FRAME_SIZE);
init_put_bits(&s->pb, frame, s->frame_size);
s->output_frame_header(s);