avformat/rtpdec_jpeg: fix low contrast image on low quality setting

Original mail and my own followup on ffmpeg-user earlier today:

I have a device sending out a MJPEG/RTP stream on a low quality setting.
Decoding and displaying the video with libavformat results in a washed
out, low contrast, greyish image. Playing the same stream with VLC results
in proper color representation.

Screenshots for comparison:

  http://zevv.nl/div/libav/shot-ffplay.jpg
  http://zevv.nl/div/libav/shot-vlc.jpg

A pcap capture of a few seconds of video and SDP file for playing the
stream are available at

  http://zevv.nl/div/libav/mjpeg.pcap
  http://zevv.nl/div/libav/mjpeg.sdp

I believe the problem might be in the calculation of the quantization
tables in the function create_default_qtables(), the attached patch
solves the issue for me.

The problem is that the argument 'q' is of the type uint8_t. According to the
JPEG standard, if 1 <= q <= 50, the scale factor 'S' should be 5000 / Q.
Because the create_default_qtables() reuses the variable 'q' to store the
result of this calculation, for small values of q < 19, q wil subsequently
overflow and give wrong results in the calculated quantization tables. The
patch below uses a new variable 'S' (same name as in RFC2435) with the proper
range to store the result of the division.

Signed-off-by: Michael Niedermayer <michael@niedermayer.cc>
This commit is contained in:
Ico Doornekamp 2016-03-24 14:31:38 +01:00 committed by Michael Niedermayer
parent 72e1360007
commit e3e6a2cff4
1 changed files with 4 additions and 3 deletions

View File

@ -193,16 +193,17 @@ static void create_default_qtables(uint8_t *qtables, uint8_t q)
{
int factor = q;
int i;
uint16_t S;
factor = av_clip(q, 1, 99);
if (q < 50)
q = 5000 / factor;
S = 5000 / factor;
else
q = 200 - factor * 2;
S = 200 - factor * 2;
for (i = 0; i < 128; i++) {
int val = (default_quantizers[i] * q + 50) / 100;
int val = (default_quantizers[i] * S + 50) / 100;
/* Limit the quantizers to 1 <= q <= 255. */
val = av_clip(val, 1, 255);