[FFmpeg-devel] [PATCH] Print invalid picture dimensions as signed values.
Ronald S. Bultje
Mon May 10 17:54:15 CEST 2010
On Mon, May 10, 2010 at 11:49 AM, Benoit Fouet <benoit.fouet at free.fr> wrote:
> Index: libavcodec/utils.c
> --- libavcodec/utils.c ?(revision 23079)
> +++ libavcodec/utils.c ?(working copy)
> @@ -223,7 +223,7 @@ int avcodec_check_dimensions(void *av_lo
> ? ? if((int)w>0 && (int)h>0 && (w+128)*(uint64_t)(h+128) < INT_MAX/8)
> ? ? ? ? return 0;
> - ? ?av_log(av_log_ctx, AV_LOG_ERROR, "picture size invalid (%ux%u)\n", w, h);
> + ? ?av_log(av_log_ctx, AV_LOG_ERROR, "picture size invalid (%dx%d)\n", (int)w, (int)h);
> ? ? return AVERROR(EINVAL);
I'm probably starting an enormous bikeshed here, but these tend to
come from bitstreams. If an AVI has a 32-bit widht value of
0xFFFF0000, it's hard to say what it meant. Did it mean -0xFFFF =~
-65k? Or did it really mean ~4billion?
More likely, the file is buggy and we crapped out somewhere earlier
and we're just reading this crap value as a result of that.
Bit/bytestream values like these aren't signed or unsigned, they are
bit/bytestream values. Let's not treat them any differently than what
they are... As such, %u should be fine.
More information about the ffmpeg-devel