[FFmpeg-devel] [PATCH 3/3] avcodec/bitstream: Consistently treat symbol as VLC_TYPE

Anton Khirnov anton at khirnov.net
Tue Oct 27 13:35:02 EET 2020


Quoting Andreas Rheinhardt (2020-10-27 11:30:32)
> Andreas Rheinhardt:
> > If a static VLC table gets initialized a second time (or concurrently by
> > two threads) and if said VLC table uses symbols that have the sign bit
> > of VLC_TYPE (a typedef for int16_t) set, initializing the VLC fails. The
> > reason is that the type of the symbol in the temporary array is an
> > uint16_t and so comparing it to the symbol read from the VLC table will
> > fail, because only the lower 16bits coincide. Said failure triggers an
> > assert.
> > 
> > Signed-off-by: Andreas Rheinhardt <andreas.rheinhardt at gmail.com>
> > ---
> > Found when playing around a bit with making the ClearVideo VLCs static.
> > 
> >  libavcodec/bitstream.c | 2 +-
> >  1 file changed, 1 insertion(+), 1 deletion(-)
> > 
> > diff --git a/libavcodec/bitstream.c b/libavcodec/bitstream.c
> > index 39749c6092..a908c10980 100644
> > --- a/libavcodec/bitstream.c
> > +++ b/libavcodec/bitstream.c
> > @@ -129,7 +129,7 @@ static int alloc_table(VLC *vlc, int size, int use_static)
> >  
> >  typedef struct VLCcode {
> >      uint8_t bits;
> > -    uint16_t symbol;
> > +    VLC_TYPE symbol;
> >      /** codeword, with the first bit-to-be-read in the msb
> >       * (even if intended for a little-endian bitstream reader) */
> >      uint32_t code;
> > 
> LGTM'ed by Lynne via email and pushed.

This is the second time I see this, why does that not go through the ML?

-- 
Anton Khirnov


More information about the ffmpeg-devel mailing list