[MPlayer-dev-eng] [BUG] [PATCH] Ogg/Theora frametime broken on 0-length packets
Reimar Döffinger
Reimar.Doeffinger at gmx.de
Sat Apr 30 15:08:39 CEST 2011
On Sat, Apr 30, 2011 at 02:34:20PM +0200, David Kuehling wrote:
> I noticed that a/v sync drifts in theora files that contain 0-length
> video demux packets (i.e. frames where the image doesn't change at all),
> when played back with -demuxer ogg. The libavformat demuxer seems to
> work correctly. However it uses so much memory (a leak), that it won't
> work on the target platform (NanoNote: 32MB RAM, no swap).
Please provide a better bug report so we can fix this, -demuxer ogg
is unlikely to work forever.
> Looking closer, I found that ds_get_next_pts() returns the *current* pts
> when the packet read last has zero length. This is due to the check
> for 'ds->buffer_pos' to see whether data have already been read from the
> *current* packet. Of course, ds->buffer_pos will stay when a
> zero-length packet was read, so that fails.
There are multiple possible solutions, the most correct _probably_ to stop
the Ogg demuxer from outputting 0-size packets.
Though this also might be an issue of the decoder behaving incorrectly.
Are you sure it is not related to that you end up using libtheora when
using -demuxer ogg while using fftheora with -demuxer lavf?
But a sample file to test and verify multiple approaches would be most useful.
More information about the MPlayer-dev-eng
mailing list