[NUT-devel] Broadcasting nuts [PATCH]

Rich Felker dalias at aerifal.cx
Wed Feb 6 16:51:04 CET 2008


On Wed, Feb 06, 2008 at 04:24:06PM +0100, Michael Niedermayer wrote:
> On Tue, Feb 05, 2008 at 11:55:37PM -0500, Rich Felker wrote:
> > On Tue, Feb 05, 2008 at 09:31:14PM +0100, Michael Niedermayer wrote:
> > > Hi
> > > 
> > > Attached patch should make broadcast of single program nuts possible.
> > 
> > It's already possible. The claim that any of these things are needed
> > is a fallacy.
> [...]
> > > +transmit_ts (t)
> > > +    The timestamp at which the first bit of the syncpoint is transmitted.
> > > +    MUST be less than or equal to all following dts.
> > > +    See broadcast buffering model.
> > 
> > Waste of space and does not do anything useful.
> > 
> > > +
> > > +Broadcast buffering model:
> > > +--------------------------
> > > +Nut files can be broadcast. Such specific nut files must be encoded and
> > 
> > There are not different types of NUT files. NUT is device-independent
> > and the muxing MUST NOT DIFFER depending on the intended use. This is
> > fundamental.
> 
> And they dont, now please return to technical discussion! Nothing will be
> added to nut if you can proof that its unneeded. But you should listen to
> what others say as well. And i do think nut as it is has a serious problem
> with being broadcasted.
> 
> And the only broadcast specific thing are the transmit_ts, and the spec
> explicitly says that a demuxer in non broadcast mode must ignore them. So
> your claim of device dependance for an existing use case is not correct.

You misunderstood the word device-dependent. I do not mean that the
files are unplayable by agnostic devices which do not care about the
added nonsense. Rather, I meant that the file is polluted with
device-specific information, information which in the case of
transmit_ts also happens to be totally irrelevant!

> > Unnecessary. Any timestamp works equally well for synchronizing the
> > clock as long as the receiver has an approximately accurate local
> > clock source and measures corrections over long periods.
> 
> No, again not even slightly correct, and even with infinite buffers.
> 
> Let me try to give some examples to show the problems.
> First we have a single file with a single (video) stream. This stream is
> variable bitrate.
> The very first constraint here is that the bits from the start until
> time t of the movie must never exceed the channel bandwidth*( t + preload ).
> Preload here is decided by the demuxer currently. This is true no matter how
> large the decoder buffers are. As the decoder cannot decode what it hasnt
> received yet.

This all has nothing to do with redundant timestamps.

> Thats the first unavoidable device dependance

Not unavoidable and not permissible.

> Lets for example assume a decoder has a buffer of size X (that can be 50gb if
> you want). And let us further assume the muxer/transmiter by using magic
> knows that size and stays within the limit. that is the buffer might be
> 0-0.1% full for an hour or 99.9-100% full for an hour. No problem but
> the decoder CANNOT with that detect clock drift. because the dts/pts it
> receives as well as all other timestamps in nut currently, will differ from
> its clock by the buffer fullness. transmit_ts can easily be used as it is
> the true transmit time.

Mark the buffer with actual receive times, according to the local
clock, and correlate them with the timestamps in the NUT stream,
smoothing out any timing adjustments over a long interval that's still
an order of magnitude away from the length needed for drift to become
a problem. This approach does not require polluting the NUT stream
with device- (and transmission-) dependent data which is irrelevant to
the actual media contained.

I think this issue gets to the core of device-[in]dependence: a
device-oriented format contains information irrelevant to the media
itself but needed to control primitive devices. A device-independent
format contains only the 'primary' information and requires a device
with specific needs to interpret that. Obviously the latter is more
efficient and less strongly tied to the technical requirements of a
particular era's gadgets...

> Now i belive you will agree that without clock synchronization its just a
> matter of time until the buffer will end up empty and the video/audio
> freezing. Again the buffer might be infinitly large, it doesnt help.
> You would need atomic clocks or some other means of keeping time accurate.

With ordinary cheap clocks you're fine for at least 5 minutes or so
which is plenty time to sample and correct drift.

Rich



More information about the NUT-devel mailing list