[NUT-devel] Broadcasting nuts [PATCH]
Rich Felker
dalias at aerifal.cx
Thu Feb 7 05:27:43 CET 2008
On Thu, Feb 07, 2008 at 03:17:02AM +0100, Michael Niedermayer wrote:
> > > It really is a 0 vs. 11 second difference for
> > > the receier after you switch it on/or tune to the channel at the start.
> >
> > You'll have to clarify perhaps with an example because you're not
> > making sense to me.
>
> Iam not talking about live streams, just a normal tv movie being broadcast
> for example. Here the 99byte per frame can be transmitted faster than
> realtime thus when the high bitrate 200byte per frames are reached. They are
> already available to the decoder a few seconds before they are needed.
>
> Of course if you dont start watching at the start but at some other point then
> some larger preload might be needed.
For all practical purposes, the viewer ALWAYS starts watching at some
other point than the "beginning" which hardly even exists except at
some distant point in history. _Broadcast_ means a unidirectional
where all 'clients' receive the same thing, whatever is currently 'on
the air'.
Thus, I see the whole issue as unworthy of consideration. For genuine
broadcast purposes, the buffering constraints must be such that
clients can pick up and immediately start playing at any point in the
broadcast. Transmitting low-bitrate parts faster-than-realtime, aside
from over very short intervals, has no practical purpose in such an
application because it will mean long intervals of time during which
it's impossible to start watching the broadcast.
> "your design B." here was the case that the transmitter was not allowed to
> transmit the first 1000 99bte frames at 100byte/sec but was limited to
> 99bytes/sec. This causes a larger preload requirement at the start. And
2 issues here:
1. I never said you're not allowed to transmit the first 1000 99-byte
frames at 100 bytes/sec, only that the average difference between dts
and transmit time over VERY LARGE windows needs to be constant. Here,
very large means much larger than the buffer size but much smaller
than the time scale on which the clock error grows too large. Unless
your buffering (and worst-case preload time) is extremely huge or your
clock is extremely bad, there will always be a HUGE (several orders of
magnitude) margin of choice for the window size between these two
bounds.
2. Even if you want to compare the version that cannot transmit
anything faster-than-realtime (which is more restrictive than my
proposal) to the version that can, the worst-case preload requirement
is always the same. This stream is entirely unusable for broadcast
purposes due to that fact.
> without the transmission limit my proof of the impossibility to sync clocks
> would apply.
> And honestly i dont think the clocks can be synchronized reliably even with
> some simple limits.
My proof of the ability to sync clocks always works as long as you
treat the window of mean-difference sampling as being significantly
larger than the receive buffer. Do I need to write it out formally?
> > > The main/stream headers are just needed once, if i switch channels around,
> > > i already have them after a few minutes for all channels. And wont want to
> > > wait for them ...
> >
> > After a few minutes?
>
> You missunderstood me. I meant if i cycle through all channels trying to find
> something not being total trash (which might take a minute). I do afterwards
> have all headers and wouldnt want to wait at all for them. That is i would
> benefit from more frequent preoad/transmit_ts than just after the headers.
I think this argument is weak. Any broadcast scenario where there's a
significant delay the first time you tune to a new channel has
essentially zero chance of being market-viable. The headers will have
to be repeated at least once every 2-3 seconds and probably 2-3 times
per second...
Rich
More information about the NUT-devel
mailing list