[FFmpeg-devel] [Jack-Devel] [PATCH] libavdevice: JACK demuxer
Wed Mar 4 21:31:44 CET 2009
On Wed, Mar 04, 2009 at 05:06:08PM +0100, Michael Niedermayer wrote:
> well, the filter will take the first system time it gets as its
> best estimate
there's no alternative at that time
> and then "add" future times slowly into this to
> compensate. That is it weights the first sample
> very differently than the following ones, this is
> clearly not optimal.
What makes you think that ?
> or in other words the noisyness or call it accuracy of its internal state
> will be very poor after the first sampele while after a hundread it will
> be better.
Which is what matters.
> The filter though will add samples in IIR fashion while ignoring
It's called exponential averaging, which means recent
samples have more weight than older ones. Without that
a system can't be adaptive.
> its a little like trying to find the average of 100 values and to do this
> a= value
> for(i=0; i<100; i++)
That's a first order filter. The DLL as described is
second order, but otherwise it does indeed behave in
this way, as it should. We are not computing the average
of a fixed set, new input is being added continuously.
> > >>> also the authors of the paper test the filter meassuring
> > >>> its jitter, i dont see in how far this is meassuring the
> > >>> quality of the filter,
> > >
> > > What do you mean by 'quality' ?
> the sum of squared errors between the true sampling times and the output
> of the filter.
The error on the input consist of two parts, a systematic
one due to the sample rate not being equal to the nomimal
value (w.r.t to the timer used), and a random one (the
jitter). Since the systematic input error is first degree
and the filter is second order, the average error of the
filter must be zero. That's elementary control loop theory.
How much the jitter is reduced depends on the bandwidth.
Laboratorio di Acustica ed Elettroacustica
O tu, che porte, correndo si ?
E guerra e morte !
More information about the ffmpeg-devel