[FFmpeg-devel] [GSoC] Motion Interpolation
Michael Niedermayer
michael at niedermayer.cc
Fri Jun 17 23:45:51 CEST 2016
On Fri, Jun 17, 2016 at 08:19:00AM +0000, Davinder Singh wrote:
> On Wed, Jun 15, 2016 at 5:04 PM Michael Niedermayer <michael at niedermayer.cc>
> wrote:
>
> > Hi
> >
> > On Tue, May 31, 2016 at 10:43:38PM +0000, Davinder Singh wrote:
> > > There’s a lot of research done on Motion Estimation. Depending upon the
> > > intended application of the resultant motion vectors, the method used for
> > > motion estimation can be very different.
> > >
> > > Classification of Motion Estimation Methods:
> > >
> > > Direct Methods: In direct methods we calculate optical flow
> > > <https://en.wikipedia.org/wiki/Optical_flow> in the scene.
> > >
> > > - Phase Correlation
> > >
> > > - Block Matching
> > >
> > > - Spatio-Temporal Gradient
> > >
> > > - Optical flow: Uses optical flow equation to find motion in the scene.
> > >
> > > - Pel-recursive: Also compute optical flow, but in such a way that allow
> > > recursive computability on vector fields)
> > >
> > > Indirect Methods
> > >
> > > - Feature based Method: Find features in the frame, and used for
> > estimation.
> > >
> > > Here are some papers on Frame Rate Up-Conversion (FRUC):
> > >
> > > Phase Correlation:
> > >
> > > This method relies on frequency-domain representation of data, calculated
> > > using fast Fourier transform.
> > > <https://en.wikipedia.org/wiki/Fast_Fourier_transform> Phase Correlation
> > > provides a correlation surface from the comparison of images. This
> > enables
> > > the identification of motion on a pixel-by-pixel basis for correct
> > > processing of each motion type. Since phase correlation operates in the
> > > frequency rather than the spatial domain, it is able to zero in on
> > details
> > > while ignoring such factors as noise and grain within the picture. In
> > other
> > > words, the system is highly tolerant of the noise variations and rapid
> > > changes in luminance levels that are found in many types of content –
> > > resulting in high-quality performance on fades, objects moving in and out
> > > of shade, and light ashes.
> > >
> > > Papers:
> > >
> > > [1] "Disney Research » Phase-Based Frame Interpolation for Video." IEEE
> > > CVPR 2015 <https://www.disneyresearch.com/publication/phasebased/>
> > >
> > > [2] Yoo, DongGon et al. "Phase Correlated Bilateral Motion Estimation for
> > > Frame Rate Up-Conversion." The 23rd International Technical Conference on
> > > Circuits/Systems, Computers and Communications (ITC-CSCC Jul. 2008.
> > >
> > > <http://www.ieice.org/proceedings/ITC-CSCC2008/pdf/p385_G3-4.pdf>
> > >
> > > The video on paper [1] page demonstrate comparison between various
> > methods.
> > >
> > > Optical Flow:
> > >
> > > http://www.cs.toronto.edu/~fleet/research/Papers/flowChapter05.pdf
> > >
> > > [3] Brox et al. "High accuracy optical flow estimation based on a theory
> > > for warping." Computer Vision - ECCV 2004: 25-36.
> > >
> > > <
> > >
> > http://www.wisdom.weizmann.ac.il/~/vision/courses/2006_2/papers/optic_flow_multigrid/brox_eccv04_of.pdf
> > > >
> > >
> > > Slowmovideo <http://slowmovideo.granjow.net/> open-source project is
> > based
> > > on Optical flow equation.
> > >
> > > Algorithm we can implement is based on block matching method.
> > >
> > > Motion Compensated Frame Interpolation
> > >
> > > Paper:
> > >
> > > [4] Zhai et al. "A low complexity motion compensated frame interpolation
> > > method." IEEE ISCAS 2005: 4927-4930.
> > >
> > > <http://research.microsoft.com/pubs/69174/lowcomplexitymc.pdf>
> > >
> > > Block-based motion estimation and pixel-wise motion estimation are the
> > two
> > > main categories of motion estimation methods. In general, pixel-wise
> > motion
> > > estimation can attain accurate motion fields, but needs a substantial
> > > amount of computation. In contrast, block matching algorithms (BMA) can
> > be
> > > efficiently implemented and provide good performance.
> > >
> > > Most MCFI algorithms utilize the block-matching algorithm (BMA) for
> > motion
> > > estimation (ME). BMA is simple and easy to implement. It also generates a
> > > compactly represented motion field. However, unlike video compression, it
> > > is more important to find true motion trajectories in MCFI. The objective
> > > of MC in MCFI is not to minimize the energy of MC residual signals, but
> > to
> > > reconstruct interpolated frames with better visual quality.
> > >
> > > The algorithm uses motion vectors which are embedded in bit-stream. If
> > > vectors exported by codec (using +export_mvs flag2) are used when
> > > available, computation of the motion vectors will be significantly
> > reduced
> > > for realtime playback. Otherwise the mEstimate filter will generate MVs,
> > > and to make the process faster, same algorithms (used by x264 and x265) -
> > > Diamond, Hex, UMH, Star will be implemented in the filter. Other filter -
> > > mInterpolate will use the MVs in the frame side data to interpolate
> > frames
> > > using various methods - OBMC (Overlapped block motion compensation),
> > simple
> > > frame blending and frame duplication etc.
> > >
> > > However, MVs generated based on SAD or BAD might bring serious artifacts
> > if
> > > they are used directly. So, the algorithm first examines the motion
> > vectors
> > > and classify into two groups, one group with vectors which are considered
> > > to represent “true” motion, other having “bad” vectors, then carries out
> > > overlapped block bi-directional motion estimation on corresponding blocks
> > > having “bad” MVs. Finally, it utilizes motion vector post-processing and
> > > overlapped block motion compensation to generate interpolated frames and
> > > further reduce blocking artifacts. Details on each step are in the paper
> > > [4].
> > >
> > > Paper 2:
> > >
> > > [5] Choi et al. "Motion-compensated frame interpolation using bilateral
> > > motion estimation and adaptive overlapped block motion compensation."
> > Circuits
> > > and Systems for Video Technology, IEEE Transactions 2007: 407-416.
> > >
> > > <http://www.mku.edu.tr/getblogfile.php?keyid=793>
> > >
> > > Other Papers:
> > >
> > > Bai et al. "Visual-weighted motion compensation frame interpolation with
> > > motion vector refinement" Circuits and Systems (ISCAS), 2012 IEEE
> > > International Symposium 2012: 500-503.
> > >
> > > <http://www.icst.pku.edu.cn/course/icb/Pub%20Files/2012/bw_12-iscas.pdf>
> > >
> > > Park et al. "Motion compensated frame rate up-conversion using modified
> > > adaptive extended bilateral motion estimation" Journal of Automation and
> > > Control Engineering Vol 2.4 (2014).
> > >
> > > <http://www.joace.org/uploadfile/2014/0114/20140114120043876.pdf>
> > >
> > > Tsai et al. "Frame rate up-conversion using adaptive bilateral motion
> > > estimation" WSEAS International Conference. Proceedings. Mathematics and
> > > Computers in Science and Engineering: 2009.
> > > <
> > http://www.wseas.us/e-library/conferences/2009/hangzhou/ACACOS/ACACOS31.pdf
> > > >
> > >
> >
> > > Please share your thoughts on this.
> >
> > You looked at alot of papers. Thats good, keep in mind though some
> > of these things might be rater academical and have no use in reality
> > (i dont know which though)
> >
> > About using motion vectors from the bitstream, i think thats completely
> > the wrong time to think about that. Such optimizations can be
> > considered once there is fully working code.
> >
> >
> > > Meanwhile I'm implementing fast ME methods (dia, hex, star) in mEstimate
> > > and OBMC in mInterpolate.
> >
> > there already is OBMC implemented in the mcfps code
> > you only need to port motion estimation into it
> >
> > can you explain the relation between what you work on and mcfps?
> >
> >
> > [...]
> > --
> > Michael GnuPG fingerprint: 9FF2128B147EF6730BADF133611EC787040B0FAB
> >
> > Those who are too smart to engage in politics are punished by being
> > governed by those who are dumber. -- Plato
> > _______________________________________________
> > ffmpeg-devel mailing list
> > ffmpeg-devel at ffmpeg.org
> > http://ffmpeg.org/mailman/listinfo/ffmpeg-devel
>
>
> Yes, I did that, after understanding it completely. It now works with the
> motion vectors generated by mEstimate filter. Now I’m trying to improve it
> based on this paper: Overlapped Block Motion Compensation: An
> Estimation-Theoretic Approach
> <http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.112.8359&rep=rep1&type=pdf>
this is 22 years old
> and
> this one: Window Motion Compensation
> <https://www.researchgate.net/publication/252182199>.Takes a lot of time
this is 25 years old
not saying old papers are bad, just that this represents the knowledge
of 20 years ago
also its important to keep in mind that blind block matching of any
metric will not be enough. To find true motion the whole motion
vector fields of multiple frames will need to be considered
For example a ball thrown accross the field of view entering and
exiting the picture needs to move smoothly and at the ends (in time)
there are frames without the ball then a frame with the ball
these 2 are not enough to interpolate the frames between as we have
just one location where the ball is. With the next frames though
we can find the motion trajectory of the ball and interpolate it end
to end
I think papers which work on problems like this and also interpolation
of all the areas that end up overlapping and covering each other
like the backgroud behind the ball in that example would be better
starting points for implementing motion estiation because ultimatly
that is the kind of ME code we would like to have.
Block matching with various windows, OBMC, ... are all good but
if in our example the vectors for the ball or background are off that
will look rather bad with any motion compensation
So trying to move a bit toward this would make sense but first
having some motion estimation even really basic and dumb with
mc working in a testable filter (pair) should probably be done.
Iam just mentioning this as a bit of a preview of what i hope could
eventually be implemented, maybe this would be after GSoC but its
the kind of code needed to have really usable frame interpolation
> reading them. I think we need to add new Raised Cosine window (weights)
> along with Linear Window (currently implemented). What do you say?
i dont know, the windows used in snow are already the best of several
tried (for snow).
no great gains will be found by changing the OBMC window from snow.
>
> Also making mInterpolate work with variable macroblock size MC. The current
> interpolation works without half pel accuracy, though.
mcfps has fully working 1/4 pel OBMC code, that should be fine to be
used as is i think unless i miss something
half pel is 20 years old, it is not usefull
multiple block sizes on the MC side should not really matter ATM
smaller blocks are a bit slower but first we should get the code
working, then working with good quality and then working fast.
multiple block sizes may be usefull for the estimation side if it
improves estimation somehow.
Can i see your current "work in progress" ?
[...]
> I’m moving estimation code to some new file motion_est.c file and the
> methods are shared by both mEstimate and mInterpolate filters. mEstimate
> store the MVs in frame’s side data for any other filter. Moreover, any
> other filter if need post processing on MVs it can directly use the shared
> methods. But, mInterpolate use them internally, no saving in sidedata, and
> saving unnecessary processing.
This design sounds good
>
>
> Also, Paper [1] doesn’t uses window with OBMC at all. It just find normal
> average without weight. Perhaps to compare papers I either need to add
> multiple option for each setting or need to assign the algorithm as
> researcher’s name in filter options.
[...]
--
Michael GnuPG fingerprint: 9FF2128B147EF6730BADF133611EC787040B0FAB
Why not whip the teacher when the pupil misbehaves? -- Diogenes of Sinope
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 181 bytes
Desc: Digital signature
URL: <http://ffmpeg.org/pipermail/ffmpeg-devel/attachments/20160617/55286c30/attachment.sig>
More information about the ffmpeg-devel
mailing list