[FFmpeg-devel] [GSoC] Motion Interpolation

Davinder Singh ds.mudhar at gmail.com
Fri Jun 17 10:19:00 CEST 2016


On Wed, Jun 15, 2016 at 5:04 PM Michael Niedermayer <michael at niedermayer.cc>
wrote:

> Hi
>
> On Tue, May 31, 2016 at 10:43:38PM +0000, Davinder Singh wrote:
> > There’s a lot of research done on Motion Estimation. Depending upon the
> > intended application of the resultant motion vectors, the method used for
> > motion estimation can be very different.
> >
> > Classification of Motion Estimation Methods:
> >
> > Direct Methods: In direct methods we calculate optical flow
> > <https://en.wikipedia.org/wiki/Optical_flow> in the scene.
> >
> > - Phase Correlation
> >
> > - Block Matching
> >
> > - Spatio-Temporal Gradient
> >
> >  - Optical flow: Uses optical flow equation to find motion in the scene.
> >
> >  - Pel-recursive: Also compute optical flow, but in such a way that allow
> > recursive computability on vector fields)
> >
> > Indirect Methods
> >
> > - Feature based Method: Find features in the frame, and used for
> estimation.
> >
> > Here are some papers on Frame Rate Up-Conversion (FRUC):
> >
> > Phase Correlation:
> >
> > This method relies on frequency-domain representation of data, calculated
> > using fast Fourier transform.
> > <https://en.wikipedia.org/wiki/Fast_Fourier_transform> Phase Correlation
> > provides a correlation surface from the comparison of images. This
> enables
> > the identification of motion on a pixel-by-pixel basis for correct
> > processing of each motion type. Since phase correlation operates in the
> > frequency rather than the spatial domain, it is able to zero in on
> details
> > while ignoring such factors as noise and grain within the picture. In
> other
> > words, the system is highly tolerant of the noise variations and rapid
> > changes in luminance levels that are found in many types of content –
> > resulting in high-quality performance on fades, objects moving in and out
> > of shade, and light ashes.
> >
> > Papers:
> >
> > [1] "Disney Research » Phase-Based Frame Interpolation for Video." IEEE
> > CVPR 2015 <https://www.disneyresearch.com/publication/phasebased/>
> >
> > [2] Yoo, DongGon et al. "Phase Correlated Bilateral Motion Estimation for
> > Frame Rate Up-Conversion." The 23rd International Technical Conference on
> > Circuits/Systems, Computers and Communications (ITC-CSCC Jul. 2008.
> >
> > <http://www.ieice.org/proceedings/ITC-CSCC2008/pdf/p385_G3-4.pdf>
> >
> > The video on paper [1] page demonstrate comparison between various
> methods.
> >
> > Optical Flow:
> >
> > http://www.cs.toronto.edu/~fleet/research/Papers/flowChapter05.pdf
> >
> > [3] Brox et al. "High accuracy optical flow estimation based on a theory
> > for warping." Computer Vision - ECCV 2004: 25-36.
> >
> > <
> >
> http://www.wisdom.weizmann.ac.il/~/vision/courses/2006_2/papers/optic_flow_multigrid/brox_eccv04_of.pdf
> > >
> >
> > Slowmovideo <http://slowmovideo.granjow.net/> open-source project is
> based
> > on Optical flow equation.
> >
> > Algorithm we can implement is based on block matching method.
> >
> > Motion Compensated Frame Interpolation
> >
> > Paper:
> >
> > [4] Zhai et al. "A low complexity motion compensated frame interpolation
> > method." IEEE ISCAS 2005: 4927-4930.
> >
> > <http://research.microsoft.com/pubs/69174/lowcomplexitymc.pdf>
> >
> > Block-based motion estimation and pixel-wise motion estimation are the
> two
> > main categories of motion estimation methods. In general, pixel-wise
> motion
> > estimation can attain accurate motion fields, but needs a substantial
> > amount of computation. In contrast, block matching algorithms (BMA) can
> be
> > efficiently implemented and provide good performance.
> >
> > Most MCFI algorithms utilize the block-matching algorithm (BMA) for
> motion
> > estimation (ME). BMA is simple and easy to implement. It also generates a
> > compactly represented motion field. However, unlike video compression, it
> > is more important to find true motion trajectories in MCFI. The objective
> > of MC in MCFI is not to minimize the energy of MC residual signals, but
> to
> > reconstruct interpolated frames with better visual quality.
> >
> > The algorithm uses motion vectors which are embedded in bit-stream. If
> > vectors exported by codec (using +export_mvs flag2) are used when
> > available, computation of the motion vectors will be significantly
> reduced
> > for realtime playback. Otherwise the mEstimate filter will generate MVs,
> > and to make the process faster, same algorithms (used by x264 and x265) -
> > Diamond, Hex, UMH, Star will be implemented in the filter. Other filter -
> > mInterpolate will use the MVs in the frame side data to interpolate
> frames
> > using various methods - OBMC (Overlapped block motion compensation),
> simple
> > frame blending and frame duplication etc.
> >
> > However, MVs generated based on SAD or BAD might bring serious artifacts
> if
> > they are used directly. So, the algorithm first examines the motion
> vectors
> > and classify into two groups, one group with vectors which are considered
> > to represent “true” motion, other having “bad” vectors, then carries out
> > overlapped block bi-directional motion estimation on corresponding blocks
> > having “bad” MVs. Finally, it utilizes motion vector post-processing and
> > overlapped block motion compensation to generate interpolated frames and
> > further reduce blocking artifacts. Details on each step are in the paper
> > [4].
> >
> > Paper 2:
> >
> > [5] Choi et al. "Motion-compensated frame interpolation using bilateral
> > motion estimation and adaptive overlapped block motion compensation."
> Circuits
> > and Systems for Video Technology, IEEE Transactions 2007: 407-416.
> >
> > <http://www.mku.edu.tr/getblogfile.php?keyid=793>
> >
> > Other Papers:
> >
> > Bai et al. "Visual-weighted motion compensation frame interpolation with
> > motion vector refinement" Circuits and Systems (ISCAS), 2012 IEEE
> > International Symposium 2012: 500-503.
> >
> > <http://www.icst.pku.edu.cn/course/icb/Pub%20Files/2012/bw_12-iscas.pdf>
> >
> > Park et al. "Motion compensated frame rate up-conversion using modified
> > adaptive extended bilateral motion estimation" Journal of Automation and
> > Control Engineering Vol 2.4 (2014).
> >
> > <http://www.joace.org/uploadfile/2014/0114/20140114120043876.pdf>
> >
> > Tsai et al. "Frame rate up-conversion using adaptive bilateral motion
> > estimation" WSEAS International Conference. Proceedings. Mathematics and
> > Computers in Science and Engineering: 2009.
> > <
> http://www.wseas.us/e-library/conferences/2009/hangzhou/ACACOS/ACACOS31.pdf
> > >
> >
>
> > Please share your thoughts on this.
>
> You looked at alot of papers. Thats good, keep in mind though some
> of these things might be rater academical and have no use in reality
> (i dont know which though)
>
> About using motion vectors from the bitstream, i think thats completely
> the wrong time to think about that. Such optimizations can be
> considered once there is fully working code.
>
>
> > Meanwhile I'm implementing fast ME methods (dia, hex, star) in mEstimate
> > and OBMC in mInterpolate.
>
> there already is OBMC implemented in the mcfps code
> you only need to port motion estimation into it
>
> can you explain the relation between what you work on and mcfps?
>
>
> [...]
> --
> Michael     GnuPG fingerprint: 9FF2128B147EF6730BADF133611EC787040B0FAB
>
> Those who are too smart to engage in politics are punished by being
> governed by those who are dumber. -- Plato
> _______________________________________________
> ffmpeg-devel mailing list
> ffmpeg-devel at ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-devel


Yes, I did that, after understanding it completely. It now works with the
motion vectors generated by mEstimate filter. Now I’m trying to improve it
based on this paper: Overlapped Block Motion Compensation: An
Estimation-Theoretic Approach
<http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.112.8359&rep=rep1&type=pdf>
and
this one: Window Motion Compensation
<https://www.researchgate.net/publication/252182199>.Takes a lot of time
reading them. I think we need to add new Raised Cosine window (weights)
along with Linear Window (currently implemented). What do you say?

Also making mInterpolate work with variable macroblock size MC. The current
interpolation works without half pel accuracy, though.

I also tried it with exported motion vectors by codec. Quality was poor.
Leaving that idea for now.


I’m moving estimation code to some new file motion_est.c file and the
methods are shared by both mEstimate and mInterpolate filters. mEstimate
store the MVs in frame’s side data for any other filter. Moreover, any
other filter if need post processing on MVs it can directly use the shared
methods. But, mInterpolate use them internally, no saving in sidedata, and
saving unnecessary processing.


Also, Paper [1] doesn’t uses window with OBMC at all. It just find normal
average without weight. Perhaps to compare papers I either need to add
multiple option for each setting or need to assign the algorithm as
researcher’s name in filter options.

Thanks :)


More information about the ffmpeg-devel mailing list