[FFmpeg-devel] [Patch][GSoC] Motion Estimation filter

Michael Niedermayer michael at niedermayer.cc
Wed Apr 13 23:02:16 CEST 2016


On Wed, Apr 13, 2016 at 10:30:35PM +0200, Michael Niedermayer wrote:
> On Wed, Apr 13, 2016 at 10:01:35PM +0200, Michael Niedermayer wrote:
> > On Wed, Apr 13, 2016 at 07:24:42PM +0000, Davinder Singh wrote:
> > > good vectors? how can i improve them? since it search for every possible
> > > place, it should give best match. can you give more details, why
> > > surrounding vectors need to be considered?
> > 
> > the goal is to find the true motion that is how the physical object
> > moves, the best match pixel by pixel may not represent the true motion
> > at all
> > for example a block that contains a finely detailed high contrast
> > object and that matches with a low difference to only one spot in the
> > new image would likely represent a good/true motion vector
> > a block that contains a sharp straight line and that matches well
> > would likely be accurate in the direction normal to the line but may
> > be somewhat inaccurate in the direction along the line
> > a block containing nearly flat constant color will quite possibly
> > match somewhere best that is very far from the true motion
> > and blocks that on another frame move off screen or behind other objects
> > will also match best somewhere pixel by pixel but wherever they will
> > match it will not be the true motions direction
> > 
> > whats needed is to have the "strong/good" matching vectors pull any
> > surrounding weak ones towards themselfs in some form.
> > 
> > Researching algorithms/papers that attempt to find true motion well and
> > robustly, may be needed. Its liklely theres some existing research and
> > experiments in that area. Of course we will have to experiment too
> > but there should be some existing stuff
> > 
> > Its important to note though that this problem differs from motion
> > estimation used in video encoding.
> > But the methods used in video encoding still should perform much better
> > than "flat" independant exhaustive search (they are also MUCH faster)
> > For encoder targeted ME there is UMH and the older EPZS
> > i dont know what algorithm is good for finding true motion
> > but the encoder targeted ones at least have a term that minimizes
> > vector differnces (as there is cost in a encoder to store the
> > difference from the predicted)
> 
> one thing that could be tried in absence of other ideas
> is to add a term that penalizes vectors which differ from the
> surrounding vectors and just use exhaustive search as done, plus
> repeating the search iteratively until it has converged and does
> no change anymore.
> Using a fast search like EPZS or UMH instead of exhaustive should
> be alot more bearable speed wise though
> 
> also basic frame interpolation with the motion vectors should probably
> be implemented better earlier than later as that would allow much
> better judging (by eye) how good various motion estimatin algortithms
> perform.
> 
> The good news for you is that ive already implemented motion
> compensation so you only need to take my mcfps code which ive posted
> a while ago and replace the ugly use of the libavcodec motion estimatin
> in it by your code
> 
> that should be relatively easy and would provide a starting point
> to finetune the motion estimation
> first to match the iterative EPZS from libavcodec (which is designed
> purely with encoding in mind)
> and then hopefully to exceed it and make the filter actually work well
> and of course anything can be changed in it to work well ...
> 
> ill try to update my mcfps code and put it in some branch on my repo
> so you can take a look

https://github.com/michaelni/FFmpeg/tree/mcfps

you can test this using for example:
time ./ffmpeg -s 720x576 -i matrix_orig.yuv -vf mcfps=0:5,mcfps=3:25 -y /dev/shm/matrix_5-25.yuv
tests/tiny_psnr matrix_orig.yuv /dev/shm/matrix_5-25.yuv

above test drops 4 out of 5 frames and then recreates them using
mcfps and compares them to the original

the big problem of the mcfps code is that it uses the motion estimation
from libavcodec and this is messy and also the ME code in libavcodec
is not designed for finding true or robust motion


[...]

-- 
Michael     GnuPG fingerprint: 9FF2128B147EF6730BADF133611EC787040B0FAB

It is what and why we do it that matters, not just one of them.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 181 bytes
Desc: Digital signature
URL: <http://ffmpeg.org/pipermail/ffmpeg-devel/attachments/20160413/11a38962/attachment.sig>


More information about the ffmpeg-devel mailing list