ABSTRACT

Image motion analysis is typically accomplished by estimating optical flow vector for each pixel by matching their gray levels in consecutive frames of an image sequence. In this work, the gray level of each pixel is described by the content of its neighborhood using Zernike moments instead of intensity values, making the flow computation algorithm robust to non-uniform illumination and fluctuations in pixel intensity. To quantify the accuracy of the approach, endpoint error and angular error values have been computed for Middlebury benchmark dataset and the results have surpassed the results obtained by the popular Farneback method.