Motion Silencing of Flicker Distortions on Naturalistic Videos
Prof. Alan Bovik and his student Lark Kwon Choi in the WNCG Laboratory for Image and Video Engineering (LIVE) and Prof. Lawrence Cormack in the Center for Perceptual Systems (CPS) in the Department of Psychology study the influence of motion on the visibility of flicker distortions in naturalistic videos.
The well-known “Motion silencing illusion” has shown that salient changes among a group of objects’ luminances, colors, shapes, or sizes may appear to stop when objects move rapidly. To understand why the human visual system might be insensitive to changes in object luminances (‘flicker’) in the presence of object motion, we have developed a spatiotemporal flicker detector model of motion silencing.
Further, a series of human subjective studies were executed to understand how motion silences the visibility of flicker distortions as a function of object motion, flicker frequency, and video quality. We found that flicker visibility is strongly reduced when the speed of coherent motion is large, and the effect is pronounced when video quality is poor. Based on this finding, we have proposed a model of flicker visibility on naturalistic videos.
We believe that sufficiently fast and coherent motion silences the perception of flicker distortions on naturalistic videos in agreement with a recently observed “motion silencing” effect on synthetic stimuli. We envision that the proposed model could be applied to develop perceptual video quality assessment algorithms that can predict “silenced” temporal distortions and account for them when computing video quality judgments.
Paper 1: Motion Silencing of Flicker Distortions on Naturalistic Videos
Paper 2: Spatiotemporal Flicker Detector Model of Motion Silencing
Paper 3: Visibility Prediction of Flicker Distortions on Naturalistic Videos
Paper 4: On the Visibility of Flicker Distortions in Naturalistic Videos
This work was supported by Intel and Cisco Corporations under the VAWN program, and by the National Science Foundation under Grants IIS-0917175 and IIS-1116656.