Motion clouds: model-based stimulus synthesis of natural-like random textures for the study of motion perception.

Abstract : Choosing an appropriate set of stimuli is essential to characterize the response of a sensory system to a particular functional dimension, such as the eye movement following the motion of a visual scene. Here, we describe a framework to generate random texture movies with controlled information content, i.e., Motion Clouds. These stimuli are defined using a generative model that is based on controlled experimental parametrization. We show that Motion Clouds correspond to dense mixing of localized moving gratings with random positions. Their global envelope is similar to natural-like stimulation with an approximate full-field translation corresponding to a retinal slip. We describe the construction of these stimuli mathematically and propose an open-source Python-based implementation. Examples of the use of this framework are shown. We also propose extensions to other modalities such as color vision, touch, and audition.
Document type :
Journal articles
Complete list of metadatas

Cited literature [78 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-00726828
Contributor : Laurent Perrinet <>
Submitted on : Friday, August 31, 2012 - 1:30:23 PM
Last modification on : Saturday, May 26, 2018 - 1:17:43 AM
Long-term archiving on : Friday, December 16, 2016 - 9:01:09 AM

Files

MotionClouds.pdf
Files produced by the author(s)

Identifiers

Citation

Paula Sanz Leon, Ivo Vanzetta, Guillaume S Masson, Laurent U Perrinet. Motion clouds: model-based stimulus synthesis of natural-like random textures for the study of motion perception.. Journal of Neurophysiology, American Physiological Society, 2012, 107 (11), pp.3217-26. ⟨10.1152/jn.00737.2011⟩. ⟨hal-00726828⟩

Share

Metrics

Record views

627

Files downloads

307