Motion integration modulated by form information
Résumé
We propose a model of motion integration modulated by form information, inspired by neurobiological data. Our dynamical system models several key features of the motion processing stream in primate visual cortex. Thanks to a multi-layer architecture incorporating both feedforward-feedback and inhibitive lateral connections, our model is able to solve local motion ambiguities. One important feature of our model is to propose an anitropic integration of motion based on the form information. Our model can be implemented efficiently on GPU and we show its properties on classical psychophysical examples. First, a simple read-out allows us to reproduce the dynamics of ocular following for a moving bar stimulus. Second, we show how our models able to discriminate between extrinsic and intrinsic junctions present in the chopstick illusion. Finally, we show some promising results on real videos.
Origine | Accord explicite pour ce dépôt |
---|
Loading...