Learning Graph Representation with Randomized Neural Network for Dynamic Texture Classification - Archive ouverte HAL
Article Dans Une Revue Applied Soft Computing Année : 2021

Learning Graph Representation with Randomized Neural Network for Dynamic Texture Classification

Résumé

Dynamic textures (DTs) are pseudo periodic data on a space × time support, that can represent many natural phenomena captured from video footages. Their modeling and recognition are useful in many applications of computer vision. This paper presents an approach for DT analysis combining a graph-based description from the Complex Network framework, and a learned representation from the Randomized Neural Network (RNN) model. First, a directed space × time graph modeling with only one parameter (radius) is used to represent both the motion and the appearance of the DT. Then, instead of using classical graph measures as features, the DT descriptor is learned using a RNN, that is trained to predict the gray level of pixels from local topological measures of the graph. The weight vector of the output layer of the RNN forms the descriptor. Several structures are experimented for the RNNs, resulting in networks with final characteristics of a single hidden layer of 4, 24, or 29 neurons, and input layers 4 or 10 neurons, meaning 6 different RNNs. Experimental results on DT recognition conducted on Dyntex++ and UCLA datasets show a
Fichier principal
Vignette du fichier
ASoC2021.pdf (4.34 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03431533 , version 1 (16-11-2021)

Identifiants

  • HAL Id : hal-03431533 , version 1

Citer

Lucas C Ribas, Jarba Joaci de Mesquita Sá Junior, Antoine Manzanera, Odemir M Bruno. Learning Graph Representation with Randomized Neural Network for Dynamic Texture Classification. Applied Soft Computing, 2021. ⟨hal-03431533⟩
41 Consultations
136 Téléchargements

Partager

More