Online Distributed Learning over Graphs with Multitask Graph-Filter Models
Résumé
In this work, we are interested in adaptive and distributed estimation of graph filters from streaming data. We formulate this problem as a consensus estimation problem over graphs, which can be addressed with diffusion LMS strategies. Most popular graph-shift operators such as those based on the graph Laplacian matrix, or the adjacency matrix, are not energy preserving. This may result in an ill-conditioned estimation problem, and reduce the convergence speed of the distributed algorithms. To address this issue and improve the transient performance, we introduce a preconditioned graph diffusion LMS algorithm. We also propose a computationally efficient version of this algorithm by approximating the Hessian matrix with local information. Performance analyses in the mean and mean-square sense are provided. Finally, we consider a more general problem where the filter coefficients to estimate may vary over the graph. To avoid a large estimation bias, we introduce an unsupervised clustering method for splitting the global estimation problem into local ones. Numerical results show the effectiveness of the proposed algorithms and validate the theoretical results.
Origine | Fichiers produits par l'(les) auteur(s) |
---|