TCMP: end-to-end topologically consistent magnitude pruning for miniaturized graph convolutional networks
Résumé
Magnitude pruning is one of the mainstream methods in lightweight architecture design whose goal is to extract subnetworks with the largest weight connections. This method is known to be successful, but under very high pruning regimes, it suffers from topological inconsistency which renders the extracted subnetworks disconnected, and this hinders their generalization ability. In this paper, we devise TCMP a novel end-to-end Topologically Consistent Magnitude Pruning method that allows extracting subnetworks while guaranteeing their topological consistency. The latter ensures that only accessible and co-accessible -impactful -connections are kept in the resulting lightweight networks. Our solution is based on a novel reparametrization and two supervisory bi-directional networks which implement accessibility/coaccessibility and guarantee that only connected subnetworks will be selected during training. This solution allows enhancing generalization significantly, under very high pruning regimes, as corroborated through extensive experiments, involving graph convolutional networks, on the challenging task of skeleton-based action recognition.
Origine | Fichiers produits par l'(les) auteur(s) |
---|