DeepPrism: Channel Convolution for Lightweight Generative Models
Résumé
In this paper, we introduce DeepPrism, a novel network architecture for generative models, which addresses the redundancy issue of convolutional filters. DeepPrism reaches an unprecedented parameter efficiency improvement of up to 1000 times on generative models. The number of parameters is reduced over conventional CNNs from quadratic to constant dependency with respect to the network’s width. The training / inference time and space are also reduced. The main novelty lies in the geometric property namely the translation equivariance on channels, which gives rise to the convolution structure along the channels, and trivially generalizes to the attention mechanism. Compared with Latent Diffusion Model, DeepPrism produces similar qualitative and quantitative results, but the number of parameters can be reduced at a great scale. Other generative models including autoencoders and single-image diffusion models are also experimented to exhibit the generality of this method on generative models.
Origine | Fichiers produits par l'(les) auteur(s) |
---|---|
Licence |