Improved Transformer-Based Implicit Latent GAN with Multi-headed Self-attention for Unconditional Text Generation - Archive ouverte HAL
Communication Dans Un Congrès Année : 2022

Improved Transformer-Based Implicit Latent GAN with Multi-headed Self-attention for Unconditional Text Generation

Ziyun Jiao
  • Fonction : Auteur
  • PersonId : 1405994
Xin Kang
  • Fonction : Auteur
  • PersonId : 1275130

Résumé

Generative Adversarial Network (GAN) is widely used in computer vision, such as image generation and other tasks. In recent years, GAN has also been developed in the field of unconditional text generation. In this work, we improve TILGAN for unconditional text generation by refactoring the generator. In short, we use Multi-headed Self-attention to replace the Linear layer and BN layer to endow the generator with better text generation capabilities. Our model consists of three components: a transformer autoencoder, a Multi-headed Self attention based generator and a linear based discriminator. The encoder in transformer autoencoder is used to generate the distribution of real samples, and the decoder is used to decode real or generated sentence vector into text. The loss functions for autoencoder and GAN are cross entropy and KL divergence, respectively. On the MS COCO dataset, the proposed model has achieved a better BLEU score than TILGAN. Our ablation experiments also proved the effectiveness of the proposed generator network for unconditional text generation.
Fichier sous embargo
Fichier sous embargo
0 0 10
Année Mois Jours
Avant la publication
mercredi 1 janvier 2025
Fichier sous embargo
mercredi 1 janvier 2025
Connectez-vous pour demander l'accès au fichier

Dates et versions

hal-04666458 , version 1 (01-08-2024)

Licence

Identifiants

Citer

Fuji Ren, Ziyun Jiao, Xin Kang. Improved Transformer-Based Implicit Latent GAN with Multi-headed Self-attention for Unconditional Text Generation. 5th International Conference on Intelligence Science (ICIS), Oct 2022, Xi'an, China. pp.166-173, ⟨10.1007/978-3-031-14903-0_18⟩. ⟨hal-04666458⟩
33 Consultations
1 Téléchargements

Altmetric

Partager

More