Style Transformer: Unpaired Text Style Transfer Without Disentangled Latent Representation | Awesome LLM Papers Add your paper to Awesome LLM Papers

Style Transformer: Unpaired Text Style Transfer Without Disentangled Latent Representation

Ning Dai, Jianze Liang, Xipeng Qiu, Xuanjing Huang . Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics 2019 – 68 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
ACL Model Architecture

Disentangling the content and style in the latent space is prevalent in unpaired text style transfer. However, two major issues exist in most of the current neural models. 1) It is difficult to completely strip the style information from the semantics for a sentence. 2) The recurrent neural network (RNN) based encoder and decoder, mediated by the latent representation, cannot well deal with the issue of the long-term dependency, resulting in poor preservation of non-stylistic semantic content. In this paper, we propose the Style Transformer, which makes no assumption about the latent representation of source sentence and equips the power of attention mechanism in Transformer to achieve better style transfer and better content preservation.

Similar Work