An Integrated Deep Generative Model for Text Classification and Generation

Joint Authors

Wu, Qingbiao
Wang, Zheng

Source

Mathematical Problems in Engineering

Issue

Vol. 2018, Issue 2018 (31 Dec. 2018), pp.1-8, 8 p.

Publisher

Hindawi Publishing Corporation

Publication Date

2018-08-19

Country of Publication

Egypt

No. of Pages

8

Main Subjects

Civil Engineering

Abstract EN

Text classification and generation are two important tasks in the field of natural language processing.

In this paper, we deal with both tasks via Variational Autoencoder, which is a powerful deep generative model.

The self-attention mechanism is introduced to the encoder.

The modified encoder extracts the global feature of the input text to produce the hidden code, and we train a neural network classifier based on the hidden code to perform the classification.

On the other hand, the label of the text is fed into the decoder explicitly to enhance the categorization information, which could help with text generation.

The experiments have shown that our model could achieve competitive classification results and the generated text is realistic.

Thus the proposed integrated deep generative model could be an alternative for both tasks.

American Psychological Association (APA)

Wang, Zheng& Wu, Qingbiao. 2018. An Integrated Deep Generative Model for Text Classification and Generation. Mathematical Problems in Engineering،Vol. 2018, no. 2018, pp.1-8.
https://search.emarefa.net/detail/BIM-1208926

Modern Language Association (MLA)

Wang, Zheng& Wu, Qingbiao. An Integrated Deep Generative Model for Text Classification and Generation. Mathematical Problems in Engineering No. 2018 (2018), pp.1-8.
https://search.emarefa.net/detail/BIM-1208926

American Medical Association (AMA)

Wang, Zheng& Wu, Qingbiao. An Integrated Deep Generative Model for Text Classification and Generation. Mathematical Problems in Engineering. 2018. Vol. 2018, no. 2018, pp.1-8.
https://search.emarefa.net/detail/BIM-1208926

Data Type

Journal Articles

Language

English

Notes

Includes bibliographical references

Record ID

BIM-1208926