YorkSpace has migrated to a new version of its software. Access our Help Resources to learn how to use the refreshed site. Contact diginit@yorku.ca if you have any questions about the migration.
 

Context-FOFE Based Deep Learning Models for Text Classification and Modeling

Loading...
Thumbnail Image

Date

2018-03-01

Authors

Lin, Yuping

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

Text classification is a fundamental task in natural language processing. Many recently proposed deep learning models have leveraged context information in documents and achieved great successes. However, most of these models use complicated recurrent structures to handle the variable-length text and to record context information, which are hard to train. In this case, we propose a simple and efficient encoding scheme called context-FOFE that can encode context of variable-length documents into fixed-size representations. Our encoding is unique and reversible for any text sequence. Based on the encoded representations of documents, we further use two feed-forward neural network models and a generative HOPE model for text classification and modeling. We tested the models on the 20 Newsgroups text classification dataset and the IMDB sentiment analysis dataset. Experimental results show that our models can achieve competitive performance as the existing best models while using much simpler context encoding mechanism and network structure.

Description

Keywords

Computer science

Citation