Text categorization using syllables and recurrent neural networks

Available
The embargo period has ended, and this item is now available.

Date

2017-07

Editor(s)

Advisor

Kozat, Süleyman Serdar

Supervisor

Co-Advisor

Co-Supervisor

Instructor

Source Title

Print ISSN

Electronic ISSN

Publisher

Volume

Issue

Pages

Language

English

Type

Journal Title

Journal ISSN

Volume Title

Attention Stats
Usage Stats
2
views
31
downloads

Series

Abstract

We investigate multi class categorization of short texts. To this end, in the third chapter, we introduce highly efficient dimensionality reduction techniques suitable for online processing of high dimensional feature vectors generated from freely-worded text. Although text processing and classification are highly important due to many applications such as emotion recognition, advertisement selection, etc., online classification and regression algorithms over text are limited due to need for high dimensional vectors to represent natural text inputs. We overcome such limitations by showing that randomized projections and piecewise linear models can be efficiently leveraged to significantly reduce the computational cost for feature vector extraction from the tweets. We demonstrate our results over tweets collected from a real life case study where the tweets are freely-worded and unstructured. We implement several well-known machine learning algorithms as well as novel regression methods and demonstrate that we can significantly reduce the computational complexity with insignificant change in the classification and regression performance.Furthermore, in the fourth chapter, we introduce a simple and novel technique for short text classification based on LSTM neural networks. Our algorithm obtains two distributed representations for a short text to be used in classification task. We derive one representation by processing vector embeddings corresponding to words consecutively in LSTM structure and taking average of the produced outputs at each time step of the network. We also take average of distributed representations of the words in the short text to obtain the other representation. For classification, weighted combination of both representations are calculated. Moreover, for the first time in literature we propose to use syllables to exploit the sequential nature of the data in a better way. We derive distributed representations of the syllables and feed them to an LSTM network to obtain the distributed representation for the short text. Softmax layer is used to calculate categorical distribution at the end. Classification performance is evaluated in terms of AUC measure. Experiments show that utilizing two distributed representations improves classification performance by 2%. Furthermore, we demonstrate that using distributed representations of syllables in short text categorization also provides performance improvements.

Course

Other identifiers

Book Title

Degree Discipline

Electrical and Electronic Engineering

Degree Level

Master's

Degree Name

MS (Master of Science)

Citation

Published Version (Please cite this version)