Quick Search:

Probabilistic Relational Supervised Topic Modelling using Word Embeddings

Alshehabi Al-Ani, Jabir ORCID logoORCID: https://orcid.org/0000-0002-0553-2538 and Fasli, Maria (2019) Probabilistic Relational Supervised Topic Modelling using Word Embeddings. In: Proceedings 2018 IEEE International Conference on Big Data (Big Data). IEEE

Full text not available from this repository.

Abstract

The increasing pace of change in languages affects many applications and algorithms for text processing. Researchers in Natural Language Processing (NLP) have been striving for more generalized solutions that can cope with continuous change. This is even more challenging when applied on short text emanating from social media. Furthermore, increasingly social media have been casting a major influence on both the development and the use of language. Our work is motivated by the need to develop NLP techniques that can cope with short informal text as used in social media alongside the massive proliferation of textual data uploaded daily on social media. In this paper, we describe a novel approach for Short Text Topic Modelling using word embeddings and taking into account any informality of words in the social media text with the aim of addressing the challenge of reducing noise in messy text. We present a new algorithm derived from the Term Frequency -Inverse Document Frequency (TF-IDF), named Term Frequency - Inverse Context Term Frequency (TF-ICTF). TF-ICTF relies on a probabilistic relation between words and context with respect to time. Our experimental work shows promising results against other state-of-the-art methods.

Item Type: Book Section
Status: Published
DOI: 10.1109/BigData.2018.8622326
School/Department: School of Science, Technology and Health
URI: https://ray.yorksj.ac.uk/id/eprint/7563

University Staff: Request a correction | RaY Editors: Update this record