Dependency Based Embeddings for Sentence Classification Tasks

Alexandros Komninos, Suresh Manandhar

Research output: Contribution to conferencePaperpeer-review

Abstract

We compare different word embeddings from a standard window based skipgram model, a skipgram model trained using dependency context features and a novel skipgram variant that utilizes additional information from dependency graphs. We explore the effectiveness of the different types of word embeddings for word similarity and sentence classification tasks. We consider three common sentence classification tasks: question type classification on the TREC dataset, binary sentiment classification on Stanford’s Sentiment Treebank and semantic relation classification on the SemEval 2010 dataset. For each task we use three different classification methods: a Support Vector Machine, a convolutional Neural Network and a Long Short Term Memory Network. Our experiments show that dependency based embeddings outperform standard window based embeddings in most of the settings, while using dependency context embeddings as additional features improves performance in all tasks regardless of the classification method.
Our embeddings and code are available at
https://www.cs.york.ac.uk/
Original languageEnglish
Pages1490-1500
Number of pages11
Publication statusPublished - 12 Jun 2016
Event2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (NAACL HLT) - California, San Diego, United States
Duration: 12 Jun 201617 Jun 2016

Conference

Conference2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (NAACL HLT)
Country/TerritoryUnited States
CitySan Diego
Period12/06/1617/06/16

Cite this