Download PDF

European Summer School in Logic, Language and Information (ESSLLI), Date: 2008/08/04 - 2008/08/15, Location: Hamburg, Germany

Publication date: 2008-08-01
Pages: 143 - 152
ISSN: 9783642147289
Publisher: Springer; Berlin

Proceedings of the 13th ESSLLI Student Session

Author:

Peirsman, Yves

Keywords:

Computational linguistics, lexical semantics

Abstract:

Word Space Models provide a convenient way of modelling word meaning in terms of a word’s contexts in a corpus. This paper investigates the influence of the type of context features on the kind of semantic information that the models capture. In particular, we make a distinction between semantic similarity and semantic relatedness. It is shown that the strictness of the context definition correlates with the models’ ability to identify semantically similar words: syntactic approaches perform better than bag-of-word models, and small context windows are better than larger ones. For semantic relatedness, however, syntactic features and small context windows are at a clear disadvantage. Second-order bag-of-word models perform below average across the board.