Digital Library

cab1

 
Title:      USING SEMANTIC RELATIONSHIPS TO ENHANCE NEURAL WORD EMBEDDINGS
Author(s):      Yanqin Yin, Xiaodong Sun, Huanhuan Lv, Pikun Wang, Hongwei Ma and Dongqiang Yang
ISBN:      978-989-8704-23-8
Editors:      Pedro IsaĆ­as
Year:      2020
Edition:      Single
Keywords:      Neural Network Word Embeddings, Semantic Relationship, Semantic Similarity
Type:      Short
First Page:      150
Last Page:      154
Language:      English
Cover:      cover          
Full Contents:      click to dowload Download
Paper Abstract:      Neural language models have significantly improved current natural language understanding tasks. However, distributional semantics, derived from neural language models is less competitive in computing semantic relatedness or similarity than other taxonomy-based methods. Although current researches seek to exploit the handcrafted semantic knowledge in ontology to improve distributional semantics, they often ignore distinguishing different functions of semantic relationships in updating or retrofitting neural word embeddings. This paper proposes retrofitting neural word embedding through semantic relationships encoded in semantic networks such as WordNet and Roget's thesaurus. We employ the hypernym/hyponym relationships to modify the asymmetric distance measure in retrofitting neural embeddings, which can fully transfer the hierarchical semantic information contained in semantic networks. In the evaluation with the gold-standard data sets, our method achieved the Spearman correlation value of 0.80, which is about 8% higher than the state-of-the-art methods in the literature.
   

Social Media Links

Search

Login