Search (1 results, page 1 of 1)

  • × author_ss:"Yen, T.-Y."
  • × author_ss:"Chen, H.-H."
  1. Lee, Y.-Y.; Ke, H.; Yen, T.-Y.; Huang, H.-H.; Chen, H.-H.: Combining and learning word embedding with WordNet for semantic relatedness and similarity measurement (2020) 0.00
    0.0040585627 = product of:
      0.008117125 = sum of:
        0.008117125 = product of:
          0.024351375 = sum of:
            0.024351375 = weight(_text_:12 in 5871) [ClassicSimilarity], result of:
              0.024351375 = score(doc=5871,freq=2.0), product of:
                0.13281173 = queryWeight, product of:
                  2.765864 = idf(docFreq=7562, maxDocs=44218)
                  0.048018172 = queryNorm
                0.1833526 = fieldWeight in 5871, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  2.765864 = idf(docFreq=7562, maxDocs=44218)
                  0.046875 = fieldNorm(doc=5871)
          0.33333334 = coord(1/3)
      0.5 = coord(1/2)
    
    Abstract
    In this research, we propose 3 different approaches to measure the semantic relatedness between 2 words: (i) boost the performance of GloVe word embedding model via removing or transforming abnormal dimensions; (ii) linearly combine the information extracted from WordNet and word embeddings; and (iii) utilize word embedding and 12 linguistic information extracted from WordNet as features for Support Vector Regression. We conducted our experiments on 8 benchmark data sets, and computed Spearman correlations between the outputs of our methods and the ground truth. We report our results together with 3 state-of-the-art approaches. The experimental results show that our method can outperform state-of-the-art approaches in all the selected English benchmark data sets.