Site Search
Computer Science

Photos

Dr. Scott GoodwinDr. Scott Goodwin
Dr. Scott Goodwin
Dr. Luis RuedaDr. Luis Rueda
Dr. Luis Rueda
Dr. Ziad Kobti lecturingDr. Ziad Kobti
Dr. Ziad Kobti
Alioune Ngom, Ph.D.Dr. Alioune Ngom
Dr. Alioune Ngom
Jessica Chen, Ph.D.Dr. Jessica Chen
Dr. Jessica Chen
Dr. Robert KentDr. Robert Kent
Dr. Robert Kent
Xiaobu Yuan, Ph.D.Dr. Xiaobu Yuan
Dr. Xiaobu Yuan
Lambton TowerLambton Tower
Lambton Tower
Arunita Jaekel, Ph.D.Dr. Arunita Jaekel
Dr. Arunita Jaekel
Imran Ahmad, Ph.D.Dr. Imran Ahmad
Dr. Imran Ahmad
Robin Gras, Ph.D.Dr. Robin Gras
Dr. Robin Gras
Windsor WaterfrontWindsor Waterfront Park
Windsor Waterfront Park
Christie Ezeife, Ph.D.Dr. Christie Ezeife
Dr. Christie Ezeife

Capturing Word Semantics from Co-occurrences Using Dynamic Mutual Information

Add this event into your calendar using the iCAL format
  • Fri, 10/05/2018 - 11:00am - 1:00pm




Capturing Word Semantics From Co-occurrences Using Dynamic Mutual Information

MSc Thesis Proposal by:

Yaxin Li

Date:  Friday, October 5th, 2018
Time:  11: 00 am – 1:00 pm
Location: 3105, Lambton Tower

Abstract: Semantic relations between words are crucial for information retrieval and natural language processing tasks. Distributional representations are based on word co-occurrence, and have been proven successful. Recent neural network approaches such as Word2vec and Glove are all derived from co-occurrence information. In particular, they are based on Shifted Positive Pairwise Mutual Information (SPPMI). In SPPMI, PMI valves are shifted uniformly by a constant, which is typically five. Although SPPMI is effective in practice, it lacks theoretical explanation, and has space for improvement. Intuitively, shifting is to remove co-occurrence pairs that could have co-occurred due to randomness, i.e., the pairs whose expected co-occurrence count is close to its observed appearances. We propose a new shifting scheme, called Dynamic Mutual Information (DMI), where the shifting is based on the variance of co-occurrences and Chebyshev’s Inequality. Intuitively, DMI shifts more aggressively for rare word pairs. We demonstrate that DMI outperforms the state-of-the-art SPPMI in a variety of word similarity evaluation tasks.

Thesis Committee:
Internal Reader: Dr. Robin Gras
External Reader: Dr. Abdulkadir Hussein
Advisor: Dr. Jianguo Lu

 



csgradinfo@uwindsor.ca
(519)253-3000


See More: