Site Search
Computer Science


Arunita Jaekel, Ph.D.Dr. Arunita Jaekel
Dr. Arunita Jaekel
Dr. Ziad Kobti lecturingDr. Ziad Kobti
Dr. Ziad Kobti
Lambton TowerLambton Tower
Lambton Tower
Windsor WaterfrontWindsor Waterfront Park
Windsor Waterfront Park
Xiaobu Yuan, Ph.D.Dr. Xiaobu Yuan
Dr. Xiaobu Yuan
Dr. Robert KentDr. Robert Kent
Dr. Robert Kent
Dr. Luis RuedaDr. Luis Rueda
Dr. Luis Rueda
Jessica Chen, Ph.D.Dr. Jessica Chen
Dr. Jessica Chen
Robin Gras, Ph.D.Dr. Robin Gras
Dr. Robin Gras
Christie Ezeife, Ph.D.Dr. Christie Ezeife
Dr. Christie Ezeife
Dr. Scott GoodwinDr. Scott Goodwin
Dr. Scott Goodwin
Imran Ahmad, Ph.D.Dr. Imran Ahmad
Dr. Imran Ahmad
Alioune Ngom, Ph.D.Dr. Alioune Ngom
Dr. Alioune Ngom

MSc Thesis Defense by Yaxin Li: Capturing Word Semantics from Co-occurrences Using Dynamic Mutual Information

Add this event into your calendar using the iCAL format
  • Mon, 01/21/2019 - 3:00pm - 5:00pm

The School of Computer Science is pleased to present…………



Capturing Word Semantics from Co-occurrences Using Dynamic Mutual Information

MSc Thesis Defense by:

Yaxin Li


Date: January 21, 2019

Time:  3:00 pm – 5:00 pm

Location: 3105, Lambton Tower


Abstract: Semantic relations between words are crucial for information retrieval and natural language processing tasks. Distributional representations are based on word co-occurrence, and have been proven successful. Recent neural network approaches such as Word2vec and Glove are all derived from co-occurrence information. In particular, they are based on Shifted Positive Pointwise Mutual Information (SPPMI). In SPPMI, PMI values are shifted uniformly by a constant, which is typically five. Although SPPMI is effective in practice, it lacks theoretical explanation, and has space for improvement. Intuitively, shifting is to remove co-occurrence pairs that could have co-occurred due to randomness, i.e., the pairs whose expected co-occurrence count is close to its observed appearances. We propose a new shifting scheme, called Dynamic Mutual Information (DMI), where the shifting is based on the variance of co-occurrences and Chebyshev's Inequality. Intuitively, DMI shifts more aggressively for rare word pairs. We demonstrate that DMI outperforms the state-of-the-art SPPMI in a variety of word similarity evaluation tasks.



Thesis Committee:

Internal Reader:          Dr. Robin Gras           

External Reader:         Dr. Abdulkadir Hussein

Advisor:                        Dr. Jianguo Lu

Chair:                            TBD   





All are welcome


For further information on upcoming events,  please visit .



                                     Thanks and have a great day!



Christine Weisener
(519)253-3000 ext.3716