Event-Centric Natural Language Understanding
Challenges
For humans, successful reading comprehension depends on the construction of an event structure that represents what is happening in the text, often referred to as the situation model in cognitive psychology. This situation model also involves the integration of prior knowledge with information presented in text for reasoning and inference.
Language understanding requires a combination of relevant evidence, such as from contextual knowledge, common sense or world knowledge, to infer meaning underneath. It also requires a constant update of memory as reading progresses. In machine reading comprehension, a computer could continuously build and update a graph of eventualities as reading progresses. Question-answering could, in principle, be based on such a dynamically updated event graph.
Project Aims
The project aims to develop a knowledge-aware and event-centric framework for NLU, in which event graphs are built as reading progresses; event representations are learned with the incorporation of background knowledge; implicit knowledge is derived by performing reasoning over event graphs; and the comprehension model is developed with built-in interpretability and robustness against adversarial attacks.
Impact
Since spoken and written communication plays a central part in our daily work and life, the proposed framework will have a profound impact on a variety of application areas, including drug discovery, intelligent virtual assistants, automated customer services, smart home, and question-answering in the finance and legal domains, benefiting industries such as healthcare, finance, law, insurance and education.
Participants
Lin Gui, Gabriele Pergola, Xingwei Tan
Publications
- X. Tan, G. Pergola and Y. He. Extracting Event Temporal Relations via Hyperbolic Geometry. Conference on Empirical Methods in Natural Language Processing (EMNLP), Nov. 2021.
- L. Zhu, G. Pergola, L. Gui, D. Zhou and Y. He. Topic-Driven and Knowledge-Aware Transformer for Dialogue Emotion Detection, The 59th Annual Meeting of the Association for Computational Linguistics (ACL), Aug. 2021.
- L. Zhang, D. Zhou, Y. He and Z. Yang. MERL: Multimodal Event Representation Learning in Heterogeneous Embedding Spaces, The 35th AAAI Conference on Artificial Intelligence (AAAI), Feb. 2021.
- R. Wang, D. Zhou and Y. He. Open Event Extraction from Online Texts using a Generative Adversarial Network. Conference on Empirical Methods in Natural Language Processing (EMNLP), Hong Kong, China, Nov. 2019.
- D. Zhou, L. Guo and Y. He. Neural Storyline Extraction Model for Storyline Generation from News Articles, The 16th Annual Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (NAACL), New Orleans, Louisiana, USA, Jun. 2018.
- D Zhou, X. Zhang and Y. He. Event Extraction from Twitter using Non-Parametric Bayesian Mixture Model with Word Embeddings, The European Chapter of the Association for Computational Linguistics (EACL), Valencia, Spain, Apr. 2017.
- D. Zhou, L. Chen, X. Zhang and Y. He. Unsupervised Event Exploration from Social Text Streams, Intelligent Data Analysis, 21(4):849-866, 2017.
- D. Zhou, T. Gao and Y. He. Jointly Event Extraction and Visualization on Twitter via Probabilistic Modelling, The 54th Annual Meeting of the Association for Computational Linguistics (ACL), Berlin, Germany, Aug. 2016.
- D. Zhou, H. Xu, X. Dai and Y. He. Unsupervised Storyline Extraction on News Articles, The 25th International Joint Conference on Artificial Intelligence (IJCAI), New York, US, Jul. 2016.