Xu Han

[CV]   [Scholar]

Ph.D. Student
Natural Language Processing Lab,
Department of Computer Science and Technology,
Tsinghua University, Beijing, China.

Email: thu [dot] hanxu13 [at] gmail [dot] com
cst_hanxu13 [at] 163 [dot] com
Office: Room 4-506, FIT Building, Tsinghua University
Beijing, 100084, China
Github: THUCSTHanxu13


About Me

Hi! I am a Ph.D. student in the Department of Computer Science and Technology at Tsinghua University. I am advised by Professor Zhiyuan Liu and affiliated with the THUNLP. My research interests lie within the intersection of natural language processing, deep learning, information extraction and knowledge graphs. Before I becoming a Ph.D. student, I also received my bachelor degree from Tsinghua University.


Projects

Publications

Denoising Distant Supervision for Relation Extraction via Instance-Level Adversarial Training.

Existing neural relation extraction (NRE) models rely on distant supervision and suffer from wrong labeling problems. In this paper, we propose a novel adversarial training mechanism over instances for relation extraction to alleviate the noise issue. As compared with previous denoising methods, our proposed method can better discriminate those informative instances from noisy ones. Our method is also efficient and flexible to be applied to various NRE architectures. As shown in the experiments on a large-scale benchmark dataset in relation extraction, our denoising method can effectively filter out noisy instances and achieve significant improvements as compared with the state-of-the-art models.

Papers:
Denoising Distant Supervision for Relation Extraction via Instance-Level Adversarial Training.

Adversarial Multi-lingual Neural Relation Extraction (COLING18)

Multi-lingual relation extraction aims to find unknown relational facts from text in various languages. Existing models cannot well capture the consistency and diversity of relation patterns in different languages. To address these issues, we propose an adversarial multi-lingual neural relation extraction (AMNRE) model, which builds both consistent and individual representations for each sentence to consider the consistency and diversity among languages. Further, we adopt an adversarial training strategy to ensure those consistent sentence representations could effectively extract the language-consistent relation patterns. The experimental results on real-world datasets demonstrate that our AMNRE model significantly outperforms the state-of-the-art models.

Papers:
Adversarial Multi-lingual Neural Relation Extraction. (COLING 2018)

Code:
[AMNRE]

Neural Knowledge Acquisition via Mutual Attention between Knowledge Graph and Text (AAAI18)

We propose a general joint representation learning framework for knowledge acquisition (KA) on two tasks, knowledge graph completion (KGC) and relation extraction (RE) from text. In this framework, we learn representations of knowledge graphs (KGs) and text within a unified parameter sharing semantic space. To achieve better fusion, we propose an effective mutual attention between KGs and text. The reciprocal attention mechanism enables us to highlight important features and perform better KGC and RE. Different from conventional joint models, no complicated linguistic analysis or strict alignments between KGs and text are required to train our models. Experiments on relation extraction and entity link prediction show that models trained under our joint framework are significantly improved in comparison with other baselines. Most existing methods for KGC and RE can be easily integrated into our framework due to its flexible architectures.

Papers:
Neural Knowledge Acquisition via Mutual Attention between Knowledge Graph and Text (AAAI 2018)

Code:
[JointNRE]


Education and Experience

[08.2014-present.] Ph.D. student, Dept. of CS&T, Tsinghua University, Beijing, China.
[08.2013-07.2017.] B.S., Dept. of CS&T, Tsinghua University, Beijing, China.
[08.2010-07.2013.] Dafeng High School, Yancheng, Jiangsu, China.
PC Member: SIGIR 2018, IJCAI 2018.


Honors and Awards

[2017.]  Excellent Graduate, Tsinghua University.
[2017.]  Excellent Bachelor Thesis, Tsinghua University.
[2017.]  Excellent Graduate, Dept. of CS&T, Tsinghua University.
[2016.]  First-class Overall Excellence Scholarship, Tsinghua University.
[2015.]  First-class Science and Technology Innovation Excellence Scholarship, Tsinghua University.
[2014.]  First-class Academic Excellence Scholarship, Tsinghua University.
[2012.]  Silver medal in National Olympiad in Information, China.
[2011.]  Golden medal in National Olympiad in Information, China.


Teaching

[2018.]  Head TA for Object-Oriented Programming, Tsinghua University.
[2017.]  TA for Object-Oriented Programming, Tsinghua University.
[2016.]  TA for Software Engineering, Tsinghua University.


Miscellaneous