JOURNAL ARTICLE

GATE: Graph Attention Transformer Encoder for Cross-lingual Relation and Event Extraction

Wasi Uddin AhmadNanyun PengKai-Wei Chang

Year: 2021 Journal:   Proceedings of the AAAI Conference on Artificial Intelligence Vol: 35 (14)Pages: 12462-12470   Publisher: Association for the Advancement of Artificial Intelligence

Abstract

Recent progress in cross-lingual relation and event extraction use graph convolutional networks (GCNs) with universal dependency parses to learn language-agnostic sentence representations such that models trained on one language can be applied to other languages. However, GCNs struggle to model words with long-range dependencies or are not directly connected in the dependency tree. To address these challenges, we propose to utilize the self-attention mechanism where we explicitly fuse structural information to learn the dependencies between words with different syntactic distances. We introduce GATE, a Graph Attention Transformer Encoder, and test its cross-lingual transferability on relation and event extraction tasks. We perform experiments on the ACE05 dataset that includes three typologically different languages: English, Chinese, and Arabic. The evaluation results show that GATE outperforms three recently proposed methods by a large margin. Our detailed analysis reveals that due to the reliance on syntactic dependencies, GATE produces robust representations that facilitate transfer across languages.

Keywords:
Computer science Encoder Artificial intelligence Transformer Natural language processing Relationship extraction Transferability Sentence Graph Dependency grammar Margin (machine learning) Dependency (UML) Theoretical computer science Information extraction Machine learning

Metrics

71
Cited By
7.60
FWCI (Field Weighted Citation Impact)
88
Refs
0.98
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Advanced Graph Neural Networks
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

Multi-scale cross-attention transformer encoder for event classification

A. HammadStefano MorettiMihoko M. Nojiri

Journal:   Journal of High Energy Physics Year: 2024 Vol: 2024 (3)
JOURNAL ARTICLE

Multi-scale cross-attention transformer encoder for event classification

Hammad, Ahmed

Journal:   Zenodo (CERN European Organization for Nuclear Research) Year: 2024
JOURNAL ARTICLE

Multi-scale cross-attention transformer encoder for event classification

Hammad, Ahmed

Journal:   Zenodo (CERN European Organization for Nuclear Research) Year: 2024
JOURNAL ARTICLE

A Cross-Attention Fusion Based Graph Convolution Auto-Encoder for Open Relation Extraction

Xie Bin-hongYu LiHongyan ZhaoLihu PanEnhui Wang

Journal:   IEEE/ACM Transactions on Audio Speech and Language Processing Year: 2022 Vol: 31 Pages: 476-485
© 2026 ScienceGate Book Chapters — All rights reserved.