Are Noisy Sentences Useless for Distant Supervised Relation Extraction?
Are Noisy Sentences Useless for Distant Supervised Relation Extraction?
Large Scaled Relation Extraction With Reinforcement Learning
Neural Relation Extraction for Knowledge Base Enrichment
Graph Neural Networks with Generated Parameters for Relation Extraction
Attention Guided Graph Convolutional Networks for Relation Extraction
Fine-tuning Pre-Trained Transformer Language Models to Distantly Supervised Relation Extraction
A Walk-based Model on Entity Graphs for Relation Extraction
A Fine-grained and Noise-aware Method for Neural Relation Extraction
An Improved Neural Baseline for Temporal Relation Extraction
Cross-Sentence N-ary Relation Extraction using Lower-Arity Universal Schemas
An Improved Neural Baseline for Temporal Relation Extraction
Easy First Relation Extraction with Information Redundancy
Improving Distantly-Supervised Relation Extraction with Joint Label Embedding
Improving Relation Extraction with Knowledge-attention
Open Relation Extraction: Relational Knowledge Transfer from Supervised Data to Unsupervised Data
Connecting the Dots:Document-level Neural Relation Extraction with Edge-oriented Graphs
Hierarchical Relation Extraction with Coarse-to-Fine Grained Attention
Neural Relation Extraction via Inner-Sentence Noise Reduction and Transfer Learning
RESIDE: Improving Distantly-Supervised Neural Relation Extraction using Side Information
Label-Free Distant Supervision for Relation Extraction via Knowledge Graph Embedding
Hop: An Unrestricted-Hop Relation Extraction Framework for Knowledge-Based Question Answering
Long-tail Relation Extraction via Knowledge Graph Embeddings and Graph Convolution Networks
Discovering Correlations between Sparse Features in Distant Supervision for Relation Extraction
Indirect Supervision for Relation Extraction using Question-Answer Pairs