Adopt unsupervise approach on sentence encoding without explicit features selection to extract relations from plane text ...
learn moreDICE, Paderborn University | AKSW, University of Leipzig
Manzoor Ali, Muhammad Saleem, and Axel-Cyrille Ngonga Ngomo
Arecticture of US-BERT
Models | P | R | F1 | P | R | F1 | |
RelLDA1 | 0.30 | 0.47 | 0.36 | - | - | - | |
Simon | 0.32 | 0.50 | 0.39 | 0.33 | 0.50 | 0.40 | |
EType+ | 0.30 | 0.62 | 0.40 | 0.31 | 0.64 | 0.42 | |
US-BERT | 0.35 | 0.45 | 0.39 | 0.38 | 0.61 | 0.47 |
StandfordNER
AllenNLP NER
Precision (P) Recall (R) and F1 score of different systems using two NER annotation techniques on NYT-FB.
We used pre-trained sentence encoding to extract high-quality relations with-out any explicit features selection. We achieved the best F1, and precision score compares to the (SOTA) unsupervised methods. To further investigate the relation extraction, we will use some feature selection, compare the results with our work and see the impact also, we will compare our approach with some other (SOTA) approaches in our future work, mainly to the relation extraction systems based on language models.
This work has been supported by the EU H2020 Marie Skłodowska-Curie projectKnowGraphs (860801), the BMBF-funded EuroStars projects E!113314 FROCKG(01QE19418) and E! 114154 PORQUE (01QE2056C).
created with
Website Builder Software .