Should We Afford Affordances? Injecting ConceptNet Knowledge into BERT-Based Models to Improve Commonsense Reasoning Ability
[ 1 ] Instytut Informatyki, Wydział Informatyki i Telekomunikacji, Politechnika Poznańska | [ S ] student | [ P ] pracownik
2022
rozdział w monografii naukowej / referat
angielski
- Commonsense reasoning
- Natural Language Processing
- Deep Learning
- Knowledge Graph
EN Recent years have shown that deep learning models pre-trained on large text corpora using the language model objective can help solve various tasks requiring natural language understanding. However, many commonsense concepts are underrepresented in online resources because they are too obvious for most humans. To solve this problem, we propose the use of affordances – common-sense knowledge that can be injected into models to increase their ability to understand our world. We show that injecting ConceptNet knowledge into BERT-based models leads to an increase in evaluation scores measured on the PIQA dataset.
20.09.2022
97 - 104
witryna wydawcy
ostateczna wersja opublikowana
20
70