Handling Long Sequences in BERT for Question Answering Systems
DOI:
https://doi.org/10.59188/eduvest.v5i8.51006Keywords:
Question Answering, Hierarchical Processing, Dynamic Memory Networks, BERT, Long-Context ComprehensionAbstract
The Question Answering System is an important component of Natural Language Processing applications, allowing for efficient information processing and improving user experience. Despite the fact that BERT has provided additional work in QA tasks, the fact that it only supports up to 512 tokens has reduced its effectiveness in large-scale scenarios. This study addresses the problem by introducing a new algorithm that integrates hierarchical and dynamic memory networks with BERT. The method used to collect broad contexts into chunks that may be used for independent research, ensuring that no important information is missing. The dynamic memory module integrates and stores information in real time throughout the system, allowing for comprehensive context understanding. Depending on the SQuAD v2.0 dataset, the model achieved an Exact Match score of 78.10% and an F1 score of 87.27%. The F1-score value of standard BERT with a value of 81.9% increased to 87.27% with this approach. This research investigated the potential of structured and memory networks to overcome the weaknesses of BERT, provide solutions, and adapt to QA tasks.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 Andreyanto Pratama, Hasanul Fahmi

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.