Open Access Paper
12 November 2024 Keywords extraction algorithm based on attention mechanism of BERT model
Yanfen Luo
Author Affiliations +
Proceedings Volume 13395, International Conference on Optics, Electronics, and Communication Engineering (OECE 2024) ; 133954H (2024) https://doi.org/10.1117/12.3049066
Event: International Conference on Optics, Electronics, and Communication Engineering, 2024, Wuhan, China
Abstract
Text keyword extraction refers to finding the words or phrases from an article that best represent its topic or content. Text keyword extraction is not only helpful to quickly understand the content of the text, but also can be used in NLP tasks such as information retrieval, automatic summarization, text classification, etc. In order to improve the performance of keyword extraction algorithm, this paper proposes a keyword extraction model AttentionRank based on the self-attention mechanism of BERT model by using BERT pre-trained language model. The pre-trained BERT language model can recognize keywords through self-attention and cross-attention, and enhance the ability of keyword extraction algorithm to understand context. The experimental results show that AttentionRank has obvious advantages in keyword extraction compared with LDA and LSA algorithms.
(2024) Published by SPIE. Downloading of the abstract is permitted for personal use only.
Yanfen Luo "Keywords extraction algorithm based on attention mechanism of BERT model", Proc. SPIE 13395, International Conference on Optics, Electronics, and Communication Engineering (OECE 2024) , 133954H (12 November 2024); https://doi.org/10.1117/12.3049066
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Transformers

Performance modeling

Process modeling

Matrices

Network architectures

Neural networks

Semantics

Back to Top