期刊文献+

基于BERT的双通道神经网络模型文本情感分析研究 被引量:5

Research on text sentiment analysis of dual-channel neural network model based on BERT
在线阅读 下载PDF
导出
摘要 针对当前情感分析任务中使用Word2Vec、GloVe等模型生成的文本词向量,无法有效解决多义词表征、经典神经网络模型无法充分提取文本语义特征等问题,本文提出基于BERT的双通道神经网络模型文本情感分析方法。该方法采用BERT模型生成词向量,BERT模型对下游分类任务进行微调的过程中生成文本词向量的动态表征。然后,将词向量输入由CNN与BiGRU构建的双通道模型进行特征提取,并行获取文本的局部与全局语义特征,并通过注意力机制为输出特征分配相应的权重分值,突出文本的情感极性。最后将双通道输出特征融合进行情感分类。在酒店评论数据集上进行实验,结果表明本文模型与文本情感分析的基线模型相比,在准确率与F;分值上分别提高了3.7%和5.1%。 Aiming at the current sentiment analysis tasks using the text word vector generated by Word2 vec,GloVe and other models,it cannot effectively solve the problems of polysemous word representation and that the classic neural network model cannot fully extract the semantic features of the text.This paper proposes a dual-channel neural network model text sentiment analysis based on BERT method.This method uses the BERT model to generate word vectors,and the BERT model generates dynamic representations of text word vectors in the process of fine-tuning downstream classification tasks.Then,the word vector is input to the dual-channel model constructed by CNN and BiGRU for feature extraction,the local and global semantic features of the text are obtained in parallel,and the corresponding weight scores are assigned to the output features through the attention mechanism to highlight the emotional polarity of the text.Finally,the dual-channel output features are fused to perform sentiment classification.Experiments on the hotel review data set show that compared with the baseline model of text sentiment analysis,the accuracy and F;score of this model are improved by 3.7%and 5.1%,respectively.
作者 严驰腾 何利力 YAN Chiteng;HE Lili(School of Informatics Science and Technology,Zhejiang Sci-Tech University,Hangzhou 310018,China)
出处 《智能计算机与应用》 2022年第5期16-22,共7页 Intelligent Computer and Applications
基金 国家重点研发计划项目(2018YFB1700702)
关键词 文本情感分析 BERT模型 卷积神经网络(CNN) 双向门控制循环单元(BiGRU) 注意力机制 text sentiment analysis BERT model Convolutional Neural Network Bidirectional Gate Control Recurrent unit attention mechanism
  • 相关文献

参考文献1

二级参考文献8

共引文献47

同被引文献43

引证文献5

二级引证文献16

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部