期刊文献+

基于多头自注意力和SENet的远程监督关系抽取 被引量:4

Distant supervision relationship extraction based on multi-head self-attention and SENet
在线阅读 下载PDF
导出
摘要 关系抽取是指从文本中识别实体并抽取出实体之间的语义关系。它作为信息抽取的基本组成之一,在知识图谱、关系推理、知识问答等自然语言处理领域有着广泛的应用。卷积神经网络是关系抽取中比较常用的特征提取网路。由于它在抽取句子的长程依赖特征时有一定的局限性,提出了SENet和多头自注意力相结合的关系抽取模型。模型通过多头自注意力来计算句子每个词与其他词的联系,捕获句子的内部结构,用SENet来对卷积通道进行注意力加权,获得句子短语的深度语义信息,最后用句子级注意力机制来解决远程监督的噪声。所提出的方法在公开数据集NYT上测试,通过计算出准确率、召回率,绘制出PR曲线证明所提方法与基准方法相比,性能有较大的提升。 Relationship extraction refers to identifying entities from text and extracting the semantic relationship between entities.As one of the basic components of information extraction,it has a wide range of applications in the fields of natural language processing such as knowledge graphs,relational reasoning,and knowledge questions and answers.Convolutional neural network is a commonly used feature extraction network in relation extraction.Because of its limitations in extracting long-range dependent features of sentences,proposes a relation extraction model that combines SENet and multi-head self-attention.The model uses multi-head self-attention to calculate the connection between each word of the sentence and other words,captures the internal structure of the sentence,uses SENet to weight the attention of the convolution channel,obtains the deep semantic information of the sentence phrase,and finally uses sentence-level attention mechanism to solve the noise of remote supervision.The proposed method was tested on the public data set NYT.By calculating the accuracy rate,recall rate,and drawing the PR curve,it proved that the method has a relatively large performance improvement compared with the benchmark method.
作者 蔡伟龙 毛建华 Cai Weilong;Mao Jianhua(School of Communication and Information Engineering,Shanghai University,Shanghai 200444,China)
出处 《电子测量技术》 2020年第21期132-136,共5页 Electronic Measurement Technology
关键词 关系抽取 多头自注意力 远程监督 relation extraction multi-head self-attention distant supervision
  • 相关文献

参考文献1

二级参考文献2

共引文献2

同被引文献63

引证文献4

二级引证文献17

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部