期刊文献+

SHEL:a semantically enhanced hardware-friendly entity linking method

在线阅读 下载PDF
导出
摘要 With the help of pre-trained language models,the accuracy of the entity linking task has made great strides in recent years.However,most models with excellent performance require fine-tuning on a large amount of training data using large pre-trained language models,which is a hardware threshold to accomplish this task.Some researchers have achieved competitive results with less training data through ingenious methods,such as utilizing information provided by the named entity recognition model.This paper presents a novel semantic-enhancement-based entity linking approach,named semantically enhanced hardware-friendly entity linking(SHEL),which is designed to be hardware friendly and efficient while maintaining good performance.Specifically,SHEL's semantic enhancement approach consists of three aspects:(1)semantic compression of entity descriptions using a text summarization model;(2)maximizing the capture of mention contexts using asymmetric heuristics;(3)calculating a fixed size mention representation through pooling operations.These series of semantic enhancement methods effectively improve the model's ability to capture semantic information while taking into account the hardware constraints,and significantly improve the model's convergence speed by more than 50%compared with the strong baseline model proposed in this paper.In terms of performance,SHEL is comparable to the previous method,with superior performance on six well-established datasets,even though SHEL is trained using a smaller pre-trained language model as the encoder.
作者 亓东林 CHEN Shudong DU Rong TONG Da YU Yong QI Donglin;CHEN Shudong;DU Rong;TONG Da;YU Yong(Institute of Microelectronics of Chinese Academy of Sciences,Beijing 100029,P.R.China;University of Chinese Academy of Sciences,Beijing 100190,P.R.China)
出处 《High Technology Letters》 EI CAS 2024年第1期13-22,共10页 高技术通讯(英文版)
基金 the Beijing Municipal Science and Technology Program(Z231100001323004)。
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部