期刊文献+

融合金字塔卷积的U-Net肝脏与肝肿瘤分割网络 被引量:3

U-Net liver and liver tumor segmentation network based on pyramid convolution
在线阅读 下载PDF
导出
摘要 U-Net自诞生以来就在医学分割领域十分热门,尽管原生的U-Net在医学图像分割任务上已经有着非常不错的表现,但是在肝脏肝肿瘤分割任务上仍然有着改进空间。首先肝脏肝肿瘤分割任务中每张CT切片的肝脏和肝肿瘤的大小不一、形状各异,所以需要提取多尺度信息,而U-Net网络所有进行卷积操作的卷积核尺寸都是一样的,因此将金字塔卷积模块替换了传统卷积,以此达到提取多尺度信息的目的。由于有些切片中肝脏肝肿瘤和背景相比样本数量少得多,为此将交叉熵损失函数与Dice损失函数相结合来解决样本数量不平衡带来的问题。U-Net使用下采样操作,因此保证下采样时能保留更多有用的上下文信息至关重要,为此引入了CBAM注意力模块,它同时具备空间注意力和通道注意力。通过在LiTS2017数据集上的大量实验证明了提出模型的有效性。 U-Net has performed very well in the medical image segmentation,but there is still a room for its improvement in the liver segmentation and liver tumor segmentation. The size and shape of the liver and liver tumor in each CT slice in the liver and liver tumor segmentation are different,so it is necessary to extract multi-scale information,However,the convolution kernel size of all convolution operations in the U-Net network is the same. Therefore,the traditional convolution is replaced with the pyramid convolution module to achieve the purpose of extracting multi-scale information. Since the number of samples of liver and liver tumor in some sections is much fewer than those of background,the cross entropy loss function is combined with the Dice loss function to solve the problem caused by the imbalance of the number of samples. The downsampling operation is used in U-Net,so it is important to ensure that more useful context information can be retained during downsampling. Therefore,the CBAM attention module is introduced,which makes it have both spatial attention and channel attention. The effectiveness of the proposed model is proved by a large number of experiments on the LiTS2017 dataset.
作者 郭鹏 邵剑飞 GUO Peng;SHAO Jianfei(School of Information Engineering and Automation,Kunming University of Science and Technology,Kunming 650500,China)
出处 《现代电子技术》 2023年第5期85-88,共4页 Modern Electronics Technique
关键词 医学图像分割 肝脏分割 肝肿瘤分割 U-Net 特征提取 CBAM注意力 实验分析 medical image segmentation liver segmentation liver tumor segmentation U-Net feature extraction CBAM attention experimental analysis
  • 相关文献

参考文献3

二级参考文献35

  • 1MarkoffJ. How many computers to identify a cat?[NJ The New York Times, 2012-06-25.
  • 2MarkoffJ. Scientists see promise in deep-learning programs[NJ. The New York Times, 2012-11-23.
  • 3李彦宏.2012百度年会主题报告:相信技术的力量[R].北京:百度,2013.
  • 410 Breakthrough Technologies 2013[N]. MIT Technology Review, 2013-04-23.
  • 5Rumelhart D, Hinton G, Williams R. Learning representations by back-propagating errors[J]. Nature. 1986, 323(6088): 533-536.
  • 6Hinton G, Salakhutdinov R. Reducing the dimensionality of data with neural networks[J]. Science. 2006, 313(504). Doi: 10. 1l26/science. 1127647.
  • 7Dahl G. Yu Dong, Deng u, et a1. Context-dependent pre?trained deep neural networks for large vocabulary speech recognition[J]. IEEE Trans on Audio, Speech, and Language Processing. 2012, 20 (1): 30-42.
  • 8Jaitly N. Nguyen P, Nguyen A, et a1. Application of pretrained deep neural networks to large vocabulary speech recognition[CJ //Proc of Interspeech , Grenoble, France: International Speech Communication Association, 2012.
  • 9LeCun y, Boser B, DenkerJ S. et a1. Backpropagation applied to handwritten zip code recognition[J]. Neural Computation, 1989, I: 541-551.
  • 10Large Scale Visual Recognition Challenge 2012 (ILSVRC2012)[OLJ.[2013-08-01J. http://www. image?net.org/challenges/LSVRC/2012/.

共引文献686

同被引文献15

引证文献3

二级引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部