期刊文献+

基于非零均值广义高斯模型与全局结构相关性的BRISQUE改进算法 被引量:2

An Improved BRISQUE Algorithm Based on Non-zero Mean Generalized Gaussian Model and Global Structural Correlation Coefficients
在线阅读 下载PDF
导出
摘要 为解决经典无参考图像空域质量评价(BRISQUE)算法中图像质量敏感特征描述能力弱的问题,进一步提高BRISQUE算法的准确性和鲁棒性,提出一种基于非零均值广义高斯模型与全局结构相关性的改进算法(IBRISQUE).首先用非零均值对称广义高斯模型从均值减损对比度归一化(MSCN)系数映射图中提取图像质量敏感特征;然后用非零均值非对称广义高斯模型在水平、垂直、主对角线和次对角线4个方向从MSCN系数映射图的相邻系数中提取反映局部结构失真的图像质量敏感特征;最后计算水平、垂直、主对角线和次对角线4个方向MSCN系数映射图的相邻系数的皮尔逊相关系数(PLCC)值,将其作为新增的反映全局结构失真的图像质量敏感特征.在LIVE和TID2013基准测试数据库上的实验结果表明,与当前主流图像质量评价算法相比,IBRISQUE算法具有更高的预测准确性,同时执行效率基本保持了与BRISQUE算法相当的水平,综合性能更优. To solve the problem of the limited description ability of the quality-aware features used in blind/referenceless image spatial quality evaluator(BRISQUE)algorithm and enhance the accuracy and robustness of the BRISQUE algorithm,an improved BRISUQE(IBRISQUE)algorithm was proposed in this paper.First,we adopted non-zero mean symmetric generalized Gaussian distribution(GGD)model to obtain the quality-aware features from mean subtracted and contrast normalized(MSCN)coefficients.Then,we used non-zero mean asymmetric generalized Gaussian distribution(AGGD)to extract the quality-aware features that could represent the local structural distortions from the neighboring MSCN coefficients along four orientations.Finally,we utilized Pearson linear correlation coefficients(PLCC)of neighboring MSCN coefficients from horizontal,vertical,main-diagonal,and secondary-diagonal directions as the quality-aware features reflecting global structural distortions.The IBRISQUE algorithm was tested on the LIVE and TID2013 benchmark databases.Comparing with state-of-the-art image quality assessment algorithms,IBRISQUE algorithm achieves higher prediction accuracy while the efficiency maintains a roughly equal level compared with the original BRISQUE algorithm.The proposed algorithm that strikes a favorable balance between accuracy and complexity outperforms other competing algorithms significantly.
作者 唐祎玲 江顺亮 徐少平 Tang Yiling;Jiang Shunliang;Xu Shaoping(Information Engineering School,Nanchang University,Nanchang 330031)
出处 《计算机辅助设计与图形学学报》 EI CSCD 北大核心 2018年第2期298-308,共11页 Journal of Computer-Aided Design & Computer Graphics
基金 国家自然科学基金(61662044 61163023 61379018 81501560) 江西省自然科学基金(20171BAB202017)
关键词 无参考图像质量评价 自然场景统计 非零均值广义高斯模型 全局结构特征 no-reference image quality assessment natural scene statistics non-zero mean generalized Gaussian model global structural feature
  • 相关文献

参考文献3

二级参考文献41

  • 1韦学辉,李均利,陈刚.一种图像感知质量评价模型[J].计算机辅助设计与图形学学报,2007,19(12):1540-1545. 被引量:8
  • 2Manap R A, Shao L. Non-distorion-specific no-reference imagequality assessment: a survey[J]. Information Sciences, 2015, 301:141-160.
  • 3Liu L X, Dong H P, Huang H, et al, No-reference image qualityassessment in curvelet domain[J]. Signal Processing: ImageCommunication, 2014, 29: 494-505.
  • 4Mittal A, Moorthy A K, Bovik A C. No-reference image qualityassessment in the spatial domain[J]. IEEE Transactions on ImageProcessing, 2012, 21(12): 4695-4708.
  • 5Moorthy A K, Bovik A C. Blind image quality assessment:from natural scene statistics to perceptual quality[J]. Transactionson Image, 2011, 20(12): 3350-3364.
  • 6Ferzli R, Karam L J. A no-reference objective image sharpnessmetric based on the notion of just noticeable blur[J]. IEEETransactions on Image Processing, 2009, 18(4): 717-728.
  • 7Hassen R, Wang Z, Salama M M A. Image sharpness assessmentbased on local phase coherence[J]. IEEE Transactions onImage Processing, 2013, 22(7): 2798-2810.
  • 8Mittal A, Soundararajan R, Bovik A C. Making a “completelyblind” image quality analyzer[J]. IEEE Signal Processing Letters,2013, 20(3): 209-212.
  • 9Sheikh H R, Wang Z, Bovik A C, et al. LIVE image quality assessmentdatabase release 2[OL]. [2015-05-19]. http://live.ece.utexas.edu/research/quality.
  • 10Itti L, Koch C. Feature combination strategies for saliencybasedvisual attention systems[J]. Journal of Electronic Imaging,2001, 10(1): 161-169.

共引文献25

同被引文献25

引证文献2

二级引证文献15

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部