期刊文献+

深度学习:多层神经网络的复兴与变革 被引量:38

Deep learning: The revival and transformation of multi layer neural networks
原文传递
导出
摘要 人工智能(AI)已经进入一个新的蓬勃发展期。推动这一轮AI狂澜的是三大引擎,即深度学习(DL)、大数据和大规模并行计算,其中又以DL为核心。本文回顾本轮"深度神经网络复兴"的基本情况,概要介绍常用的4种深度模型,即:深度信念网络(DBN)、深度自编码网络(DAN)、深度卷积神经网络(DCNN)及长短期记忆递归神经网络(LSTM-RNN)。简要介绍深度学习在语音识别和计算机视觉领域几个重要任务上的应用效果情况。为便于应用DL,介绍了几种常用的深度学习开源平台。对深度学习带来的启示和变革做了一些开放式的评述,讨论了该领域的开放问题和发展趋势。 Artificial intelligence (AI) has entered a new period of vigorous development. This round of AI topsy is driven by three engines, namely the depth of learning (DL), big data and massively parallel computing, with DL as the core. This article reviews from a historical perspective the basic situation of the round "deep neural networks renaissance", then summarizes the four common depth models: deep belief network (DBN), depth from network coding (DAN), deep eonvolutional neural networks (DCNN) and long short term memory recurrent neural network LSTM-RNN. After that, this paper briefly introduces the application effects of deep learning in speech recognition and computer vision. In order to facilitate the application of DL, it also introduces several commonly used deep learning platforms. Finallv, the enlightement and reform of deep learning are commented, and the open problems and developnaent trend in this fiehl are discussed.
出处 《科技导报》 CAS CSCD 北大核心 2016年第14期60-70,共11页 Science & Technology Review
关键词 深度神经网络:深度信念网络 深度自编码网络 深度卷积神经网络 长短期记忆递归神经网络 语音识别 计算机视觉 multilayer neural networks DBN DAN DCNN LSTM-RNN speech recognition: computer vision
  • 相关文献

参考文献41

  • 1McCulloeh W S, Pitts W. A logical calculus of the ideas immanent in nervous activity[J]. Bulletin of Mathematical Biophysics. 1943, 5(4): 115-133.
  • 2Hebb D O. The organization of behavior[M]. New York: Wiley, 1949.
  • 3Rosenblatt F. The pereeptron: A probal,ilistic model for information storage and organization in the brain[J]. Psyehological Review, 1958.65(6): 386-408.
  • 4Rumelhart D E. Hinton G E. Williams R J. Learning internal representations by error propagalion[J]. Nature. 323. 1986. ttoi: 10.1016/B978-1-4832- 1446-7.50035-2.
  • 5Hornik K, Stinchcombe M, White H. Muhilayer feedforward networks are universal approximators[J]. Neural Networks. 1989. 2(2): 359-366.
  • 6Hinton G E, Salakhutdinov R R. Reducing the dimensionality of data with neural networks[J]. Science, 2006, 313: 504-507.
  • 7Hinton G E, Osindero S, Teh Y. A fast learning algorithm for deep belief nets[J]. Neural Computation, 2006, 18: 1527-1554.
  • 8Bengio Y, Lamblin P, Popoviei D. el al. Greedy Layer-Wise training of deep networks[M]//Advances in Neural lnformation Processing Systems 19 (NIPS" 06). Cambridge MA: MIT Press, 2007: 153-160.
  • 9Krizhevsky A, Sutskever 1. Hinton G E. lmageNel classification with deep eonvohutional neural networks[M]//Advances in Neural Information Processing Systems (NIPS). Cambridge MA: MIT Press, 2012:1097-1105.
  • 10Szegedy C, Liu W, Jia Y Q. et al. Vincent vanhoucke and andrew rabinovich. Going deeper with convolutions[C]. IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, June 7-12, 2015: 1-9.

同被引文献381

引证文献38

二级引证文献343

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部