Journal Home Online First Current Issue Archive For Authors Journal Information 中文版

Engineering >> 2020, Volume 6, Issue 3 doi: 10.1016/j.eng.2019.11.012

From Brain Science to Artificial Intelligence

a Department of Automation, Tsinghua University, Beijing 100084, China
b Tsinghua-Berkeley Shenzhen Institute, Tsinghua University, Shenzhen 518055, China

Received: 2019-07-11 Revised: 2019-10-22 Accepted: 2019-11-04

Next Previous

Abstract

Reviewing the history of the development of artificial intelligence (AI) clearly reveals that brain science has resulted in breakthroughs in AI, such as deep learning. At present, although the developmental trend in AI and its applications has surpassed expectations, an insurmountable gap remains between AI and human intelligence. It is urgent to establish a bridge between brain science and AI research, including a link from brain science to AI, and a connection from knowing the brain to simulating the brain. The first steps toward this goal are to explore the secrets of brain science by studying new brain-imaging technology; to establish a dynamic connection diagram of the brain; and to integrate neuroscience experiments with theory, models, and statistics. Based on these steps, a new generation of AI theory and methods can be studied, and a subversive model and working mode from machine perception and learning to machine thinking and decision-making can be established. This article discusses the opportunities and challenges of adapting brain science to AI.

References

[ 1 ] Hebb DO. The organization of behavior. Hoboken: John Wiley & Sons; 1949. link1

[ 2 ] LeCun Y, Boser B, Denker JS, Henderson D, Howard RE, Hubbard W, et al. Backpropagation applied to handwritten zip code recognition. Neural Comput 1989;1(4):541–51. link1

[ 3 ] Krizhevsky A, Sutskever I, Hinton G. ImageNet classification with deep convolutional neural networks. In: Pereira F, Burges CJC, Bottou L, Weinberger KQ, editors. Proceedings of the Neural Information Processing Systems 2012; 2012 Dec 3–6; Lake Tahoe, NV, USA; 2012. p. 1097–105. link1

[ 4 ] James W, Burkhardt F, Bowers F, Skrupskelis IK. The principles of psychology. New York: Henry Holt; 1890. link1

[ 5 ] Hochreiter S, Schmidhuber J. Long short-term memory. Neural Comput 1997;9 (8):1735–80. link1

[ 6 ] Kirkpatrick J, Pascanu R, Rabinowitz N, Veness J, Desjardins G, Rusu AA, et al. Overcoming catastrophic forgetting in neural networks. Proc Natl Acad Sci USA 2017;114(13):3521–6. link1

[ 7 ] Russell SJ, Norvig P. Artificial intelligence: a modern approach. 3rd ed. New York: Pearson Education; 2010. link1

[ 8 ] Miller GA. The cognitive revolution: a historical perspective. Trends Cogn Sci 2003;7(3):141–4. link1

[ 9 ] Turing A. Computing machinery and intelligence. Mind 1950;236:433–60. link1

[10] Minsky M, Papert S. Perceptrons: an introduction to computational geometry. Cambridge: MIT Press; 1987. link1

[11] McCarthy J. Defending AI research: a collection of essays and reviews. Stanford: CSLI Publications; 1996. link1

[12] Hinton GE, Rumelhart DE, McClelland JL. Distributed representations. In: Parallel distributed processing: explorations in the microstructure of cognition: foundations. Cambridge: MIT Press; 1986. p. 77–109. link1

[13] Rosenblatt F. The perceptron: a probabilistic model for information storage and organization in the brain. Psychol Rev 1958;65(6):386–408. link1

[14] Hubel DH, Wiesel TN. Receptive fields of single neurones in the cat’s striate cortex. J Physiol 1959;148(3):574–91. link1

[15] LeCun Y, Bengio Y, Hinton G. Deep learning. Nature 2015;521(7553):436–44. link1

[16] Rumelhart DE, McClelland JL. Learning internal representations by error propagation. In: Parallel distributed processing: explorations in the microstructure of cognition: foundations. Cambridge: MIT Press; 1986. p. 318–62. link1

[17] Rumelhart DE, McClelland JL. Parallel distributed processing: explorations in the microstructures of cognition: foundations. Cambridge: MIT Press; 1986. link1

[18] Raichle ME. Positron emission tomography. In: Wilson RA, Keil LC, editors. The MIT encyclopedia of the cognitive sciences. Cambridge: MIT Press; 1999. p. 656–8. link1

[19] Scolari M, Seidl-Rathkopf KN, Kastner S. Functions of the human frontoparietal attention network: evidence from neuroimaging. Curr Opin Behav Sci 2015;1:32–9. link1

[20] Bahdanau D, Cho K, Bengio Y. Neural machine translation by jointly learning to align and translate. 2014. arXiv:1409.0473.

[21] Reed S, Zhang Y, Zhang Y, Lee H. Deep visual analogy-making. In: Cortes C, Lawrence ND, Lee DD, Sugiyama M, Garnett R, editors. Proceedings of the Neural Information Processing Systems 2015; 2015 Dec 7–12; Montreal, QC, Canada; 2015. p. 1252–60. link1

[22] Atkinson RC, Shiffrin RM. Human memory: a proposed system and its control processes. In: Spence KW, Spence JT, editors. Psychology of learning and motivation (volume 2). New York: Academic Press; 1968. p. 89–195. link1

[23] Baddeley AD, Hitch G. Working memory. In: Bower GH, editor. Psychology of learning and motivation (volume 8). New York: Academic Press; 1974. p. 47–89. link1

[24] Goldman-Rakic PS. Cellular and circuit basis of working memory in prefrontal cortex of nonhuman primates. Prog Brain Res 1990;85:325–35. link1

[25] McCarthy G, Puce A, Constable RT, Krystal JH, Gore JC, Goldman-Rakic P. Activation of human prefrontal cortex during spatial and nonspatial working memory tasks measured by functional MRI. Cereb Cortex 1996;6(4):600–11. link1

[26] Jonides J, Smith EE, Koeppe RA, Awh E, Minoshima S, Mintun MA. Spatial working memory in humans as revealed by PET. Nature 1993;363 (6430):623–5. link1

[27] Graves A, Wayne G, Reynolds M, Harley T, Danihelka I, Grabska-Barwin´ ska A, et al. Hybrid computing using a neural network with dynamic external memory. Nature 2016;538(7626):471–6. link1

[28] Santoro A, Bartunov S, Botvinick M, Wierstra D, Lillicrap T. One-shot learning with memory-augmented neural networks. 2016. arXiv:1605.06065.

[29] Denk W, Strickler JH, Webb WW. Two-photon laser scanning fluorescence microscopy. Science 1990;248(4951):73–6. link1

[30] Nishiyama J, Yasuda R. Biochemical computation for spine structural plasticity. Neuron 2015;87(1):63–75. link1

[31] Cichon J, Gan WB. Branch-specific dendritic Ca2+ spikes cause persistent synaptic plasticity. Nature 2015;520(7546):180–5. link1

[32] Sutton R, Barto A. Introduction to reinforcement learning. Cambridge: MIT Press; 1998. link1

[33] Sutton RS, Barto AG. Toward a modern theory of adaptive networks: expectation and prediction. Psychol Rev 1981;88(2):135–70. link1

[34] Insel TR, Landis SC, Collins FS. The NIH BRAIN Initiative. Science 2013;340 (6133):687–8. link1

[35] Jeong S, Lee Y, Jun B, Ryu Y, Sohn J, Kim S, et al. Korea Brain Initiative: emerging issues and institutionalization of neuroethics. Neuron 2019;101(3):390–3. link1

[36] Amunts K, Ebell C, Muller J, Telefont M, Knoll A, Lippert T. The human brain project: creating a European research infrastructure to decode the human brain. Neuron 2016;92(3):574–81. link1

[37] Okano H, Sasaki E, Yamamori T, Iriki A, Shimogori T, Yamaguchi Y, et al. Brain/ MINDS: a Japanese national brain project for marmoset neuroscience. Neuron 2016;92(3):582–90. link1

[38] Jabalpurwala I. Brain Canada: one brain one community. Neuron 2016;92 (3):601–6. link1

[39] Alliance A. A neuroethics framework for the Australian Brain Initiative. Neuron 2019;101(3):365–9. link1

[40] Deisseroth K. Optogenetics. Nat Methods 2011;8(1):26–9. link1

[41] Pégard NC, Mardinly AR, Oldenburg IA, Sridharan S, Waller L, Adesnik H. Threedimensional scanless holographic optogenetics with temporal focusing (3DSHOT). Nat Commun 2017;8(1):1228. link1

[42] Hochbaum DR, Zhao Y, Farhi SL, Klapoetke N, Werley CA, Kapoor V, et al. Alloptical electrophysiology in mammalian neurons using engineered microbial rhodopsins. Nat Methods 2014;11(8):825–33. link1

[43] Ji N, Freeman J, Smith SL. Technologies for imaging neural activity in large volumes. Nat Neurosci 2016;19(9):1154–64. link1

[44] Weisenburger S, Vaziri A. A guide to emerging technologies for large-scale and whole-brain optical imaging of neuronal activity. Annu Rev Neurosci 2018;41 (1):431–52. link1

[45] Ahrens MB, Orger MB, Robson DN, Li JM, Keller PJ. Whole-brain functional imaging at cellular resolution using light-sheet microscopy. Nat Methods 2013;10(5):413–20. link1

[46] Kim TH, Zhang Y, Lecoq J, Jung JC, Li J, Zeng H, et al. Long-term optical access to an estimated one million neurons in the live mouse cortex. Cell Rep 2016;17 (12):3385–94. link1

[47] McConnell G, Trägårdh J, Amor R, Dempster J, Reid E, Amos WB. A novel optical microscope for imaging large embryos and tissue volumes with sub-cellular resolution throughout. Elife 2016;5:e18659. link1

[48] Stirman JN, Smith IT, Kudenov MW, Smith SL. Wide field-of-view, multiregion, two-photon imaging of neuronal activity in the mammalian brain. Nat Biotechnol 2016;34(8):857–62. link1

[49] Chen JL, Carta S, Soldado-Magraner J, Schneider BL, Helmchen F. Behaviourdependent recruitment of long-range projection neurons in somatosensory cortex. Nature 2013;499:336–40. link1

[50] Sofroniew NJ, Flickinger D, King J, Svoboda K. A large field of view two-photon mesoscope with subcellular resolution for in vivo imaging. Elife 2016;5: e14472. link1

[51] Joesch M, Mankus D, Yamagata M, Shahbazi A, Schalek R, Suissa-Peleg A, et al. Reconstruction of genetically identified neurons imaged by serial-section electron microscopy. Elife 2016;5:e15015. link1

[52] Friedrich J, Yang W, Soudry D, Mu Y, Ahrens MB, Yuste R, et al. Multi-scale approaches for high-speed imaging and analysis of large neural populations. PLoS Comput Biol 2017;13(8):e1005685. link1

[53] Berens P, Freeman J, Deneux T, Chenkov N, McColgan T, Speiser A, et al. Community-based benchmarking improves spike rate inference from twophoton calcium imaging data. PLoS Comput Biol 2018;14(5):e1006157. link1

[54] Paninski L, Cunningham JP. Neural data science: accelerating the experimentanalysis-theory cycle in large-scale neuroscience. Curr Opin Neurobiol 2018;50:232–41. link1

[55] Hoffer E, Hubara I, Soudry D. Train longer, generalize better: closing the generalization gap in large batch training of neural networks. In: Guyon I, Luxburg UV, Bengio S, Wallach H, Fergus R, Vishwanathan S, et al., editors. Proceedings of the Neural Information Processing Systems 2017; 2017 Dec 4– 9; Long Beach, CA, USA; 2017. p. 1731–41. link1

[56] Kadmon J, Sompolinsky H. Optimal architectures in a solvable model of deep networks. In: Lee DD, Sugiyama M, Luxburg UV, Guyon I, Garnett R, editors. Proceedings of the Neural Information Processing Systems 2016; 2016 Dec 5– 10; Barcelona, Spain; 2016. p. 4788–96 link1

Related Research