Journal Home Online First Current Issue Archive For Authors Journal Information 中文版

Engineering >> 2019, Volume 5, Issue 5 doi: 10.1016/j.eng.2019.08.005

Complexity at Mesoscales: A Common Challenge in Developing Artificial Intelligence

a State Key Laboratory of Multiphase Complex Systems, Institute of Process Engineering, Chinese Academy of Sciences, Beijing
100190, China
b School of Chemical Engineering, University of Chinese Academy of Sciences, Beijing 100049, China
c School of Computer and Information Technology, Beijing Jiaotong University, Beijing 100044, China

Received: 2019-07-15 Revised: 2019-07-29 Accepted: 2019-08-12 Available online: 2019-08-21

Next Previous

Abstract

Exploring the physical mechanisms of complex systems and making effective use of them are the keys to dealing with the complexity of the world. The emergence of big data and the enhancement of computing power, in conjunction with the improvement of optimization algorithms, are leading to the development of artificial intelligence (AI) driven by deep learning. However, deep learning fails to reveal the underlying logic and physical connotations of the problems being solved. Mesoscience provides a concept to understand the mechanism of the spatiotemporal multiscale structure of complex systems, and its capability for analyzing complex problems has been validated in different fields. This paper proposes a research paradigm for AI, which introduces the analytical principles of mesoscience into the design of deep learning models. This is done to address the fundamental problem of deep learning models detaching the physical prototype from the problem being solved; the purpose is to promote the sustainable development of AI.

Figures

Fig. 1

Fig. 2

Fig. 3

References

[ 1 ] The State Council of China. Development Plans of New Generation Artificial Intelligence; 2017.

[ 2 ] European Commission. Artificial Intelligence for Europe; 2018.

[ 3 ] UK Government Office for Science. Artificial intelligence: opportunities and implications for the future of decision making; 2016.

[ 4 ] National Science and Technology Council, Networking and Information Technology Research and Development Subcommittee. The National Artificial Intelligence Research and Development Strategic Plan. Washington: Executive Office of the President of the United State; 2016 Oct.

[ 5 ] Chang CL, Lee RC. Symbolic logic and mechanical theorem proving. New York: Academic Press; 1973. link1

[ 6 ] Puppe F. Systematic introduction to expert systems. Berlin: Springer-Verlag; 1993. link1

[ 7 ] Jordan MI, Mitchell TM. Machin learning: trends, perspectives, and prospects. Science 2015;349(6245):255–60. link1

[ 8 ] Krizhevsky A, Sutskever I, Hinton GE. ImageNet classification with deep convolutional neural networks. In: Proceedings of the Neural Information Processing Systems; 2012 Dec 3–8; Lake Tahoe, NV, USA. San Diego: Neural Information Processing Systems Foundation; 2012. link1

[ 9 ] Simonyan K, Zisserman A. Very deep convolutional networks for large-scale image recognition. In: Proceedings of the 3rd International Conference on Learning Representations (ICLR); 2015 May 7–9; San Diego, CA, USA; 2015. arXiv: 1409.1556. link1

[10] He K, Zhang X, Ren S, Sun J. Deep residual learning for image recognition. In: Proceedings of the 28th IEEE Conference on Computer Vision and Pattern Recognition; 2015 June 8–10; Boston, MA, USA. New York: IEEE; 2016. link1

[11] Hinton G, Deng L, Yu D, Dahl G, Mohamed A, Jaitly N, et al. Deep neural networks for acoustic modeling in speech recognition. IEEE Signal Process Mag 2012;29:82–97. link1

[12] Sainath TN, Mohamed A, Kingsbury B, Ramabhadran B. Deep convolutional neural networks for LVCSR. In: Proceedings of the 38th IEEE International Conference on Acoustics, Speech and Signal Processing; 2013 May 26–31; Vancouver, Canada. New York: IEEE; 2013. link1

[13] Ma J, Sheridan RP, Liaw A, Dahl GE, Svetnik V. Deep neural nets as a method for quantitative structure-activity relationships. J Chem Inf Model 2015;55:263–74. link1

[14] Ciodaro T, Deva D, De Seixas JM, Damazio D. Online particle detection with neural networks based on topological calorimetry information. J Phys Conf Ser 2012;368:12–30. link1

[15] Helmstaedter M, Briggman KL, Turaga SC, Jain V, Seung S, Denk W. Connectomic reconstruction of the inner plexiform layer in the mouse retina. Nature 2013;500:168–74. link1

[16] Shallue V, Vanderburg A. Identifying exoplanets with deep learning: a fiveplanet resonant chain around Kepler-80 and an eighth planet around Kepler- 90. Astron J 2018;155:94–115. link1

[17] Esteva A, Kuprel B, Novova RA, Ko J, Swetter SM, Blau HM, et al. Dermatologistlevel classification of skin cancer with deep neural networks. Nature 2017;552:23–5. link1

[18] Leung MK, Xiong HY, Lee LJ, Frey BJ. Deep learning of the tissue-regulated splicing code. Bioinformatics 2014;30:121–9. link1

[19] Xiong H, Alipanahi B, Lee LJ, Bretschneider H, Merico D, Yuen RKC, et al. The human splicing code reveals new insights into the genetic determinants of disease. Science 2015;347(6218):1254806. link1

[20] LeCun Y, Bengio Y, Hinton G. Deep learning. Nature 2015;521:436–44. link1

[21] Rumelhart DE, Hinton GE, Williams RJ. Learning representations by backpropagating errors. Nature 1986;323(6088):533–6. link1

[22] Vapnik V. The nature of statistical learning theory. New York: Springer; 1998. link1

[23] Elsken T, Metzen JH, Hutter F. Neural architecture search: a survey. J Mach Learn Res 2019;20:1–21. link1

[24] Mendoza H, Klein A, Feurer M, Springenberg JT, Hutter F. Towards automatically-tuned neural networks. In: Proceedings of the International Conference on Machine Learning; 2016 Jun 19–24; New York, NY, USA. JMLR Workshop Proc 2016;64:58–65. link1

[25] Zoph B, Le QV. Neural architecture search with reinforcement learning. In: Proceedings of the 5rd International Conference on Learning Representations; 2017 April 24–26; Toulon, France. La Jolla: ICLR; 2017. link1

[26] Real E, Aggarwal A, Huang Y, Le QV. Aging evolution for image classifier architecture search. In: Proceedings of the 23th AAAI Conference on Artificial Intelligence; 2019 Jan 27–Feb 1; Honolulu, HI, USA. Palo Alto: AAAI; 2019. link1

[27] Zeiler M, Fergus R. Visualizing and understanding convolutional networks. In: Proceedings of the 13th European Conference on Computer Vision; 2014 Sep 6–12; Zurich, Switzerland. Cham: Springer; 2014. link1

[28] Li J, Ge W, Yang N, Liu X, Wang L, He X, et al. From multiscale modeling to meso-science. Berlin: Springer; 2013. link1

[29] Li J, Tung Y, Kwauk M. Method of energy minimization in multi-scale modeling of particle-fluid two-phase flow. In: Proceedings of the 2nd International Conference on Circulating Fluidized Beds; 1988 Mar 14–18; Compiégne, France. Circ Fluid Bed Technol; 1988:89–103. link1

[30] Yang N, Wu Z, Chen J, Wang Y, Li J. Multi-scale analysis of gas–liquid interaction and CFD simulation of gas–liquid flow in bubble columns. Chem Eng Sci 2011;66(14):3212–22. link1

[31] Wang L, Qiu X, Zhang L, Li J. Turbulence originating from the compromise-incompetition between viscosity and inertia. Chem Eng J 2016;300:83–97. link1

[32] Li J, Zhang Z, Ge W, Sun Q, Yuan J. A simple variational criterion for turbulent flow in pipe. Chem Eng Sci 1999;54(8):1151–4. link1

[33] Ren Y, Gao J, Xu J, Ge W, Li J. Key factors in chaperonin-assisted protein folding. Particuology 2012;10(1):105–16. link1

[34] Huang WL, Li J. Mesoscale model for heterogeneous catalysis based on the principle of compromise in competition. Chem Eng Sci 2016;147:83–90. link1

[35] Li J, Huang W, Chen J, Ge W, Hou C. Mesoscience based on the EMMS principle of compromise in competition. Chem Eng J 2018;333:327–35. link1

[36] Li J, Huang W. From multiscale to mesoscience: addressing mesoscales in mesoregimes of different levels. Annu Rev Chem Biomol Eng 2018;9:41–60. link1

[37] Kandel ER, Schwartz JH, Jessell TM, Siegelbaum SA, Hudspeth AJ, Mack S. Principles of neural science. 5th ed. New York: McGraw-Hill Education; 2012. link1

[38] Silver D, Huang A, Maddison CJ, Guez A, Sifre L, Van den Driessche G, et al. Mastering the game of Go with deep neural networks and tree search. Nature 2016;529(7587):484–9. link1

[39] Silver D, Huang A, Maddison CJ, Guez A, Sifre L, Van den Driessche G, et al. Mastering the game of Go without human knowledge. Nature 2016;550:484–9. link1

[40] Goodfellow I, Pouget-Abadie J, Mirza M, Xu B, Warde-Farley D, Ozair S, et al. Generative adversarial nets. In: Proceedings of the Advances in Neural Information Processing Systems; 2014 Dec 8–13; Montréal, Canada. San Diego: Neural Information Processing Systems Foundation; 2014. link1

Related Research