• Neural Networks and Quantum Physics are the Future: Paul Werbos

    April, 10 2023, Noor E Karishma Shaik

    Summary

    Paul Werbos, an American social scientist highlighted how IEEE conferences were the primary intellectual centre of the new technology in the field of neural networks and quantum computing. Werbos introduced the concept of backpropagation, which is now widely used in neural networks. He believes that quantum mechanics will be the 'next revolution' in our understanding of nature and our ability to control it.

    Paul Werbos, an American social scientist and machine learning pioneer, has recently received the prestigious Frank Rosenblatt Award, IEEE’s leading technical field award for technologies covered by the World Congress on Computational Intelligence (WCCI). Prior to this, he was also awarded the IEEE Neural Network Pioneer Award for the discovery of backpropagation and other basic neural network learning frameworks such as Adaptive Dynamic Programming. Werbos was one of the original three two-year Presidents of the International Neural Network Society. He has also served as program director in the National Science Foundation for several years. As we congratulate him on his tremendous technological advancements and for keeping the IEEE community enlightened during his remarkable career, we have also asked him to provide valuable advice for the next generation.

    Werbos highlights how the massive changes on the internet coming now were actually a result of technical contributions at several conferences hosted by IEEE and WCCI. Paul says, “The neural network field has experienced three massive revolutions, starting from 1987, when IEEE held the first International Conference on Neural Networks (ICNN), leading NSF to create the Neuroengineering research program which I ran and expanded from 1988 to 2014. This first period of growth already saw a huge proliferation of important new applications in engineering, such as vehicle control, manufacturing, and partnerships with biology. IEEE conferences were the primary intellectual center of the new technology, relying most on generalized backpropagation (including backpropagation over time and backpropagation for deep learning) and on a ladder of neural network control designs, rising up to ‘reinforcement learning’.”

    He adds, “The second great revolution resulted from a paradigm-shifting research program, COPN, which resulted from deep dialogue and voting across research program directors at NSF: National Science Foundation (2007), Emerging Frontiers in Research and Innovation 2008. In that program, I funded Andrew Ng and Yann LeCun to test neural networks on crucial benchmark challenges in AI and computer science. After they demonstrated to Google that neural networks could outperform classical methods in AI, Google announced a new product which set off a massive wave of “new AI” in industry and in computer science. In computer science, this added momentum to the latest movement (or third revolution) for Artificial General Intelligence (AGI), which our reinforcement learning designs already aimed at. However, there are levels and levels of generality, even in “general intelligence”. We now speak of Reinforcement Learning and Approximate Dynamic Programming (RLADP).”

    Speaking of the latest revolution, Werbos further explains, “Just as adaptive analog networks, neural networks, massively and provably open up capabilities beyond old sequential Turing machines, the quantum extension of RLADP offers power far beyond what the usual Quantum Turing Machines (invented by David Deutsch) can offer. It offers true Quantum AGI, which can multiply capabilities by orders of magnitude in an application domain requiring higher intelligence, such as observing the sky, management of complex power grids, and ‘quantum bromium’ (hard cybersecurity).”

    Werbos has presented detailed roadmaps of how to rise up from the most powerful methods popular in computer science even today, up to intelligence as general as that of the basic mammal brain at several international avenues. Together with Davis, he has also demonstrated how this view of intelligence fits real-time data from rat brains better than older paradigms for brain modeling. Werbos has also worked on quantum mechanics and other areas of physics. He also has an interest in larger questions relating to consciousness, the foundations of physics, and human potential. According to him, the key challenges to basic scientific understanding are (a) the mind, (b) the universe, and (c) life. He thinks the key broader challenges to humanity are (d) sustainable growth on earth, (e) cost-effective sustainable space settlement, and (f) human potential — growth/learning in brain, soul, integration (body).

    References:

    [1] Emerging Frontiers in Research and Innovation 2008 (EFRI-2008)  https://www.nsf.gov/pubs/2007/nsf07579/nsf07579.htm.

    [2] What do Neural Nets and Quantum Theory Tell us about Mind and Reality? https://arxiv.org/pdf/q-bio/0311006.pdf

    [3] From ADP to the brain: foundations, roadmap, challenges and research priorities. In 2014 International Joint Conference on Neural Networks (IJCNN) (pp. 107-111). IEEE. https://arxiv.org/pdf/1404.0554.pdf.

    [4] “Quantum technology to expand soft computing.” Systems and Soft Computing 4: 200031. https://www.sciencedirect.com/science/article/pii/S2772941922000011

    Article Contribution: Noor E Karishma Shaik (Editor-in-Chief, IMPACT Blog) is currently working as Academic Researcher at University of Melbourne.