Friday, January 24, 2025

Scaling up Neuromorphic Computing for More Efficient and Effective AI Everywhere and Anytime

A visual comparison between traditional computing architectures and neuromorphic computing. It highlights the differences in structure and processing style between the two systems.

Brain-Inspired Computing Shows Promise in Reducing AI's Energy Footprint

Neuromorphic computing, which designs computer chips to work like networks of brain cells, is gaining momentum as a potential solution to artificial intelligence's growing energy demands. Unlike traditional computers that separate memory and processing, these systems mimic how the human brain processes information - storing and computing data in the same location, much like how neurons and synapses work together.

Recent advances at UC San Diego and other institutions demonstrate several key advantages:

Energy Efficiency: The NeuRRAM chip processes AI tasks using a fraction of traditional computing power by eliminating the constant shuttling of data between memory and processing units.

Speed: Direct in-memory computing allows faster processing of complex AI tasks, particularly for applications requiring real-time responses like autonomous systems and medical monitoring.

Adaptability: Like the brain, neuromorphic systems can rewire their connections based on new information, making them particularly effective for learning tasks and adapting to new situations.

Commercial applications are emerging across industries:
  • - Healthcare: Real-time patient monitoring through energy-efficient wearable devices
  • - Smart Cities: Traffic management and infrastructure monitoring
  • - Agriculture: Automated crop monitoring and precision farming
  • - Manufacturing: Quality control and predictive maintenance

A $4 million National Science Foundation grant to THOR: The Neuromorphic Commons is accelerating development by providing researchers open access to this technology. With AI power consumption projected to double by 2026, the timing is crucial.

Major tech companies including Intel are investing in neuromorphic platforms, suggesting the technology is moving beyond research labs toward practical implementation. Intel's Loihi chip, for example, has demonstrated the ability to solve complex problems while using only a fraction of the power required by traditional processors.

The field still faces challenges in programming these novel systems, but researchers recently outlined a roadmap for scaling up the technology in Nature, indicating a path toward broader adoption.
 

What it is:

Neuromorphic computing is a field of research that aims to build machines that can explore brain function and use our enhanced understanding of the brain to develop better computer hardware and algorithms. It involves the study of spiking neural networks, which can be used to model neural circuits and develop low-power bio-inspired AI systems. Researchers are exploring how the principles of approximate computing can be applied to the design of neuromorphic systems at various levels, including the algorithm, circuit, and device levels. This can lead to significant improvements in energy efficiency for neuromorphic computing. Neuromorphic computing is seen as a leading option for reestablishing growth in the IT sector, as it can potentially overcome the physical limits of traditional computing approaches. Researchers are also exploring how neuromorphic computing can enable more efficient machine learning techniques, such as by developing spiking neural networks. Neuromorphic computing is being studied as a way to move beyond the limitations of the von Neumann architecture and traditional Boolean logic, by leveraging emerging technologies like spintronics.  

Scaling up Neuromorphic Computing for More Efficient and Effective AI Everywhere and Anytime

today.ucsd.edu

Story by:Ioana Patringenaru -

Neuromorphic computing—a field that applies principles of neuroscience to computing systems to mimic the brain’s function and structure—needs to scale up if it is to effectively compete with current computing methods. In a review published Jan. 22 in the journal Nature, 23 researchers, including two from the University of California San Diego, present a detailed roadmap of what needs to happen to reach that goal. The article offers a new and practical perspective toward approaching the cognitive capacity of the human brain with comparable form factor and power consumption. 

“We do not anticipate that there will be a one-size-fits-all solution for neuromorphic systems at scale but rather a range of neuromorphic hardware solutions with different characteristics based on application needs,” the authors write. 

Applications for neuromorphic computing include scientific computing, artificial intelligence, augmented and virtual reality, wearables, smart farming, smart cities and more. Neuromorphic chips have the potential to outpace traditional computers in energy and space efficiency, as well as performance. This could present substantial advantages across various domains, including AI, health care and robotics. As the electricity consumption of AI is projected to double by 2026, neuromorphic computing emerges as a promising solution.

“Neuromorphic computing is particularly relevant today, when we are witnessing the untenable scaling of power- and resource-hungry AI systems,” said Gert Cauwenberghs, a Distinguished Professor in the UC San Diego Shu Chien-Gene Lay Department of Bioengineering and one of the paper’s coauthors.

Neuromorphic computing is at a pivotal moment, said Dhireesha Kudithipudi, the Robert F. McDermott Endowed Chair at the University of Texas San Antonio and the paper’s corresponding author. “We are now at a point where there is a tremendous opportunity to build new architectures and open frameworks that can be deployed in commercial applications,” she said. “I strongly believe that fostering tight collaboration between industry and academia is the key to shaping the future of this field. This collaboration is reflected in our team of co-authors.” 

Last year, Cauwenberghs and Kudithipudi secured a $4 million grant from the National Science Foundation to launch THOR: The Neuromorphic Commons, a first-of-its-kind research network providing access to open neuromorphic computing hardware and tools in support of interdisciplinary and collaborative research.

In 2022, a neuromorphic chip designed by a team led by Cauwenberghs showed that these chips could be highly dynamic and versatile, without compromising accuracy and efficiency. The NeuRRAM chip runs computations directly in memory and can run a wide variety of AI applications—all at a fraction of the energy consumed by computing platforms for general-purpose AI computing. “Our Nature review article offers a perspective on further extensions of neuromorphic AI systems in silicon and emerging chip technologies to approach both the massive scale and the extreme efficiency of self-learning capacity in the mammalian brain,” said Cauwenberghs.

To achieve scale in neuromorphic computing, the authors propose several key features that must be optimized, including sparsity, a defining feature of the human brain. The brain develops by forming numerous neural connections (densification) before selectively pruning most of them. This strategy optimizes spatial efficiency while retaining information at high fidelity. If successfully emulated, this feature could enable neuromorphic systems that are significantly more energy-efficient and compact. 

“The expandable scalability and superior efficiency derive from massive parallelism and hierarchical structure in neural representation, combining dense local synaptic connectivity within neurosynaptic cores modeled after the brain's gray matter with sparse global connectivity in neural communication across cores modeling the brain's white matter, facilitated through high-bandwidth reconfigurable interconnects on-chip and hierarchically structured interconnects across chips,” said Cauwenberghs.

“This publication shows tremendous potential toward the use of neuromorphic computing at scale for real-life applications. At the San Diego Supercomputer Center, we bring new computing architectures to the national user community, and this collaborative work paves the path for bringing a neuromorphic resource for the national user community,” said Amitava Majumdar, director of the division of Data-Enabled Scientific Computing at SDSC here on the UC San Diego campus, and one of the paper’s coauthors. 

In addition, the authors also call for stronger collaborations within academia, and between academia and industry, as well as for the development of a wider array of user-friendly programming languages to lower the barrier of entry into the field. They believe this would foster increased collaboration, particularly across disciplines and industries.

Neuromorphic Computing at Scale

Dhireesha Kudithipudi and Tej Pandit, University of Texas, San Antonio
Catherine Schuman, University of Tennessee, Knoxville
Craig M. Vineyard, James B. Aimone and Suma George Cardwell, Sandia National Laboratories
Cory Merkel, Rochester Institute of Technology
Rajkumar Kubendran, University of Pittsburgh
Garrick Orchard and Ryad Benosman, Intel Labs
Christian Mayr, Technische Universität Dresden
Joe Hays, U.S. Naval Research Laboratory,
Cliff Young, Google DeepMind
Chiara Bartolozzi, Italian Institute of Technology
Amitava Majumdar and Gert Cauwenberghs, University of California San Diego
Melika Payvand, Institute of Neuroinformatics, University of Zürich and ETH Zürich 
Sonia Buckley, National Institute of Standards and Technology
Shruti Kulkarni, Oak Ridge National Laboratory
Hector A. Gonzalez, SpiNNcloud Systems GmbH, Dresden, Germany
Chetan Singh Thakur, Indian Institute of Science, Bengaluru  
Anand Subramoney, Royal Holloway, University of London, Egham
Steve Furber, The University of Manchester

Learn more about research and education at UC San Diego in: Artificial Intelligence

Evolution of Neuromorphic Computing with Machine Learning and Artificial Intelligence

I. Sharma and Vanshika, "Evolution of Neuromorphic Computing with Machine Learning and Artificial Intelligence," 2022 IEEE 3rd Global Conference for Advancement in Technology (GCAT), Bangalore, India, 2022, pp. 1-6, doi: 10.1109/GCAT55367.2022.9971889.

Abstract: In the present century, where artificial intelligence and machine learning are regenerating the world, this paper takes inspiration from brain intelligence and explores new advanced computing known as Neuromorphic Computing. The energy efficiency and accuracy of the brain are remarkable and its retaining ability and grasping tendency are overwhelming. Motivated by this, our paper discusses this benchmarking technology i.e. neuromorphic computing and how it is optimizing machine learning techniques by developing Spiking Neural Networks (SNNs). Neuromorphic computing is constructing computers that have a commonality like the human brain and have all biological functionalities. This research paper discusses the recent neuromorphic systems and also presents the comparative analysis of the neuromorphic systems in which various research areas are discussed like image retrieval, forecasting, prediction and classification. Machine learning algorithms are applied everywhere nowadays and neuromorphic computing enables machine learning techniques to work in an efficient manner. This paper shows the power of neuromorphic computing as an essential epitome to existing Von-Neumann architecture and also aims to make the industry's future bright with more research work in this research area. 

keywords: {Computers;Machine learning algorithms;Neuromorphic engineering;Evolution (biology);Image retrieval;Machine learning;Grasping;Neuromorphic Computing;Machine Learning;Von-Neumann Architecture;Spiking Neural Networks (SNNs)},

URL: https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=9971889&isnumber=9971808


Abstract:

In the present century, where artificial intelligence and machine learning are regenerating the world, this paper takes inspiration from brain intelligence and explores new advanced computing known as Neuromorphic Computing. The energy efficiency and accuracy of the brain are remarkable and its retaining ability and grasping tendency are overwhelming. Motivated by this, our paper discusses this benchmarking technology i.e. neuromorphic computing and how it is optimizing machine learning techniques by developing Spiking Neural Networks (SNNs). Neuromorphic computing is constructing computers that have a commonality like the human brain and have all biological functionalities. This research paper discusses the recent neuromorphic systems and also presents the comparative analysis of the neuromorphic systems in which various research areas are discussed like image retrieval, forecasting, prediction and classification. Machine learning algorithms are applied everywhere nowadays and neuromorphic computing enables machine learning techniques to work in an efficient manner. This paper shows the power of neuromorphic computing as an essential epitome to existing Von-Neumann architecture and also aims to make the industry's future bright with more research work in this research area.
 
Date of Conference: 07-09 October 2022
Date Added to IEEE Xplore: 12 December 2022
ISBN Information:
Conference Location: Bangalore, India
SECTION I.

Introduction

In short term, Neuromorphic Computing can be described as Brain-inspired Computing i.e., it is the computing method which is inspired by biological structures of the human brain and nervous system and this approach to design the semiconductor chips with Brain-inspired computing methods. This phenomenon is a great replacement for the von-Neumann architecture of computer systems and invent new systems having a condensed architecture which can just behave like the human brain [1]. Traditional neural networks have low power and low efficiency but by introducing neuromorphic computing, neuromorphic systems can be developed which have the fast computation, low power consumption and are more efficient. Neuromorphic Computing has its significance in the past also as it was first discovered by Carver Mead in the 1980s, some discovery has been done on it, but nowadays neuromorphic computing is the utmost point of research for maintaining the future of Artificial Intelligence (AI) and Machine Learning (ML).

In contemporary architecture, either performance or power is kept at the priority over the other i.e., only one thing is maintained at one time but by using this advanced technology, neuromorphic chips deliver both performances as well as low power, this is the main merit of neuromorphic systems. Our Brain has the capable nature to adapt to all circumstances, it also changes to external environment conditions with its biological structure and constricted architecture of neurons, similarly, neuromorphic chips are designed having behavior to adapt to all changes in the system and give the output by changing accordingly. Thus, every data is maintained in the proper way and memory and processing can be done simultaneously. Therefore, neuromorphic computers are replacing traditional computers and provide benefits like stochasticity, plasticity, fine-grained parallelism, fast-paced learning, rapid response, and higher adaptability [2].

Fig. 1. - Architecture of neuromorphic computing
Fig. 1.

Architecture of neuromorphic computing

In figure1, the basic architecture of the neuromorphic system is explained resembling the human brain and nervous system. Neuromorphic computing has great significance in Artificial Intelligence and Machine Learning as it will overcome all the challenges of the long-established architecture like power consumption, storage capacity and low efficiency and will allow computers to execute brain-like-performance and activities. Machine Learning can be described as the techniques enabling computers to have the ability to learn and achieve self-learning power. With the rapidly rising bulk of data, machine learning has become a very important aspect nowadays in the industry and it is also mandatory for solving problems in various departments like price and load forecasting, computational biology, finance, robotics, healthcare, industry 4.0 etc. Like ways, Neuromorphic chips would increase the use of machine learning by enabling the artificially intelligent system to work in a well-organized manner. The relatability of machine learning with neuromorphic computing will be discussed later in the paper.

In Section II, neuromorphic computing and machine learning are discussed in detail and its correlativity that how neuromorphic computing is enabling machine learning algorithms to work efficiently is also discussed in this section. The selected research work for advancements in machine learning with neuromorphic computing is detailed in Section III. Comparative analysis of the discussed research work and results are given in section IV. The research paper is concluded in section V with a discussion on future directions.

SECTION II.

Why Neuromorphic Computing for Machine Learning?

In this section, Machine learning techniques and neuromorphic computing technology are discussed to elaborate on how these techniques are beneficial. The neuromorphic systems are enabling the efficient implementation of ML techniques. Researchers have productively employed neuromorphic technology in machine learning models to overcome the challenges faced in contemporary architecture. This section also summarizes the brain-inspired chips discovered to promote research in neuromorphic computing.

A. Machine Learning Methodologies

As the term specifies, Machine Learning techniques are the processes which use mathematical models of data to make computers learn and work by themselves without proper instruction and are at the core of Artificial Intelligence and Data Science. The adoption and enhancement of machine learning algorithms are found throughout many walks of life and allow more prominent decision-making in every aspect. Machine Learning is different due to its own identity and its terminologies and is applicable in many domains like molecular biology, computer vision and text processing. In today's trend, we are discovering advanced technology i.e., Neuromorphic Computing which enables Machine learning algorithms to work more efficiently and replace the stereotypical architecture of computers which will be discussed in later parts of Section 2.

The machine learning paradigm is used in probabilistic Information retrieval; many neural networks (Hopfield Network), symbolic learning Implementation (ID3 AND ID5R) and genetic algorithms are applied [3]. Also, Text classification methodologies are also applicable in the micro-text (chat post) which are very smaller in length and have informal grammar and language allowing point-to-point conversation via any protocol which conveys important information within many military forces at a small pace of time. One more operation of machine learning algorithms is that it is used to predict Coronary artery calcification using a sibship-based design [4]. Hence, machine learning is setting its remarkable progress in the present century and will be at its peak with new advancements. In this research paper, we are demonstrating the new concept i.e., Neuromorphic Computing which will become the future of AI and ML.

B. Neuromorphic Computing-Next Generation Computing

Neuromorphic Computing represents the computing methods resemblance to the human brain and nervous system and replaces the old armature of the computer system with a new configuration which is tomorrow's demand in the industry. Pre-predictions have already been made by Gartner that old computer systems will be soon at the edge by 2025, evoking the new paradigm i.e., neuromorphic computing. As the demand for new computing systems is heading upward, so it is necessary to look up more advancements and neuromorphic computing can be the best bet forward. Till now very few researchers can demonstrate that neuromorphic chips are more efficient, some of the neuromorphic chips that are invented so far are INTEL'S LOIHI [5], BSS2 PROTOTYPE [6], IBM'S TRUENORTH. Loihi chip was the neuromorphic chip excogitated by Intel replacing the conventional chip system and it proves to provide more results and performance as it will give support to the spiking neural networks (SNN). Loihi also solves the many ranges of the problems and delivers quantitative computation, lower energy consumption, feature recognition, and precise relationships as it has been made by integrating 131 072 neurons into the small mesh type container allowing spike messages to operate and carry events independently putting no pressure on the framework. Loihi's merits over the contemporary neuromorphic chips validate the progress in the neuromorphic computing architecture overtaking the von-Neumann architecture [5].

BrainScaleS2 Prototype (BSS2) is also a Neuromorphic chip manufactured using 65nm CMOS-process ASICs and designed using Plasticity Processing Unit (PPU) inspired by the pong-videogame having fixed-pattern noise, particularly mimicking the brain's form and dynamics., also having three orders of magnitude more efficiently. Reinforcement Learning was deployed in biological Spiking Neutral Networks using the Reward Modulated Spike-Timing Dependent Plasticity (R-STDP) learning rule and it reproduces Pavlovian conditioning on the neuromorphic chip to obtain real-time speed [6]. Using R-STDP statistical learning rule, one form of discrete weight updates is employed:

Δwab=α.(Hh).rab(1)
View SourceRight-click on figure for MathML and additional features. where α is the learning rate, H is the reward, h is the baseline and rab is the Spike-Timing Dependent Plasticity (STDP) eligibility trace as the result of the post and presynaptic spikes of synapse connecting neurons a and b.

Optoelectronic Synapses are one of the cheering applications for artificial intelligence which were proposed from all-inorganic perovskite nanoplates. For achieving the promising approach of neuromorphic computing, optoelectronic synapses are discovered that could process visual signals, retain all the data, have lower power consumption, and detect the historical optoelectronic information whenever needed. Backtracking function is one main function in the optoelectronic synapses which allow to see back the information of the history when it is required, this is the key benefit of the optoelectronic perovskite synapses [7].

Fig. 2. - Optoelectronic perovskite synapses for neuromorphic computing
Fig. 2.

Optoelectronic perovskite synapses for neuromorphic computing

In figure 2, a general representation of the optoelectronic synapses is shown which are made from the perovskite nanoplates like polystyrene sulphonate (PEDOT: PSS), thiocyanate (CuSCN) and the backtracking function is also explained in this which helps in finding the previous data.

C. A Stochastic Approach To Machine Learning With Neuromorphic Computing

Machine learning algorithms have turned to be the best techniques for providing the endless performance in the ample of applications and scenarios. With the advent of neuromorphic systems, machine learning algorithms can be implemented in an energy-efficient manner. The new framework is discovered by the scientists which will have more SNN (Spiking Neural Networks) and charge pump discharge strategies that increase the workload and design the neuromorphic chips which will have more appropriate reliability and give workload-specific performance [8]. IBM's TrueNorth Neuromorphic system was designed using 16 TrueNorth neuromorphic processors and this system called NS16e is an advanced structure invented to make ML techniques more reliable and scalable. NS16e has just like brain computation having neurons and synapses and gives output like the brain comparing to Von-Neumann architecture. In the field of radar, NS16e proves to be the best armature for finding the direction, distance and sending the pulses of radio waves [9].

Inspired from the brain and by using SNN's, the Neu Cube framework is established through which new data machines are invented known as STDM (Evolving Spatio-Temporal Data Machines). eSTDM is the promising approach used for predictions in all fields like stroke prediction, personalised event prediction, and ecological predictions. These evolving connectionist systems can be used for new knowledge discovery and for the better acquiring of all the input and output information [10]. In these spatial data machines, it is very important to learn all patterns correctly and data streams, for this data machine requires SSTD (Spatio or Spectro-temporal data). SSTD is

E:A(Δt)B(2)
View SourceRight-click on figure for MathML and additional features.

In equation 2, A(t)=(a1(t),  a2(t),an(t)) and t=1,2,3,4..,m where A is a set of independent variables, B is the set of dependent output variables, t is the time window and E is the linked function between whole input data. Thus, by using neuromorphic computing systems like Neu Cube system, more machine learning data machines are established which can do predictions, retrieve data and information, and give output accordingly.

Evolution Of Machine Learning With Neuromorphic Computing.

In this section the recent research work for the evolution of machine learning techniques or applications with neuromorphic computing is elaborated.

NengoDL is discussed in the paper [11], which is a software tool designed for the implementation of SNN-based algorithms. The main aim of NengoDL is to allow all the users to use this framework to construct hybrid models and form dynamic neural networks combining both deep learning as well as neuromorphic systems. NengoDL offers higher speed and allows more models to run at the same time i.e., inducement of the model is done once keeping all different models in parallel rather than running all different models in a row taking a long time. NengoDL has some prominent features in comparison to other tools like forming of spiking versions from deep learning networks, and the insertion of TensorFlow materials. Some models are discovered using NengoDL liking Spiking MNIST having all features, therefore NengoDL allows the use of all neural networks and Nengo methods.

In paper [12], content-based image retrieval is discussed which is finding the similarity between the query image and dataset of product images using the Loihi chip as discussed in paper [5]. In content-based image retrieval, visual search is done using neuromorphic neural networks and SNNs are trained on Fashion-MNIST (image classification dataset). By doing this, the nearest visual neighbors of the query image are obtained using the neuromorphic chip Loihi considerably showing the power efficiency of neuromorphic systems. This will also accelerate the process by providing 12.5 times more energy than GPU and 2.5 more times than CPU. Thus, it shows that by using neuromorphic systems, machine learning applications can become more reliable.

One more application of machine learning i.e., short-term wind power forecasting can be enhanced using neuromorphic computing. This paper builds up more ML algorithms using neuromorphic models for efficient forecasting of wind power and uses NengoDL software [11] for making deep spiking neural networks by ANN(Artificial Neural Network) -SNN conversion. SNNs are made more useful by overcoming all the challenges by using the error backpropagation algorithm and ANN-to-SNN algorithm and making them more capable of forecasting wind [13].

In paper [14], lower energy efficiency maintenance is discussed that how we can use system software for lowering the energy used in the neuromorphic system for solving the machine learning problems and applied to fields like Embedded systems and Internet-of-Things (IoT). Thus, it provides the clear recognition that system software helps to divide the SNN model into clusters, mapped clusters into neurosynaptic cores and arrangement of neurons and synapses into the core. This will lower the energy level by 20% making the system more considerable for machine learning problems. The author also discusses the heuristic-based mapping approach to arrange neurons and synapses in such a manner that would take lower energy consumption.

Authors in paper [15] mention about PyCarl which is PyNN interface that increases the neuromorphic development cycle by allowing the combination of both machine learning models and code sharing. In this PyCarl tool, CARL-sim SNN simulations are done using biological details of neurons and synapses and hardware-oriented simulations. It allows all the users to apply machine learning models and neuromorphic hardware early during implementation. This will take the field of neuromorphic computing at its peak.

In the paper [16], the author mentions SpiNNaker which is the programmable neuromorphic platform in which a machine learning algorithm like the k-NN algorithm can be applied. This article tells that K-nearest neighbor search can be performed with more accuracy absorbing low power particularly for the classification tasks. This platform allows the simulation of large-scale spiking neural networks. Classification task on the Irish dataset is also performed using SpiNNaker platform and the output obtained is correct with higher accuracy it is also seen that on an increasing number of K values, k-NN algorithm gives more correct output. Software simulation is done using NEST to run SpiNNaker platform.

Machine learning has its ample applications, and these are solved at fastest pace with neuromorphic computing one such is explained in paper [17] which is in cheminformatics for band gap prediction and classification of chemicals. In this, neuromorphic simulation software CrossSim and NeuroSim are used for classifying materials and predicting the band gap of small-molecule organic semiconductors. The state-of-the-art contemporary architecture uses Dynamic Random Access Memory (DRAM) to obtain the results, but crossbar neuromorphic circuitry allows to do parallel work and does not use DRAM. This energy-efficient neuromorphic system is suitable for performing such tasks in chemical organics providing low power and consuming less energy. In this work, the author also aims for more fabrication of neuromorphic circuits to be specially designed to meet the industry requirements for chemical purposes.

In paper [18], deep medical image analysis is discussed which is used for doing detection of diseases to give outcomes and for this, machine learning technique combines with neuromorphic computing system like loihi chip to overcome challenges like having less-labelled images, lack of advanced technology like magnetic resonance imaging. Some learning methods are described in it like spatial representation learning, and transfer learning for providing the correct output from the large-scale database. It also resolves the issue of time management of Magnetic Resonance Imaging (MRI) as it is a long-lasting test taking ample time. Therefore, this benchmarking technology will enable countless machine learning applications to apply by overcoming all challenges.

The authors in the paper [19], discuss the e-nose systems which have been encouraged using machine learning algorithms. Neuromorphic SNNs are used for doing classification of malts and producing olfactory systems consuming less energy offering real-time processing giving 97% more accuracy. In this, the neuromorphic approach of Akida SNN and Address Event Representation Olfaction (AERO) encoder is used for the classification of malts in olfactory systems. Using the neuromorphic approach, sensor arrays and PARC (Pattern Recognition Engine) are used for recognizing chemicals. Olfactory systems made from these are used for wide purposes like in chemical factories, and wine industries. The Authors also discuss the data-to-event encoding using AERO, signal conditioning and pre-processing.

One more architecture of the multi-core neuromorphic processor chip is discussed in paper [20] which is particularly used for the classification of visual signals in DVS (dynamic vision sensor) to manage the large traffic in the system. This architecture called Dynamic Neuromorphic Asynchronous Processors (DYNAP)s has different neurons and synapses routing methods like memory optimized routing method, and programmable Address Event Representation (AER) routing to minimize memory needs and maximize the programmability which supports a wide range of real-life applications. It particularly has 1k Very Large-Scale Integration (VLSI) neurons distributed among neurons and routers for meeting these demands. This paper particularly aims at building such neuromorphic systems which have on-chip heterogeneous memory structures and have the capability to reduce the memory requirements and construct different network methodologies.

In paper [21], Design-Technology-Co-Framework (DTCO) is developed which is constructed for performing machine learning algorithms and this is an essential framework for inference accuracy robustness. It represents how the ML algorithm, emerging memory Verilog-A models and the circuit interacts to obtain the best output, this is done using SPICE and PYTHON packages. Also, an example, ReRAM neuromorphic circuit is also studied in which robustness of Neural Network (NN) is implemented for digit recognition and in ReRAM neuromorphic circuit, it was found by the author that by using DTCO framework deeper neural network has less robustness making the neuromorphic circuit more robust. In table 1, the comparative analysis of the studied Neuromorphic computing systems has been presented.

Table 1. Comparative analysis of neuromorphic computing systems
Table 1.- Comparative analysis of neuromorphic computing systems
SECTION III.

Results and Discussion

The machine learning and artificial intelligence are impacting most of the industries by providing highly improved solutions. The industries ranging from manufacturing to banking, all are targeting to take decisions based on the probabilistic results achieved by machine learning intelligence. The advancement in neuromorphic computing can provide optimal solutions with low processing power and less consuming time. In this paper, we discussed the recent research work for the evolution of machine learning with neuromorphic computing. The studied research work shows that many machine learning algorithms can be improved using Neuromorphic computing system. The NeuCube neuromorphic framework [10] provides highly improved solution for neural network and fuzzy systems, which can be widely utilized in healthcare industry for various applications like medical image processing, early detection of diseases and patients pattern recognition. In the similar way, Crossbar neuromorphic circuitry [17] is capable to predict the band gap of small size molecules of semiconductors, which is opening a new domain for neuromorphic computing in the direction of cheminformatics. The Loihi chip for neuromorphic computing has been experimented in recent research works for inducing efficiency in different application areas for instance robotics [5], Image extraction based on content or features [12] and the other promising research area is Wind power forecasting [13]. Many researchers have worked to achieve impressible results for handwriting recognition tasks employing neural network techniques, however ReRAM neuromorphic circuit-based solution for handwritten images clearly indicates the significance of power of neuromorphic computation.

DYNAPs [20] and Spiking MNIST [11] are two evolving neuromorphic computing systems that are capable to make optimal deep neural network for IoT based research problems. Akida SNN neuromorphic architecture [19] demonstrated the experimental results with high accuracy for classification of malts in e-nose systems. Moreover, the energy efficiency software engineering approach can also be achieved by incorporation of neuromorphic computing system as discussed in the discussed paper [14]. PyCarl [15], co-simulation tool encourages researcher to deploy neuromorphic development cycle including hardware and software resource utilization. The studied research papers demonstrated the improved performance for widely used K-NN machine learning approach with integration of neuromorphic computing [16], [19]. The application area radar beam shaping has been explored with BrainScaleS 2 neuromorphic prototype by the authors, that provides platform for reinforcement learning with STDP. With this discussion, we clearly understand the future of neuromorphic computing is very promising and can solve numerous computation research problems.

SECTION IV.

Conclusion and Future Scope

In our paper, we have discussed the key advantages of the neuromorphic computing as it is arising as the utmost technology in the industry and replacing the conventional architecture, also it is emerging as the future of artificial intelligence by building neuromorphic systems and allowing machine learning algorithms to work in an excellent manner. This research article covers the applications of neuromorphic computing that in which domains neuromorphic computing has proved its capability like content-based image retrieval, short-term power forecasting, classification, stroke prediction, IoT etc. This paper also gives information about the different neuromorphic systems like loihi, BrainScaleS 2 prototype and how machine learning algorithms can be more applicable by using neuromorphic computing. Some research work has been done in neuromorphic computing and some neuromorphic chips developed but it is at its infant stage but now the future need is to take this technology to its successful stage by more research in this domain. More research work in the neuromorphic computing is the mandatory need in the industry for taking this upwards as it promises to open various new possibilities in the computing, healthcare, robotics, and many other artificial intelligence applications. The neuromorphic computing systems are intended to solve tremendous data parallel processing problems.

Concepts and Paradigms for Neuromorphic Programming

Computer Science > Neural and Evolutionary Computing

Steven Abreu
The value of neuromorphic computers depends crucially on our ability to program them for relevant tasks. Currently, neuromorphic computers are mostly limited to machine learning methods adapted from deep learning. However, neuromorphic computers have potential far beyond deep learning if we can only make use of their computational properties to harness their full power. Neuromorphic programming will necessarily be different from conventional programming, requiring a paradigm shift in how we think about programming in general. The contributions of this paper are 1) a conceptual analysis of what "programming" means in the context of neuromorphic computers and 2) an exploration of existing programming paradigms that are promising yet overlooked in neuromorphic computing. The goal is to expand the horizon of neuromorphic programming methods, thereby allowing researchers to move beyond the shackles of current methods and explore novel directions.
Subjects: Neural and Evolutionary Computing (cs.NE); Emerging Technologies (cs.ET)
Cite as: arXiv:2310.18260 [cs.NE]
  (or arXiv:2310.18260v1 [cs.NE] for this version)
  https://doi.org/10.48550/arXiv.2310.18260

Submission history

From: Steven Abreu [view email]
[v1] Fri, 27 Oct 2023 16:48:11 UTC (112 KB)

 

 

No comments:

Post a Comment

Advanced ALERTCalifornia AI System Revolutionizes Wildfire Response in California

Advanced ALERTCalifornia AI System Revolutionizes Wildfire Response in California By ChatGPT As wildfires become an increasing threat in C...