We, in 21st century are experiencing the immense feat of technology, most credited to artificial intelligence and machine learning. Well, developing countries hassling for possible enhancement via AI & ML, here are researches and companies that updated the technology to not just two, but three folds. And now, companies like intel, IBM have been dealing with third generation of AI, called ‘neuromorphic computing’ (neuro – morphic computing), the one aiming to morph computing, mimicking as human’s brains.Welcome to future of computing!
What is Neuromorphic Computing?
Neuromorphic computing is understanding how the morphology of individual neurons, circuits, applications and overall architectures of human brain’s neural network and applying the same in creating an artificial neural system, with objective to make machines work more independently in all unconstrained environments.
Neuromorphic Computing is literally stated as that computing, concerned with emulating the neural structure and operation of human brain, as well as probabilistic computing, which creates algorithmic approaches to dealing with the uncertainty, ambiguity and contradiction in the nature world. Like giving machines, the ability to think, learn and maneuver as humans!
Levels of AI Technology
Artificial Intelligence, originated in mid-20th century had been a first step towards machine’s intelligence, which was rules-based and emulated classic logic to draw reasoned conclusions with a specific, narrowly defined problem domain. This first generation of AI was well suited for monitoring the processes autonomously and increasing efficiency, for example.
Coming to preponderance of technology as of 2020, world in bigger half works in second or current level of AI, which could be abbreviated as ‘sensing & perception’. In this current level of AI, machines are designed to function, collecting all the senses around it via sensory inputs and conventional computation of all gathered information with a perception of more comfortable to the user. The current level covers all IoT and industrial automations, to say for example.
Our grey-matter can sense and interpret in an ambiguous and unconstrained situation though, flexible in nature. And here comes, the third or next generation of AI that corresponds to human cognition, such as interpretation and autonomous adaptation. This neural brain-like performer addresses to novel situations and abstraction to automate ordinary human activities.
Read also : AI recreates ancient sculptures & statues: Artists works
Neuromorphic Computing: Explained
Speaking of neuromorphic research, job is to match an algorithm with human’s flexibility and ability to learn from unstructured stimuli of the human brain, along non-compromising energy efficiency. Neurons on our brain, in nature process information as a linkage from one-to-many and all they do with a spike in just one neuron, stimulated at situations, so called ‘learning’ and lingers the network, it is conjoined with. That’s how we learn.
Researchers replicated it to be the computational building blocks of neuromorphic algorithm, the “spiking neural networks” (SNNs). Each ‘neuron’ in the SNN can fire independently of the others, and doing so it sends pulsed signals to other neurons in the network that directly change the electrical states of those neurons. By encoding information within the signals, themselves and their timing, SNNs simulate natural learning processes by dynamically remapping the synapses between artificial neurons in response to stimuli.
Benefits and Future
Probably companies led the fire out to console this third-generation tech, in their own innovations and ideas. And Intel and Belgium’s IMEC builds neuromorphic chips that has its own significances.
Intel’s Loihi Chip
Intel Labs designed Loihi, a self-learning neuromorphic research test chip, introduced in November 2017. This 128-core design is optimized for SNN algorithms and includes a total of 130,000 neurons, interconnected and can communicate with thousands of others. The chip supports dramatically accelerated learning in unstructured environments for systems that require autonomous operation and continuous learning, with extremely low power consumption, plus performance and capacity.
Intel holds the chip in research phase, and calls researches to discuss the chip’s infrastructure through its Intel Neuromorphic Research Community (INRC).
IMEC’s “Intelligent Drones”
As SNNs operate similarly to biological neural networks, IMEC’s chip incorporated with SNN would consume 100 times less power than traditional implementations while featuring a tenfold reduction in latency – enabling almost instantaneous decision-making.
IMEC firstly aims to create a low-power, highly intelligent anti-collision radar system for drones that can react much more effectively to approaching objects, with just an atom of charge.
Future is yet to do with a lot of sparking stuffs, striving to make non-living things to imitate humans, with a cause “To make life easiest and comfortable to humans”.