The human brain has long fascinated scientists: we’ve discovered enough to appreciate this amazing organ, yet haven’t discovered enough to completely understand it. In fact, some of the brain’s inner workings are still a mystery today: some researchers devote their entire lives to understanding the brain, which still ranks as one of the greatest challenges for medicine and science.

Many parallel the human brain to a computer but in reality it’s like trying to compare the proverbial chalk and cheese. Although some functions performed by the human brain and computers, such as solving mathematical problems, lead to the same results, the processes and the internal workings are completely different.

A computer is mainly composed of two distinct functionalities: processing power and memory. This concept was first developed 70 years ago and remains basically the same today, when microprocessors are getting smaller yet more powerful. However, microprocessors will soon stumble upon a physical limitation: they will not be able to become any smaller.

Microchip designers working in close collaboration with cognitive scientists have been looking at how the human brain works to enable them to emulate it on newer hardware. Leading the race in trying to come up with a cost-effective design for producing neuromorphic computing power – or what is referred to in layman’s terms as the brain chip – is a team of researchers involved in the Synapse programme.

Since the 1980s, this programme has cost HRL Laboratories, Hewlett-Packard and IBM Research millions in funding: the main aim is to try and understand how neurons and synapses, two key components of how the human brain operates, can be mimicked onto a computer design. Although promising, the ideas and designs that have been proposed by the programme are still not cost-effective enough and it might take some more time before we see such computing power in everyday applications.

In the meantime, researchers have been designing software that thinks and learns just like a human brain does. Neural networks can mimic how humans learn and take decisions based on their experience. Nonetheless such software programs are still limited by the fact that they run on traditional hardware made of simple processing and memory functionalities.

Just like a human being, it learns from its mistakes and life experiences

Neural nets are today helping in time series predictions. Traditional software packages were unable to solve any problems in these fields just a few years ago. Neural nets work by analysing past performance of a particular model and learning from the examples it is presented with. Once the neural network has finished learning, it is able to form non-linear connections between variables even when there is an amount of noise in the training data itself. The trained neural network is later used to make predictions about a particular type of event. Just like a human being, it learns from its mistakes and life experiences: a neural network is then able to adjust its own models according to the error deviances between its own predictions and reality. With time, a neural network’s ability to generate precise predictions becomes more accurate. Software neural nets are today achieving a high degree of success in areas such as self-driving cars, weather forecasts, financial risk analysis and even medical diagnosis.

If the brain chip becomes economically feasible, this will completely change computer architecture and processing power will increase exponentially. A brain chip working with powerful neural networks would mean that the present barriers limiting artificial intelligence would be broken and computers that think like a human being will become an everyday reality.

Last August, an IBM team working closely with the US Defence Department reported a major breakthrough with the creation of a prototype brain chip. This processor, called TrueNorth, mimics one million neurons and 256 million synapses and encodes data and processes data in a similar way as the human brain.

TrueNorth might boast brain-like efficiency yet the brain is still miles ahead. In fact, the adult human brain has an average of 85 billion neurones, which dwarfs the TrueNorth processor. Nonetheless, let’s keep in mind the rapid evolution that computers have enjoyed in less than three decades. If the brain chip develops at the same rate, then computer processors with the same amount of neurons as the human brain may not be such a far-fetched prospect.

The TrueNorth prototype processor is the first step towards the creation of an economically viable production of brain chip processors for everyday use. Currently, IBM is testing this chip’s capabilities and writing specific software powerful enough to exploit all this computing power before launching it commercially.

Such a processor will accelerate research in artificial intelligence and pave the way for artificial intelligence software to be embedded in everyday machines. This means that we might start interacting with our devices in more humanlike ways. For instance, common medical sensors such as a pacemaker might have an embedded brain chip to diagnose problems even before someone suffers a heart attack. Your smartphone might start learning to anticipate what you will do next and automatically change to vibrate during a meeting or tell a caller that you are driving and cannot take a call. Self-driving cars might start learning to drive by themselves or at least take over if anything happens to the human driver or if the car senses an imminent crash.

Such technological advances will undoubtedly lead to more controversial debates and ethical considerations. For instance, if a self-driving car had to injure a pedestrian, would the human driver still be held liable? In fact, just recently Tesla Motors CEO Elon Musk and astrophysicist Stephen Hawking released statements cautioning of the risks which humanity might face if we continue developing human-like intelligence in computer systems.

As with anything else in life, any tool we develop can be used with noble and less noble intentions. Artificial intelligence can be a dream or a nightmare – it all depends on which side we take.

Ian Vella is a search engine optimisation specialist.

www.ianvella.com.

Independent journalism costs money. Support Times of Malta for the price of a coffee.

Support Us