I work at ValueFirst Digital Media Private Ltd. I am a Product Marketer in the Surbo Team. Surbo is Chatbot Generator Platform owned by Value First. ...Full Bio
I work at ValueFirst Digital Media Private Ltd. I am a Product Marketer in the Surbo Team. Surbo is Chatbot Generator Platform owned by Value First.
Success story of Haptik
610 days ago
Who is afraid of automation?
610 days ago
What's happening in AI, Blockchain & IoT
611 days ago
3 million at risk from the rise of robots
611 days ago
The Future of Artificial Intelligence is in hardware
So far the entire conversation regarding Artificial Intelligence has been revolving around algorithms. Deep Mind, which is leading the world in terms of Machine Learning recently spoke about how it's AlphaGo Zero managed to become a master in Go and beat all previous versions of itself, all by itself and from scratch. It used an advanced form of reinforcement learning algorithms. While companies are vying with each other to get the top talent data science and algorithmic design, the real advancement is taking place in hardware.
The flattening of Moore's Law
Let's first have a look at history. In 1958 the first integrated circuit contained 2 transistors and it was quite a sizable thing, covering around one square centimeter. By 1971, Moore's Law had become evident when people saw the exponential increase of performance in integrated chips. There were 2300 transistors packed in the same surface as before. Then by 2014, IBM P8 processor had more than 4.2 billion transistors and 16 cores all fitted into 650 square mm device. Alas, there is a natural limit on how many transistors you can pack in a given piece of silicon, and we are reaching this limit soon. Moreover, machine learning applications, particularly in pattern recognition (e.g. understanding speech, images. etc.) require massive parallel processing. When Google announced that its algorithms were able to recognize images of cats, what they failed to mention was that its software needed 16,000 processors to run in order to do so. That's not too much of an issue when you can run your algorithms on a server farm in the cloud, but what if you had to run them on a tiny mobile device? This is increasingly becoming a major industry need. Running advanced machine learning capabilities at the end-point confers huge advantages to users, and solves for many data privacy issues as well. Imagine if Siri, for example, did not need to make cloud calls but was able to process all data and algorithms on the hardware of your smart phone. But if you thought that your smartphone heats up too much during a few minutes conversation, or after playing Minecraft, wait till it Siri becomes truly personalised by running on your phone too.
Solving for the bottleneck
The reason why devices heat up, and the main problem with our current computer hardware designs, is the so called "von Neumann bottleneck": classic computer architectures separate the processing from the data storage, which means that data need to be transferred back and forth from one place to the other overtime a calculation takes place. Parallelism solves part of this problem by breaking down calculations and distributing processing, but you still need to move data at the end, to reconcile everything into a desired output. So what if there was a way to get rid of the hardware bottleneck altogether? What if processing and data resided in the same place and nothing had to be moved around and produce heat, and consume so much energy? After all, that is how our brains work: we do not have separate areas for processing and data storage as computers do; everything is happening at our neurons.
Getting inspired by the way our brain functions is not new in Artificial Intelligence - we are already doing that in deep learning using neural networks. However, we do so by emulating the functioning of neurons using machine learning algorithms and parallel processing. But what if instead of emulation we had computers that functioned just like our brains do? Since the 1970s people have envisioned such a factual mapping of brain functionalities onto hardware, in other words hardware that directly mapped brain architecture. This approach, called "neuromorphic computing", is finally becoming commercialised with companies such as Intel and Qualcomm recently announcing releases of neuromorphic chips for commercial use.
Neuromorphic chips can be used for AI applications at the end point, which is very exciting news indeed. Nevertheless, they also have the potential to advance machine intelligence to a whole new level. By using electronics, instead of software, to develop machine cognition we may be able to achieve the dream of general artificial intelligence and create truly intelligent systems.
Quantum: the big bang of computing
However, the true big bang in computing will be coming not from neuromorphic chips but from harnessing the potential of quantum physics. As the demand for fast computing is increasing so is our ambition for solving really hard problems too. Classical computing is unable to solve some really complex problems in many areas like Cancer Research & others. Quantum computing can play a role here and figure out the answers for us.
Quantum computers are advancing at a significant rate. Presently we are now at 50 qubits level. Let's understand this. A 32-bit quantum computer can process 4 billion coefficients and 256GB information. Nothing really impressive, since one can run similar programs on a laptop computer in just a few seconds. However, once we reach the 64-bit quantum computer limit, the story becomes entirely different. Such a computer is capable of computing all the information that resides on the internet at once. This means around 74 exabytes (billion GB) of data processing which would otherwise take years to do on a present day supercomputers. And we are quite close to this. The real game changing thing will be development of a 256-bit quantum computer. Such a computer would be capable of doing calculations that are greater than the number of atoms in all of the universe. This is going to be huge and will have great repercussions on our civilization.