Nand Kishor Contributor

Nand Kishor is the Product Manager of House of Bots. After finishing his studies in computer science, he ideated & re-launched Real Estate Business Intelligence Tool, where he created one of the leading Business Intelligence Tool for property price analysis in 2012. He also writes, research and sharing knowledge about Artificial Intelligence (AI), Machine Learning (ML), Data Science, Big Data, Python Language etc... ...

Full Bio 
Follow on

Nand Kishor is the Product Manager of House of Bots. After finishing his studies in computer science, he ideated & re-launched Real Estate Business Intelligence Tool, where he created one of the leading Business Intelligence Tool for property price analysis in 2012. He also writes, research and sharing knowledge about Artificial Intelligence (AI), Machine Learning (ML), Data Science, Big Data, Python Language etc...

3 Best Programming Languages For Internet of Things Development In 2018
311 days ago

Data science is the big draw in business schools
484 days ago

7 Effective Methods for Fitting a Liner
494 days ago

3 Thoughts on Why Deep Learning Works So Well
494 days ago

3 million at risk from the rise of robots
494 days ago

Top 10 Hot Artificial Intelligence (AI) Technologies
308472 views

Here's why so many data scientists are leaving their jobs
80340 views

Want to be a millionaire before you turn 25? Study artificial intelligence or machine learning
75303 views

2018 Data Science Interview Questions for Top Tech Companies
75042 views

Google announces scholarship program to train 1.3 lakh Indian developers in emerging technologies
60957 views

Q&A | Human brain structure inspires artificial intelligence

By Nand Kishor |Email | Jun 30, 2017 | 7338 Views

Scientists are looking at how the brain processes information, and how that can be used in AI.

The human brain is the most powerful supercomputer on Earth, and now researchers from the University of Southern California are taking inspiration from the structure of the human brain to make better artificial intelligence systems.

What is artificial intelligence?
Artificial intelligence (or AI) is a system of computing that aims to mimic the power of the human brain. We have more than 100 trillion neurons, or electrically conducting cells in our brain, that give us the incredible computing power for which we are known. Computers can do things like multiply 134,341 by 989,999 really well, but they can't do things like recognize human faces or learn or change their understanding of the world. At least not yet, and that's the goal of AI: to devise a computer system that can learn, process images and otherwise be human-like.

Why do we want a computer that is human-like?
Very good question!  Part of this answer is: why not? AI is the holy grail for computer scientists who want to make a computer as powerful as the human brain. Basically, they want to create a computer that doesn't need to be programmed with all the variables because it can learn them just like our brain does.

A six-foot-tall, 300-pound Valkyrie robot is seen at University of Massachusetts-Lowell's robotics center in Lowell, Mass. "Val," one of four sister robots built by NASA, could be the vanguard for the colonization of Mars by helping to set up a habitat for future human explorers. (Elise Amendola/Associated Press)

Another reason scientists are interested in AI is that it could be used for things like surveillance and face recognition, and having computer systems that can learn new terrain or solve a new problem somewhat autonomously, which, in certain situations, could be very beneficial.

Why is it so hard to mimic the human brain?
In order to fully mimic the power of our own cognitive capacity, we have to first  understand how the brain works, which is a feat in and of itself. We have to re-engineer and re-envision the computer to be completely different from hardware to software and everything in between, and the reason we have to do this has to do with how our brains are powered.

"If we compare, for example, our brain to the super computers we have today, they run on megawatts, [which is] a huge amount of power that's equivalent to a few hundred households, while our brain only relies on water and sandwiches to function," said artificial intelligence and computing expert Han Wang from the University of Southern California said. "It consumes power that's equivalent to a light bulb." 

So you see the incredible efficiency of millions of years of evolution on our brain means we have learned to work with limited resources and become so power-efficient that we can beat a supercomputer for complex processing without breaking the energy bank.

How does the brain work at such low energy levels?
This is where the main difference between the brain and the computer lie. 

"Our current computers, there's a very powerful core...but then you have a long queue of tasks [which] come in sequentially and are processed sequentially," Wang said. "While our brain, the computation of units, which are the neurons, are connected in highly parallel manner. It's this high level parallelism that has advantages in learning and recognition."

So it's the parallelism in the brain that allows us to use only what we need only when we need it, and to not waste energy on running background processes that we all know slow down our computing power.

What's the new finding that helps us get closer to making computers like the brain?
It's this concept of running at low energy in parallel circuits. The key to this is to make computer circuits more complex in the messages they can send. 

In a typical computer, we think that each node sends a one or a zero, and then there's a series of ones and zeros until a program is made. 

In the brain, it's a very small circuit and they can send a one which means go, a zero which means no signal, and/or a two that says stop, or both a one and a two at the same time. 


In other words, our brains can send double the information in any given exchange compared to a computer, and that, coupled with smaller networks working in parallel, reduces the power strain.

What Wang and colleagues did was to create a system of wires that connect using tin selenite and black phosphate that can send, stop, go, do nothing, or do both signals, depending on the voltage sent. 

Now the plan is to re-engineer the computer from the ground up and build a computer that has the capacity for these low voltage decisions that aren't wired through these few cores that we see today, but instead with each circuit of messages working in parallel like the brain does.

Until recently, this was a theoretical concept because there was really no way to send as much information in a single transmission as we have now. 

So, artificial intelligence is only a few incredible brilliant research careers away from a reality.

Source: CBC