Your smartphone is already pretty smart. Chances are it has a GPU (for graphics) and a CPU (for everything else) packed into its tiny microchip brain. Those two parts alone make it more powerful than entire desktop PCs from only a decade or so ago, but a new kind of chip developed by MIT could take your pocket supercomputer even farther into the future. 

The new chip is called "Eyeriss" and in a lot of ways its quite similar to a GPU (Graphics Processing Unit). Like a GPU, it has a whole bunch of processing cores, and is very good at multitasking. That is to say, it can process a whole bunch of different points of data at the same time unlike a CPU (Central Processing Unit) which tends to tackle its calculations one at a time. 

But if Eyeriss is just like a GPU, what do you need it for? The answer is deep-learning. Deep-learning systems like neural networks—a artificial intelligence strategy that amounts to building robot brains that can learn—are a great way to accomplish complex tasks that are very easy for humans but very hard for computers. Tasks like image recognition, speech decoding, predictive text, and other software features that learn over time. 

The GPU in your phone can already chew on projects like these, and it's relatively good at them. Neural networks work by pouring over the data multiple times in different ways, so that ability to multitask comes in super handy. The catch is that, most of the time, your phone's GPU needs to be busy doing graphics stuff. Which is where Eyeriss could help. 

Not only would Eyeriss help out by being an additional hand on deck, it would also be better at these tasks thanks to chunks of memory that would hide directly inside the chip and an architecture that would let the chip's different nodes talk to each other directly instead of everything having to squeeze through one pipeline. 

In short, a chip like Eyeriss could give your phone the power to look through its camera and recognize things around it in real time without chewing through your battery. Moreover, chips like these could make their way into industrial robots, cars, whatever you can imagine, and give them the computational horsepower they would need use the most advanced AI technology we have right now. 

Other chipmakers like Nvidia and Qualcomm are working on similar tech, as well they should be; if it can be made for relatively cheap, these sort of chips could give even the smallest devices untold computational power. It'll just be a matter of finding great ways to use it. 

Source: MIT News via The Verge