Translate to multiple languages

Subscribe to my Email updates
Enjoy what you've read, make sure you subscribe to my Email Updates

Sunday, February 12, 2017

Intel Gets Serious About Neuromorphic, Cognitive Computing Future | The Next Platform

Photo: Nicole Hemsoth   
"Like all hardware device makers eager to meet the newest market opportunity, Intel is placing multiple bets on the future of machine learning hardware." notes Nicole Hemsoth, Co-founder and co-editor. 

Photo: The Next Platform
The chipmaker has already cast its Xeon Phi and future integrated Nervana Systems chips into the deep learning pool while touting regular Xeons to do the heavy lifting on the inference side.

However, a recent conversation we had with Intel turned up a surprising new addition to the machine learning conversation—an emphasis on neuromorphic devices and what Intel is openly calling “cognitive computing” (a term used primarily—and heavily—for IBM’s Watson-driven AI technologies). This is the first time to date we’ve heard the company make any definitive claims about where neuromorphic chips might fit into a strategy to capture machine learning, and marks a bold grab for the term “cognitive computing” which has been an umbrella term for Big Blue’s AI business.

Intel has been developing neuromorphic devices for some time, with one of the first prototypes that was well known in 2012. At the same time, IBM was still building out efforts on its own “True North” neuromorphic architecture, which we do not generally hear much about outside of its role as a reference point for new neuro-inspired devices we’ve watched roll out in the last couple of years. Some might suggest that a renewed interest in neuromorphic computing from Intel could be aligned with the DoE’s assertion that at least one of the forthcoming exascale machines must utilize a novel architecture, (although just what classifies as “novel” is still up for debate) and some believe that neuromorphic is a strong contender. The problem is, if neuromorphic is one of the stronger bets, there are some big challenges ahead. Near term, there are really no neuromorphic devices being produced at scale enough to warrant an already-risky DoE investment and second, albeit longer term, is the fact that programming such devices, even to handle offload workloads for existing large-scale scientific simulations, is a tall order.

Leaving aside exascale, however, is the fact that there are many emerging use cases that could benefit from a powerful pattern matching-pro device like a neuromorphic chip. These have far less to do with supercomputing and much more to do with self-driving cars and real-time sensor-fed networks. Either way, Intel is getting serious about neuromorphic chips again—and they’re backing that with a lot of talk about what’s next for “cognitive computing”.

Source: The Next Platform