Loading...

Indian student creates chip that can help phones respond faster to human commands

Indian student creates chip that can help phones respond faster to human commands
Loading...

As the world increasingly embraces artificial intelligence, there is a growing demand for devices to process facial recogniition and respond faster to speech-to-text and human commands. 

This would involve either expensive infrastructure to support natural language processing (NLS) or machine learning on the device itself or bandwidth to send data to the cloud for decision-making.

However, smaller devices do not have the power or infrastructure needed to run the data through complex neural networks -  a mesh of information servers on which machine learning algorithms are trained - locally on the devices and hence have to send the data to the cloud, which increases response times.

Loading...

Now, an Indian-origin student at the Massachusetts Institute of Technology (MIT) in the US has developed a new processor chip that can speed up data processing on neutral networks by 3-7 times while reducing power consumption by 94-95%.

This means that smaller devices such as smartphones or smart home appliances could run NLS or face recognition locally, and in turn, respond faster to human commands.

"The general processor model is that there is a memory in some part of the chip, and there is a processor in another part of the chip, and you can move the data back and forth between them when you do these computations," Avishek Biswas, who led the chip's development, was quoted as saying by MIT News. Biswas is an MIT graduate student in electrical engineering and computer science.

Loading...

He said that because machine learning algorithms require so many computations, transferring data back and forth consumes the majority of energy used in the overall process.

"But the computation these algorithms do can be simplified to one specific operation, called the dot product," said Biswas. "Our approach was: can we implement this dot-product functionality inside the memory so that you don’t need to transfer this data back and forth?”   

Tech giants such as SoftBank-owned ARM Holdings, Google and Amazon are also trying to bring down latency - delay in transfer of data - in AI and machine learning-driven chips. 

Loading...

ARM Holdings is working on a new project called Trillium to develop chips that can support artificial intelligence (AI) and machine learning (ML) workloads on mobile devices.

Intel Corporation has also launched a new chip aiming to power up applications that can process data on the device itself instead of using the cloud.

Google has started offering its new AI-tailored chips on its cloud platform to other companies for advanced testing as part of its effort to accelerate machine learning models and get them running faster.

Loading...

Google also recently merged its smart home devices unit Nest with its hardware team in a bid to outgun Amazon's Alexa.

Separately, Amazon is reportedly developing an artificial intelligence-powered chip that will bolster devices using its smart assistant, Alexa, amid efforts to consolidate its lead in the consumer-facing AI segment and keep Google at bay.


Sign up for Newsletter

Select your Newsletter frequency