More Than 2 Billion Shipments of Devices with Machine Learning will Bring On-Device Learning and Inference Directly to Consumers by 2027, Finds ABI Research

  • The processes of inference and learning that form the backbone of AI typically take place in big servers, far removed from consumers. ABI Research finds that new models are changing that.

  • Recent Federated Learning, Distributed Learning, and Few-shot Learning frameworks can be deployed directly on consumers’ devices that have lower compute and smaller power budgets, bringing AI to end-users.

  • It will take up to 10 years for such on-device learning and inference to be operative, and these will require adopting new technologies, such as neuromorphic chips.

Artificial Intelligence (AI) is all around us, but the processes of inference and learning that form the backbone of AI typically take place in big servers, far removed from consumers. New models are changing all that, according to ABI Research, a global technology intelligence firm, as the more recent frameworks of Federated Learning, Distributed Learning, and Few-shot Learning can be deployed directly on consumers’ devices that have lower compute and smaller power budget, bringing AI to end-users.

“This is the direction the market has increasingly been moving to, though it will take some time before the full benefits of these approaches become a reality, especially in the case of Few-Shot Learning, where a single individual smartphone would be able to learn from the data that it is itself collecting. This might well prove to be an attractive proposition for many, as it does not involve uploading data onto a cloud server, making for more secure and private data. In addition, devices can be highly personalized and localized as they can possess high situational awareness and a better understanding of the local environments,” explains David Lobina, Research Analyst at ABI Research.

ABI Research believes that it will take up to 10 years for such on-device learning and inference to be operative, and these will require adopting new technologies, such as neuromorphic chips. The shift will take place in more powerful consumer devices, such as autonomous vehicles and robots, before making its way into the likes of smartphones, wearables, and smart home devices. Big players such as Intel, NVIDIA, and Qualcomm have been working on these models in recent years, which in addition to neuromorphic chipset players such as BrainChip and GrAI Matter Labs, have provided chips that offer improved performance on a variety of training and inference tasks. The take-up is still small, but it can potentially disrupt the market.

“Indeed, these learning models have the potential to revolutionize a variety of sectors, most probably the fields of autonomous driving and the deployment of robots in public spaces, both of which are currently difficult to pull off, particularly in co-existence with other users,” Lobina concludes. “Federated Learning, Distributed Learning, and Few-shot Learning reduce the reliance on cloud infrastructure, allowing AI implementers to create low latency, localized, and privacy-preserving AI that can deliver much better user experience for end-users.”

To read more, please visit: https://www.abiresearch.com

Guest User