“AI tasks, because they are so computationally intensive, often need custom-designed chips for the devices themselves and even custom-designed servers for data centers where AI algorithms are often trained, developed, and deployed from the cloud,” Statt reports. “Amazon, which seeks to stay competitive in the smart home hardware market and in the realm of consumer-facing AI products, has nearly 450 people with chip expertise on staff, reports The Information, thanks to key hires and acquisitions the e-commerce giant has made in the last few years.”
“The plan is for Amazon to develop its own AI chips so Alexa-powered products in its ever-expanding Echo line can do more on-device processing, instead of having to communicate with the cloud, a process that increases response rate times,” Statt reports. “With regards to AI specifically, Apple designed a new ‘neural engine’ as part of its A11 Bionic chip, which handles on-device processing for machine learning algorithms that power features like Face ID and ARKit apps.”
Read more in the full article here.
MacDailyNews Take: The more than can be done on device as opposed to the cloud, the better the experience will be.
Apple explains how ‘Hey Siri’ works using a deep neural network and machine learning – October 19, 2017