“When Apple introduced its 64-bit A7 processor in September, 2013, they caught the industry by surprise. According to an ex-Intel gent who’s now at a long-established Sand Hill Road venture firm, the competitive analysis group at the imperial x86 maker had no idea Apple was cooking a 64-bit chip,” Jean-Louis Gassée writes for Monday Note. “As I recounted in a September 2013 Monday Note titled 64 bits. It’s Nothing. You Don’t Need It. And We’ll Have It In 6 Months, competitors and Intel stenographers initially dismissed the new chip. They were in for a shock: Not only did the company jump to the head of the race for powerful mobile chips, but Apple also used its combined control of hardware and software to build what Warren Buffett refers to as a wide “wide moat””

“I think we might see another moat being built, this time in the fields of Augmented Reality (AR), Machine Vision (MV), and, more generally, Machine Learning (ML),” Gassée writes. “At last week’s WWDC, Apple introduced ARKit… As many observers have pointed out, Apple just created the largest installed base of AR-capable devices.”

“Apple also introduced CoreML, an application framework that integrates ‘trained models’ into third-party apps,” Gassée writes. “Couple Apple’s AR and ML announcements with the putative ANE [Apple Neural Engine] chip and we have an integrated whole that sounds very much like the Apple culture and silicon muscle we’ve already witnessed, a package that would further strengthen the company’s moat, its structural competitive advantage.”

Read more in the full article here.

MacDailyNews Take: Apple 3.0’s Phillip Elmer-Dewitt recounts what Apple’s Craig Federighi last week told The Talk Show‘s John Gruber regarding the rumored Apple Neural Engine:

…GPUs are a case where, as we’ve been able to shrink transistor density, you can essentially throw more and more transistors at the problem of graphics processing and it pretty much scales up. It’s just a very parallelizable task. And it turns out that if you want teraflops of performance to run a machine learning model, you can parallelize that on a GPU and you can get tremendous wins. — Craig Federighi

SEE ALSO:
Apple steps up homegrown GPU plans with London hiring spree – April 9, 2017
Why Apple’s ditching of Imagination is critical for the future of the iPhone – and maybe even the Mac – April 5, 2017
Apple could look to buy Imagination Technologies after ditching the chip firm, share price plunge – April 4, 2017
Imagination Technologies’ shares collapse after Apple dumps UK chip designer – April 3, 2017
Apple nabs top talent from iPhone 7 GPU chipmaker Imagination Technologies – October 13, 2016
After failed takeover talks with Apple, Imagination Technologies sells stake to state-owned Chinese company – May 9, 2016
Apple in ‘advanced talks’ to acquire Imagination Technologies for PowerVR GPU – March 22, 2016