Forget 5G. Apple is really betting on this technology with iPhone 12

Apple hasn’t yet introduced an augmented reality device or even promised one, but the company recently took steps to build the most advanced AR ecosystem in the world.

iPhone 12 Pro Max features the largest display ever on an Apple iPhone, a pro camera system with the new sensor-shift OIS on the Wide camera and longer focal length Telephoto camera, a LiDAR Scanner, A14 Bionic, and so much more.
iPhone 12 Pro Max features the largest display ever on an iPhone, a pro camera system with the new sensor-shift OIS on the Wide camera and longer focal length Telephoto camera, a LiDAR Scanner, A14 Bionic, and so much more.

An all-new LiDAR Scanner comes to the Pro line, offering the ability to measure light distance and use pixel depth information of a scene. This technology delivers faster, more realistic AR experiences and improves autofocus by 6x in low-light scenes for more accuracy and reduced capture time in photos and videos. This advanced hardware, combined with the power of the Neural Engine of A14 Bionic, also unlocks Night mode portraits, rendering a beautiful low-light bokeh effect.

Travis Hoium for The Motley Fool:

LiDAR technology built into the iPhone 12 will be a critical advance for AR and is a precursor to a long-rumored augmented reality/virtual reality device sometime in the next few years.

AR has long been a tool for Apple, but the inclusion of LiDAR in the iPhone 12 Pro along with the iPad Pro is an indication that Apple is taking augmented reality very seriously. Instead of approximating where surfaces are with images, lidar gives an accurate depth picture of a space, which can then be used for a variety of purposes that could lead to new devices and keep Apple growing for years to come…

Lidar will not only enhance existing AR applications, it will likely open up more innovations by developers. And that’s where the real value of AR will likely be unlocked. The iPhone will be the first place where lidar and AR are going mainstream, but it probably won’t be the last.

Apple has long been rumored to be working on augmented reality glasses.

MacDailyNews Take: Apple’s years of work on AR and VR will pay off in smartglasses as soon as 2021!

As iPhone was to Nokia and Apple Watch is to Swatch, Apple Glasses will be to Ray-Ban et al.MacDailyNews, March 8, 2019

Once Apple’s augmented reality smartglasses are released, people are going to want to wear them during every waking hour.MacDailyNews, February 26, 2019

Someday, hopefully sooner than later, we’ll look back at holding up slabs of metal and glass to access AR as unbelievably quaint. — MacDailyNews, July 28, 2017

The impact of augmented reality cannot be overstated. It will be a paradigm shift larger than the iPhone and the half-assed clones it begat. — MacDailyNews, August 4, 2017

Augmented Reality is going to change everything.MacDailyNews, July 21, 2017

5 Comments

  1. My first thoughts have always been I’d never wear smart glasses even though I do wear glasses. If they came up with a way for smart glasses to correct my vision without having to go to an eye doctor and get new glasses I might have to change my mind. Sounds like it might be possible.

  2. So far, with that LIDAR Scanner you can measure things and so, you don’t need to carry a physical tape measure. Wow! I really have to see how Apple can leverage this component to their advantage. You never hear anyone saying Samsung doesn’t having a LIDAR Scanner on their smartphones, but you sure hear a lot of criticizing about Apple not having a folding smartphone.

    I doubt many people are asking for AR glasses but Apple seems to think that’s what consumers will want. Beats me. I need to see what these glasses can do before I pass judgment about them. If I could only use them as a written language translation device, I’d buy a pair. I honestly don’t know what else I could use them for.

  3. …if Apple could create AR that project an image over what a person is actually seeing so they could perform an action such as measuring (as mentioned above), following a blueprint/diagram, using multiple sequential images to performa a repair, projecting a 3d map over a person’s current location with arrows show which way to go, surgeons using it to match anatomy/procedure steps over what they actually looking at and so on.

    There could be a lot more if the proposed glasses can project those kinds of images in a usable form that doesn’t block vision. This a bit of pie in the sky, but consider that the iPad and the Apple Watch were thought to be just that before their introduction and just a few years later, I can’t be without mine. Hopefully, this is another instance where Apple is (to paraphrase an somewhat overused Wayne Gretzky quote) “skating to where the puck is going to be….” to create realistic, valuable, usable, useful, sensible, intelligent tool for us (ie everything that Google Glass wasn’t)

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.