Halide team peeks into the future with iPad Pro’s LiDAR Scanner

On March 18th, Apple unveiled its most advanced iPad Pro. With the A12Z Bionic chip, iPad Pro is faster and more powerful than most Windows PC laptops. The new iPad Pro adds an Ultra Wide camera, studio-quality mics and a breakthrough LiDAR Scanner that delivers cutting-edge depth-sensing capabilities, opening up more pro workflows and supporting pro photo and video apps. The LiDAR Scanner, along with pro cameras, motion sensors, pro performance, pro audio, the Liquid Retina display, and powerful apps, extends the lead of iPad Pro as the world’s best device for augmented reality.

The Halide team looks at iPad Pro’s new camera array, what’s changed, the whole new “3D” LiDAR sensor — and what this new technology enables.

The Halide team built a proof-of-concept called Esper that uses iPad Pro's LiDAR Scanner
The Halide team built a proof-of-concept called Esper that uses iPad Pro’s LiDAR sensor

Sebastiaan de With for Halide:

If you need something to compare it to, it’s the iPhone 8 camera. Don’t expect parity with Apple’s latest iPhone 11 Pro shooters, but it’s still a great set of cameras…

In short, the iPhone 11 and 11 Pro pack a significantly larger (and better) sensor with its wide-angle camera, compared to iPad. The ultra-wide sensor on iPhone is comparable to the ultra-wide on iPad in quality, but the iPad is lower resolution.

Here’s the biggest and totally new thing, then: The LIDAR sensor… The LIDAR sensor, also known as a 3D ‘Time of Flight’ sensor (ToF for short) is a sensor that is exceptionally good at detecting range.

Regular camera sensors are good at focused images, in color. The LIDAR sensor doesn’t do anything like this. It emits small points of light, and as they bounce off your surroundings, it times how long it took the light to come back.

This sounds crazy, but it’s timing something moving at the speed of light. This window of time that amount to hundreds of picoseconds. Pico? Yes, pico — that’s an order of magnitude smaller than nanoseconds! A picosecond is 0.000000000001 seconds…

With Halide, we’d love to use the depth data in interesting ways, even if it’s low resolution. There was only one problem: there are no APIs for us as developers to use to get access to the underlying depth data. They only expose the processed 3d surface.

What if we re-thought photographic capture, though? We built a proof-of-concept we’re calling Esper.

MacDailyNews Take: We can’t wait to see what Halide and myriad other developers can do with the 3D ToF LiDAR Scanner in iPad Pro and, soon, we believe, iPhone!


  1. The Lidar will make it VERY easy for just about anybody to make realistic fake images utilizing the AR capture ability of objects and superimposing them on another image. Once the APIs are available and rolled out to more and more iPhones, distinguishing between real and fake media will get harder and harder as the masses catch on. My 2 cents…

        1. Coming from the other side, I completely agree. In fact, why are we letting politics stir us up? If ever we needed evidence, we have it. Now is the time to pull together and cut this ridiculous tribal crap. We don’t agree on everything. So what?

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.