I’m scrambling around a wooden floor, trying to fend off a giant virtual ball with my iPad Pro. It’s a bowling alley, but it feels like a basketball court. A crowd cheers. Eli faces off across from me, pushing his iPad Pro forward, launching the ball at my face. I lose my footing. The ball shoots past me. I run after it. Too late. The pins were knocked down. I lost. Again.
But we’re on a big court that’s totally empty, except for us and our test iPads. We see the big ball, bowling pins, and us, running around among these things as if they’re in our shared space. Welcome to Apple’s imminent AR future: fast-paced, collaborative, and… still, headset-free.
Apple unleashed a number of AR tools at its WWDC developer conference that are coming this fall, including a whole AR-making toolkit called Reality Composer. ARKit 3, which needs a recent A12-equipped iPhone or iPad to do its most impressive effects, is what Swift Strike is meant to show off. And it shows how far things have come in a year.
I came away thinking that all I was really missing was the convenience of wearing a pair of AR glasses so I would not have to worry about looking down at an iPad all the time.
MacDailyNews Take: Patience, Padawans.
So when real people are in the game, it looks silly because they’re staring into their iPhones or iPads (which is, admittedly, like real life today), but imagine this in the future with Apple Glasses instead. Then it will all work so much better. — MacDailyNews, during Apple’s WWDC Keynote, June 3, 2019