Apple’s self-driving car spotted on the 101 in Palo Alto

“For the last few weeks, Apple has been testing its autonomous driving software in three Lexus RX450h SUVs, which have been photographed around the San Francisco Bay Area,” Juli Clover reports for MacRumors.

“This morning, MacRumors reader Andrew was able to capture some video of one of the SUVs, giving us our closest look yet at the vehicles Apple is using to test its system,” Clover reports. “The SUV is equipped with multiple sensors and cameras, making it easy to see when it’s out on the road.”

“It uses a top-end Velodyne 64-channel lidar, two or more radar, and several cameras to navigate,” Clover reports. “The video was captured on the 101 North, with Apple’s vehicle exiting towards University Avenue in Palo Alto.”

 
Read more in the full article here.

MacDailyNews Take: There’s nothing quite like video shot in portrait, especially when shot by someone driving on the freeway. A more effective commercial for what autonomous vehicles can spare us from would be difficult to conceive.

11 Comments

  1. Navigating in a complex, 3-D environment is extremely challenging. It gives one a new perspective on the incredible flexibility and power of the human brain and sensory organs. It is impossible at this point to fully replicate either of those human capabilities, and that makes it extremely difficult to deal with the outlier situations – poor weather/visibility, irrational drivers, poorly or improperly marked roads, road construction and other temporary changes.

    But sensor technologies and algorithms continue to improve and it is now possible to embed considerable computing resources into vehicles. As sensors improve and drop in cost, I have no doubt that autonomous vehicles will continue to evolve over the coming decades and begin to play a substantial role in U.S. transportation. However, I am a bit skeptical regarding the degree of reliable vehicle autonomy that can be successfully achieved in the mass market over the next three to five years. These things always take longer than the optimistic projections of enthusiastic CEOs and engineers. Always…

  2. Vertical video is about as useless as a pencil under water. But because our primary user interface on our phones is designed for vertical orientation (for ease of holding the phone), vast numbers of people now shoot vertical video. It was bad when the video was standard-def, with the image format closer do a square (4:3 ratio), but now with HD, all we get is a narrow sliver of video, squeezed tight between two large black bars. It doesn’t help that, with the phone lens being wide-angle, the main object in that video is in the middle, with top and bottom parts showing vast blue sky (or ceiling), and ground (or floor). Upload such video on YouTube, then embed on Facebook. People normally flip through their Facebook feed holding the phone in vertical position. That YouTube is horizontal, so it is squeezed and shrunk to fit the width of the upright phone, and inside of it, with black bars, there is the narrow vertical strip. On the average iPhone (6/6s/7), that vertical video is now about 20mm tall by 14mm wide (.8″x1/2″). And inside that tiny little elongated rectangle, the actual action is taking place on the area about 1/2″ wide by about 1/3″ tall. And on iPhone 5/5s/SE, this is even smaller.

    About a year ago, I read an article about a purported debate whether shooting vertical video is truly so terribly wrong. In fact, Facebook tried to go along with this, allowing vertical videos from phones to be embedded vertically, so that they can fill the screen vertically. Ultimately, they moved away from that, and after constraining video to a square, they now prefer hotizontal video for both FB live and other embedded video.

    Apple should really put a large message very time user launches video recording mode in the camera app, encouraging users to turn the phone to horizontal position before filming HD (or 4K) video. There is really absolutely zero benefit from shooting vertical.

      1. It would be nice if the camera app gave you a warning to switch to landscape when in video mode .. you could simply ignore it if you wanted to. I edit a lot of corporate selfie videos and its so ugly and a royal pain.

  3. Meh. It’s still in the clunky junk-stuck-on-a-car stage. The purpose is clearly testing. As with Google’s junk-stuck-on-a-car, no one is going to want to own something looking like that.

    All of the autonomous car R&D is of course very much part of the bleeding edge robotics movement. But it’s great to see Apple has clearly invested itself into the fray. We know they’ll be the company to make it the most user-friendly.

  4. What I don’t understand is why Apple, and Google’s cars, look so shitty/obvious when Tesla’s auto-driving cars are sleek/integrated/seamless. Why even bother with mounting some gross spinning rooftop Lidar system when cameras/software have been shown to work very well?

    1. Simple — because they aren’t for sale. Engineers are just bolting prototype hardware onto imported sheet metal.

      If Apple ever gets the hardware to work, Ive will swoop in, remove half of the usability features, flatten it, remove as much local processing and memory as possible, then make sure users couldn’t adjust the seat or open the windows. Apple maps would direct you to only those locations Siri can lookup on Wikipedia, and it would not operate outside wifi or cellular coverage.

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.