Here’s what a Zoom call looks like on the Apple Vision Pro

Zoom launches new app for Apple Vision Pro
Zoom launches new app for Apple Vision Pro

Today Zoom announced a new app designed for Apple Vision Pro, giving users the ability to connect with colleagues and customers in a more real-world, immersive environment, available on February 2. Zoom’s app for Apple Vision Pro seamlessly blends video conferencing with users’ physical space, blurring the lines of in-person and remote meetings with the infinite canvas on Apple Vision Pro, helping distributed teams feel more connected and included.

“Zoom on Apple Vision Pro helps teammates stay connected no matter when and where they work, or how they communicate and collaborate,” said Smita Hashim, chief product officer at Zoom. “The integration of Zoom on Apple Vision Pro and visionOS extends our platform promise of allowing participants the ultimate meeting experience from anywhere and delivers on our commitment of seamless collaboration and communication.”

With people working in-person, remotely, or hybrid, Zoom is helping users facilitate connection, productivity, and collaboration with their colleagues and customers. Zoom’s app on Apple Vision Pro provides users with a native experience for meetings, giving them exceptional audio and video quality, as well as the ability to easily start, join, collaborate, and schedule meetings.

Apple Vision Pro users can look forward to a number of new features with Zoom on visionOS:

• Personas: Users are represented by an authentic spatial representation of themselves in Apple Vision Pro, allowing other meeting participants to see their facial and hand movements.

• Spatial Zoom experience: Zoom on Apple Vision Pro creates an immersive experience that can be scaled to the perfect size, which is ideal for users wanting to feel like they are in the same room as their colleagues and customers, without the need for additional physical equipment or setup.

Additional capabilities for Zoom on Apple Vision Pro, expected to be available later this spring, include:

• 3D object sharing: For some industries, like media and design, sharing 3D files is an important part of the workflow. While it’s possible to share these files on traditional screens, the experience will come to life when seeing objects in the context of an environment. For instance, an animator or game designer could collaborate and share the latest character model via Zoom’s 3D object sharing capabilities.

Zoom Team Chat: Team Chat is tightly integrated with Zoom Meetings and makes it easy to share information and connect with teammates. Apple Vision Pro users will be able to collaborate with colleagues and streamline communications using Team Chat in the Zoom app.

• Real-world pinning: Users will be able to pin up to five Zoom Meeting participants anywhere in their physical space, with the option to remove the background of pinned participants, helping Apple Vision Pro users feel more connected to the people in the meeting.

In December 2023, Zoom announced a new app for Apple TV, furthering its mission of creating one platform that delivers limitless human connection across the Apple ecosystem, which also includes experiences for iPhone, iPad, Mac, and CarPlay.

MacDailyNews Note: Users will be able to download Zoom from the App Store when Apple Vision Pro launches on February 2, 2024.

Please help support MacDailyNews. Click or tap here to support our independent tech blog. Thank you!

Support MacDailyNews at no extra cost to you by using this link to shop at Amazon.

17 Comments

    1. It scans your appearance into a personna and then animates your every movement. It’s like an AI special effect and others won’t necessarily be able to tell you’re using a Vision Pro.

      1. That’s amazing that it can animate your every movement with only cameras looking at your eyes and any body parts that might come into view of the external cameras. The mouth area and any other facial movements seem to be rather difficult to translate as the current device is designed.

    1. I don’t understand. Why would your avatar have goggles on? Isn’t the point of making a visible avatar so you can see the other user w/o it? or as some other character?

        1. I’m sorry, that still doesn’t make sense. Are you saying that you would use another camera to capture yourself with the visor in Zoom and Realtime AI would remove the visor from that image? What’s the purpose of wearing the Visor Pro to participate in the meeting at that point?

  1. This is absolutely stupid. The whole point of using cameras with Zoom is to see each other. If attendees are not seeing the real person, what is the point? The built-in Avatars in Zoom are annoying enough as it is.

    I miss Steve Jobs’ Apple and Steve Jobs’ products. Apple has become nothing more than a woke, profit hungry big corporate machine. I wish they would return to focusing on great products and leave the social crusading out. Just look at the iPhone, a great device, yes; however, the UI hasn’t changed much since it came out. I do have to say Android phones have iOS beat by a mile in the UI department. Give me the ability to better organize my home screen (why can’t I have empty space between icons, why can’t I use animated wallpaper, etc)?

    5
    3
  2. Using Apple Vision Pro to conduct a video conference call seems like going around your elbow to touch your nose—just because you can, doesn’t mean it’s an awesome way to do things.

  3. Somewhere amidst the presumptive and differentiating “it just works” motto, spell check remains lost in the 80’s, but I believe even DOS did better. I’ve been an annoying drip, drip, drip with this issue, but I am left often with mouth gaping…especially when 99.8% of the time, Goog provides an answer, OUTSIDE the Mac OS/iOS app.

    In the same vein, I might as well mention the cursor….It was approx 8 yrs ago that AAPL changed the iOS cursor. The reason still befuddles me as I regularly fiddle with the precise placement of the “new” tool. One would expect user facility would adapt to a new tech after this amt of time, OR the tech would evolve? I work with my hands often in 3-D art, btw…I’m not necessarily a klutz…but klutzie is the feeling in this realm.

    I often think back to a Business Week (as I remember) article that provided ammunition for my Apple sales efforts when the colored iMac was thought of as a toy–approx ’01-02. It discussed UI interfaces and the benefits of AAPL v MS… Mouse precision was one item and how much time using an Apple would actually save the business worker. MS mice were like punts/gondolas, compared to the facility of the Apple mouse. Yes, I’m referring to desktop mice, but the precision factor in iOS–though it looks line “sharp,” DOES NOT save more time as an “upgrade” to the prev cursor. It’s still a befuddling klunker. I bet spell ck and the cursor spend free-time together.

    Bryan, I think spatial computing is going to advance computing, but I get the concern of separation from the “real person.” As well, I do miss Job’s laser focus on the biz and NOT being diverted (apparently–at least) with “crusading.” This is the biggest downfall…not just b/c of the time leak, but also the hubris that’s inextricable linked. Tim has a nanny mindset.

    Something not mentioned that’s destroying my Apple experience, that’s related to hubris–is control. Apple is progressively shutting out the user’s UI decisions. Increasingly and tangibly, Apple is determining the “ride” THEY want me to have. My choices are becoming more challenging to make and sometimes, they are overridden by the master. This aspect is deflating and no other aspect has catalyzed my wondering, “maybe there’s another alternative.” Searching ….

  4. I’m more interested in the prosaic but important things. Can you see a wifi signal icon and battery percentage in the corner of your field of vision if you so choose? Or is Apple going to hide these fundamental pieces of information for a “cleaner ui” like they’ve done in the past?

    I agree that so far we’re seeing apps that look different (and very cool) but we haven’t yet been shown a compelling improvement in functionality. How many people want to be that first weirdo with a digital avatar on a Zoom call instead of their real face? I’d rather just own it and have everyone see me wearing the goggles with the eyesight feature looking at them 😅

    1. Actually I was wondering how the Vision Pro scanned your mouth area to reflect that movement to the avatar. Just using the mic would suffice I guess for opening and closing the avatar mouth but smiles, frowns, etc. would be difficult just with audio cues.

      1. nvm. Just saw the iJustine vid covering her experience with the Vision Pro. I see there are 4 cameras on the underside of the Vision Pro to track your hands and I suspect they are also used to track your lower face movements. The eye track frame rate seems jerky right now on the avatar so it may be a bit disconcerting for some.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.