I Have So Many Questions About Apple’s Vision Pro
I Have So Many Questions About Apple’s Vision Pro
There will probably be a long wait for answers
Apple introduced its Vision Pro headset at 2:00 PM my time on Monday. I spent most of the rest of the day and a lot of Tuesday scouring the internet for more information, but most of what I found only raised more questions in my mind.
The battery life
It’s short, apparently only a few hours. However, if I understand this correctly, you can use it plugged in to power, so that might not matter a lot if you are sitting at a desk or on your couch as many of the Apple videos displayed. It certainly would matter if you were playing games.
How much does it weigh and is it comfortable?
Apple didn’t have a lot to say about this.
I asked Google Bard and it guessed 300–400 grams, which is about 10 to 14 ounces. Everyone I’ve read who actually got to try one says it was “heavy”, maybe too heavy to wear for long. But I looked up motorcycle helmets and found them over 1200 grams.
Of course weight distribution could matter a lot; if it’s not evenly distributed, a lighter device could still be uncomfortable.
Being able to plug it in to get by the battery life issue won’t matter if it is uncomfortable.
That’s how Apple is referring to this. Not Augmented Reality, not Virtual, but Spatial. It makes sense to me, and differentiates their product, but will that confuse people?
Will it make people woozy?
Apple didn’t talk about this, but motion sickness is something that can happen with VR headsets.
I don’t know that there is much Apple can do about that, though perhaps there are remedies for affected people. But motion sickness comes mostly with action packed games and I don’t see that as a critical market for this device.
On a slightly related note, the operating system seems to have a full complement of accessibility features built in, and the hardware can be fitted with corrective lenses. I read somewhere that Apple itself will be offering those; that doesn’t make sense to me unless they have partnered with some optometry firm.
Accessibility features include pausing animated images in Safari and Messages, and dimming flashing lights in video. That’s will be true across all devices.
This can be a Virtual Screen for Mac but is it for iPhone and iPad too?
According to Apple, you just look at your Mac and its screen pops into Vision Pro and then will react like its built in apps do with eye movement and hand gestures.
All apps? Hmm.. seems unlikely as I think hooks need to be added to the app code.
Will you be able to do the same with iPad and iPhone? That would be neat.
I bet someone will open a cafe where you can rent Vision Pro to use with your computer or to watch movies.
Does “Vision Pro” imply a “Vision Air”?
That almost deserves a “Duh, yes!” answer, but I was interested to read on Tuesday that Apple has bought a company that makes headsets for the U.S. military and for the “Mario Kart ride at its theme parks in Japan and LA’s Universal Studios”, according to this article:
Apple has bought an AR headset startup called Mira
Apple has acquired Mira, a Los Angeles-based AR startup that makes headsets for other companies and the US military…
From what I read, Xcode (Apple’s development system) will have an emulator for testing Vision Pro apps, but also will let developers test with the device itself.
That made me wonder what Google and Meta are doing with app development for their products, but they don’t seem to be bragging about that as I couldn’t find anything.
Apple mentioned immersion several times. Apparently they have some lovely 3D scenes you can use as background for FaceTime or simply chill out with.
I think I heard “explore” mentioned (could be wrong) and immediately wondered about that as it sounds like that could really be neat. Explore the Grand Canyon, the Ozarks? How about Paris or deep sea diving? Count me in.
Some thought provoking quotes from Apple’s webpages
“You can pull a 3D object out of an app and look at it from every angle, as if it’s right in front of you.”
I happened to read that shortly after reading someone questioning how long before porn reaches Vision Pro. My guess? About fifteen minutes after the first deliveries.
“Your digital Persona allows others to see you while you’re wearing Vision Pro. It’s a dynamic, natural representation of your face and hand movements while you’re using FaceTime.”
You can see the room you are in and objects in it, but just like your eyes, it’s done with cameras.
“Vision Pro works with Bluetooth accessories like Magic Keyboard and Magic Trackpad, which are great for things like complex spreadsheets and long emails.”
Yeah, though a virtual keyboard was also mentioned.
“Within FaceTime, you can also use apps to collaborate with colleagues on the same documents simultaneously.”
The apps have to be SharePlay enabled. I expect most Apple apps will be, but not every third party app. Will Google and Meta shun that?
What are your questions?