Yesterday I went to the moon, the top of Mount Hood volcano, and on safari to see an elephant so close it could have sprayed me with its muddy trunk.
Such bucket list experiences are out of reach for most, but I breezed through them in half an hour.
That was all thanks Apple’s debut headset, the Vision Pro.
True, I looked like a giant fly on the sofa inelegantly pinching the air, but I felt truly immersed in these scenes.
It would take more than a minute with the gorgeous mindfulness app to ground me again after a whistle stop trip through the world’s most mind-blowing experiences.
I’ve tried out VR headsets before, and found myself flailing around hitting furniture and losing my balance. They were interesting – kind of – but not something I wanted or needed.
Apple’s new ‘spatial computer’ offering, the Vision Pro headset launching in the UK on Friday, made me question whether that was still the case.
As the product is still so new, the apps to do all of these things may not exist yet. But trying it out made me think they would all be possible:
There were moments in the product demo where I felt genuinely emotional, such as looking at a 3D photograph of a child blowing out their birthday candles. It made me feel bad for my son that I didn’t have similar photos of him; he would have to settle for iPhone clips.
To address the elephant in the room (having just seen an elephant in the room), the Vision Pro is expensive.
Starting at £3,499, it is even more expensive for Brits than for those in the US, where it has been in shops since February and is the same figure in dollars (£2731.67 if converted to Sterling).
While still just about possible to save that much, it’s not exactly accessible to the masses, who may prefer to put that money towards a house deposit.
I won’t be buying one for this reason, although if it were cheaper I would definitely be tempted.
It was crazy to me that I could control something so seamlessly using my eyes. After pressing one button on the top right to bring up a menu, I didn’t need to move my head, point to things or nod. My eyes alone were enough to select the apps I wanted, and pinching my thumb and finger together stood in place of a tap or mouse click to select.
There is also a dial to select how immersed you want to be, tuning out others around you or being able to see them as well as whatever else you’re looking at.
The product is not perfect, and there are elements I can immediately identify as targets for improvement in future versions.
But wow – what a strange and futuristic thing to be able to take home with you and set up in minutes.
It was quick to do the scans of my eyes and hands, and as I wear glasses, I also used a lensometer to measure my prescription and use the correct optical inserts.
Maybe this is why the 3D effects felt so much more vivid to me than I have usually experienced them. Wearing cinema sunglasses and watching an IMAX film never worked that well for me, but these images were undeniably 3D.
The most impressive part of the demo was the immersive video, where you don’t see the edges to the screen like you would in a cinema. When you turn your head, you see different parts of the picture with the overall effect of making it feel much more real.
While I knew rationally these were pixels on a screen, my brain didn’t completely accept that so I felt vertigo watching a woman walk a precipitous tightrope.
The most shocking part was seeing groups of people so clearly – they seemed so close and so real they were in my personal space.
I couldn’t help but think of other uses for the technology; undoubtedly someone has already rendered a lap dance in immersive tech. Videos from warzones would be similarly immediate, making it possible to witness the deaths in Gaza in super high definition.
What would the effect be of watching content like this on demand every day? Are our minds ready for just how graphic the new graphic content is going to be?
I guess it is an endorsement of the product that these were the questions I started thinking about.
I didn’t have enough time to test many of the features, such as gaming, art, typing, taking my own photos and video, typing, or using FaceTime.
But there are certain people and professions for whom the tech already seems like a no brainer.
If you want to create things in 3D, in architecture for example, it is a perfect way to view them from all sides in a realistic perspective. Medical students could learn anatomy without the need to dissect an actual corpse, with the definition realistic enough to translate.
I’m not sure I’d justify buying one just to entertain myself and take home videos, although I hope enough people do that Apple keeps investing in the technology.
There are some aspects that seem ripe for improvement, such as making the feel of the headset lighter.
It is not uncomfortable to wear, and I wouldn’t call it heavy, but it’s definitely substantial to the point I wouldn’t wear it all day and would rather be sitting down to use it.
Although it looks like reality, what you see through the headset is entirely recreated by cameras and pixels (eyes you may see on the front are also fake eyes to make having conversations more natural).
Videos and photos you watch through the headset are crisp, but the ‘real world’ was not as high definition and ever-so-slightly fuzzy.
There were also rare moments of glitches, such as when I looked down at my hands in an immersive view and there was a ragged white halo around them.
These were minor snags though.
Overall, the experience was as polished as the glass headset itself.
If you want to see for yourself, you can book a demo from this Friday at any Apple store.