Understanding how the Apple Vision Pro is likely to impact the virtual and augmented reality industry.
Whenever Apple does something, they do it the right, nay, the ‘Apple way.’ The Apple way of doing things generally refers to the company’s approach to design, product development, and branding. The process takes time, often years, meaning Apple products are introduced much later than the rest of its competitors in the tech industry.
Take virtual reality/augmented reality (VR/AR), for example. The first massive success in selling VR/AR headsets as a consumer product was achieved by Oculus VR with the Oculus Rift in 2010. Since then, the VR/AR space has been getting more competitive, with each company trying to beat the other on price, console, gear, or interface. Now, 13 years after the first commercial success of a VR headset, Apple finally revealed their take on VR – the Apple Vision Pro. Except, introducing just another headset in the market is not the Apple way. So, Apple unveiled the Vision Pro without ever uttering the words ‘virtual’ or ‘reality’ a single time during the entire presentation. Instead, the Apple Vision Pro is being marketed as the world’s first proper spatial computing device.
SPATIAL COMPUTING WITH THE APPLE VISION PRO
The Vision Pro is a first-generation device from Apple and will cost USD 3500. Apple has announced that 1 million units will be produced sometime next year in 2024 when it will first hit the shelves, but some tech review platforms have it on good authority that production complications have already caused Apple to scale down operations down to 400,000. This should come as no surprise as the tech behind the Vision Pro is astoundingly complicated and requires hefty processing power.
For starters, the device is a stand-alone computer and will not need to be paired to an iPhone or a Macbook. It does not even come with any controllers. All interactions with the Vision Pro will be done by hand gestures, eye movements, and voice commands. To achieve this feat, the Vision Pro needs to have cameras, microphones and sensors all around it – 23 to be exact. Input from these 23 sources needs to be fed simultaneously to a pair of processors encased in a unit not much larger than a ski mask.
Apple had to create an entirely new dedicated chipset called the R1 to better handle its mixed reality spatial computing device. It works in tandem with an M2 chip that handles the rest of the processes. Additionally, the Vision Pro, being a standalone device, requires its own operating system which, again, Apple had to design from scratch. It’s called the Vision OS and is the key to making the Vision Pro a seamless transition between reality and virtuality.
While Apple’s initial projection of 1 million units was a far-fetched ambition, based on the reviews of the prototype so far, the revised 400,00 units are still expected to create a lot of hype and buzz.
All interactions with the Vision Pro will be done by hand gestures, eye movements, and voice commands. To achieve this feat, the Vision Pro needs to have cameras, microphones and sensors all around it – 23 to be exact.
THE TECH WORLD’S FIRST IMPRESSIONS
Every tech reviewer who has gotten hands-on experience with the Vision Pro has unanimously agreed that Apple has done VR right. In fact, the consensus is, all other VR headsets have so far been getting it wrong. For instance, while other VR consoles require controllers or holding their hands out in front of them and do complicated hand gestures to interact with the interface, the Vision Pro tracks the user’s eyes to pinpoint where exactly they are looking, and select an item by pinching their fingers while their hands rest on their lap or on a tabletop. Vision OS does a phenomenal job of augmenting the reality in which the user wears the headset. For example, light sources in a living room can be detected and simulated virtually inside the headset, so that when a movie is playing, Vision Pro can augment the surroundings to simulate a dark room, while casting the hue of the screen on the walls of the simulated space.
Techies particularly applauded The Vision Pro’s efforts to remove the physical barrier between its wearer and the people around them. This is done with Transparency Mode, where an external screen on the headset can show what the wearer’s real-time facial expression looks like, making interactions natural and less clumsy.
The biggest excitement has been about the level of immersion that Vision Pro has been able to achieve. Even though the screens are right in front of the wearer’s eyes, the pixel density is 64 times that of Apple’s revolutionary retina display which results in virtually infinite resolution. The life-like clarity, near-zero latency, and limitless virtual workspace make app organisation and task management feel natural and intuitive. Reviewers have verified Apple’s claim that the VR/AR experience looks, sounds, and feels like real-world interactions and that it does help with productivity, focus, and organisation. But there are two sides to this.
OPPORTUNITIES AND CAVEATS
Entertainment, browsing, and productivity – Vision Pro has added depth and dimension to all three by effectively making the entire world a computer screen. The words ‘seamless’ and ‘immersion’ kept getting brought up by whoever wore the device praising how easy and intuitive Apple made the Vision Pro’s spatial computing. Since the device also has built-in headsets, the audio is spatial as well, meaning movies can be enjoyed as if sitting in a theatre. Web browsing also becomes faster as tabs and windows can be organised and placed around the virtual space, similar to how one would stack files and folders around their desk. And even in completely crowded areas with distracting noises and lights, it is easy to zone everything out by virtually teleporting to another environment without physically needing to move an inch. Effectively, the Vision Pro allows for a new level of privacy and isolation for enjoying content and managing tasks.
Arun Maini, a renowned tech reviewer on YouTube has raised a valid concern regarding this level of immersion. He undoubtedly loves the product and agrees that if spatial computing is ever to be a thing, then the Apple way is the right way to do it. However, the isolated experience of spatial computing is likely to incentivise people to stay more time online and consume content in volumes that could be detrimental to mental health and social life. For example, short-format videos, which are typically made for smartphone screen sizes, already chip away hours from people’s lives because they are so easy to access and consume. This is already happening even when people are watching shorts and reels in the physical world. But, in a virtual setting, they are completely detached from any external stimulation, meaning there are reduced incentives or opportunities to take a break and interact with others around. This could result in an unhealthy social setting, where people are motivated to stay online for the majority of the day, as the real world has much less exciting propositions when it comes to entertainment and sharing. The same goes for interactions during office hours, where collaboration, innovation and idea exchange occurs through social interactions. If spatial computing becomes the practice then that level of physical communication also gets removed.
Nonetheless, the Vision Pro has enormous potential for pushing the boundaries of tech. Historically, Apple products have set new benchmarks and industry standards that have brought permanent changes to product design and usability. Given how many times big tech companies have tried and fallen short of presenting the perfect VR/AR headset, it is exciting to see the possibilities now that Apple has stepped into the game.