Source: Cointelegraph
The tech is still experimental, but its implications could change how users view reality.
Meta recently showed off a new “Hyperscape” tech that takes the idea of stitching photographs together to form a 3D environment — such as YouTube’s 360 videos — and turns it into a real-time rendering system that could potentially revolutionize telepresence and redefine the idea of working from home.
Hyperscape
Meta’s still as bullish as ever when it comes to the metaverse. As Cointelegraph recently reported, Meta CEO Mark Zuckerberg showed off the company’s new “Orion” smart glasses at the company’s “Connect” event on Sept. 25.
The Orion glasses purportedly give the user an effective heads-up display, allowing them to navigate the physical world with digital information seamlessly integrated into what they are seeing.
However, while the company’s new Orion spatial computing glasses may have gotten the most attention, its Hyperscape demo might be the most exciting update for those interested in both virtual reality and Web3.
Hyperscape, which is still experimental, would ultimately allow a person or machine to scan an area using a phone camera and then convert that imagery into a real-time-rendered, fully navigable digital environment.
One pundit who tried a demo of Hyperscape with Meta’s Quest 3 virtual reality headset described the experience as being like the “Holodeck” from the fictional Star Trek universe.
The demo is currently available to the general public, but it only allows users to visit a few different spaces that were pre-rendered using the technology.
Real-time telepresence
Future iterations of Hyperscape, however, could allow any observable environment to be rendered in the metaverse in real-time. This could make it possible for people attending a meeting in virtual reality to see and interact with those attending the meeting physically in real-time from an immersive perspective.
A decentralized version could allow geographically separated people to use similar technology to verify reality in real-time from a navigable perspective via the metaverse. This might prove far more immersive and socially binding than relying on pre-recorded or forced-perspective video footage to verify facts.
The advent of non-fungible tokens and the rising popularity of digital assets have made the metaverse possible, but arguably, its mainstream proliferation will require a bridge between Web3 and reality that incentivizes more than the financial possibilities.