The next-gen Spectacles boast cutting-edge technology that seamlessly overlays vital information onto their transparent lenses, giving you a lifelike experience where virtual objects blend effortlessly with reality. Powered by the corporation’s newly developed Snap OS, the interface functions seamlessly. Unlike traditional VR headsets or spatial computing devices, these innovative AR lenses do not obstruct your natural vision, instead leveraging cameras to seamlessly augment and enhance what you see. There is no display screen obstructing your field of vision. As a substitute, images appear to float and persist in three dimensions around you, suspended in the air or positioned on tables and floors.
Bobby Murphy, Snap’s Chief Technology Officer, characterized the predicted outcome as “a seamless integration of computing into our daily lives, amplifying our understanding of the people and environments surrounding us, rather than disconnecting us from the world.”
In my demonstration, I showcased exceptional dexterity by stacking Lego bricks on a desk, accurately hitting an AR golf ball into a gap across the room (achieving a respectable triple bogey), and creatively painting flowers and vines onto ceilings and walls using only my arms. Additionally, I engaged in thoughtful discussions about the objects involved, seeking insightful answers from Snap’s digital AI chatbot. Was I surprised to discover that a faintly purplish, digital creature resembling a dog, courtesy of Niantic, decided to adopt me, effortlessly transporting me from indoors to outdoors, where we ended up on a balcony together.
From your vantage point at the desk, you gaze out to behold a typical room. The golf ball lies motionless on the grassy terrain, a tangible reminder of the game’s physicality. Perched delicately atop the balcony railing, the vibrant Peridot seems to defy gravity as it surveys its surroundings. In essence, maintaining visual and physical connectivity with others in the same space will become a distinct possibility.
Snap seamlessly integrated vast amounts of advanced technology within its sleek frameworks. With two processors embedded internally, all computational tasks take place directly within the frames of the glasses. The cooling chambers situated along the sides performed admirably well in effectively dissipating heat during my demonstration. Four discreet cameras surround you, capturing every detail of your environment, while simultaneously tracking the subtle movements of your arms to monitor gestures with precision. Photographs are displayed using micro-projectors, similar to those found in pico projectors, which effectively render three-dimensional images directly before your eyes without necessitating extensive preliminary setup. The device creates a vivid, immersive experience – akin to a 100-inch display at just 10 feet away – within a compact, lightweight package (226 grams). Moreover, it automatically adjusts its brightness when taken outdoors, ensuring optimal performance both indoors and out.
You manage everything seamlessly through a harmonious blend of verbal cues and deliberate hand movements, many of which have become second nature to you. You can pinch to select objects and then drag them around as needed. The AI chatbot may respond to inquiries formulated in unadulterated language (“What’s that vessel I spot on the horizon?”). Most interactions with Spectacles don’t require a cellphone, as this wearable device is designed to operate independently for the majority of its functions.
It doesn’t come low cost. Snap doesn’t promote its glasses directly to customers, but rather encourages them to adopt a minimum standard. I was previously under the impression that the corporation had a truly inclusive policy regarding who could contribute to the platform’s development. Snap has forged a new alliance with OpenAI, leveraging its multimodal capabilities to empower developers in crafting experiences that seamlessly integrate real-world context based on users’ visual, auditory, and verbal inputs.
Working together, they achieved impressive results. Three-dimensional objects retain their permanence in the spaces where they are placed, allowing for easy repositioning without any disturbance to their original configuration. The artificial intelligence assistant successfully identified each component I specified for processing. While minor issues arose, such as Lego bricks occasionally interlocking imperfectly, overall the device performed well and with few notable exceptions.
Despite its unassuming nature, it certainly doesn’t lack importance. These peculiar spectacles won’t be mistaken for your average eyewear. A colleague described them as beefed-up 3D glasses, a description that seems aptly accurate. While I’ve worn more absurd laptops on my face, these ones didn’t exactly turn me into a hip cat either. Here’s a photograph of my attempt to try them out. Draw your individual conclusions.