For now, the field of view of the lab version is poor. It's only 11.7 degrees in the lab, much less than the Magic Leap 2 and Microsoft HoloLens.
However, the Computational Imaging Laboratory at Stanford University Visual aids follow throughout the page. This suggests that it may be something special. It's a thin stack of holographic components that fits roughly into the frame of a standard pair of glasses and is trained to project realistic, full-color, moving 3D images that appear at different depths.
Like other AR glasses, they use waveguides, which are components that guide light through the glasses and into the wearer's eyes. However, the researchers have developed a unique „nanophotonic metasurface waveguide“ that can „eliminate the need for bulky collimation optics“ and a „learned physical waveguide“ that uses AI algorithms to significantly improve image quality. He says he has developed a model. According to the study, the model „automatically adjusts using camera feedback.“
Although Stanford's technology is currently only a prototype with a working model and a 3D-printed frame that appears mounted on a bench, the researchers believe the current spatial computing market, including bulky trying to destroy it. pass-through mixed reality Headsets like Apple's Vision Pro and Meta's Quest 3.
Gun-Yeal Lee, a postdoctoral researcher who cooperated with the writing Paper published in Naturestates that no other AR system comes close in both functionality and compactness.