CREAL3D’s light-field glasses tech helps you “see real 3D”


When the human eye focuses on an object or area, our eyes defocus everything else. However, currently available AR and VR devices do not recreate this effect. As a result, users can experience discomfort, headache, fatigue, nausea, and more. CREAL3D, a Switzerland-based startup, wants to change this through light-field technology.

According to the company, “light-field is a missing key to comfortable and fashion-smart eyewear.” Solutions that use this technology project images with real optical depth, allowing eyes to naturally change focus between virtual objects at different distances, complete with the defocusing effect. Apart from removing some of the sickness effects, this also creates a more immersive, real and fluid experience, and, for AR, virtual objects are seamlessly fused into the real world.

“Virtual and augmented reality headsets use flat screens that set the focus distance,” CEO Tomas Sluka explains. “But our eyes must be able to adapt the optical depth to the distance of the objects (clear in front of blur in the bottom, for example).”

As interesting as it sounds, this is not the first time a company uses this technology for AR and VR. In 2017, we wrote about how Avegant’s Light-Field AR technology would compete with Magic Leap and HoloLens. Unfortunately, we haven’t heard any news since then, so the appearance of CREAL3D is welcoming news.

Alternatively, most companies exploring how to imitate the way eyes naturally work are using eye-tracking technology. In 2016’s Oculus Connect 3, Chief Scientist Michael Abrash discussed the challenges to eye-tracking and foveated rendering in a VR headset. Two of the main issues was figuring how to track different eye shapes and the adapting size of the pupils – something that light-field technology doesn’t have to deal with. However, recent reports show how much eye-tracking technology evolved and the advantages it could bring to these headsets.

If light-field technology works as CREAL3D claims, it seems the way to go for both VR and AR. For now, eye-tracking promises a great future. Last year, CREAL3D raised around $1 million and is in the process of raising a $5 million round to further grow the company and its development.


About Author

João Antunes

IT and videogames are João's topics of interest since a very early age. Videogames, the Internet, game consoles and computers became his normal toys, as result of being the son of a journalist writing about the infancy of the Web, the games industry and hardware in general. Small game reviews published on the first Portuguese computer games magazine, back in the early 2000s ignited a passion – writing - he now pursues, along with his other interests: programming, web designing and hardware. Technology in general makes him tick.

1 Comment

  1. Avatar

    Something to keep in mind:
    With an eye tracking solution for focal accommodation, it isn’t just the eye tracking that you need to worry about. Eye tracking is just a means of determining the direction in which the user is looking. If they focus on a real-world object at the same focal depth as the virtual content placed near it, you can use the convergence angle to determine the appropriate focal depth. Otherwise you actually need to ray-trace from the pupil to the target virtual object, or to a point on a depth map of the real world in front of them. After you know the appropriate focal depth, foveated rendering and blurring aren’t enough to produce realistic focal accommodation in AR. Great stopgap solution for VR, but not for AR, except as a means of reducing load on the GPU. No amount of squinting will bring an artificially blurred image into focus. (It could be brought into “focus”, i.e. unblurred, by the software, but the optical focal distance would still be wrong.) Likewise, a crisp clear image of an object at distance of 2m, with the appropriate stereo convergence for that distance, but projected through optics focused at 1m, will be detached from its real-world surroundings that are also at a distance of 2m. Instead, the virtual object will share a focal distance with real world objects at 1m, but because of the 2m-appropriate convergence angle, those real objects at 1m will be seen double while the user’s gaze is fixed on the virtual content. So the actual optical focus needs to be changed. You can do this by a number of means, both mechanical and solid state. All of these means have a non-zero response latency, as does eye tracking. So really, you want to know where the user is going to be looking before their eyes even land there, which can probably be done by measuring the impulse at the beginning of a saccade in that direction. So now we’re talking about a very high frame rate for our eye tracking. Point is: the fact that eye tracking itself has improved is great, and contributes to solving the problem of a non-lightfield focal accommodation solution, but there is still quite a lot of non-trivial engineering involved in making it work. At least in my opinion, solid-state continual focus lightfield optics really are the only viable long term for mobile optically see-through AR headsets.

Leave A Reply

© Diversified Communications. All rights reserved.