Experience: The human eye resolution headset

Finnish startup Varjo has developed a human eye resolution VR headset, delivering a resolution of 60pixels per degree, akin to 20/20 vision. Reece Webb explores.

VR-1 is a VR headset using Varjo’s bespoke 20/20 eye tracker, designed to track eye movements by reflecting images on to the eye and tracking them algorithmically, using dual cameras, infrared illumination pattern approach and a computer vision algorithm to deliver an accuracy of less than one degree of visual angle.

This allows developers to determine what the user is focusing on in the VR space, concentrating processing power on that area to create a high definition render and using a workstation type setup such as a GeForce 2080 or an RTX 6000 to provide additional boost to performance and visual quality.

A setup like this can cost a few thousand Euros, using the latest in rendering technology and allowing for upgrades to the software and workstation to keep pace with improvements.

Most conventional high-end VR devices rely on two displays, one display for each eye, however VR-1 uses a total of four displays, with two displays per eye. 

A context display acts as a background display with an added semi-transparent mirror at 45 degrees which allows a smaller display to be projected from the side, creating a feeling of overlay.

The centre area uses a high-resolution display that matches human eye resolution, approximately 60 pixels per one-degree field of view.

The headset is controlled by using controllers from the Steam eco-system, allowing any controller made for that ecosystem to work with the headset.

Varjo also intends to develop its own controller designed for the professional market. Niko Eiden, founder and CEO of Varjo, set his sights on the professional market, working with high end clients in the automotive industry like Audi and Volvo who use the headsets to design and test cars.

Eiden has a background at Nokia and Microsoft, working to develop diffractive optics that went on to become the Microsoft HoloLens.

Eiden said: “Because our target is the professional market, how do you work? How do you spend time on a daily basis in VR and in the VR session? For that we need to have the right tool.

“Creative professionals want to use VR-1 for the design part, they want to show management the future cars, the experience of VR, they want to do factory floor planning in VR and then they want to use it for customers, showrooms and so that the customers can see the car before buying it.”

“From an interaction perspective, it’s very exciting to be here because for the first time, you can actually use eyesight to interact with things.” - Niko Eiden, Varjo CEO
Varjo have built a relationship with Audi and Volkswagen but see a wide variety of use cases in training and pre-experience applications.

Eiden explained: “Police, emergency rooms, control rooms, everything where you have situations that might happen infrequently, it’s very hard to actually exercise, so that one time that a user comes face to face with an unusual situation, they know how to react.”


At NXTBLD 2019 in London, Inavate experienced the VR-1 headset first hand to see if the quality of the resolution lived up to its reputation.
The headset itself is weighty but comfortable, with over 2,000 high-resolution images in a scene to create a realistic environment that a user can lose themselves in.

The most striking element of the experience conversely comes from one of its minutest aspects that emboldens the headset’s quasi-real experience: the in-sim text on objects, such as newspapers and CDs.

In simulations past, text on rendered objects has often been blurry or ignored due to its size and lack of importance in a scene, however with VR-1 you can closely examine a CD case and article to read the smallest texts in detail.

Another ‘make or break’ element of the VR-1 is of course its eye tracking. Varjo’s demo scene for the motion tracker places the user in a virtual air traffic control tower. They can operate and identify key elements on the ground and in the sky, including readable screens in the tower’s control room that can be operated by the eye as well individual moving vehicles and aircraft which, when highlighted by focusing on them with the user’s eyes, brings up specific information relevant to the user.

When adjusting to eye tracking, it is notable that the setup process can be very precise about the user’s eye level, requiring careful positioning of the headset for a first-time user to find the correct level for the eye tracker to calibrate to the user’s eye movements effectively. 

Once correctly calibrated, the tracker itself is responsive and highly accurate, responding to eye movements in real time; flicking your eyes quickly to test the tracker showed that there was no latency, with the movements themselves being precise and instantaneous throughout the simulation.

The tracker was able to detect even negligible movements that make a first-time user very conscious of small movements.

Unconsciously flicking your eyes can unintentionally highlight a different object, however this is easy to adjust to after a minute or two of use. 


Eiden added: “We have an eye tracker with eyeglasses and context lenses, nobody else has anything similar on the market.

“I’ve been looking at eye tracking for 15 years and for the first time, it’s at a level that you don’t have to focus on anything. From an interaction perspective, it’s very exciting to be here because for the first time, you can actually use eyesight to interact with things. 

“Previously, you had to concentrate and really try to focus on something, that might have worked, or it might not. We have a demo and you can select things just by looking at it and it’s really fast so it’s not just for training purposes, but even for use cases where you have your hands tied.

“We are redefining reality.” Says Eiden. 

Article Categories

Most Viewed