Researchers combine holography and light field tech for better 3D experience

Researchers combine holography and light field tech for better 3D experience
VR and AR has promising applications across a range of markets and usage is growing, but there are still some limitations with the technologies. To tackle motion sickness and other visual disturbances without increasing size, weight and cost of AR/VR devices, a team of researchers have turned to holography and light field technologies.

Holography and light field technology have been used with this goal in mind before, but it’s always required additional optics that create bulky, heavy and expensive products, which have struggled to be commercially successful. 

Hoping for better chances of success, a group of researchers in Japan and Belgium decided to take a different approach. 

“Objects we see around us scatter light in different directions at different intensities in a way defined by the object’s characteristic features—including size, thickness, distance, colour, texture,” said researcher Boaz Jessie Jackin of the National Institute of Information and Communication Technology in Japan. “The modulated [scattered] light is then received by the human eye and its characteristic features are reconstructed within the human brain.”

Devices capable of generating the same modulated light—without the physical object present—are known as true 3-D displays, which includes holography and light-field displays. “Faithfully reproducing all of the object’s features, the so-called ‘modulation,’ is very expensive,” said Jackin. “The required modulation is first numerically computed and then converted into light signals by a liquid crystal device (LCD). These light signals are then picked up by other optical components like lenses, mirrors, beam combiners and so on.”

The additional optical components, which are usually made of glass, play an important role because they determine the final performance and size of the display device.

This is where holographic optical elements can make a big difference. “A holographic optical element is a thin sheet of photosensitive material—think photographic film—that can replicate the functions of one or more additional optical components,” said Jackin. “They aren’t bulky or heavy, and can be adapted into smaller form factors. Fabricating them emerged as a new challenge for us here, but we’ve developed a solution.”

Recording, or fabricating, a hologram that can replicate the function of a glass-made optical component requires that particular optical component to be physically present during the recording process. This recording is an analogue process that relies on lasers and recording film; no digital signals or information are used.

“Recording multiple optical components requires that all of them be present in the recording process, which makes it complex and, in most cases, impossible to do,” said Jackin.

The group decided to print/record the hologram digitally, calling the solution a “digitally designed holographic optical element” (DDHOE). They use a holographic recording process that requires none of the optical components to be physically present during the recording, yet all the optical components’ functions can be recorded.

“The idea is to digitally compute the hologram of all the optical functions [to be recorded and] reconstruct them together optically using a LCD and laser,” said Jackin. “This reconstructed optical signal resembles the light that is otherwise modulated by all of those optical components together. The reconstructed light is then used to record the final holographic optical element. Since the reconstructed light had all optical functions, the recorded hologram on the photosensitive film will be able to modulate a light with all of those functions. So all of the additional optics needed can be replaced by a single holographic film.”

In terms of applications, the researchers have already put DDHOE to the test on a head-up light field 3D display. The system is see-through, so it’s suitable for augmented reality applications.

“Our system uses a commercially available 2D projector to display a set of multi-view images onto a micro-lens array sheet—which is usually glass or plastic,” said Jackin.  “The sheet receives the light from the projector and modulates it to reconstruct the 3D images in space, so a viewer looking through the micro-lens array perceives the image in 3D.”

One big difficulty their approach overcomes is that light from a 2D projector diverges and must be made collimated into a parallel beam before it hits the micro-lens array in order to accurately reconstruct the 3D images in space.

“As displays get larger, the collimating lens should also increase in size. This leads to a bulky and heavy lens, the system consuming long optical path length and also the fabrication of the collimating lens gets costly,” said Jackin. “It’s the main bottleneck preventing such a system from achieving any commercial success.”

Jackin and colleagues’ approach completely avoids the requirement of collimation optics by incorporating its function on the lens array itself. The micro-lens array is a fabricated DDHOE, which includes the collimating functions.

The researchers went on to create a head-up, see-through 3-D display, which could soon offer an alternative to the current models that use the bulky collimation optics.

Researchers will present the work during The Optical Society’s (OSA) Frontiers in Optics meeting, September, 16 to 20 in Washington, D.C.

Source: The Optical Society 

Full research paper

Figure: a) A fabricated DDHOE lens array; b) The 3D display system consisting of a 2D projector and DDHOE; c) A computer-modelled 3D scene of depth 6cm; d) A 3D reconstruction of a modelled scene captured by camera looking into the DDHOE