Google Patent | Compact Near-Eye Display Optics for Augmented Reality

Patent: Compact Near-Eye Display Optics for Augmented Reality

Publication Number: 20190025602

Publication Date: 2019-01-24

Applicants: Google


An optical system includes a first filter stack configured to convert light received from a display to a first circular polarization, a second filter stack configured to convert light received from external sources to a second circular polarization, and a third filter stack configured to reflect light having the first circular polarization and transmit light having the second circular polarization. The optical system also includes a refractive beam splitting lens configured to transmit light received from the second filter stack to the third filter stack. The second filter stack is oriented to reflect light received from the first filter stack onto the refractive beam splitting lens. The optical system is implemented in augmented reality devices, such as head mounted devices (HMDs), to combine images generated by the display with light received from external sources.


Augmented reality (AR) systems typically utilize a head mounted display (HMD) device that focuses light rays received from the environment and light rays generated by a display onto the eyes of a user. A user wearing the HMD device therefore views a scene of the real world that is “augmented” with virtual images. For example, an HMD device can augment the user’s view of an unfamiliar street by overlaying a virtual image including walking directions. The optical system implemented in an HMD that supports AR functionality typically includes a beam splitting element that transmits external light to the user’s eyes and reflects light from the display into the path of the external light, as well as an optical element to focus light onto the user’s eyes. Several designs of HMD devices that provide AR functionality are currently available. The optical systems implemented in these HMD devices include birdbath optics (a concave mirror and a display separated by a beam splitter that combines the virtual image with the see-through image), a display coupled into a (geometric or diffractive) waveguide by a collimation lens, a display coupled to multiple freeform reflectors, and a display coupled to a freeform prism.

These optical systems share a common deficiency: the element that provides optical power (e.g., focusing of the light rays) is positioned relatively far from the user’s eyes, which reduces the field-of-view of the virtual image. For example, a typical field-of-view for a conventional AR system is around The field-of-view can be increased by increasing the size of the AR system, but this is undesirable in a wearable HMD device. Furthermore, some of the optical systems distort the see-through image. For example, a geometrical waveguide that uses total internal reflection to guide the virtual image to the user’s eyes can generate segmented shadows in the see-through image. For another example, a diffractive waveguide that uses diffraction to guide the virtual image to the user’s eyes can generate ghost images from unwanted diffraction orders. For yet another example, a freeform prism can create non-uniform see-through distortion that causes eyestrain.


You may also like...