Magic Leap Patent | Augmented reality display with waveguide configured to capture images of eye and/or environment
Patent: Augmented reality display with waveguide configured to capture images of eye and/or environment
Publication Number: 20250155722
Publication Date: 2025-05-15
Assignee: Magic Leap
Abstract
Head mounted display systems configured to project light to an eye of a user to display augmented reality image content in a vision field of the user are disclosed. In embodiments, the system includes a frame configured to be supported on a head of the user, an image projector configured to project images into the user's eye, a camera coupled to the frame, a waveguide optically coupled to the camera, an optical coupling optical element me, an out-coupling element configured to direct light emitted from the waveguide to the camera, and a first light source configured to direct light to the user's eye through the waveguide. Electronics control the camera to capture images periodically and farther control the first light source to pulse in time with the camera such that light emitted by the light source has a reduced intensity when the camera is not capturing images.
Claims
What is claimed is:
1.A head mounted display system configured to project light to an eye of a user to display augmented reality image content in a vision field of the user, the head mounted display system comprising:a frame configured to be supported on a head of the user; an image projector coupled to the frame and configured to project images into the user's eye to display image content in the vision field of the user; a camera coupled to the frame; a waveguide coupled to the frame and optically coupled to the camera; an optical coupling optical element coupled to the frame and configured such that light is coupled into the waveguide; an out-coupling element coupled to the frame and configured to direct light emitted from the waveguide to the camera; a first light source coupled to the frame and configured to direct light to the user's eye; and electronics coupled to the first light source and to the camera, the electronics configured to control the camera to capture images periodically.
Description
CROSS-REFERENCE TO RELATED APPLICATION(S)
This application is a continuation application of U.S. application Ser. No. 18/493,429, filed Oct. 24, 2023. U.S. application Ser. No. 18/493,429 is a continuation application of U.S. application Ser. No. 16/138,228, filed Sep. 21, 2018. U.S. application Ser. No. 16/138,228 is a nonprovisional application of U.S. Provisional Application No. 62/561,645, filed on Sep. 21, 2017. This application claims priority to, and hereby incorporates by reference, U.S. application Ser. No. 18/493,429, U.S. application Ser. No. 16/138,228, and U.S. Provisional Application No. 62/561,645.
FIELD OF THE DISCLOSURE
The present disclosure relates to optical devices, including augmented reality imaging and visualization systems.
BACKGROUND OF THE DISCLOSURE
Modern computing and display technologies have facilitated the development of systems for so called “virtual reality” or “augmented reality” experiences, in which digitally reproduced images or portions thereof are presented to a user in a manner wherein they seem to be, or may be perceived as, real. A virtual reality, or “VR”, scenario typically involves the presentation of digital or virtual image information without transparency to other actual real-world visual input; an augmented reality, or “AR”, scenario typically involves presentation of digital or virtual image information as an augmentation to visualization of the actual world around the user. A mixed reality, or “MR”, scenario is a type of AR scenario and typically involves virtual objects that are integrated into, and responsive to, the natural world. For example, an MR scenario may include AR image content that appears to be blocked by or is otherwise perceived to interact with objects in the real world.
Referring to FIG. 1, an augmented reality scene 1 depicted. The user of an AR technology sees a real-world park-like setting 2aturing people, trees, buildings in the background, and a concrete platform 3he user also perceives that he/she “sees” “virtual content” such as a robot statue 4anding upon the real-world platform 3nd a flying cartoon-like avatar character 5ich seems to be a personification of a bumble bee. These elements 5e “virtual” in that they do not exist in the real world. Because the human visual perception system is complex, it is challenging to produce AR technology that facilitates a comfortable, natural-feeling, rich presentation of virtual image elements amongst other virtual or real-world imagery elements.
Systems and methods disclosed herein address various challenges related to AR and VR technology.
Polarizing beam splitters may be used in display systems to direct polarized light to light modulators and then to direct this light to a viewer. There is a continuing demand to reduce the sizes of display systems generally and, as a result, there is also a demand to reduce the sizes of the constituent parts of the display systems, including constituent parts utilizing polarizing beam splitters.
SUMMARY
Various implementations described herein include display systems configured to provide illumination and/or image projection to the eye. Additionally or alternatively, the display systems can image the eye and/or the environment.
In some embodiments, a head mounted display system is configured to project light to an eye of a user to display augmented reality image content in a vision field of said user. The head-mounted display system can include a frame that is configured to be supported on a head of the user. The display system can also include an image projector that is configured to project images into the user's eye to display image content in the vision field of the user. The display system can include a camera, at least one waveguide, at least one coupling optical element that is configured such that light is coupled into said waveguide and guided therein, and at least one out-coupling element. The at least one out-coupling element can be configured to couple light that is guided within said waveguide out of said waveguide and direct said light to said camera. The camera can be disposed in an optical path with respect to said at least one out-coupling optical element to receive at least a portion of the light that is coupled into said waveguide via the coupling element and guided therein and that is coupled out from said waveguide by said out-coupling coupling element such that images may be captured by said camera.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 illustrates a user's view of augmented reality (AR) through an AR device.
FIG. 2 illustrates an example of wearable display system.
FIG. 3 illustrates a conventional display system for simulating three-dimensional imagery for a user.
FIG. 4 illustrates aspects of an approach for simulating three-dimensional imagery using multiple depth planes.
FIGS. 5A-5C illustrate relationships between radius of curvature and focal radius.
FIG. 6 illustrates an example of a waveguide stack for outputting image information to a user.
FIG. 7 illustrates an example of exit beams outputted by a waveguide.
FIG. 8 illustrates an example of a stacked waveguide assembly in which each depth plane includes images formed using multiple different component colors.
FIG. 9A illustrates a cross-sectional side view of an example of a set of stacked waveguides that each includes an incoupling optical element. As discuss herein, the stack of waveguide may comprise an eyepiece.
FIG. 9B illustrates a perspective view of an example of the plurality of stacked waveguides of FIG. 9A.
FIG. 9C illustrates a top-down plan view of an example of the plurality of stacked waveguides of FIGS. 9A and 9B.
FIG. 1 schematically illustrates a cross-sectional side view of an example imaging system comprising an eyepiece, an image projector, a light source for illuminating the eye, and a camera for capturing an image of the eye.
FIG. 11A schematically illustrates the light source for illuminating the eye and the image projector for injecting images in the eye both emitting light toward an incoupling optical element on a waveguide of the eyepiece.
FIG. 11B schematically illustrates projected light from the light source and from the image projector coupled into the waveguide.
FIG. 11C schematically illustrates how incoupled light may propagate through a waveguide by total internal reflection.
FIG. 11D schematically illustrates light from the light source and from the image projector coupled out of the eyepiece.
FIG. 11E schematically illustrates the waveguide and coupling optical element configured to propagate incoupled light at least along a full dimension (e.g., along the x-direction) of the coupling optical element. Light entering the eye is shown from an extended source (e.g., the imaging light will capture a region of the retina).
FIG. 12A is a cross-sectional view that schematically shows light reflected from the retina exiting the eye and incident on the eyepiece.
FIG. 12B schematically illustrates the example light coupled into the waveguide of the eyepiece.
FIG. 12C schematically illustrates collimated incoupled light from the eye propagating through a waveguide toward an imaging device.
FIG. 12D schematically shows incoupled light from the eye propagating to the one or more outcoupling optical elements.
FIG. 12E schematically illustrates light from the eye coupled out of the waveguide by the outcoupling optical element and directed to the camera so that an image of the eye (e.g., the retina) can be captured by the camera.
FIG. 13A schematically illustrates how the imaging system can image various portions of the eye, for example, of the retina, which can enable the orientation of the eye to be determined and the eye position tracked.
FIG. 13B illustrates a pattern of sequentially displayed fixation targets used to cause the eye to be directed in a variety of different directions during which the retina is imaged. The resultant images correspond to non-identical portions of the retina. For example, when the eye is directed in various directions to view differently located fixation targets on the display, images captured by the camera include different portions of the retina. These images can be assembled to form a larger map or composite image of the retina.
FIG. 14A schematically illustrates a cross-sectional view of an imaging system comprising an eyepiece and a camera for collecting light from the environment forward the eyepiece. Light from the environment is shown reflected off or emitted from one or more physical objects in the environment. Collection of light from objects in the environment in front of the eyepiece can enable images of the environment to be captured.
FIG. 14B schematically illustrates light from the environment being coupled by the coupling optical element into a waveguide of the eyepiece.
FIG. 14C schematically illustrates an imaging system for collecting light from the environment using a powered optical element, such as a refractive optical element (e.g., lens such as a wide field of view lens), forward the eyepiece.
FIG. 15A schematically illustrates an example imaging system comprising a polarization selective incoupling optical element for receiving light a illumination source and coupling the light into a waveguide in an eyepiece. The eyepiece further includes a polarization selective light coupling element for coupling light out of the waveguide. A polarizer may be used to polarize the light from the illumination source and a half wave retarder may be used to rotate the orientation of the linearly polarized light to be turned into the waveguide by the polarization selective incoupling optical element.
FIG. 15B schematically illustrates light from the eye (e.g., from the retina illuminated with infrared light from the illumination source) being coupled back into the waveguide and directed to a camera for image capture.
FIG. 16 schematically illustrates an imaging system configured for imaging an anterior portion (e.g., cornea) of an eye. The imaging system comprises an eyepiece such as described above. The imaging system further includes a positive lens for collimating light collect from the anterior portion of the eye for coupling via an optical coupling element into a waveguide and propagation to a camera for image capture. The system further comprises a negative lens to offset the positive power introduced by the positive lens and to prevent inversion of images of the environment in front of the eyepiece that would otherwise be caused by the positive lens.
FIG. 17 schematically illustrates another example imaging system configured for imaging an anterior portion (e.g., cornea) of an eye. The imaging system comprises a curved wavelength selective reflector that collimates light from the anterior portion of the eye for coupling via an optical coupling element into a waveguide and propagation to a camera for image capture. The wavelength selective reflector may operate in reflection for infrared light reflected from the eye and in transmission for visible light from the environment in front of the user.
FIG. 18 schematically illustrates an example imaging system that also includes a curved wavelength selective reflector that collimates light from the anterior portion of the eye for coupling via an optical coupling element into a waveguide and propagation to a camera for image capture. Polarization selectivity may be employed to assist in controlling the path of the light reflected from the eye. Illumination of the eye is provided via the waveguide instead of a plurality of light source between the waveguide and the eye as shown in FIG. 18.
FIG. 19 schematically illustrates an imaging system that includes a shutter to assist in a procedure for subtracting out noise.
FIGS. 20A-20E schematically illustrate an alternative procedure for subtracting out noise using wavelength modulation in conjunction with a curved wavelength selective reflector.
FIG. 21 shows an example eyepiece that can be used to simultaneously project light into a user's eye to provide image content thereto while receiving image data of the user's eye or of the environment in front of the user.
FIG. 22 illustrates a cross-sectional side view of an example of a cholesteric liquid crystal diffraction grating (CLCG) having a plurality of uniform chiral structures.
FIG. 23 illustrates an example of an imaging system comprising a forward-facing camera configured to images a wearer's eye using a cholesteric liquid crystal (CLC) off-axis mirror.
The drawings are provided to illustrate example embodiments and are not intended to limit the scope of the disclosure. Like reference numerals refer to like parts throughout.
Publication Number: 20250155722
Publication Date: 2025-05-15
Assignee: Magic Leap
Abstract
Head mounted display systems configured to project light to an eye of a user to display augmented reality image content in a vision field of the user are disclosed. In embodiments, the system includes a frame configured to be supported on a head of the user, an image projector configured to project images into the user's eye, a camera coupled to the frame, a waveguide optically coupled to the camera, an optical coupling optical element me, an out-coupling element configured to direct light emitted from the waveguide to the camera, and a first light source configured to direct light to the user's eye through the waveguide. Electronics control the camera to capture images periodically and farther control the first light source to pulse in time with the camera such that light emitted by the light source has a reduced intensity when the camera is not capturing images.
Claims
What is claimed is:
1.
Description
CROSS-REFERENCE TO RELATED APPLICATION(S)
This application is a continuation application of U.S. application Ser. No. 18/493,429, filed Oct. 24, 2023. U.S. application Ser. No. 18/493,429 is a continuation application of U.S. application Ser. No. 16/138,228, filed Sep. 21, 2018. U.S. application Ser. No. 16/138,228 is a nonprovisional application of U.S. Provisional Application No. 62/561,645, filed on Sep. 21, 2017. This application claims priority to, and hereby incorporates by reference, U.S. application Ser. No. 18/493,429, U.S. application Ser. No. 16/138,228, and U.S. Provisional Application No. 62/561,645.
FIELD OF THE DISCLOSURE
The present disclosure relates to optical devices, including augmented reality imaging and visualization systems.
BACKGROUND OF THE DISCLOSURE
Modern computing and display technologies have facilitated the development of systems for so called “virtual reality” or “augmented reality” experiences, in which digitally reproduced images or portions thereof are presented to a user in a manner wherein they seem to be, or may be perceived as, real. A virtual reality, or “VR”, scenario typically involves the presentation of digital or virtual image information without transparency to other actual real-world visual input; an augmented reality, or “AR”, scenario typically involves presentation of digital or virtual image information as an augmentation to visualization of the actual world around the user. A mixed reality, or “MR”, scenario is a type of AR scenario and typically involves virtual objects that are integrated into, and responsive to, the natural world. For example, an MR scenario may include AR image content that appears to be blocked by or is otherwise perceived to interact with objects in the real world.
Referring to FIG. 1, an augmented reality scene 1 depicted. The user of an AR technology sees a real-world park-like setting 2aturing people, trees, buildings in the background, and a concrete platform 3he user also perceives that he/she “sees” “virtual content” such as a robot statue 4anding upon the real-world platform 3nd a flying cartoon-like avatar character 5ich seems to be a personification of a bumble bee. These elements 5e “virtual” in that they do not exist in the real world. Because the human visual perception system is complex, it is challenging to produce AR technology that facilitates a comfortable, natural-feeling, rich presentation of virtual image elements amongst other virtual or real-world imagery elements.
Systems and methods disclosed herein address various challenges related to AR and VR technology.
Polarizing beam splitters may be used in display systems to direct polarized light to light modulators and then to direct this light to a viewer. There is a continuing demand to reduce the sizes of display systems generally and, as a result, there is also a demand to reduce the sizes of the constituent parts of the display systems, including constituent parts utilizing polarizing beam splitters.
SUMMARY
Various implementations described herein include display systems configured to provide illumination and/or image projection to the eye. Additionally or alternatively, the display systems can image the eye and/or the environment.
In some embodiments, a head mounted display system is configured to project light to an eye of a user to display augmented reality image content in a vision field of said user. The head-mounted display system can include a frame that is configured to be supported on a head of the user. The display system can also include an image projector that is configured to project images into the user's eye to display image content in the vision field of the user. The display system can include a camera, at least one waveguide, at least one coupling optical element that is configured such that light is coupled into said waveguide and guided therein, and at least one out-coupling element. The at least one out-coupling element can be configured to couple light that is guided within said waveguide out of said waveguide and direct said light to said camera. The camera can be disposed in an optical path with respect to said at least one out-coupling optical element to receive at least a portion of the light that is coupled into said waveguide via the coupling element and guided therein and that is coupled out from said waveguide by said out-coupling coupling element such that images may be captured by said camera.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 illustrates a user's view of augmented reality (AR) through an AR device.
FIG. 2 illustrates an example of wearable display system.
FIG. 3 illustrates a conventional display system for simulating three-dimensional imagery for a user.
FIG. 4 illustrates aspects of an approach for simulating three-dimensional imagery using multiple depth planes.
FIGS. 5A-5C illustrate relationships between radius of curvature and focal radius.
FIG. 6 illustrates an example of a waveguide stack for outputting image information to a user.
FIG. 7 illustrates an example of exit beams outputted by a waveguide.
FIG. 8 illustrates an example of a stacked waveguide assembly in which each depth plane includes images formed using multiple different component colors.
FIG. 9A illustrates a cross-sectional side view of an example of a set of stacked waveguides that each includes an incoupling optical element. As discuss herein, the stack of waveguide may comprise an eyepiece.
FIG. 9B illustrates a perspective view of an example of the plurality of stacked waveguides of FIG. 9A.
FIG. 9C illustrates a top-down plan view of an example of the plurality of stacked waveguides of FIGS. 9A and 9B.
FIG. 1 schematically illustrates a cross-sectional side view of an example imaging system comprising an eyepiece, an image projector, a light source for illuminating the eye, and a camera for capturing an image of the eye.
FIG. 11A schematically illustrates the light source for illuminating the eye and the image projector for injecting images in the eye both emitting light toward an incoupling optical element on a waveguide of the eyepiece.
FIG. 11B schematically illustrates projected light from the light source and from the image projector coupled into the waveguide.
FIG. 11C schematically illustrates how incoupled light may propagate through a waveguide by total internal reflection.
FIG. 11D schematically illustrates light from the light source and from the image projector coupled out of the eyepiece.
FIG. 11E schematically illustrates the waveguide and coupling optical element configured to propagate incoupled light at least along a full dimension (e.g., along the x-direction) of the coupling optical element. Light entering the eye is shown from an extended source (e.g., the imaging light will capture a region of the retina).
FIG. 12A is a cross-sectional view that schematically shows light reflected from the retina exiting the eye and incident on the eyepiece.
FIG. 12B schematically illustrates the example light coupled into the waveguide of the eyepiece.
FIG. 12C schematically illustrates collimated incoupled light from the eye propagating through a waveguide toward an imaging device.
FIG. 12D schematically shows incoupled light from the eye propagating to the one or more outcoupling optical elements.
FIG. 12E schematically illustrates light from the eye coupled out of the waveguide by the outcoupling optical element and directed to the camera so that an image of the eye (e.g., the retina) can be captured by the camera.
FIG. 13A schematically illustrates how the imaging system can image various portions of the eye, for example, of the retina, which can enable the orientation of the eye to be determined and the eye position tracked.
FIG. 13B illustrates a pattern of sequentially displayed fixation targets used to cause the eye to be directed in a variety of different directions during which the retina is imaged. The resultant images correspond to non-identical portions of the retina. For example, when the eye is directed in various directions to view differently located fixation targets on the display, images captured by the camera include different portions of the retina. These images can be assembled to form a larger map or composite image of the retina.
FIG. 14A schematically illustrates a cross-sectional view of an imaging system comprising an eyepiece and a camera for collecting light from the environment forward the eyepiece. Light from the environment is shown reflected off or emitted from one or more physical objects in the environment. Collection of light from objects in the environment in front of the eyepiece can enable images of the environment to be captured.
FIG. 14B schematically illustrates light from the environment being coupled by the coupling optical element into a waveguide of the eyepiece.
FIG. 14C schematically illustrates an imaging system for collecting light from the environment using a powered optical element, such as a refractive optical element (e.g., lens such as a wide field of view lens), forward the eyepiece.
FIG. 15A schematically illustrates an example imaging system comprising a polarization selective incoupling optical element for receiving light a illumination source and coupling the light into a waveguide in an eyepiece. The eyepiece further includes a polarization selective light coupling element for coupling light out of the waveguide. A polarizer may be used to polarize the light from the illumination source and a half wave retarder may be used to rotate the orientation of the linearly polarized light to be turned into the waveguide by the polarization selective incoupling optical element.
FIG. 15B schematically illustrates light from the eye (e.g., from the retina illuminated with infrared light from the illumination source) being coupled back into the waveguide and directed to a camera for image capture.
FIG. 16 schematically illustrates an imaging system configured for imaging an anterior portion (e.g., cornea) of an eye. The imaging system comprises an eyepiece such as described above. The imaging system further includes a positive lens for collimating light collect from the anterior portion of the eye for coupling via an optical coupling element into a waveguide and propagation to a camera for image capture. The system further comprises a negative lens to offset the positive power introduced by the positive lens and to prevent inversion of images of the environment in front of the eyepiece that would otherwise be caused by the positive lens.
FIG. 17 schematically illustrates another example imaging system configured for imaging an anterior portion (e.g., cornea) of an eye. The imaging system comprises a curved wavelength selective reflector that collimates light from the anterior portion of the eye for coupling via an optical coupling element into a waveguide and propagation to a camera for image capture. The wavelength selective reflector may operate in reflection for infrared light reflected from the eye and in transmission for visible light from the environment in front of the user.
FIG. 18 schematically illustrates an example imaging system that also includes a curved wavelength selective reflector that collimates light from the anterior portion of the eye for coupling via an optical coupling element into a waveguide and propagation to a camera for image capture. Polarization selectivity may be employed to assist in controlling the path of the light reflected from the eye. Illumination of the eye is provided via the waveguide instead of a plurality of light source between the waveguide and the eye as shown in FIG. 18.
FIG. 19 schematically illustrates an imaging system that includes a shutter to assist in a procedure for subtracting out noise.
FIGS. 20A-20E schematically illustrate an alternative procedure for subtracting out noise using wavelength modulation in conjunction with a curved wavelength selective reflector.
FIG. 21 shows an example eyepiece that can be used to simultaneously project light into a user's eye to provide image content thereto while receiving image data of the user's eye or of the environment in front of the user.
FIG. 22 illustrates a cross-sectional side view of an example of a cholesteric liquid crystal diffraction grating (CLCG) having a plurality of uniform chiral structures.
FIG. 23 illustrates an example of an imaging system comprising a forward-facing camera configured to images a wearer's eye using a cholesteric liquid crystal (CLC) off-axis mirror.
The drawings are provided to illustrate example embodiments and are not intended to limit the scope of the disclosure. Like reference numerals refer to like parts throughout.