空 挡 广 告 位 | 空 挡 广 告 位

Magic Leap Patent | Eyepieces For Augmented Reality Display System

Patent: Eyepieces For Augmented Reality Display System

Publication Number: 20200159023

Publication Date: 20200521

Applicants: Magic Leap

Abstract

An eyepiece waveguide for an augmented reality display system. The eyepiece waveguide can include an input coupling grating (ICG) region. The ICG region can couple an input beam into the substrate of the eyepiece waveguide as a guided beam. A first combined pupil expander-extractor (CPE) grating region can be formed on or in a surface of the substrate. The first CPE grating region can receive the guided beam, create a first plurality of diffracted beams at a plurality of distributed locations, and out-couple a first plurality of output beams. The eyepiece waveguide can also include a second CPE grating region formed on or in the opposite surface of the substrate. The second CPE grating region can receive the guided beam, create a second plurality of diffracted beams at a plurality of distributed locations, and out-couple a second plurality of output beams.

INCORPORATION BY REFERENCE TO ANY PRIORITY APPLICATIONS

[0001] This application claims priority to U.S. Provisional Patent Application 62/769,933, filed Nov. 20, 2018, and entitled “EYEPIECES FOR AUGMENTED REALITY DISPLAY SYSTEM.” The foregoing application(s), and any other application(s) for which a foreign or domestic priority claim is identified in the Application Data Sheet as filed with the present application, are hereby incorporated by reference under 37 CFR 1.57.

BACKGROUND

Field

[0002] This disclosure relates to eyepieces for virtual reality, augmented reality, and mixed reality systems.

Description of the Related Art

[0003] Modern computing and display technologies have facilitated the development of virtual reality, augmented reality, and mixed reality systems. Virtual reality, or “VR,” systems create a simulated environment for a user to experience. This can be done by presenting computer-generated image data to the user through a head-mounted display. This image data creates a sensory experience which immerses the user in the simulated environment. A virtual reality scenario typically involves presentation of only computer-generated image data rather than also including actual real-world image data.

[0004] Augmented reality systems generally supplement a real-world environment with simulated elements. For example, augmented reality, or “AR,” systems may provide a user with a view of the surrounding real-world environment via a head-mounted display. However, computer-generated image data can also be presented on the display to enhance the real-world environment. This computer-generated image data can include elements which are contextually-related to the real-world environment. Such elements can include simulated text, images, objects, etc. Mixed reality, or “MR,” systems are a type of AR system which also introduce simulated objects into a real-world environment, but these objects typically feature a greater degree of interactivity. The simulated elements can often times be interactive in real time.

[0005] FIG. 1 depicts an example AR scene 1 where a user sees a real-world park setting 6 featuring people, trees, buildings in the background, and a concrete platform 20. In addition to these items, computer-generated image data is also presented to the user. The computer-generated image data can include, for example, a robot statue 10 standing upon the real-world platform 20, and a cartoon-like avatar character 2 flying by which seems to be a personification of a bumblebee, even though these elements 2, 10 are not actually present in the real-world environment.

SUMMARY

[0006] In some embodiments, an eyepiece waveguide for an augmented reality display system comprises: an optically transmissive substrate having a first surface and a second surface; an input coupling grating (ICG) region formed on or in one of the surfaces of the substrate, the ICG region being configured to receive an input beam of light and to couple the input beam into the substrate as a guided beam; a first combined pupil expander-extractor (CPE) grating region formed on or in the first surface of the substrate, the first CPE grating region being positioned to receive the guided beam from the ICG region and to create a first plurality of diffracted beams at a plurality of distributed locations, and to out-couple a first plurality of output beams; and a second CPE grating region formed on or in the second surface of the substrate, the second CPE grating region being positioned to receive the guided beam from the ICG region and to create a second plurality of diffracted beams at a plurality of distributed locations, and to out-couple a second plurality of output beams.

[0007] In some embodiments, an eyepiece waveguide for an augmented reality display system comprises: an optically transmissive substrate; an input coupling grating (ICG) region; a first combined pupil expander-extractor (CPE) grating region; and a second CPE grating region, wherein the ICG region is configured receive a set of a plurality of input beams of light, the set of input beams being associated with a set of k-vectors which form a field of view (FOV) shape located at the center of a k-space annulus associated with the eyepiece waveguide; wherein the ICG region is configured to diffract the input beams so as to couple them into the substrate as guided beams and so as to translate the FOV shape to a first position at least partially within the k-space annulus; wherein the first CPE grating region is configured to diffract the guided beams so as to translate the FOV shape from the first position to a second position at least partially within the k-space annulus; wherein the second CPE grating region is configured to diffract the guided beams so as to translate the FOV shape from the first position to a third position at least partially within the k-space annulus, wherein the first CPE grating region is configured to diffract the guided beams so as to translate the FOV shape from the third position to the center of the k-space annulus, and wherein the second CPE grating region is configured to diffract the guided beams so as to translate the FOV shape from the second position to the center of the k-space annulus.

[0008] In some embodiments, an eyepiece waveguide for an augmented reality display system comprises: an optically transmissive substrate having a first surface and a second surface; an input coupling grating (ICG) region formed on or in one of the surfaces of the substrate, the ICG region being configured to receive a beam of light and to couple the beam into the substrate in a guided propagation mode; and a first combined pupil expander-extractor (CPE) grating region formed on or in the first surface of the substrate, the first CPE grating region being positioned to receive the beam of light from the ICG region, and the first CPE grating region comprising a plurality of diffractive features configured to alter the propagation direction of the beam with a first interaction, and to out-couple the beam from the eyepiece waveguide with a second interaction.

[0009] In some embodiments, an eyepiece waveguide for an augmented reality display system comprises: an optically transmissive substrate; an input coupling grating (ICG) region; and a first combined pupil expander-extractor (CPE) grating region formed on a first side of the substrate, wherein the ICG region is configured to receive a set of a plurality of input beams of light, the set of input beams being associated with a set of k-vectors which form a field of view (FOV) shape located at the center of a k-space annulus associated with the eyepiece waveguide; wherein the ICG region is configured to diffract the input beams so as to couple them into the substrate as guided beams and so as to translate the FOV shape to a first position at least partially within the k-space annulus; wherein, with a first interaction, the first CPE grating region is configured to diffract the guided beams so as to translate the FOV shape from the first position to second and third positions at least partially within the k-space annulus; and wherein, with a second interaction, the first CPE grating region is configured to diffract the guided beams so as to translate the FOV shape from the second and third positions to the center of the k-space annulus.

BRIEF DESCRIPTION OF THE DRAWINGS

[0010] FIG. 1 illustrates a user’s view of augmented reality (AR) through an AR device.

[0011] FIG. 2 illustrates an example of a wearable display system.

[0012] FIG. 3 illustrates a conventional display system for simulating three-dimensional image data for a user.

[0013] FIG. 4 illustrates aspects of an approach for simulating three-dimensional image data using multiple depth planes.

[0014] FIGS. 5A-5C illustrate relationships between radius of curvature and focal radius.

[0015] FIG. 6 illustrates an example of a waveguide stack for outputting image information to a user in an AR eyepiece.

[0016] FIGS. 7A-7B illustrate examples of exit beams outputted by a waveguide.

[0017] FIG. 8 illustrates an example of a stacked waveguide assembly in which each depth plane includes images formed using multiple different component colors.

[0018] FIG. 9A illustrates a cross-sectional side view of an example of a set of stacked waveguides that each includes an in-coupling optical element.

[0019] FIG. 9B illustrates a perspective view of an example of the plurality of stacked waveguides of FIG. 9A.

[0020] FIG. 9C illustrates a top-down plan view of an example of the plurality of stacked waveguides of FIGS. 9A and 9B.

[0021] FIG. 10 is a perspective view of an example AR eyepiece waveguide stack.

[0022] FIG. 11 is a cross-sectional view of a portion of an example eyepiece waveguide stack with an edge seal structure for supporting eyepiece waveguides in a stacked configuration.

[0023] FIGS. 12A and 12B illustrate top views of an eyepiece waveguide in operation as it projects an image toward a user’s eye.

[0024] FIG. 13A illustrates a k-vector which can be used to represent the propagation direction of a light ray or a light beam.

[0025] FIG. 13B illustrates a light ray within a planar waveguide.

[0026] FIG. 13C illustrates the permissible k-vectors for light of a given angular frequency, co, propagating in an unbounded homogenous medium with refractive index, n.

[0027] FIG. 13D illustrates the permissible k-vectors for light of a given angular frequency, co, propagating in a homogenous planar waveguide medium with refractive index, n.

[0028] FIG. 13E illustrates an annulus in k-space which corresponds to k-vectors of light waves which can be guided within a waveguide having a refractive index, n.sub.2.

[0029] FIG. 13F shows a k-space diagram and an eyepiece waveguide which illustrate the relationship between a k-vector and the density of interactions between a guided beam corresponding to that k-vector and a diffraction grating formed on or in the waveguide.

[0030] FIG. 13G illustrates a top view of a diffraction grating and some of its associated k-space diffraction grating vectors (G-2, G-1, G1, G2).

[0031] FIG. 13H illustrates a transverse view of the diffraction grating and its effect, in k-space, on a k-vector corresponding to a normally-incident ray or beam of light.

[0032] FIG. 13I illustrates a transverse view of the diffraction grating shown in FIG. 13G and its effect, in k-space, on a k-vector corresponding to an obliquely-incident ray or beam of light.

[0033] FIG. 13J is a k-space diagram which illustrates the field of view of an image that is projected into an AR eyepiece waveguide.

[0034] FIG. 13K is a k-space diagram which shows the translational shift, in k-space, of the FOV rectangle which is caused by an input coupling grating (ICG) located at the entrance pupil of an eyepiece waveguide.

[0035] FIG. 14A illustrates an example eyepiece waveguide with an ICG region, an orthogonal pupil expander (OPE) region, and an exit pupil expander (EPE) region.

[0036] FIG. 14B illustrates the k-space operation of the eyepiece waveguide shown in FIG. 14A.

[0037] FIG. 14C illustrates the optical operation of the OPE region shown in FIGS. 14A and 14B.

[0038] FIG. 14D illustrates a technique for determining the sizes and shapes of the OPE region and the EPE region.

[0039] FIG. 15A illustrates an example embodiment of a waveguide eyepiece in which the OPE region is tilted and located such that its lower border is parallel to the upper border of the EPE region.

[0040] FIG. 15B includes k-space diagrams which illustrate the operation of the eyepiece waveguide shown in FIG. 15A.

[0041] FIG. 15C is another k-space diagram which illustrates the operation of the eyepiece waveguide shown in FIG. 15A.

[0042] FIG. 15D is a diagram of the first generation of interactions between an input beam and the OPE region of the eyepiece waveguide embodiment shown in FIG. 15A.

[0043] FIG. 15E is a diagram of the second generation of interactions between an input beam and the OPE region of the eyepiece waveguide embodiment shown in FIG. 15A.

[0044] FIG. 15F is a diagram of the third generation of interactions between an input beam and the OPE region of the eyepiece waveguide embodiment shown in FIG. 15A.

[0045] FIG. 15G is a diagram which illustrates how a single input beam from the ICG region is replicated by the OPE region and redirected toward the EPE region as a plurality of beams.

[0046] FIG. 16A illustrates an example eyepiece waveguide that has a multi-directional pupil expander (MPE) region rather than an OPE region.

[0047] FIG. 16B illustrates a portion of an example 2D grating, along with its associated grating vectors, which can be used in the MPE region shown in FIG. 16A.

[0048] FIG. 16C is a k-space diagram which illustrates the k-space operation of the MPE region of the eyepiece waveguide shown in FIG. 16A.

[0049] FIG. 16D is a k-space diagram which further illustrates the k-space operation of the MPE region of the eyepiece waveguide shown in FIG. 16A.

[0050] FIG. 16E is a k-space diagram which illustrates the k-space operation of the eyepiece waveguide shown in FIG. 16A.

[0051] FIG. 16F is a diagram of the first generation of interactions between an input beam and the MPE region of the eyepiece waveguide embodiment shown in FIG. 16A.

[0052] FIG. 16G is a diagram of the second generation of interactions between an input beam and the MPE region of the eyepiece waveguide embodiment shown in FIG. 16A.

[0053] FIG. 16H is a diagram of the third generation of interactions between an input beam and the MPE region of the eyepiece waveguide embodiment shown in FIG. 16A.

[0054] FIG. 16I is a diagram of the fourth generation of interactions between an input beam and the MPE region of the eyepiece waveguide embodiment shown in FIG. 16A.

[0055] FIG. 16J is a diagram which illustrates various paths which beams may follow through the MPE region and ultimately to the EPE region according to the eyepiece waveguide embodiment shown in FIG. 16A.

[0056] FIG. 16K is a diagram which illustrates how a single input beam from the ICG region is replicated by the MPE region and redirected toward the EPE region as a plurality of beams.

[0057] FIG. 16L is a side-by-side comparison which illustrates the performance of an eyepiece waveguide with an OPE region versus that of an eyepiece waveguide with an MPE region.

[0058] FIG. 16M further illustrates the performance of an eyepiece waveguide with an MPE region versus others with OPE regions.

[0059] FIG. 17A illustrates a portion of an example 2D grating, along with its associated grating vectors, which can be used in the MPE region of an eyepiece waveguide.

[0060] FIG. 17B is a k-space diagram which illustrates the k-space operation of the MPE region of an eyepiece waveguide.

[0061] FIG. 17C is a k-space diagram which illustrates the k-space operation of an eyepiece waveguide with an MPE region.

[0062] FIG. 17D is a diagram of the first generation of interactions between an input beam and the MPE region of an eyepiece waveguide.

[0063] FIG. 17E is a diagram of the second generation of interactions between an input beam and the MPE region of an eyepiece waveguide.

[0064] FIG. 17F is a diagram of the third generation of interactions between an input beam and the MPE region of an eyepiece waveguide.

[0065] FIG. 17G is a diagram of the fourth generation of interactions between an input beam and the MPE region of an eyepiece waveguide.

[0066] FIG. 18A illustrates an example eyepiece waveguide with an ICG region, two orthogonal pupil expander regions, and an exit pupil expander region.

[0067] FIGS. 18B and 18C illustrate top views of the EPE region of the eyepiece waveguide shown in FIG. 18A.

[0068] FIG. 19 illustrates an embodiment of an eyepiece waveguide with an expanded field of view.

[0069] FIG. 20A illustrates an embodiment of an expanded FOV eyepiece waveguide with an MPE region which is overlapped by an EPE region.

[0070] FIG. 20B illustrates a portion of an example 2D grating, along with its associated grating vectors, which can be used in the MPE region of the eyepiece waveguide in FIG. 20A.

[0071] FIG. 20C is a k-space diagram which illustrates the k-space operation of the ICG region of the eyepiece waveguide in FIG. 20A.

[0072] FIG. 20D is a k-space diagram which illustrates part of the k-space operation of the MPE region of the eyepiece waveguide in FIG. 20A.

[0073] FIG. 20E is a k-space diagram which illustrates another part of the k-space operation of the MPE region of the eyepiece waveguide in FIG. 20A.

[0074] FIG. 20F is similar to FIG. 20E, except that it shows the k-space operation of the MPE region on the FOV rectangle from FIG. 20D which was translated to the 9 o’clock position (instead of the 3 o’clock position, as illustrated in FIG. 20E).

[0075] FIG. 20G is a k-space diagram which illustrates the k-space operation of the EPE region in the eyepiece waveguide in FIG. 20A.

[0076] FIG. 20H is a k-space diagram which summarizes the k-space operation of the eyepiece waveguide in FIG. 20A.

[0077] FIG. 20I is a diagram which illustrates how beams of light spread through the eyepiece waveguide shown in FIG. 20A.

[0078] FIG. 20J illustrates how the diffractive efficiency of the MPE region in the eyepiece waveguide in FIG. 20A can be spatially varied so as to enhance the uniformity of luminance in the waveguide.

[0079] FIG. 20K illustrates how the diffractive efficiency of the EPE region in the eyepiece waveguide in FIG. 20A can be spatially varied so as to enhance the uniformity of luminance in the waveguide.

[0080] FIG. 20L illustrates an embodiment of the eyepiece waveguide in FIG. 20A which includes one or more diffractive mirrors around the peripheral edge of the waveguide.

[0081] FIG. 20M illustrates an example embodiment of eyeglasses which incorporate one or more instances of the eyepiece waveguide in FIG. 20A.

[0082] FIG. 20N illustrates another example embodiment of eyeglasses which incorporate one or more instances of the eyepiece waveguide in FIG. 20A.

[0083] FIG. 21A illustrates another embodiment of an eyepiece waveguide with an MPE region which is overlapped by an EPE region.

[0084] FIG. 21B is a k-space diagram which illustrates the k-space operation of the eyepiece waveguide in FIG. 20A on the first set of input beams corresponding to the first sub-portion of the FOV of an input image.

[0085] FIG. 21C is a k-space diagram which illustrates the k-space operation of the eyepiece waveguide in FIG. 21A on the second set of input beams corresponding to the second sub-portion of the FOV of the input image.

[0086] FIG. 21D is a k-space diagram which summarizes the k-space operation of the eyepiece waveguide in FIG. 21A.

[0087] FIG. 21E illustrates an example embodiment of eyeglasses which incorporate one or more instances of the eyepiece waveguide in FIG. 21A.

[0088] FIG. 21F illustrates example FOVs corresponding to the eyeglasses in FIG. 21E.

[0089] FIG. 21G illustrates the k-space operation of another embodiment of the eyepiece waveguide shown in FIG. 21A.

[0090] FIG. 22A illustrates an embodiment of an eyepiece waveguide that can project an FOV which is expanded in two directions.

[0091] FIG. 22B illustrates the opposite side of the eyepiece waveguide shown in FIG. 22A.

[0092] FIG. 22C illustrates the k-space operation of the ICG regions and the OPE regions in the eyepiece waveguide embodiment in FIG. 22A.

[0093] FIG. 22D illustrates the k-space operation of the MPE region in the eyepiece waveguide embodiment in FIG. 22A.

[0094] FIG. 22E illustrates the k-space operation of the EPE region in the eyepiece waveguide embodiment in FIG. 22A.

[0095] FIG. 23 illustrates an example embodiment of an eyepiece waveguide designed to function with an angled projector.

[0096] FIG. 24A is an edge view of an example eyepiece waveguide that has a multiple combined pupil expander-extractor (CPE) regions.

[0097] FIG. 24B illustrates the operation of the first and second CPE regions in both physical space and in k-space according to a first type of main pathway of light through the eyepiece waveguide.

[0098] FIG. 24C illustrates the operation of the first and second CPE regions in both physical space and in k-space according to a second type of main pathway of light through the eyepiece waveguide.

[0099] FIG. 24D illustrates the operation of the first and second CPE regions in both physical space and in k-space according to both the first and second types of main pathways of light through the eyepiece waveguide.

[0100] FIG. 24E is a diagram of the first generation of interactions between an input beam and the CPE regions of the eyepiece waveguide embodiment shown in FIG. 24A.

[0101] FIG. 24F is a diagram of the second generation of interactions between the input beam and the CPE regions of the eyepiece waveguide embodiment shown in FIG. 24A.

[0102] FIG. 24G is a diagram of the third generation of interactions between the input beam and the CPE regions of the eyepiece waveguide embodiment shown in FIG. 24A.

[0103] FIG. 24H is a diagram of the fourth generation of interactions between the input beam and the CPE regions of the eyepiece waveguide embodiment shown in FIG. 24A.

[0104] FIG. 24I is a diagram of the fifth generation of interactions between the input beam and the CPE regions of the eyepiece waveguide embodiment shown in FIG. 24A.

[0105] FIG. 24J illustrates, in k-space, higher-order pathways of light through the eyepiece waveguide shown in FIG. 24A.

[0106] FIG. 24K is a diagram which illustrates how beams of light spread through the eyepiece waveguide shown in FIG. 24A.

[0107] FIG. 25A is an edge view of an example eyepiece waveguide that has a single 2D combined pupil expander-extractor (CPE) grating region.

[0108] FIG. 25B illustrates the operation of the 2D CPE region in both physical space and in k-space.

[0109] FIG. 26A is an edge view of an example eyepiece waveguide that has a 2D combined pupil expander-extractor (CPE) grating region on each of its sides.

[0110] FIG. 26B illustrates the so-called “screen door effect” which is an image artifact that is related to the density of output beams from an eyepiece waveguide.

[0111] FIG. 26C illustrates input coupling grating re-bounce, which is an effect that can cause light to be disadvantageously lost from an eyepiece waveguide.

[0112] FIG. 26D illustrates how the double-sided 2D CPE gratings in FIG. 26A increase the density of output beams from the eyepiece waveguide.

[0113] FIG. 26E illustrates the density of output beams for the eyepiece waveguides shown in FIG. 24A (double-sided 1D CPE gratings), FIG. 25A (single-sided 2D CPE grating), and FIG. 26A (double-sided 2D CPE gratings).

[0114] FIG. 26F shows example simulated images produced by eyepiece waveguides with 2D CPE gratings; images for both the case of the single-sided embodiment of FIG. 25A and the double-sided embodiment of FIG. 26A are shown.

DETAILED DESCRIPTION

Overview

[0115] This disclosure describes a variety of eyepiece waveguides which can be used in AR display systems to project images to a user’s eye. The eyepiece waveguides are described both in physical terms and using k-space representations.

Example HMD Device

[0116] FIG. 2 illustrates an example wearable display system 60. The display system 60 includes a display or eyepiece 70, and various mechanical and electronic modules and systems to support the functioning of that display 70. The display 70 may be coupled to a frame 80, which is wearable by a display system user 90 and which is configured to position the display 70 in front of the eyes of the user 90. The display 70 may be considered eyewear in some embodiments. In some embodiments, a speaker 100 is coupled to the frame 80 and is positioned adjacent the ear canal of the user 90. The display system may also include one or more microphones 110 to detect sound. The microphone 110 can allow the user to provide inputs or commands to the system 60 (e.g., the selection of voice menu commands, natural language questions, etc.), and/or can allow audio communication with other persons (e.g., with other users of similar display systems). The microphone 110 can also collect audio data from the user’s surroundings (e.g., sounds from the user and/or environment). In some embodiments, the display system may also include a peripheral sensor 120a, which may be separate from the frame 80 and attached to the body of the user 90 (e.g., on the head, torso, an extremity, etc.). The peripheral sensor 120a may acquire data characterizing the physiological state of the user 90 in some embodiments.

您可能还喜欢...