空 挡 广 告 位 | 空 挡 广 告 位

Magic Leap Patent | Eyepieces For Augmented Reality Display System

Patent: Eyepieces For Augmented Reality Display System

Publication Number: 20200400955

Publication Date: 20201224

Applicants:

Abstract

An augmented reality display system. The system can include a first eyepiece waveguide with a first input coupling grating (ICG) region. The first ICG region can receive a set of input beams of light corresponding to an input image having a corresponding field of view (FOV), and can in-couple a first subset of the input beams. The first subset of input beams can correspond to a first sub-portion of the FOV. The system can also include a second eyepiece waveguide with a second ICG region. The second ICG region can receive and in-couple at least a second subset of the input beams. The second subset of the input beams can correspond to a second sub-portion of the FOV. The first and second sub-portions of the FOV can be at least partially different but together include the complete FOV of the input image.

INCORPORATION BY REFERENCE TO ANY PRIORITY APPLICATIONS

[0001] This application claims priority to U.S. Provisional Patent Application 62/863,871, filed Jun. 20, 2019, and entitled “EYEPIECES FOR AUGMENTED REALITY DISPLAY SYSTEM,” and to U.S. Provisional Patent Application 63/024,343, filed May 13, 2020, and entitled “EYEPIECES FOR AUGMENTED REALITY DISPLAY SYSTEM.” Each of these applications, and any and all other applications for which a foreign or domestic priority claim is identified in the Application Data Sheet as filed with the present application, are hereby incorporated by reference under 37 CFR 1.57.

BACKGROUND

Field

[0002] This disclosure relates to eyepieces for virtual reality, augmented reality, and mixed reality systems.

Description of the Related Art

[0003] Modern computing and display technologies have facilitated the development of virtual reality, augmented reality, and mixed reality systems. Virtual reality, or “VR,” systems create a simulated environment for a user to experience. This can be done by presenting computer-generated image data to the user through a head-mounted display. This image data creates a sensory experience which immerses the user in the simulated environment. A virtual reality scenario typically involves presentation of only computer-generated image data rather than also including actual real-world image data. Augmented reality systems generally supplement a real-world environment with simulated elements. For example, augmented reality, or “AR,” systems may provide a user with a view of the surrounding real-world environment via a head-mounted display. However, computer-generated image data can also be presented on the display to enhance the real-world environment. This computer-generated image data can include elements which are contextually-related to the real-world environment. Such elements can include simulated text, images, objects, etc. Mixed reality, or “MR,” systems are a type of AR system which also introduce simulated objects into a real-world environment, but these objects typically feature a greater degree of interactivity. The simulated elements can often times be interactive in real time.

[0004] FIG. 1 depicts an example AR scene 1 where a user sees a real-world park setting 6 featuring people, trees, buildings in the background, and a concrete platform 20. In addition to these items, computer-generated image data is also presented to the user. The computer-generated image data can include, for example, a robot statue 10 standing upon the real-world platform 20, and a cartoon-like avatar character 2 flying by which seems to be a personification of a bumblebee, even though these elements 2, 10 are not actually present in the real-world environment.

SUMMARY

[0005] In some embodiments, an augmented reality display system comprises: a first eyepiece waveguide comprising a first optically transmissive substrate; a first input coupling grating (ICG) region formed on or in the first eyepiece waveguide, the first ICG region being configured to receive a set of input beams of light corresponding to an input image having a corresponding field of view, and to couple a first subset of the input beams into the substrate as a first set of guided beams, the first subset of the input beams corresponding to a first sub-portion of the field of view of the input image; a second eyepiece waveguide comprising a second optically transmissive substrate; and a second input coupling grating (ICG) region formed on or in the second eyepiece waveguide, the second ICG region being configured to receive at least a second subset of the input beams of light corresponding to the input image, and to couple the second subset of input beams into the substrate as a second set of guided beams, the second subset of the input beams corresponding to a second sub-portion of the field of view of the input image, wherein the first and second sub-portions of the field of view are at least partially different but together include the complete field of view of the input image.

[0006] In some embodiments, an augmented reality display system comprises: a first eyepiece waveguide comprising a first optically transmissive substrate; a first input coupling grating (ICG) region formed on or in the first eyepiece waveguide, the first ICG region being configured to receive a set of input beams of light, the set of input beams being associated with a set of k-vectors in k-space corresponding to an input image, and to translate the set of k-vectors to a location in k-space such that a first subset of the k-vectors lies inside a first k-space annulus associated with the first eyepiece waveguide, the first k-space annulus corresponding to a region in k-space associated with guided propagation in the first eyepiece waveguide; a second eyepiece waveguide comprising a second optically transmissive substrate; and a second input coupling grating (ICG) region formed on or in the second eyepiece waveguide, the second ICG region being configured to receive at least a portion of the set of input beams of light, and to translate the set of k-vectors to a location in k-space such that a second subset of the k-vectors lies inside a second k-space annulus associated with the second eyepiece waveguide, the second k-space annulus corresponding to a region in k-space associated with guided propagation in the second eyepiece waveguide; wherein the first and second subsets of the k-vectors are at least partially different but together include the complete set of k-vectors corresponding to the input image.

BRIEF DESCRIPTION OF THE DRAWINGS

[0007] FIG. 1 illustrates a user’s view of augmented reality (AR) through an AR device.

[0008] FIG. 2 illustrates an example of a wearable display system.

[0009] FIG. 3 illustrates a conventional display system for simulating three-dimensional image data for a user.

[0010] FIG. 4 illustrates aspects of an approach for simulating three-dimensional image data using multiple depth planes.

[0011] FIGS. 5A-5C illustrate relationships between radius of curvature and focal radius.

[0012] FIG. 6 illustrates an example of a waveguide stack for outputting image information to a user in an AR eyepiece.

[0013] FIGS. 7A-7B illustrate examples of exit beams outputted by a waveguide.

[0014] FIG. 8 illustrates an example of a stacked waveguide assembly in which each depth plane includes images formed using multiple different component colors.

[0015] FIG. 9A illustrates a cross-sectional side view of an example of a set of stacked waveguides that each includes an in-coupling optical element.

[0016] FIG. 9B illustrates a perspective view of an example of the plurality of stacked waveguides of FIG. 9A.

[0017] FIG. 9C illustrates a top-down plan view of an example of the plurality of stacked waveguides of FIGS. 9A and 9B.

[0018] FIG. 10 is a perspective view of an example AR eyepiece waveguide stack.

[0019] FIG. 11 is a cross-sectional view of a portion of an example eyepiece waveguide stack with an edge seal structure for supporting eyepiece waveguides in a stacked configuration.

[0020] FIGS. 12A and 12B illustrate top views of an eyepiece waveguide in operation as it projects an image toward a user’s eye.

[0021] FIG. 13A illustrates a k-vector which can be used to represent the propagation direction of a light ray or a light beam.

[0022] FIG. 13B illustrates a light ray within a planar waveguide.

[0023] FIG. 13C illustrates the permissible k-vectors for light of a given angular frequency, co, propagating in an unbounded homogenous medium with refractive index, n.

[0024] FIG. 13D illustrates the permissible k-vectors for light of a given angular frequency, co, propagating in a homogenous planar waveguide medium with refractive index, n.

[0025] FIG. 13E illustrates an annulus in k-space which corresponds to k-vectors of light waves which can be guided within a waveguide having a refractive index, n.sub.2.

[0026] FIG. 13F shows a k-space diagram and an eyepiece waveguide which illustrate the relationship between a k-vector and the density of interactions between a guided beam corresponding to that k-vector and a diffraction grating formed on or in the waveguide.

[0027] FIG. 13G illustrates a top view of a diffraction grating and some of its associated k-space diffraction grating vectors (G-2, G-1, G1, G2).

[0028] FIG. 13H illustrates a transverse view of the diffraction grating and its effect, in k-space, on a k-vector corresponding to a normally-incident ray or beam of light.

[0029] FIG. 13I illustrates a transverse view of the diffraction grating shown in FIG. 13G and its effect, in k-space, on a k-vector corresponding to an obliquely-incident ray or beam of light.

[0030] FIG. 13J is a k-space diagram which illustrates the field of view of an image that is projected into an AR eyepiece waveguide.

[0031] FIG. 13K is a k-space diagram which shows the translational shift, in k-space, of the FOV rectangle which is caused by an input coupling grating (ICG) located at the entrance pupil of an eyepiece waveguide.

[0032] FIG. 14A illustrates an example eyepiece waveguide with an ICG region, an orthogonal pupil expander (OPE) region, and an exit pupil expander (EPE) region.

[0033] FIG. 14B illustrates the k-space operation of the eyepiece waveguide shown in FIG. 14A.

[0034] FIG. 14C illustrates the optical operation of the OPE region shown in FIGS. 14A and 14B.

[0035] FIG. 14D illustrates a technique for determining the sizes and shapes of the OPE region and the EPE region.

[0036] FIG. 15A illustrates an example embodiment of a waveguide eyepiece in which the OPE region is tilted and located such that its lower border is parallel to the upper border of the EPE region.

[0037] FIG. 15B includes k-space diagrams which illustrate the operation of the eyepiece waveguide shown in FIG. 15A.

[0038] FIG. 15C is another k-space diagram which illustrates the operation of the eyepiece waveguide shown in FIG. 15A.

[0039] FIG. 15D is a diagram of the first generation of interactions between an input beam and the OPE region of the eyepiece waveguide embodiment shown in FIG. 15A.

[0040] FIG. 15E is a diagram of the second generation of interactions between an input beam and the OPE region of the eyepiece waveguide embodiment shown in FIG. 15A.

[0041] FIG. 15F is a diagram of the third generation of interactions between an input beam and the OPE region of the eyepiece waveguide embodiment shown in FIG. 15A.

[0042] FIG. 15G is a diagram which illustrates how a single input beam from the ICG region is replicated by the OPE region and redirected toward the EPE region as a plurality of beams.

[0043] FIG. 16A illustrates an example eyepiece waveguide that has a multi-directional pupil expander (MPE) region rather than an OPE region.

[0044] FIG. 16B illustrates a portion of an example 2D grating, along with its associated grating vectors, which can be used in the MPE region shown in FIG. 16A.

[0045] FIG. 16C is a k-space diagram which illustrates the k-space operation of the MPE region of the eyepiece waveguide shown in FIG. 16A.

[0046] FIG. 16D is a k-space diagram which further illustrates the k-space operation of the MPE region of the eyepiece waveguide shown in FIG. 16A.

[0047] FIG. 16E is a k-space diagram which illustrates the k-space operation of the eyepiece waveguide shown in FIG. 16A.

[0048] FIG. 16F is a diagram of the first generation of interactions between an input beam and the MPE region of the eyepiece waveguide embodiment shown in FIG. 16A.

[0049] FIG. 16G is a diagram of the second generation of interactions between an input beam and the MPE region of the eyepiece waveguide embodiment shown in FIG. 16A.

[0050] FIG. 16H is a diagram of the third generation of interactions between an input beam and the MPE region of the eyepiece waveguide embodiment shown in FIG. 16A.

[0051] FIG. 16I is a diagram of the fourth generation of interactions between an input beam and the MPE region of the eyepiece waveguide embodiment shown in FIG. 16A.

[0052] FIG. 16J is a diagram which illustrates various paths which beams may follow through the MPE region and ultimately to the EPE region according to the eyepiece waveguide embodiment shown in FIG. 16A.

[0053] FIG. 16K is a diagram which illustrates how a single input beam from the ICG region is replicated by the MPE region and redirected toward the EPE region as a plurality of beams.

[0054] FIG. 16L is a side-by-side comparison which illustrates the performance of an eyepiece waveguide with an OPE region versus that of an eyepiece waveguide with an MPE region.

[0055] FIG. 16M further illustrates the performance of an eyepiece waveguide with an MPE region versus others with OPE regions.

[0056] FIG. 17A illustrates a portion of an example 2D grating, along with its associated grating vectors, which can be used in the MPE region of an eyepiece waveguide.

[0057] FIG. 17B is a k-space diagram which illustrates the k-space operation of the MPE region of an eyepiece waveguide.

[0058] FIG. 17C is a k-space diagram which illustrates the k-space operation of an eyepiece waveguide with an MPE region.

[0059] FIG. 17D is a diagram of the first generation of interactions between an input beam and the MPE region of an eyepiece waveguide.

[0060] FIG. 17E is a diagram of the second generation of interactions between an input beam and the MPE region of an eyepiece waveguide.

[0061] FIG. 17F is a diagram of the third generation of interactions between an input beam and the MPE region of an eyepiece waveguide.

[0062] FIG. 17G is a diagram of the fourth generation of interactions between an input beam and the MPE region of an eyepiece waveguide.

[0063] FIG. 18A illustrates an example eyepiece waveguide with an ICG region, two orthogonal pupil expander regions, and an exit pupil expander region.

[0064] FIGS. 18B and 18C illustrate top views of the EPE region of the eyepiece waveguide shown in FIG. 18A.

[0065] FIG. 19 illustrates an embodiment of an eyepiece waveguide with an expanded field of view.

[0066] FIG. 20A illustrates an embodiment of an expanded FOV eyepiece waveguide with an MPE region which is overlapped by an EPE region.

[0067] FIG. 20B illustrates a portion of an example 2D grating, along with its associated grating vectors, which can be used in the MPE region of the eyepiece waveguide in FIG. 20A.

[0068] FIG. 20C is a k-space diagram which illustrates the k-space operation of the ICG region of the eyepiece waveguide in FIG. 20A.

[0069] FIG. 20D is a k-space diagram which illustrates part of the k-space operation of the MPE region of the eyepiece waveguide in FIG. 20A.

[0070] FIG. 20E is a k-space diagram which illustrates another part of the k-space operation of the MPE region of the eyepiece waveguide in FIG. 20A.

[0071] FIG. 20F is similar to FIG. 20E, except that it shows the k-space operation of the MPE region on the FOV rectangle from FIG. 20D which was translated to the 9 o’clock position (instead of the 3 o’clock position, as illustrated in FIG. 20E).

[0072] FIG. 20G is a k-space diagram which illustrates the k-space operation of the EPE region in the eyepiece waveguide in FIG. 20A.

[0073] FIG. 20H is a k-space diagram which summarizes the k-space operation of the eyepiece waveguide in FIG. 20A.

[0074] FIG. 20I is a diagram which illustrates how beams of light spread through the eyepiece waveguide shown in FIG. 20A.

[0075] FIG. 20J illustrates how the diffractive efficiency of the MPE region in the eyepiece waveguide in FIG. 20A can be spatially varied so as to enhance the uniformity of luminance in the waveguide.

[0076] FIG. 20K illustrates how the diffractive efficiency of the EPE region in the eyepiece waveguide in FIG. 20A can be spatially varied so as to enhance the uniformity of luminance in the waveguide.

[0077] FIG. 20L illustrates an embodiment of the eyepiece waveguide in FIG. 20A which includes one or more diffractive mirrors around the peripheral edge of the waveguide.

[0078] FIG. 20M illustrates an example embodiment of eyeglasses which incorporate one or more instances of the eyepiece waveguide in FIG. 20A.

[0079] FIG. 20N illustrates another example embodiment of eyeglasses which incorporate one or more instances of the eyepiece waveguide in FIG. 20A.

[0080] FIG. 21A illustrates another embodiment of an eyepiece waveguide with an MPE region which is overlapped by an EPE region.

[0081] FIG. 21B is a k-space diagram which illustrates the k-space operation of the eyepiece waveguide in FIG. 20A on the first set of input beams corresponding to the first sub-portion of the FOV of an input image.

[0082] FIG. 21C is a k-space diagram which illustrates the k-space operation of the eyepiece waveguide in FIG. 21A on the second set of input beams corresponding to the second sub-portion of the FOV of the input image.

[0083] FIG. 21D is a k-space diagram which summarizes the k-space operation of the eyepiece waveguide in FIG. 21A.

[0084] FIG. 21E illustrates an example embodiment of eyeglasses which incorporate one or more instances of the eyepiece waveguide in FIG. 21A.

[0085] FIG. 21F illustrates example FOVs corresponding to the eyeglasses in FIG. 21E.

[0086] FIG. 21G illustrates the k-space operation of another embodiment of the eyepiece waveguide shown in FIG. 21A.

[0087] FIG. 22A illustrates an embodiment of an eyepiece waveguide that can project an FOV which is expanded in two directions.

[0088] FIG. 22B illustrates the opposite side of the eyepiece waveguide shown in FIG. 22A.

[0089] FIG. 22C illustrates the k-space operation of the ICG regions and the OPE regions in the eyepiece waveguide embodiment in FIG. 22A.

[0090] FIG. 22D illustrates the k-space operation of the MPE region in the eyepiece waveguide embodiment in FIG. 22A.

[0091] FIG. 22E illustrates the k-space operation of the EPE region in the eyepiece waveguide embodiment in FIG. 22A.

[0092] FIG. 23 illustrates an example embodiment of an eyepiece waveguide designed to function with an angled projector.

[0093] FIG. 24A is an edge view of an example eyepiece waveguide that has a multiple combined pupil expander-extractor (CPE) regions.

[0094] FIG. 24B illustrates the operation of the first and second CPE regions in both physical space and in k-space according to a first type of main pathway of light through the eyepiece waveguide.

[0095] FIG. 24C illustrates the operation of the first and second CPE regions in both physical space and in k-space according to a second type of main pathway of light through the eyepiece waveguide.

[0096] FIG. 24D illustrates the operation of the first and second CPE regions in both physical space and in k-space according to both the first and second types of main pathways of light through the eyepiece waveguide.

[0097] FIG. 24E is a diagram of the first generation of interactions between an input beam and the CPE regions of the eyepiece waveguide embodiment shown in FIG. 24A.

[0098] FIG. 24F is a diagram of the second generation of interactions between the input beam and the CPE regions of the eyepiece waveguide embodiment shown in FIG. 24A.

[0099] FIG. 24G is a diagram of the third generation of interactions between the input beam and the CPE regions of the eyepiece waveguide embodiment shown in FIG. 24A.

[0100] FIG. 24H is a diagram of the fourth generation of interactions between the input beam and the CPE regions of the eyepiece waveguide embodiment shown in FIG. 24A.

[0101] FIG. 24I is a diagram of the fifth generation of interactions between the input beam and the CPE regions of the eyepiece waveguide embodiment shown in FIG. 24A.

[0102] FIG. 24J illustrates, in k-space, higher-order pathways of light through the eyepiece waveguide shown in FIG. 24A.

[0103] FIG. 24K is a diagram which illustrates how beams of light spread through the eyepiece waveguide shown in FIG. 24A.

[0104] FIG. 25A is an edge view of an example eyepiece waveguide that has a single 2D combined pupil expander-extractor (CPE) grating region.

[0105] FIG. 25B illustrates the operation of the 2D CPE region in both physical space and in k-space.

[0106] FIG. 26A is an edge view of an example eyepiece waveguide that has a 2D combined pupil expander-extractor (CPE) grating region on each of its sides.

[0107] FIG. 26B illustrates the so-called “screen door effect” which is an image artifact that is related to the density of output beams from an eyepiece waveguide.

[0108] FIG. 26C illustrates input coupling grating re-bounce, which is an effect that can cause light to be disadvantageously lost from an eyepiece waveguide.

[0109] FIG. 26D illustrates how the double-sided 2D CPE gratings in FIG. 26A increase the density of output beams from the eyepiece waveguide.

[0110] FIG. 26E illustrates the density of output beams for the eyepiece waveguides shown in FIG. 24A (double-sided 1D CPE gratings), FIG. 25A (single-sided 2D CPE grating), and FIG. 26A (double-sided 2D CPE gratings).

[0111] FIG. 26F shows example simulated images produced by eyepiece waveguides with 2D CPE gratings; images for both the case of the single-sided embodiment of FIG. 25A and the double-sided embodiment of FIG. 26A are shown.

[0112] FIG. 27A illustrates an example embodiment of an eyepiece waveguide stack with enhanced FOV.

[0113] FIG. 27B illustrates another example embodiment of an eyepiece waveguide stack with enhanced FOV.

[0114] FIGS. 27C-27E include k-space diagrams that illustrate the k-space operation of the example embodiments of the eyepiece waveguide stacks shown in FIGS. 27A and 27B for three different refractive indexes.

[0115] FIGS. 27F-27H illustrate the sub-portions of the FOV of each color component of the input image which can be in-coupled into each of the eyepiece waveguides in the stack shown in FIGS. 27A and 27B, according to certain embodiments where each color component is partially carried in two of the eyepiece waveguides.

[0116] FIGS. 27I-27K are similar to FIGS. 27F-27H in that they illustrate the sub-portions of the FOV of each color component of the input image which can be in-coupled into each of the eyepiece waveguides in the stack, except that FIGS. 27I-27K illustrate certain embodiments where each color component is partially carried in three of the eyepiece waveguides rather than two.

[0117] FIGS. 27L-27N illustrate the sub-portions of the FOV of each color component of the input image which can be in-coupled into single-color-component-per-layer eyepiece waveguides.

[0118] FIG. 28A illustrates another example embodiment of an eyepiece waveguide stack with enhanced FOV and an in-line pupil ICG configuration.

[0119] FIG. 28B illustrates another example embodiment of an eyepiece waveguide stack with enhanced FOV and a split pupil ICG configuration.

[0120] FIG. 29 is a graph which plots the FOV values in Table 1 as a function of refractive index.

[0121] FIG. 30 illustrates an example of improved output image uniformity using the eyepiece waveguide stacks shown in FIGS. 27A-27B and FIGS. 28A-28B.

DETAILED DESCRIPTION

Overview

[0122] This disclosure describes a variety of eyepiece waveguides which can be used in AR display systems to project images to a user’s eye. The eyepiece waveguides are described both in physical terms and using k-space representations.

Example HMD Device

[0123] FIG. 2 illustrates an example wearable display system 60. The display system 60 includes a display or eyepiece 70, and various mechanical and electronic modules and systems to support the functioning of that display 70. The display 70 may be coupled to a frame 80, which is wearable by a display system user 90 and which is configured to position the display 70 in front of the eyes of the user 90. The display 70 may be considered eyewear in some embodiments. In some embodiments, a speaker 100 is coupled to the frame 80 and is positioned adjacent the ear canal of the user 90. The display system may also include one or more microphones 110 to detect sound. The microphone 110 can allow the user to provide inputs or commands to the system 60 (e.g., the selection of voice menu commands, natural language questions, etc.), and/or can allow audio communication with other persons (e.g., with other users of similar display systems). The microphone 110 can also collect audio data from the user’s surroundings (e.g., sounds from the user and/or environment). In some embodiments, the display system may also include a peripheral sensor 120a, which may be separate from the frame 80 and attached to the body of the user 90 (e.g., on the head, torso, an extremity, etc.). The peripheral sensor 120a may acquire data characterizing the physiological state of the user 90 in some embodiments.

[0124] The display 70 is operatively coupled by a communications link 130, such as by a wired lead or wireless connectivity, to a local data processing module 140 which may be mounted in a variety of configurations, such as fixedly attached to the frame 80, fixedly attached to a helmet or hat worn by the user, embedded in headphones, or removably attached to the user 90 (e.g., in a backpack-style configuration or in a belt-coupling style configuration). Similarly, the sensor 120a may be operatively coupled by communications link 120b (e.g., a wired lead or wireless connectivity) to the local processor and data module 140. The local processing and data module 140 may include a hardware processor, as well as digital memory, such as non-volatile memory (e.g., flash memory or a hard disk drive), both of which may be utilized to assist in the processing, caching, and storage of data. The data may include data 1) captured from sensors (which may be, e.g., operatively coupled to the frame 80 or otherwise attached to the user 90), such as image capture devices (example e.g., cameras), microphones, inertial measurement units, accelerometers, compasses, GPS units, radio devices, gyros, and/or other sensors disclosed herein; and/or 2) acquired and/or processed using a remote processing module 150 and/or a remote data repository 160 (including data relating to virtual content), possibly for passage to the display 70 after such processing or retrieval. The local processing and data module 140 may be operatively coupled by communication links 170, 180, such as via a wired or wireless communication links, to the remote processing module 150 and the remote data repository 160 such that these remote modules 150, 160 are operatively coupled to each other and available as resources to the local processing and data module 140. In some embodiments, the local processing and data module 140 may include one or more of the image capture devices, microphones, inertial measurement units, accelerometers, compasses, GPS units, radio devices, and/or gyros. In some other embodiments, one or more of these sensors may be attached to the frame 80, or may be standalone devices that communicate with the local processing and data module 140 by wired or wireless communication pathways.

[0125] The remote processing module 150 may include one or more processors to analyze and process data, such as image and audio information. In some embodiments, the remote data repository 160 may be a digital data storage facility, which may be available through the internet or other networking configuration in a “cloud” resource configuration. In some embodiments, the remote data repository 160 may include one or more remote servers, which provide information (e.g., information for generating augmented reality content) to the local processing and data module 140 and/or the remote processing module 150. In other embodiments, all data is stored and all computations are performed in the local processing and data module, allowing fully autonomous use from a remote module.

[0126] The perception of an image as being “three-dimensional” or “3-D” may be achieved by providing slightly different presentations of the image to each eye of the user. FIG. 3 illustrates a conventional display system for simulating three-dimensional image data for a user. Two distinct images 190, 200–one for each eye 210, 220–are output to the user. The images 190, 200 are spaced from the eyes 210, 220 by a distance 230 along an optical or z-axis that is parallel to the line of sight of the user. The images 190, 200 are flat and the eyes 210, 220 may focus on the images by assuming a single accommodated state. Such 3-D display systems rely on the human visual system to combine the images 190, 200 to provide a perception of depth and/or scale for the combined image.

[0127] However, the human visual system is complicated and providing a realistic perception of depth is challenging. For example, many users of conventional “3-D” display systems find such systems to be uncomfortable or may not perceive a sense of depth at all. Objects may be perceived as being “three-dimensional” due to a combination of vergence and accommodation. Vergence movements (e.g., rotation of the eyes so that the pupils move toward or away from each other to converge the respective lines of sight of the eyes to fixate upon an object) of the two eyes relative to each other are closely associated with focusing (or “accommodation”) of the lenses of the eyes. Under normal conditions, changing the focus of the lenses of the eyes, or accommodating the eyes, to change focus from one object to another object at a different distance will automatically cause a matching change in vergence to the same distance, under a relationship known as the “accommodation-vergence reflex,” as well as pupil dilation or constriction. Likewise, under normal conditions, a change in vergence will trigger a matching change in accommodation of lens shape and pupil size. As noted herein, many stereoscopic or “3-D” display systems display a scene using slightly different presentations (and, so, slightly different images) to each eye such that a three-dimensional perspective is perceived by the human visual system. Such systems can be uncomfortable for some users, however, since they simply provide image information at a single accommodated state and work against the “accommodation-vergence reflex.” Display systems that provide a better match between accommodation and vergence may form more realistic and comfortable simulations of three-dimensional image data.

[0128] FIG. 4 illustrates aspects of an approach for simulating three-dimensional image data using multiple depth planes. With reference to FIG. 4, the eyes 210, 220 assume different accommodated states to focus on objects at various distances on the z-axis. Consequently, a particular accommodated state may be said to be associated with a particular one of the illustrated depth planes 240, which has an associated focal distance, such that objects or parts of objects in a particular depth plane are in focus when the eye is in the accommodated state for that depth plane. In some embodiments, three-dimensional image data may be simulated by providing different presentations of an image for each of the eyes 210, 220, and also by providing different presentations of the image corresponding to multiple depth planes. While the respective fields of view of the eyes 210, 220 are shown as being separate for clarity of illustration, they may overlap, for example, as distance along the z-axis increases. In addition, while the depth planes are shown as being flat for ease of illustration, it will be appreciated that the contours of a depth plane may be curved in physical space, such that all features in a depth plane are in focus with the eye in a particular accommodated state.

[0129] The distance between an object and an eye 210 or 220 may also change the amount of divergence of light from that object, as viewed by that eye. FIGS. 5A-5C illustrate relationships between distance and the divergence of light rays. The distance between the object and the eye 210 is represented by, in order of decreasing distance, R1, R2, and R3. As shown in FIGS. 5A-5C, the light rays become more divergent as distance to the object decreases. As distance increases, the light rays become more collimated. Stated another way, it may be said that the light field produced by a point (the object or a part of the object) has a spherical wavefront curvature, which is a function of how far away the point is from the eye of the user. The curvature increases with decreasing distance between the object and the eye 210. Consequently, at different depth planes, the degree of divergence of light rays is also different, with the degree of divergence increasing with decreasing distance between depth planes and the user’s eye 210. While only a single eye 210 is illustrated for clarity of illustration in FIGS. 5A-5C and other figures herein, it will be appreciated that the discussions regarding the eye 210 may be applied to both eyes 210 and 220 of a user.

[0130] A highly believable simulation of perceived depth may be achieved by providing, to the eye, different presentations of an image corresponding to each of a limited number of depth planes. The different presentations may be separately focused by the user’s eye, thereby helping to provide the user with depth cues based on the amount of accommodation of the eye required to bring into focus different image features for the scene located on different depth planes and/or based on observing different image features on different depth planes being out of focus.

Example of a Waveguide Stack Assembly for an AR or MR Eyepiece

[0131] FIG. 6 illustrates an example of a waveguide stack for outputting image information to a user in an AR eyepiece. A display system 250 includes a stack of waveguides, or stacked waveguide assembly, 260 that may be utilized to provide three-dimensional perception to the eye/brain using a plurality of waveguides 270, 280, 290, 300, 310. In some embodiments, the display system 250 is the system 60 of FIG. 2, with FIG. 6 schematically showing some parts of that system 60 in greater detail. For example, the waveguide assembly 260 may be part of the display 70 of FIG. 2. It will be appreciated that the display system 250 may be considered a light field display in some embodiments.

[0132] The waveguide assembly 260 may also include a plurality of features 320, 330, 340, 350 between the waveguides. In some embodiments, the features 320, 330, 340, 350 may be one or more lenses. The waveguides 270, 280, 290, 300, 310 and/or the plurality of lenses 320, 330, 340, 350 may be configured to send image information to the eye with various levels of wavefront curvature or light ray divergence. Each waveguide level may be associated with a particular depth plane and may be configured to output image information corresponding to that depth plane. Image injection devices 360, 370, 380, 390, 400 may function as a source of light for the waveguides and may be utilized to inject image information into the waveguides 270, 280, 290, 300, 310, each of which may be configured, as described herein, to distribute incoming light across each respective waveguide, for output toward the eye 210. Light exits an output surface 410, 420, 430, 440, 450 of each respective image injection device 360, 370, 380, 390, 400 and is injected into a corresponding input surface 460, 470, 480, 490, 500 of the respective waveguides 270, 280, 290, 300, 310. In some embodiments, the each of the input surfaces 460, 470, 480, 490, 500 may be an edge of a corresponding waveguide, or may be part of a major surface of the corresponding waveguide (that is, one of the waveguide surfaces directly facing the world 510 or the user’s eye 210). In some embodiments, a beam of light (e.g. a collimated beam) may be injected into each waveguide and may be replicated, such as by sampling into beamlets by diffraction, in the waveguide and then directed toward the eye 210 with an amount of optical power corresponding to the depth plane associated with that particular waveguide. In some embodiments, a single one of the image injection devices 360, 370, 380, 390, 400 may be associated with, and inject light into, a plurality (e.g., three) of the waveguides 270, 280, 290, 300, 310.

[0133] In some embodiments, the image injection devices 360, 370, 380, 390, 400 are discrete displays that each produce image information for injection into a corresponding waveguide 270, 280, 290, 300, 310, respectively. In some other embodiments, the image injection devices 360, 370, 380, 390, 400 are the output ends of a single multiplexed display which may transmit image information via one or more optical conduits (such as fiber optic cables) to each of the image injection devices 360, 370, 380, 390, 400. It will be appreciated that the image information provided by the image injection devices 360, 370, 380, 390, 400 may include light of different wavelengths, or colors.

[0134] In some embodiments, the light injected into the waveguides 270, 280, 290, 300, 310 is provided by a light projector system 520, which includes a light module 530, which may include a light source or light emitter, such as a light emitting diode (LED). The light from the light module 530 may be directed to, and modulated by, a light modulator 540 (e.g., a spatial light modulator), via a beamsplitter (BS) 550. The light modulator 540 may spatially and/or temporally change the perceived intensity of the light injected into the waveguides 270, 280, 290, 300, 310. Examples of spatial light modulators include liquid crystal displays (LCD), including a liquid crystal on silicon (LCOS) displays, and digital light processing (DLP) displays.

[0135] In some embodiments, the light projector system 520, or one or more components thereof, may be attached to the frame 80 (FIG. 2). For example, the light projector system 520 may be part of a temporal portion (e.g., ear stem 82) of the frame 80 or disposed at an edge of the display 70. In some embodiments, the light module 530 may be separate from the BS 550 and/or light modulator 540.

[0136] In some embodiments, the display system 250 may be a scanning fiber display comprising one or more scanning fibers to project light in various patterns (e.g., raster scan, spiral scan, Lissajous patterns, etc.) into one or more waveguides 270, 280, 290, 300, 310 and ultimately into the eye 210 of the user. In some embodiments, the illustrated image injection devices 360, 370, 380, 390, 400 may schematically represent a single scanning fiber or a bundle of scanning fibers configured to inject light into one or a plurality of the waveguides 270, 280, 290, 300, 310. In some other embodiments, the illustrated image injection devices 360, 370, 380, 390, 400 may schematically represent a plurality of scanning fibers or a plurality of bundles of scanning fibers, each of which are configured to inject light into an associated one of the waveguides 270, 280, 290, 300, 310. One or more optical fibers may transmit light from the light module 530 to the one or more waveguides 270, 280, 290, 300, and 310. In addition, one or more intervening optical structures may be provided between the scanning fiber, or fibers, and the one or more waveguides 270, 280, 290, 300, 310 to, for example, redirect light exiting the scanning fiber into the one or more waveguides 270, 280, 290, 300, 310.

[0137] A controller 560 controls the operation of the stacked waveguide assembly 260, including operation of the image injection devices 360, 370, 380, 390, 400, the light source 530, and the light modulator 540. In some embodiments, the controller 560 is part of the local data processing module 140. The controller 560 includes programming (e.g., instructions in a non-transitory medium) that regulates the timing and provision of image information to the waveguides 270, 280, 290, 300, 310. In some embodiments, the controller may be a single integral device, or a distributed system connected by wired or wireless communication channels. The controller 560 may be part of the processing modules 140 or 150 (FIG. 2) in some embodiments.

[0138] The waveguides 270, 280, 290, 300, 310 may be configured to propagate light within each respective waveguide by total internal reflection (TIR). The waveguides 270, 280, 290, 300, 310 may each be planar or have another shape (e.g., curved), with major top and bottom surfaces and edges extending between those major top and bottom surfaces. In the illustrated configuration, the waveguides 270, 280, 290, 300, 310 may each include out-coupling optical elements 570, 580, 590, 600, 610 that are configured to extract light out of a waveguide by redirecting the light, propagating within each respective waveguide, out of the waveguide to output image information to the eye 210. Extracted light may also be referred to as out-coupled light and the out-coupling optical elements light may also be referred to light extracting optical elements. An extracted beam of light may be output by the waveguide at locations at which the light propagating in the waveguide strikes a light extracting optical element. The out-coupling optical elements 570, 580, 590, 600, 610 may be, for example, diffractive optical features, including diffractive gratings, as discussed further herein. While the out-coupling optical elements 570, 580, 590, 600, 610 are illustrated as being disposed at the bottom major surfaces of the waveguides 270, 280, 290, 300, 310, in some embodiments they may be disposed at the top and/or bottom major surfaces, and/or may be disposed directly in the volume of the waveguides 270, 280, 290, 300, 310, as discussed further herein. In some embodiments, the out-coupling optical elements 570, 580, 590, 600, 610 may be formed in a layer of material that is attached to a transparent substrate to form the waveguides 270, 280, 290, 300, 310. In some other embodiments, the waveguides 270, 280, 290, 300, 310 may be a monolithic piece of material and the out-coupling optical elements 570, 580, 590, 600, 610 may be formed on a surface and/or in the interior of that piece of material.

[0139] Each waveguide 270, 280, 290, 300, 310 may output light to form an image corresponding to a particular depth plane. For example, the waveguide 270 nearest the eye may deliver collimated beams of light to the eye 210. The collimated beams of light may be representative of the optical infinity focal plane. The next waveguide up 280 may output collimated beams of light which pass through the first lens 350 (e.g., a negative lens) before reaching the eye 210. The first lens 350 may add a slight convex wavefront curvature to the collimated beams so that the eye/brain interprets light coming from that waveguide 280 as originating from a first focal plane closer inward toward the eye 210 from optical infinity. Similarly, the third waveguide 290 passes its output light through both the first lens 350 and the second lens 340 before reaching the eye 210. The combined optical power of the first lens 350 and the second lens 340 may add another incremental amount of wavefront curvature so that the eye/brain interprets light coming from the third waveguide 290 as originating from a second focal plane that is even closer inward from optical infinity than was light from the second waveguide 280.

[0140] The other waveguide layers 300, 310 and lenses 330, 320 are similarly configured, with the highest waveguide 310 in the stack sending its output through all of the lenses between it and the eye for an aggregate focal power representative of the closest focal plane to the person. To compensate for the stack of lenses 320, 330, 340, 350 when viewing/interpreting light coming from the world 510 on the other side of the stacked waveguide assembly 260, a compensating lens layer 620 may be disposed at the top of the stack to compensate for the aggregate optical power of the lens stack 320, 330, 340, 350 below. Such a configuration provides as many perceived focal planes as there are available waveguide/lens pairings. Both the out-coupling optical elements of the waveguides and the focusing aspects of the lenses may be static (i.e., not dynamic or electro-active). In some alternative embodiments, either or both may be dynamic using electro-active features.

[0141] In some embodiments, two or more of the waveguides 270, 280, 290, 300, 310 may have the same associated depth plane. For example, multiple waveguides 270, 280, 290, 300, 310 may output images set to the same depth plane, or multiple subsets of the waveguides 270, 280, 290, 300, 310 may output images set to the same plurality of depth planes, with one set for each depth plane. This can provide advantages for forming a tiled image to provide an expanded field of view at those depth planes.

……
……
……

您可能还喜欢...