雨果巴拉:行业北极星Vision Pro过度设计不适合市场

Magic Leap Patent | Low motion-to-photon latency architecture for augmented and virtual reality display systems

Patent: Low motion-to-photon latency architecture for augmented and virtual reality display systems

Drawings: Click to check drawins

Publication Number: 20220091427

Publication Date: 20220324

Applicants: Magic Leap

Abstract

Systems and methods are disclosed for low motion-to-photon latency for augmented and virtual reality systems. Some systems generate rendered frames that are presented to a user by outputting light from a head-mounted display unit. The rendered frames are perceived by the user as virtual content. The head-mounted display unit includes an orientation sensor, a display configured to output light to the user, and processors. The processors receive a rendered frame of virtual content, obtain orientation information from the orientation sensor, and warp or modify the rendered frame of virtual content based on changes to the orientation of the user's head. The warped rendered frame is subsequently outputted from the display using modulated light. The processors and the orientation sensor may be part of a spatial light modulator for modulating the light used to present the warped rendered frame. In addition, the spatial light modulator may be a LED array having low persistence and a high duty cycle.

Claims

1. A head-mounted display system comprising: a processing system configured to generate rendered frames for output as virtual content; a head-mounted display unit in communication with the processing system via a data link, the head-mounted display unit configured to output the rendered frames as virtual content, wherein the head-mounted display unit comprises: an orientation sensor, the orientation sensor configured to detect orientation information associated with an orientation of the head-mounted display unit; a display, the display configured to output light to present the virtual content; and one or more processors, the one or more processors configured to: receive, via the data link, a rendered frame; obtain orientation information associated with the orientation of the head-mounted display unit; and warp the rendered frame, wherein the warped rendered frame is output via the display.

2. The head-mounted display system of claim 1, wherein the processing system is configured to generate rendered frames at a first frame rate, and wherein the head-mounted display unit is configured to output warped rendered frames at a second frame rate higher than the first frame rate.

3. The head-mounted display system of claim 1, wherein the data link comprises a cable connecting the processing system and head-mounted display unit, wherein a bandwidth of the data link limits the first frame rate.

4. The head-mounted display system of claim 1, wherein the head-mounted display unit is configured to warp each rendered frame a threshold number of times based on respective orientation information.

5. The head-mounted display system of claim 1, wherein a processor of the one or more processors is a hardware application-specific integrated circuit (ASIC) configured to warp rendered frames based on orientation information.

6. The head-mounted display system of claim 5, wherein the display comprises a spatial light modulator, and wherein the spatial light modulator comprises the hardware ASIC.

7. The head-mounted display system of claim 6, wherein the spatial light modulator is configured to adjust pixels of the rendered frame based on the hardware ASIC.

8. The head-mounted display system of claim 5, wherein the hardware ASIC is configured to provide information corresponding to the warped rendered frames to a spatial light modulator associated with the display.

9. The head-mounted display system of claim 1, wherein the display comprises an array of micro-LEDs, wherein each pixel of the warped rendered frame is associated with one or more of the micro-LEDs.

10. The head-mounted display system of claim 9, wherein the display is configured to update the panel globally for each warped rendered frame output by the display.

11. The head-mounted display system of claim 9, wherein the display is configured to update the panel by providing a scanned update.

12. The head-mounted display system of claim 11, wherein the scanned update comprises a sequential updating of individual pixels.

13. The head-mounted display system of claim 11, wherein the scanned update comprises sequential updating of groups of pixels at a same time.

14. The head-mounted display system of claim 1, wherein the one or more processors are configured to warp the rendered frame based on a determined gaze of a user of the head-mounted display unit.

15. The head-mounted display system of claim 1, wherein the orientation sensor is an inertial measurement unit.

16. A system comprising: one or more processors; and one or more computer storage media storing instructions that, when executed by the one or more processors, cause the one or more processors to perform operations comprising: generating, by a first element of the system at a first frame rate, rendered frames of virtual content to be displayed by the system; providing, by the first element via a hardware connection, the rendered frames of virtual content to a second element of the system, the rendered frames being provided at the first frame rate; warping, by the second element at a second frame rate higher than the first frame rate, each rendered frame a threshold number of times based on orientation information associated with the system; and outputting, via a display in communication with the second element, the warped frames at the second frame rate, wherein the display is configured to output the threshold number of warped frames associated with a first rendered frame followed by the threshold number of warped frames associated with a second subsequent rendered frame.

17. The system of claim 16, wherein the hardware connection comprises a cable connecting the first element and second element.

18. The system of claim 17, wherein the second element and display are included in a head-mounted display unit configured to be worn by a user, and wherein the first element is connected to the head-mounted display unit via the cable.

19. The system of claim 16, wherein the display comprises micro-LEDs.

20. A method implemented by a head-mounted display system, the head-mounted display system comprising a first element and a second element, the first element being in communication with the second element via a hardware connection, the method comprising: generating, by a first element at a first frame rate, rendered frames of virtual content to be displayed via the head-mounted display system; providing, by the first element, the rendered frames to a second element, the rendered frames being provided at the first frame rate; warping, by the second element at a second frame rate higher than the first frame rate, each rendered frame a threshold number of times based on orientation information associated with the head-mounted display system; and outputting, via a display in communication with the second element, the warped frames at the second frame rate, wherein the display outputs the threshold number of warped frames associated with a first rendered frame followed by the threshold number of warped frames associated with a second subsequent rendered frame.

21. The method of claim 20, wherein the display comprises micro-LEDs.

Description

PRIORITY CLAIM

[0001] This application claims priority from: U.S. Provisional Application No. 62/786,199 filed on Dec. 28, 2018 and titled "LOW MOTION-TO-PHOTON LATENCY ARCHITECTURE FOR AUGMENTED AND VIRTUAL REALITY DISPLAY SYSTEMS"; U.S. Provisional Application No. 62/858,215 filed on Jun. 6, 2019 and titled "LOW MOTION-TO-PHOTON LATENCY ARCHITECTURE FOR AUGMENTED AND VIRTUAL REALITY DISPLAY SYSTEMS"; U.S. Provisional Application No. 62/800,363 filed on Feb. 1, 2019 and titled "VIRTUAL AND AUGMENTED REALITY DISPLAY SYSTEMS WITH EMISSIE MICRO-DISPLAYS"; and U.S. Provisional Application No. 62/911,018 filed on Oct. 4, 2019 and titled "AUGMENTED AND VIRTUAL REALITY DISPLAY SYSTEMS WITH SHARED DISPLAY FOR LEFT AND RIGHT EYES". The above-noted applications are hereby incorporated by reference herein in their entireties.

INCORPORATION BY REFERENCE

[0002] This application incorporates by reference the entireties of each of the following: U.S. Patent App. Pub. No. 2018/0061121, published Mar. 1, 2018; U.S. patent application Ser. No. 16/221,065, filed Dec. 14, 2018; and U.S. Patent App. Pub. No. 2018/0275410, published Sep. 27, 2018.

BACKGROUND

Field

[0003] The present disclosure relates to display systems and, more particularly, to augmented and virtual reality display systems.

Description of the Related Art

[0004] Modern computing and display technologies have facilitated the development of systems for so called "virtual reality" or "augmented reality" experiences, in which digitally reproduced images or portions thereof are presented to a user in a manner wherein they seem to be, or may be perceived as, real. A virtual reality, or "VR", scenario typically involves the presentation of digital or virtual image information without transparency to other actual real-world visual input; an augmented reality, or "AR", scenario typically involves presentation of digital or virtual image information as an augmentation to visualization of the actual world around the user. A mixed reality, or "MR", scenario is a type of AR scenario and typically involves virtual objects that are integrated into, and responsive to, the natural world. For example, an MR scenario may include AR image content that appears to be blocked by or is otherwise perceived to interact with objects in the real world.

[0005] Referring to FIG. 1, an augmented reality scene 10 is depicted. The user of an AR technology sees a real-world park-like setting 20 featuring people, trees, buildings in the background, and a concrete platform 30. The user also perceives that he/she "sees" "virtual content" such as a robot statue 40 standing upon the real-world platform 30, and a flying cartoon-like avatar character 50 which seems to be a personification of a bumble bee. These elements 50, 40 are "virtual" in that they do not exist in the real world. Because the human visual perception system is complex, it is challenging to produce AR technology that facilitates a comfortable, natural-feeling, rich presentation of virtual image elements amongst other virtual or real-world imagery elements.

SUMMARY

[0006] In some embodiments, a head-mounted display system is provided. The head-mounted display includes a processing system configured to generate rendered frames for output as virtual content, and a head-mounted display unit in communication with the processing system via a data link. The head-mounted display unit is configured to output the rendered frames as virtual content. In addition, the head-mounted display unit comprises an orientation sensor, a display, and one or more processors. The orientation sensor is configured to detect orientation information associated with an orientation of the head-mounted display unit. The display is configured to output light to present the virtual content. The one or more processors are configured to: receive, via the data link, a rendered frame; obtain orientation information associated with the orientation of the head-mounted display unit; and warp the rendered frame, wherein the warped rendered frame is output via the display.

[0007] In some other embodiments, a system is provided. The system comprises one or more processors; and one or more computer storage media storing instructions. When executed by the one or more processors, the instructions cause the one or more processors to perform operations comprising: generating, by a first element of the system at a first frame rate, rendered frames of virtual content to be displayed by the system; providing, by the first element via a hardware connection, the rendered frames of virtual content to a second element of the system, the rendered frames being provided at the first frame rate; warping, by the second element at a second frame rate higher than the first frame rate, each rendered frame a threshold number of times based on orientation information associated with the system; and outputting, via a display in communication with the second element, the warped frames at the second frame rate. The display is configured to output the threshold number of warped frames associated with a first rendered frame followed by the threshold number of warped frames associated with a second subsequent rendered frame.

[0008] In yet other embodiments, a method is provided. The method is implemented by a head-mounted display system that comprises a first element and a second element. The first element is in communication with the second element via a hardware connection. The method comprising: generating, by a first element at a first frame rate, rendered frames of virtual content to be displayed via the head-mounted display system; providing, by the first element, the rendered frames to a second element, the rendered frames being provided at the first frame rate; warping, by the second element at a second frame rate higher than the first frame rate, each rendered frame a threshold number of times based on orientation information associated with the head-mounted display system; and outputting, via a display in communication with the second element, the warped frames at the second frame rate. The display outputs the threshold number of warped frames associated with a first rendered frame followed by the threshold number of warped frames associated with a second subsequent rendered frame.

[0009] Additional examples are provided below.

[0010] Example 1. A head-mounted display system comprising: [0011] a processing system configured to generate rendered frames for output as virtual content; [0012] a head-mounted display unit in communication with the processing system via a data link, the head-mounted display unit configured to output the rendered frames as virtual content, wherein the head-mounted display unit comprises: [0013] an orientation sensor, the orientation sensor configured to detect orientation information associated with an orientation of the head-mounted display unit; [0014] a display, the display configured to output light to present the virtual content; and one or more processors, the one or more processors configured to: [0015] receive, via the data link, a rendered frame; [0016] obtain orientation information associated with the orientation of the head-mounted display unit; and [0017] warp the rendered frame, wherein the warped rendered frame is output via the display.

[0018] Example 2. The head-mounted display system of example 1, wherein the processing system is configured to generate rendered frames at a first frame rate, and wherein the head-mounted display unit is configured to output warped rendered frames at a second frame rate higher than the first frame rate.

[0019] Example 3. The head-mounted display system of example 1, wherein the data link comprises a cable connecting the processing system and head-mounted display unit, wherein a bandwidth of the data link limits the first frame rate.

[0020] Example 4. The head-mounted display system of example 1, wherein the head-mounted display unit is configured to warp each rendered frame a threshold number of times based on respective orientation information.

[0021] Example 5. The head-mounted display system of example 1, wherein a processor of the one or more processors is a hardware application-specific integrated circuit (ASIC) configured to warp rendered frames based on orientation information.

[0022] Example 6. The head-mounted display system of example 5, wherein the display comprises a spatial light modulator, and wherein the spatial light modulator comprises the hardware ASIC.

[0023] Example 7. The head-mounted display system of example 6, wherein the spatial light modulator is configured to adjust pixels of the rendered frame based on the hardware ASIC.

[0024] Example 8. The head-mounted display system of example 5, wherein the hardware ASIC is configured to provide information corresponding to the warped rendered frames to a spatial light modulator associated with the display.

[0025] Example 9. The head-mounted display system of example 1, wherein the display comprises an array of micro-LEDs, wherein each pixel of the warped rendered frame is associated with one or more of the micro-LEDs.

[0026] Example 10. The head-mounted display system of example 9, wherein the display is configured to update the panel globally for each warped rendered frame output by the display.

[0027] Example 11. The head-mounted display system of example 9, wherein the display is configured to update the panel by providing a scanned update.

[0028] Example 12. The head-mounted display system of example 11, wherein the scanned update comprises a sequential updating of individual pixels.

[0029] Example 13. The head-mounted display system of example 11, wherein the scanned update comprises sequential updating of groups of pixels at a same time.

[0030] Example 14. The head-mounted display system of example 1, wherein the one or more processors are configured to warp the rendered frame based on a determined gaze of a user of the head-mounted display unit.

[0031] Example 15. The head-mounted display system of example 1, wherein the orientation sensor is an inertial measurement unit.

[0032] Example 16. A system comprising: [0033] one or more processors; and [0034] one or more computer storage media storing instructions that, when executed by the one or more processors, cause the one or more processors to perform operations comprising: [0035] generating, by a first element of the system at a first frame rate, rendered frames of virtual content to be displayed by the system; [0036] providing, by the first element via a hardware connection, the rendered frames of virtual content to a second element of the system, the rendered frames being provided at the first frame rate; [0037] warping, by the second element at a second frame rate higher than the first frame rate, each rendered frame a threshold number of times based on orientation information associated with the system; and [0038] outputting, via a display in communication with the second element, the warped frames at the second frame rate, [0039] wherein the display is configured to output the threshold number of warped frames associated with a first rendered frame followed by the threshold number of warped frames associated with a second subsequent rendered frame.

[0040] Example 17. The system of example 16, wherein the hardware connection comprises a cable connecting the first element and second element.

[0041] Example 18. The system of example 17, wherein the second element and display are included in a head-mounted display unit configured to be worn by a user, and wherein the first element is connected to the head-mounted display unit via the cable.

[0042] Example 19. The system of example 16, wherein the display comprises micro-LEDs.

[0043] Example 20. A method implemented by a head-mounted display system, the head-mounted display system comprising a first element and a second element, the first element being in communication with the second element via a hardware connection, the method comprising: [0044] generating, by a first element at a first frame rate, rendered frames of virtual content to be displayed via the head-mounted display system; [0045] providing, by the first element, the rendered frames to a second element, the rendered frames being provided at the first frame rate; [0046] warping, by the second element at a second frame rate higher than the first frame rate, each rendered frame a threshold number of times based on orientation information associated with the head-mounted display system; and [0047] outputting, via a display in communication with the second element, the warped frames at the second frame rate, [0048] wherein the display outputs the threshold number of warped frames associated with a first rendered frame followed by the threshold number of warped frames associated with a second subsequent rendered frame.

[0049] Example 21. The method of example 20, wherein the display comprises micro-LEDs.

BRIEF DESCRIPTION OF THE DRAWINGS

[0050] The following drawings and the associated descriptions are provided to illustrate embodiments of the present disclosure and do not limit the scope of the claims. Aspects and many of the attendant advantages of this disclosure will become more readily appreciated as the same become better understood by reference to the following detailed description, when taken in conjunction with the accompanying drawings, wherein:

[0051] FIG. 1 illustrates a user's view of augmented reality (AR) through an AR device.

[0052] FIG. 2 illustrates a conventional display system for simulating three-dimensional imagery for a user.

[0053] FIGS. 3A-3C illustrate relationships between radius of curvature and focal radius.

[0054] FIG. 4A illustrates a representation of the accommodation-vergence response of the human visual system.

[0055] FIG. 4B illustrates examples of different accommodative states and vergence states of a pair of eyes of the user.

[0056] FIG. 4C illustrates an example of a representation of a top-down view of a user viewing content via a display system.

[0057] FIG. 4D illustrates another example of a representation of a top-down view of a user viewing content via a display system.

[0058] FIG. 5 illustrates aspects of an approach for simulating three-dimensional imagery by modifying wavefront divergence.

[0059] FIG. 6 illustrates an example of a waveguide stack for outputting image information to a user.

[0060] FIG. 7 illustrates an example of exit beams outputted by a waveguide.

[0061] FIG. 8 illustrates an example of a stacked eyepiece in which each depth plane includes images formed using multiple different component colors.

[0062] FIG. 9A illustrates a cross-sectional side view of an example of a set of stacked waveguides that each includes an in-coupling optical element.

[0063] FIG. 9B illustrates a perspective view of an example of the plurality of stacked waveguides of FIG. 9A.

[0064] FIG. 9C illustrates a top-down plan view of an example of the plurality of stacked waveguides of FIGS. 9A and 9B.

[0065] FIG. 9D illustrates a top-down plan view of another example of a plurality of stacked waveguides.

[0066] FIG. 9E illustrates an example of wearable display system.

[0067] FIG. 10 illustrates an example of a wearable display system with a light projection system having a spatial light modulator and a separate light source.

[0068] FIG. 11A illustrates an example of a wearable display system with a light projection system having multiple emissive micro-displays.

[0069] FIG. 11B illustrates an example of an emissive micro-display with an array of light emitters.

[0070] FIG. 12 illustrates another example of a wearable display system with a light projection system having multiple emissive micro-displays and associated light redirecting structures.

[0071] FIG. 13A illustrates an example of a side-view of a wearable display system with a light projection system having multiple emissive micro-displays and an eyepiece having waveguides with overlapping and laterally-shifted light in-coupling optical elements.

[0072] FIG. 13B illustrates another example of a wearable display system with a light projection system having multiple emissive micro-displays configured to direct light to a single light in-coupling area of an eyepiece.

[0073] FIG. 14 illustrates an example of a wearable display system with a single emissive micro-display.

[0074] FIG. 15 illustrates a side view of an example of an eyepiece having a stack of waveguides with overlapping in-coupling optical elements.

[0075] FIG. 16 illustrates a side view of an example of a stack of waveguides with color filters for mitigating ghosting or crosstalk between waveguides.

[0076] FIG. 17 illustrates an example of a top-down view of the eyepieces of FIGS. 15 and 16.

[0077] FIG. 18 illustrates another example of a top-down view of the eyepieces of FIGS. 15 and 16.

[0078] FIG. 19A illustrates a side view of an example of an eyepiece having a stack of waveguides with overlapping and laterally-shifted in-coupling optical elements.

[0079] FIG. 19B illustrates a side view of an example of the eyepiece of FIG. 19A with color filters for mitigating ghosting or crosstalk between waveguides.

[0080] FIG. 20A illustrates an example of a top-down view of the eyepieces of FIGS. 19A and 19B.

[0081] FIG. 20B illustrates another example of a top-down view of the eyepieces of FIGS. 19A and 19B.

[0082] FIG. 21 illustrates a side view of an example of re-bounce in a waveguide.

[0083] FIGS. 22A-22C illustrate examples of top-down views of an eyepiece having in-coupling optical elements configured to reduce re-bounce.

[0084] FIGS. 23A-23C illustrate additional examples of top-down views of an eyepiece having in-coupling optical elements configured to reduce re-bounce.

[0085] FIG. 24A illustrates an example of angular emission profiles of light emitted by individual light emitters of an emissive micro-display, and light captured by projection optics.

[0086] FIG. 24B illustrates an example of the narrowing of angular emission profiles using an array of light collimators.

[0087] FIG. 25A illustrates an example of a side view of an array of tapered reflective wells for directing light to projection optics.

[0088] FIG. 25B illustrates an example of a side view of an asymmetric tapered reflective well.

[0089] FIGS. 26A-26C illustrate examples of differences in light paths for light emitters at different positions relative to center lines of overlying lens.

[0090] FIG. 27 illustrates an example of a side view of individual light emitters of an emissive micro-display with an overlying nano-lens array.

[0091] FIG. 28 is a perspective view of an example of the emissive micro-display of FIG. 27.

[0092] FIG. 29 illustrates an example of a wearable display system with the full-color emissive micro-display of FIG. 28.

[0093] FIG. 30A illustrates an example of a wearable display system with an emissive micro-display and an associated array of light collimators.

[0094] FIG. 30B illustrates an example of a light projection system with multiple emissive micro-displays, each with an associated array of light collimators.

[0095] FIG. 30C illustrates an example of a wearable display system with multiple emissive micro-displays, each with an associated array of light collimators.

[0096] FIGS. 31A and 31B illustrate examples of waveguide assemblies having variable focus elements for varying the wavefront divergence of light to a viewer

[0097] FIG. 32 illustrates a block diagram of an example wearable display system having a spatial light modulator with a warp engine.

[0098] FIGS. 33A-33B illustrate block diagrams of other example wearable display systems having spatial light modulators with warp engines.

[0099] FIGS. 34A-34B illustrates example schemes to update pixels of a spatial light modulator.

[0100] FIG. 35 illustrates a flowchart of an example process for outputting a warped frame of rendered content according to the techniques described herein

DETAILED DESCRIPTION

[0101] This specification describes, among other things, systems and techniques for providing augmented or virtual reality content to a user. In some embodiments, the augmented or virtual reality display system may render image frames that are then presented to the user. The presentation of images to the user may involve the display system outputting spatially modulated light that forms an image on the retina of the user's eye and is perceived as augmented or virtual reality content (which may also be referred to as "virtual content"). As an example, the display system may render and present the virtual content to the user at regular intervals, e.g., at one or more frame rates (e.g., 60 Hz, 330 Hz, and so on).

[0102] The orientation of the user's head and eyes may inform rendering of image frames for the virtual content. As an example, the virtual content may be configured to be perceived as fixed in position relative to the user and/or real-world objects. Thus, if the user rotates his/her head downward, the display system may adjust the rendered image frame accordingly, so that the virtual content is perceived to be at the appropriate location and shows details corresponding to the appropriate perspective. As a result, generating virtual content may involve determining the orientation of the user's head and rendering image frames based upon this determination. The orientation of the user's head may also be referred to as head pose or simply pose and, as an approximation, this pose may be determined by determining the orientation of the display mounted to the user's head.

[0103] It will be understood that a user's head may move and that a pose at a given instant may be the result of this movement. In addition, there may be a delay between the determination of the pose and the outputting of spatially modulated light to the user's eyes based on this pose. The delay may be caused, for example, by the time needed for electronics and optical systems to generate virtual content. This delay may be referred to as motion-to-photon latency.

[0104] In instances where the user's head continues to move, given the existence of motion-to-photon latency, the orientation of the user's head may change in the time span between the pose determination and the presentation of a rendered frame to the user's eyes. As a result, the rendered frame may not accurately correspond to the user's pose at the moment that he/she is presented with the rendered frame. Re-rendering the frame, however, may not address this change in pose, since the user's head may continue to move and the pose continue to change.

[0105] One technique for addressing this change in pose is to modify the rendered frame before presenting it to the user's eyes. Such modification occurs more quickly than re-rendering the frame, thereby reducing the possibility of perceptible mismatches between the frame presented to a user and the user's pose at the moment that they receive the presented frames. For example, updated pose information may be obtained and the rendered frame may be modified to correspond to the updated pose information. Such a modification may be referred to as frame warping and each rendered frame may be warped before being presented to a user.

[0106] Even with such warping, however, there may still be perceptible mismatches between the frame presented to a user and the user's pose. For example, such mismatches may occur where the user's head moves sufficiently quickly that even the warped frame is mismatched with the user's current pose. As a result, it would be desirable to further reduce motion-to-photon latency.

[0107] In some embodiments, to reduce motion-to-photon latency, the display system may be configured to provide a first rendered frame of virtual content based on current pose, or orientation, information. The first rendered frame may be generated to a frame rate at which virtual content is rendered (e.g., by a graphics processing unit). The first rendered frame may be presented to a user and may be warped depending upon determined pose information. Prior to rendering of a second, subsequent frame, the display system may generate and present to the user one or more additional frames. One or more of the additional frames may comprise adjustments to the first rendered frame, the adjustments being based on updated pose information. The display system may then render and present the second rendered frame (possibly after warping that second rendered frame) according to the frame rate for rendering content. Thus, the display system may render frames at the frame rate, but output additional frames of virtual content at greater than the frame rate. These additional frames may be warped depending upon pose information. Thus, in some embodiments, one or more warped frames may be represented between rendered frames.

[0108] Advantageously, the techniques and systems described herein may reduce motion-to-photon latency. The techniques and systems described herein may also reduce visual artifacts, motion blur, and so on, associated with warping. The techniques and systems may also advantageously provide power savings, processing savings, and so on.

Warping Rendered Frames

[0109] As described with reference to FIG. 1, a user may see, for example, virtual content comprising a robot 40. In this example, the display system may render the robot 40 according to a frame rate as described above. The display system may then output, or present, the rendered frames to the user. As described above, the display system may render each frame based on orientation information associated with the user. Examples of orientation information may include movement of the user's head (e.g., rotation about one or more axes, translation along one or more axes, and so on) movement of the user's eyes (e.g., rotation about one or more axes), and the like. For example, a first frame may be rendered while the user's head/eyes is/are directed straight toward the robot 40. In this example, the user may then adjust his/her head/eyes along one or more axes. When rendering the subsequent, second frame, the display system may thus utilize this orientation information to inform rendering of the second frame. For example, the robot 40 may be rendered such that it appears to remain standing vertically, but in a different part of the user's field of view.

[0110] In the above example, the second frame may be rendered a certain time period after the first frame. For an example frame rate of 60 Hz, the second frame may thus be rendered 16 milliseconds after the first frame. For an example frame rate of 330 Hz, the second frame may be rendered 8.3 milliseconds after the first frame. However, during this time period (e.g., 16 ms or 8.3 ms) the user may have rotated his/her head downwards. After this rotation, the user may still be presented with the first frame. Since this first frame was rendered based on the user looking straight ahead, the first frame may thus include an inaccurately positioned robot 40 during the rotation. When the second frame is rendered, the display system may render the robot 40 based on the detected rotation. Thus, when the second frame is presented the robot 40 may appear to be re-positioned. This updating of the position of the robot 40 may provide a visual discontinuity that is visually apparent to the user. For example, the combination of head movement (a change in pose) and motion-to-photon latency may result in a change in presented images that exceeds a threshold which is noticeable to the user, e.g., as being unnatural.

[0111] In some embodiments, the display system may be configured to warp the above-described first frame until the second frame is rendered. Warping a frame may comprise adjusting aspects of the frame based on orientation information, such as a determined head pose of the user, a determined eye gaze, and the like. Example aspects of the frame may include pixels of the frame. In this example, warping a rendered frame may move one or more pixels included in the rendered frame to respective new positions. Thus, warped frames may be generated based on the image information included in an existing rendered frame.

[0112] The display system may warp a rendered frame a multitude of times based on determined head poses of the user. With respect to the above-described example of the robot 40, the display system may thus warp the first frame at a particular frame rate until the second frame is rendered. As will be described, the display system may render virtual content at a render frame rate (e.g., 60 Hz, 120 Hz) and output the virtual content to the user at a warp frame rate (e.g., 240 Hz, 480 Hz, 2000 Hz, 2040 Hz, and so on). In this way, the user may view one or more warped frames between two time-adjacent rendered frames and the effective motion-to-latency is decreased.

[0113] For example, the first frame may be presented via the display system at a particular time. As described above, the first frame may be rendered based on a determined head pose of the user. As an example, the head pose may be based on an orientation sensor, such as an inertial measurement unit (IMU), associated with the display system. The display system may then generate a warped frame according to the warp frame rate. For each warped frame, the display system may adjust the first frame based on a respective determined head pose of the user. These warped frames may be presented to the user, until the second frame is rendered according to the rendered frame rate. Since the display system is presenting warped frames to the user, which are based on determined head poses of the user, the virtual content (e.g., robot 40) may appear more lifelike. For example, the virtual content may appear to move more naturally and with less apparent discrete jumps.

[0114] Examples of warping may include late-frame time warp, asynchronous time warp, continuous time warp, and so on. Examples of continuous time warp may include read cursor redirection, pixel redirection, buffer re-smear, write cursor redirection, and so on. Further description related to warping individual rendered frames is discussed in U.S. Patent App. 2018/0061121, published Mar. 1, 2018, which is incorporated herein by reference in its entirety.

[0115] While the warping as discussed herein may provide benefits with respect to lowering the effective motion-to-photon latency, it will be appreciated that certain display technologies may constrain an effectiveness associated with the warping. In some display systems, spatially modulated light for forming images may be provided by a liquid crystal-based spatial light modular. It is to be appreciated that the spatial light modular may modulate perceived intensity of light to encode the light with image information. An example of such a spatial light modulator is liquid crystal on silicon (LCoS) panel. LCoS panels may have a maximum refresh rate at which the LCoS panel is capable of effectively operating. As an example, an LCoS panel may be capable of achieving a maximum refresh rate of 120 Hz (e.g., there may be three colors, each being presented at 360 Hz). Thus, the LCoS panel may output warped images to a user at no higher than the maximum refresh rate. This maximum refresh rate may be unable to achieve a motion-to-photon latency which is imperceptible to a user.

[0116] Advantageously, in some embodiments, exceptionally low motion-to-photon latency may be achieved using display technologies providing exceptionally fast maximum refresh rates. Examples of such display technologies include arrays of light emitting diodes (LEDs), such as micro-LEDs arrays or displays. Micro-LED displays may comprise multitudes of micro-LEDs that each emit light. Thus, micro-LED arrays may be referred to as emissive spatial light modulators. In some embodiments, the modulators may modulate light from different light sources. In some embodiments, the modulators may be a light source. In some embodiments, each micro-LED is separately addressable. The micro-LEDs may be capable of being switched on and off very rapidly and, for example, may achieve a maximum refresh rate of 2000 Hz or more. Another example of such a display technology may include technologies based on micro-electro-mechanical systems (MEMS). For example, digital light processing (DLP) technologies may be utilized. While the description below refers to micro-LEDs for ease of discussion, it will be understood that the disclosure may utilize additional display technologies (e.g., DLP) providing refresh rates higher than, e.g., LCoS-based system. Such additional display technologies fall within the scope of the current disclosure.

[0117] Based on utilization of the above-described display technologies, the display system may thus increase a rate at which frames of virtual content are provided to a user. For example, the display system may render frames at a render frame rate of 60 Hz, 120 Hz, and so on. Due to the enhanced display technologies described above, the display system may be capable of outputting frames of virtual content at 2000 Hz or more. With respect to micro-LEDs separated into three primary colors, the display system may therefore output frames of virtual content at 666 Hz or more. Thus, and as will be described, the display system may therefore warp a rendered frame a threshold number of times based on determined head pose of the user. With respect to the example of a render frame rate of 60 Hz, the display system may warp, and output, a rendered frame 11 times or more prior to generation of a subsequent rendered frame. In this way, the techniques described herein may provide exceptionally low motion-to-photon latency, such that users are presented with more realistic virtual content.

[0118] The techniques described herein may therefore provide disparate example advantages. As described above, motion-to-photon latency may be improved. Additionally, the techniques may enable display system resource improvements (e.g., reduced power usage, reduced processing requirements, and so on). Furthermore, improvements in usability and performance of the display system may be provided. For example, motion blur may be reduced, while perceived brightness of presented virtual content may be enhanced.

Display System Resource Savings

[0119] It will be appreciated that a required bandwidth between processing elements and a spatial light modulator utilized by the display system may be substantial. As an example, and as illustrated at least in FIG. 9E, a graphics processing unit may be included in a local processing & data module 140 which is separate from a display unit 70 worn by a user. The local processing & data module 140 may render virtual content for presentation via the display unit 70. For example, the module 140 may render frames of virtual content and then optionally warp these rendered frames. As described herein, this module 140 may optionally be worn on the user (e.g., in a backpack, in an enclosure attachable to the user's pants, and so on). Therefore, in some embodiments the display unit 70 may receive rendered frames at the render frame rate described above. In schemes in which an LCoS panel may be utilized, the display unit 70 may therefore, and as an example, receive rendered frames at 120 Hz. In this example, the bandwidth between the module 140 and the display unit 70 may therefore represent at least the image information included in each rendered frame multiplied by 120 Hz.

[0120] Since the display technologies described herein, such as micro-LEDs, may be capable of substantially higher refresh rates, the bandwidth between the local processing & data module 140 and display unit 70 may be accordingly higher. Due to a potential distance between the module 140 and display unit 70 and the bandwidth needed, there may be substantial power requirements to support the transmission of the image information at the higher refresh rates.

[0121] Advantageously, and as will be described below, one or more of the processing elements conventionally included in the local processing & data module 140 may reside in the display unit 70. As a first example, the module 140 may maintain a graphics processing unit to render frames. These rendered frames may be provided from the module 140 to the display unit 70 at a render frame rate (e.g., 60 Hz, 120 Hz). The display unit 70, however, may include one or more processing elements configured to perform the warping described above. For example, the display unit 70 may comprise a hardware warp application-specific integrated circuit (ASIC).

[0122] In this first example, the hardware warp ASIC may receive a rendered frame from the module 140, and then repeatedly warp the rendered frame according to orientation information received from an orientation sensor, such as an inertial measurement unit (IMU), from eye tracking cameras, and the like. The hardware warp ASIC may then output the warped frame at a warp frame rate (e.g., 666 Hz, 2000 Hz, and so on) to control logic of the spatial light modulator. The spatial light modulator may then cause light forming the warped frame to be presented to the user. Thus, in some embodiments, the hardware warp ASIC may be positioned physically closer to the control logic of the spatial light modulator. Due to this proximity, the techniques described herein may advantageously reduce power requirements associated with the improved warp functionality described above.

[0123] As a second example, the hardware warp ASIC described above may be included in the control logic of the spatial light modulator. In this way, the spatial light modulator may receive a rendered frame (e.g., from the local processing & data module 140), and warp the rendered frame based on information received from an orientation sensor. The spatial light modulator may then directly cause output of light forming each warped frame. In this second example, the bandwidth requirements between the module 140 and display unit 70 may be reduced. For example, the display unit 70 may receive the rendered frames at the render frame rate. Additionally, the control logic of the spatial light modulator may thus (1) directly warp the received rendered frames, and (2) cause light forming the warped frames to be output to a user.

Reduced Motion Blur

[0124] Advantageously, motion blur associated with presentation of virtual content may be reduced using the techniques and systems described herein. With respect to virtual content, it will be appreciated that motion blur may relate to a field persistence associated with presentation of the virtual content. Field persistence, as utilized herein, may indicate a time for which light forming a single virtual content frame is presented to a user. It will be appreciated that motion blur may be reduced through a reduction in the field persistence. Therefore, reducing field persistence may cause a user to be presented with a same frame of virtual content for a smaller time duration.

[0125] However, in the example of an LCoS panel, the reduction in field persistence may appreciably reduce a perceived brightness associated with presented virtual content. For example, an LCoS panel may be capable of presenting virtual content at a frame rate of 120 Hz. Thus, in this example, there may be 8.33 ms between adjacent presented frames of virtual content. The LCoS panel may utilize an LED light source (e.g., as described regarding the systems of FIGS. 6 and 9E), with the LEDs optionally comprising three primary colors. For an example frame being presented, a spatial light modulator may cause each primary color of LED to successively turn on for a threshold amount of time (e.g., 1 ms, 1.2 ms). The spatial light modular may then cause the LEDs to turn off for the remainder of the 8.33 ms. In this example, the LEDs may be turned on for 40%, 45%, and so on, of the duration of the 8.33 ms frame (referred to herein as a "duty cycle". While this may reduce the appearance of motion blur, it may noticeably decrease an achievable brightness.

[0126] In contrast to the above example, a display system described herein may achieve a field persistence below a threshold (e.g., 0.4 ms, 0.5 ms, 0.6 ms) while maintaining perceived brightness. In this way, motion blur may be further reduced as compared to prior techniques. Additionally, the display system may optionally achieve a duty cycle greater than a second threshold (e.g., 90%, 95%, 99%).

Example Display System

[0127] FIG. 2 illustrates a conventional display system for simulating three-dimensional imagery for a user. It will be appreciated that a user's eyes are spaced apart and that, when looking at a real object in space, each eye will have a slightly different view of the object and may form an image of the object at different locations on the retina of each eye. This may be referred to as binocular disparity and may be utilized by the human visual system to provide a perception of depth. Conventional display systems simulate binocular disparity by presenting two distinct images 190, 200 with slightly different views of the same virtual object--one for each eye 210, 220--corresponding to the views of the virtual object that would be seen by each eye were the virtual object a real object at a desired depth. These images provide binocular cues that the user's visual system may interpret to derive a perception of depth.

[0128] With continued reference to FIG. 2, the images 190, 200 are spaced from the eyes 210, 220 by a distance 230 on a z-axis. The z-axis is parallel to the optical axis of the viewer with their eyes fixated on an object at optical infinity directly ahead of the viewer. The images 190, 200 are flat and at a fixed distance from the eyes 210, 220. Based on the slightly different views of a virtual object in the images presented to the eyes 210, 220, respectively, the eyes may naturally rotate such that an image of the object falls on corresponding points on the retinas of each of the eyes, to maintain single binocular vision. This rotation may cause the lines of sight of each of the eyes 210, 220 to converge onto a point in space at which the virtual object is perceived to be present. As a result, providing three-dimensional imagery conventionally involves providing binocular cues that may manipulate the vergence of the user's eyes 210, 220, and that the human visual system interprets to provide a perception of depth.

[0129] Generating a realistic and comfortable perception of depth is challenging, however. It will be appreciated that light from objects at different distances from the eyes have wavefronts with different amounts of divergence. FIGS. 3A-3C illustrate relationships between distance and the divergence of light rays. The distance between the object and the eye 210 is represented by, in order of decreasing distance, R1, R2, and R3. As shown in FIGS. 3A-3C, the light rays become more divergent as distance to the object decreases. Conversely, as distance increases, the light rays become more collimated. Stated another way, it may be said that the light field produced by a point (the object or a part of the object) has a spherical wavefront curvature, which is a function of how far away the point is from the eye of the user. The curvature increases with decreasing distance between the object and the eye 210. While only a single eye 210 is illustrated for clarity of illustration in FIGS. 3A-3C and other figures herein, the discussions regarding eye 210 may be applied to both eyes 210 and 220 of a viewer.

[0130] With continued reference to FIGS. 3A-3C, light from an object that the viewer's eyes are fixated on may have different degrees of wavefront divergence. Due to the different amounts of wavefront divergence, the light may be focused differently by the lens of the eye, which in turn may require the lens to assume different shapes to form a focused image on the retina of the eye. Where a focused image is not formed on the retina, the resulting retinal blur acts as a cue to accommodation that causes a change in the shape of the lens of the eye until a focused image is formed on the retina. For example, the cue to accommodation may trigger the ciliary muscles surrounding the lens of the eye to relax or contract, thereby modulating the force applied to the suspensory ligaments holding the lens, thus causing the shape of the lens of the eye to change until retinal blur of an object of fixation is eliminated or minimized, thereby forming a focused image of the object of fixation on the retina (e.g., fovea) of the eye. The process by which the lens of the eye changes shape may be referred to as accommodation, and the shape of the lens of the eye required to form a focused image of the object of fixation on the retina (e.g., fovea) of the eye may be referred to as an accommodative state.

[0131] With reference now to FIG. 4A, a representation of the accommodation-vergence response of the human visual system is illustrated. The movement of the eyes to fixate on an object causes the eyes to receive light from the object, with the light forming an image on each of the retinas of the eyes. The presence of retinal blur in the image formed on the retina may provide a cue to accommodation, and the relative locations of the image on the retinas may provide a cue to vergence. The cue to accommodation causes accommodation to occur, resulting in the lenses of the eyes each assuming a particular accommodative state that forms a focused image of the object on the retina (e.g., fovea) of the eye. On the other hand, the cue to vergence causes vergence movements (rotation of the eyes) to occur such that the images formed on each retina of each eye are at corresponding retinal points that maintain single binocular vision. In these positions, the eyes may be said to have assumed a particular vergence state. With continued reference to FIG. 4A, accommodation may be understood to be the process by which the eye achieves a particular accommodative state, and vergence may be understood to be the process by which the eye achieves a particular vergence state. As indicated in FIG. 4A, the accommodative and vergence states of the eyes may change if the user fixates on another object. For example, the accommodated state may change if the user fixates on a new object at a different depth on the z-axis.

[0132] Without being limited by theory, it is believed that viewers of an object may perceive the object as being "three-dimensional" due to a combination of vergence and accommodation. As noted above, vergence movements (e.g., rotation of the eyes so that the pupils move toward or away from each other to converge the lines of sight of the eyes to fixate upon an object) of the two eyes relative to each other are closely associated with accommodation of the lenses of the eyes. Under normal conditions, changing the shapes of the lenses of the eyes to change focus from one object to another object at a different distance will automatically cause a matching change in vergence to the same distance, under a relationship known as the "accommodation-vergence reflex." Likewise, a change in vergence will trigger a matching change in lens shape under normal conditions.

[0133] With reference now to FIG. 4B, examples of different accommodative and vergence states of the eyes are illustrated. The pair of eyes 222a is fixated on an object at optical infinity, while the pair eyes 222b are fixated on an object 221 at less than optical infinity. Notably, the vergence states of each pair of eyes is different, with the pair of eyes 222a directed straight ahead, while the pair of eyes 222 converge on the object 221. The accommodative states of the eyes forming each pair of eyes 222a and 222b are also different, as represented by the different shapes of the lenses 210a, 220a.

[0134] Undesirably, many users of conventional "3-D" display systems find such conventional systems to be uncomfortable or may not perceive a sense of depth at all due to a mismatch between accommodative and vergence states in these displays. As noted above, many stereoscopic or "3-D" display systems display a scene by providing slightly different images to each eye. Such systems are uncomfortable for many viewers, since they, among other things, simply provide different presentations of a scene and cause changes in the vergence states of the eyes, but without a corresponding change in the accommodative states of those eyes. Rather, the images are shown by a display at a fixed distance from the eyes, such that the eyes view all the image information at a single accommodative state. Such an arrangement works against the "accommodation-vergence reflex" by causing changes in the vergence state without a matching change in the accommodative state. This mismatch is believed to cause viewer discomfort. Display systems that provide a better match between accommodation and vergence may form more realistic and comfortable simulations of three-dimensional imagery.

[0135] Without being limited by theory, it is believed that the human eye typically may interpret a finite number of depth planes to provide depth perception. Consequently, a highly believable simulation of perceived depth may be achieved by providing, to the eye, different presentations of an image corresponding to each of these limited numbers of depth planes. In some embodiments, the different presentations may provide both cues to vergence and matching cues to accommodation, thereby providing physiologically correct accommodation-vergence matching.

[0136] With continued reference to FIG. 4B, two depth planes 240, corresponding to different distances in space from the eyes 210, 220, are illustrated. For a given depth plane 240, vergence cues may be provided by the displaying of images of appropriately different perspectives for each eye 210, 220. In addition, for a given depth plane 240, light forming the images provided to each eye 210, 220 may have a wavefront divergence corresponding to a light field produced by a point at the distance of that depth plane 240.

[0137] In the illustrated embodiment, the distance, along the z-axis, of the depth plane 240 containing the point 221 is 1 m. As used herein, distances or depths along the z-axis may be measured with a zero-point located at the exit pupils of the user's eyes. Thus, a depth plane 240 located at a depth of 1 m corresponds to a distance of 1 m away from the exit pupils of the user's eyes, on the optical axis of those eyes with the eyes directed towards optical infinity. As an approximation, the depth or distance along the z-axis may be measured from the display in front of the user's eyes (e.g., from the surface of a waveguide), plus a value for the distance between the device and the exit pupils of the user's eyes. That value may be called the eye relief and corresponds to the distance between the exit pupil of the user's eye and the display worn by the user in front of the eye. In practice, the value for the eye relief may be a normalized value used generally for all viewers. For example, the eye relief may be assumed to be 20 mm and a depth plane that is at a depth of 1 m may be at a distance of 980 mm in front of the display.

[0138] With reference now to FIGS. 4C and 4D, examples of matched accommodation-vergence distances and mismatched accommodation-vergence distances are illustrated, respectively. As illustrated in FIG. 4C, the display system may provide images of a virtual object to each eye 210, 220. The images may cause the eyes 210, 220 to assume a vergence state in which the eyes converge on a point 15 on a depth plane 240. In addition, the images may be formed by a light having a wavefront curvature corresponding to real objects at that depth plane 240. As a result, the eyes 210, 220 assume an accommodative state in which the images are in focus on the retinas of those eyes. Thus, the user may perceive the virtual object as being at the point 15 on the depth plane 240.

[0139] It will be appreciated that each of the accommodative and vergence states of the eyes 210, 220 are associated with a particular distance on the z-axis. For example, an object at a particular distance from the eyes 210, 220 causes those eyes to assume particular accommodative states based upon the distances of the object. The distance associated with a particular accommodative state may be referred to as the accommodation distance, A.sub.d. Similarly, there are particular vergence distances, V.sub.d, associated with the eyes in particular vergence states, or positions relative to one another. Where the accommodation distance and the vergence distance match, the relationship between accommodation and vergence may be said to be physiologically correct. This is considered to be the most comfortable scenario for a viewer.

[0140] In stereoscopic displays, however, the accommodation distance and the vergence distance may not always match. For example, as illustrated in FIG. 4D, images displayed to the eyes 210, 220 may be displayed with wavefront divergence corresponding to depth plane 240, and the eyes 210, 220 may assume a particular accommodative state in which the points 15a, 15b on that depth plane are in focus. However, the images displayed to the eyes 210, 220 may provide cues for vergence that cause the eyes 210, 220 to converge on a point 15 that is not located on the depth plane 240. As a result, the accommodation distance corresponds to the distance from the exit pupils of the eyes 210, 220 to the depth plane 240, while the vergence distance corresponds to the larger distance from the exit pupils of the eyes 210, 220 to the point 15, in some embodiments. The accommodation distance is different from the vergence distance. Consequently, there is an accommodation-vergence mismatch. Such a mismatch is considered undesirable and may cause discomfort in the user. It will be appreciated that the mismatch corresponds to distance (e.g., V.sub.d -A.sub.d) and may be characterized using diopters.

[0141] In some embodiments, it will be appreciated that a reference point other than exit pupils of the eyes 210, 220 may be utilized for determining distance for determining accommodation-vergence mismatch, so long as the same reference point is utilized for the accommodation distance and the vergence distance. For example, the distances could be measured from the cornea to the depth plane, from the retina to the depth plane, from the eyepiece (e.g., a waveguide of the display system) to the depth plane, and so on.

[0142] Without being limited by theory, it is believed that users may still perceive accommodation-vergence mismatches of up to about 0.25 diopter, up to about 0.33 diopter, and up to about 0.5 diopter as being physiologically correct, without the mismatch itself causing significant discomfort. In some embodiments, display systems disclosed herein (e.g., the display system 250, FIG. 6) present images to the viewer having accommodation-vergence mismatch of about 0.5 diopter or less. In some other embodiments, the accommodation-vergence mismatch of the images provided by the display system is about 0.33 diopter or less. In yet other embodiments, the accommodation-vergence mismatch of the images provided by the display system is about 0.25 diopter or less, including about 0.1 diopter or less.

[0143] FIG. 5 illustrates aspects of an approach for simulating three-dimensional imagery by modifying wavefront divergence. The display system includes a waveguide 270 that is configured to receive light 770 that is encoded with image information, and to output that light to the user's eye 210. The waveguide 270 may output the light 650 with a defined amount of wavefront divergence corresponding to the wavefront divergence of a light field produced by a point on a desired depth plane 240. In some embodiments, the same amount of wavefront divergence is provided for all objects presented on that depth plane. In addition, it will be illustrated that the other eye of the user may be provided with image information from a similar waveguide.

[0144] In some embodiments, a single waveguide may be configured to output light with a set amount of wavefront divergence corresponding to a single or limited number of depth planes and/or the waveguide may be configured to output light of a limited range of wavelengths. Consequently, in some embodiments, a plurality or stack of waveguides may be utilized to provide different amounts of wavefront divergence for different depth planes and/or to output light of different ranges of wavelengths. As used herein, it will be appreciated at a depth plane may be planar or may follow the contours of a curved surface.

[0145] FIG. 6 illustrates an example of a waveguide stack for outputting image information to a user. A display system 250 includes a stack of waveguides, or stacked waveguide assembly, 260 that may be utilized to provide three-dimensional perception to the eye/brain using a plurality of waveguides 270, 280, 290, 300, 310. It will be appreciated that the display system 250 may be considered a light field display in some embodiments. In addition, the waveguide assembly 260 may also be referred to as an eyepiece.

[0146] In some embodiments, the display system 250 may be configured to provide substantially continuous cues to vergence and multiple discrete cues to accommodation. The cues to vergence may be provided by displaying different images to each of the eyes of the user, and the cues to accommodation may be provided by outputting the light that forms the images with selectable discrete amounts of wavefront divergence. Stated another way, the display system 250 may be configured to output light with variable levels of wavefront divergence. In some embodiments, each discrete level of wavefront divergence corresponds to a particular depth plane and may be provided by a particular one of the waveguides 270, 280, 290, 300, 310.

[0147] With continued reference to FIG. 6, the waveguide assembly 260 may also include a plurality of features 320, 330, 340, 350 between the waveguides. In some embodiments, the features 320, 330, 340, 350 may be one or more lenses. The waveguides 270, 280, 290, 300, 310 and/or

[0148] the plurality of lenses 320, 330, 340, 350 may be configured to send image information to the eye with various levels of wavefront curvature or light ray divergence. Each waveguide level may be associated with a particular depth plane and may be configured to output image information corresponding to that depth plane. Image injection devices 360, 370, 380, 390, 400 may function as a source of light for the waveguides and may be utilized to inject image information into the waveguides 270, 280, 290, 300, 310, each of which may be configured, as described herein, to distribute incoming light across each respective waveguide, for output toward the eye 210. Light exits an output surface 410, 420, 430, 440, 450 of the image injection devices 360, 370, 380, 390, 400 and is injected into a corresponding input surface 460, 470, 480, 490, 500 of the waveguides 270, 280, 290, 300, 310. In some embodiments, each of the input surfaces 460, 470, 480, 490, 500 may be an edge of a corresponding waveguide, or may be part of a major surface of the corresponding waveguide (that is, one of the waveguide surfaces directly facing the world 510 or the viewer's eye 210). In some embodiments, a single beam of light (e.g. a collimated beam) may be injected into each waveguide to output an entire field of cloned collimated beams that are directed toward the eye 210 at particular angles (and amounts of divergence) corresponding to the depth plane associated with a particular waveguide. In some embodiments, a single one of the image injection devices 360, 370, 380, 390, 400 may be associated with and inject light into a plurality (e.g., three) of the waveguides 270, 280, 290, 300, 310.

[0149] In some embodiments, the image injection devices 360, 370, 380, 390, 400 are discrete displays that each produce image information for injection into a corresponding waveguide 270, 280, 290, 300, 310, respectively. In some other embodiments, the image injection devices 360, 370, 380, 390, 400 are the output ends of a single multiplexed display which may, e.g., pipe image information via one or more optical conduits (such as fiber optic cables) to each of the image injection devices 360, 370, 380, 390, 400. It will be appreciated that the image information provided by the image injection devices 360, 370, 380, 390, 400 may include light of different wavelengths, or colors (e.g., different component colors, as discussed herein).

[0150] In some embodiments, the light injected into the waveguides 270, 280, 290, 300, 310 is provided by a light projection system 520, which comprises a light module 530, which may include a light emitter, such as a light emitting diode (LED). The light from the light module 530 may be directed to and modified by a light modulator 540, e.g., a spatial light modulator, via a beam splitter 550. The light modulator 540 may be configured to change the perceived intensity of the light injected into the waveguides 270, 280, 290, 300, 310 to encode the light with image information. Examples of spatial light modulators include liquid crystal displays (LCD) including a liquid crystal on silicon (LCOS) displays. In some other embodiments, the spatial light modulator may be a MEMS device, such as a digital light processing (DLP) device. It will be appreciated that the image injection devices 360, 370, 380, 390, 400 are illustrated schematically and, in some embodiments, these image injection devices may represent different light paths and locations in a common projection system configured to output light into associated ones of the waveguides 270, 280, 290, 300, 310. In some embodiments, the waveguides of the waveguide assembly 260 may function as ideal lens while relaying light injected into the waveguides out to the user's eyes. In this conception, the object may be the spatial light modulator 540 and the image may be the image on the depth plane.

[0151] In some embodiments, the display system 250 may be a scanning fiber display comprising one or more scanning fibers configured to project light in various patterns (e.g., raster scan, spiral scan, Lissajous patterns, etc.) into one or more waveguides 270, 280, 290, 300, 310 and ultimately to the eye 210 of the viewer. In some embodiments, the illustrated image injection devices 360, 370, 380, 390, 400 may schematically represent a single scanning fiber or a bundle of scanning fibers configured to inject light into one or a plurality of the waveguides 270, 280, 290, 300, 310. In some other embodiments, the illustrated image injection devices 360, 370, 380, 390, 400 may schematically represent a plurality of scanning fibers or a plurality of bundles of scanning fibers, each of which are configured to inject light into an associated one of the waveguides 270, 280, 290, 300, 310. It will be appreciated that one or more optical fibers may be configured to transmit light from the light module 530 to the one or more waveguides 270, 280, 290, 300, 310. It will be appreciated that one or more intervening optical structures may be provided between the scanning fiber, or fibers, and the one or more waveguides 270, 280, 290, 300, 310 to, e.g., redirect light exiting the scanning fiber into the one or more waveguides 270, 280, 290, 300, 310.

[0152] A controller 560 controls the operation of one or more of the stacked waveguide assembly 260, including operation of the image injection devices 360, 370, 380, 390, 400, the light source 530, and the light modulator 540. In some embodiments, the controller 560 is part of the local data processing module 140. The controller 560 includes programming (e.g., instructions in a non-transitory medium) that regulates the timing and provision of image information to the waveguides 270, 280, 290, 300, 310 according to, e.g., any of the various schemes disclosed herein. In some embodiments, the controller may be a single integral device, or a distributed system connected by wired or wireless communication channels. The controller 560 may be part of the processing modules 140 or 150 (FIG. 9E) in some embodiments.

[0153] With continued reference to FIG. 6, the waveguides 270, 280, 290, 300, 310 may be configured to propagate light within each respective waveguide by total internal reflection (TIR). The waveguides 270, 280, 290, 300, 310 may each be planar or have another shape (e.g., curved), with major top and bottom surfaces and edges extending between those major top and bottom surfaces. In the illustrated configuration, the waveguides 270, 280, 290, 300, 310 may each include out-coupling optical elements 570, 580, 590, 600, 610 that are configured to extract light out of a waveguide by redirecting the light, propagating within each respective waveguide, out of the waveguide to output image information to the eye 210. Extracted light may also be referred to as out-coupled light and the out-coupling optical elements light may also be referred to light extracting optical elements. An extracted beam of light may be outputted by the waveguide at locations at which the light propagating in the waveguide strikes a light extracting optical element. The out-coupling optical elements 570, 580, 590, 600, 610 may, for example, be gratings, including diffractive optical features, as discussed further herein. While illustrated disposed at the bottom major surfaces of the waveguides 270, 280, 290, 300, 310, for ease of description and drawing clarity, in some embodiments, the out-coupling optical elements 570, 580, 590, 600, 610 may be disposed at the top and/or bottom major surfaces, and/or may be disposed directly in the volume of the waveguides 270, 280, 290, 300, 310, as discussed further herein. In some embodiments, the out-coupling optical elements 570, 580, 590, 600, 610 may be formed in a layer of material that is attached to a transparent substrate to form the waveguides 270, 280, 290, 300, 310. In some other embodiments, the waveguides 270, 280, 290, 300, 310 may be a monolithic piece of material and the out-coupling optical elements 570, 580, 590, 600, 610 may be formed on a surface and/or in the interior of that piece of material.

[0154] With continued reference to FIG. 6, as discussed herein, each waveguide 270, 280, 290, 300, 310 is configured to output light to form an image corresponding to a particular depth plane. For example, the waveguide 270 nearest the eye may be configured to deliver collimated light (which was injected into such waveguide 270), to the eye 210. The collimated light may be representative of the optical infinity focal plane. The next waveguide up 280 may be configured to send out collimated light which passes through the first lens 350 (e.g., a negative lens) before it may reach the eye 210; such first lens 350 may be configured to create a slight convex wavefront curvature so that the eye/brain interprets light coming from that next waveguide up 280 as coming from a first focal plane closer inward toward the eye 210 from optical infinity. Similarly, the third up waveguide 290 passes its output light through both the first 350 and second 340 lenses before reaching the eye 210; the combined optical power of the first 350 and second 340 lenses may be configured to create another incremental amount of wavefront curvature so that the eye/brain interprets light coming from the third waveguide 290 as coming from a second focal plane that is even closer inward toward the person from optical infinity than was light from the next waveguide up 280.

[0155] The other waveguide layers 300, 310 and lenses 330, 320 are similarly configured, with the highest waveguide 310 in the stack sending its output through all of the lenses between it and the eye for an aggregate focal power representative of the closest focal plane to the person. To compensate for the stack of lenses 320, 330, 340, 350 when viewing/interpreting light coming from the world 510 on the other side of the stacked waveguide assembly 260, a compensating lens layer 620 may be disposed at the top of the stack to compensate for the aggregate power of the lens stack 320, 330, 340, 350 below. Such a configuration provides as many perceived focal planes as there are available waveguide/lens pairings. Both the out-coupling optical elements of the waveguides and the focusing aspects of the lenses may be static (i.e., not dynamic or electro-active). In some alternative embodiments, either or both may be dynamic using electro-active features.

[0156] In some embodiments, two or more of the waveguides 270, 280, 290, 300, 310 may have the same associated depth plane. For example, multiple waveguides 270, 280, 290, 300, 310 may be configured to output images set to the same depth plane, or multiple subsets of the waveguides 270, 280, 290, 300, 310 may be configured to output images set to the same plurality of depth planes, with one set for each depth plane. This may provide advantages for forming a tiled image to provide an expanded field of view at those depth planes.

[0157] With continued reference to FIG. 6, the out-coupling optical elements 570, 580, 590, 600, 610 may be configured to both redirect light out of their respective waveguides and to output this light with the appropriate amount of divergence or collimation for a particular depth plane associated with the waveguide. As a result, waveguides having different associated depth planes may have different configurations of out-coupling optical elements 570, 580, 590, 600, 610, which output light with a different amount of divergence depending on the associated depth plane. In some embodiments, the light extracting optical elements 570, 580, 590, 600, 610 may be volumetric or surface features, which may be configured to output light at specific angles. For example, the light extracting optical elements 570, 580, 590, 600, 610 may be volume holograms, surface holograms, and/or diffraction gratings. In some embodiments, the features 320, 330, 340, 350 may not be lenses; rather, they may simply be spacers (e.g., cladding layers and/or structures for forming air gaps).

[0158] In some embodiments, the out-coupling optical elements 570, 580, 590, 600, 610 are diffractive features that form a diffraction pattern, or "diffractive optical element" (also referred to herein as a "DOE"). Preferably, the DOE's have a sufficiently low diffraction efficiency so that only a portion of the light of the beam is deflected away toward the eye 210 with each intersection of the DOE, while the rest continues to move through a waveguide via TIR. The light carrying the image information is thus divided into a number of related exit beams that exit the waveguide at a multiplicity of locations and the result is a fairly uniform pattern of exit emission toward the eye 210 for this particular collimated beam bouncing around within a waveguide.

[0159] In some embodiments, one or more DOEs may be switchable between "on" states in which they actively diffract, and "off" states in which they do not significantly diffract. For instance, a switchable DOE may comprise a layer of polymer dispersed liquid crystal, in which microdroplets comprise a diffraction pattern in a host medium, and the refractive index of the microdroplets may be switched to substantially match the refractive index of the host material (in which case the pattern does not appreciably diffract incident light) or the microdroplet may be switched to an index that does not match that of the host medium (in which case the pattern actively diffracts incident light).

[0160] In some embodiments, a camera assembly 630 (e.g., a digital camera, including visible light and infrared light cameras) may be provided to capture images of the eye 210 and/or tissue around the eye 210 to, e.g., detect user inputs and/or to monitor the physiological state of the user. As used herein, a camera may be any image capture device. In some embodiments, the camera assembly 630 may include an image capture device and a light source to project light (e.g., infrared light) to the eye, which may then be reflected by the eye and detected by the image capture device. In some embodiments, the camera assembly 630 may be attached to the frame or support structure 80 (FIG. 9E) and may be in electrical communication with the processing modules 140 and/or 150, which may process image information from the camera assembly 630. In some embodiments, one camera assembly 630 may be utilized for each eye, to separately monitor each eye.

[0161] The camera assembly 630 may, in some embodiments, observe movements of the user, such as the user's eye movements. As an example, the camera assembly 630 may capture images of the eye 210 to determine the size, position, and/or orientation of the pupil of the eye 210 (or some other structure of the eye 210). The camera assembly 630 may, if desired, obtain images (processed by processing circuitry of the type described herein) used to determine the direction the user is looking (e.g., eye pose or gaze direction). In some embodiments, camera assembly 630 may include multiple cameras, at least one of which may be utilized for each eye, to separately determine the eye pose or gaze direction of each eye independently. The camera assembly 630 may, in some embodiments and in combination with processing circuitry such as the controller 560 or the local data processing module 140, determine eye pose or gaze direction based on glints (e.g., reflections) of reflected light (e.g., infrared light) from a light source included in camera assembly 630.

[0162] With reference now to FIG. 7, an example of exit beams outputted by a waveguide is shown. One waveguide is illustrated, but it will be appreciated that other waveguides in the waveguide assembly 260 (FIG. 6) may function similarly, where the waveguide assembly 260 includes multiple waveguides. Light 640 is injected into the waveguide 270 at the input surface 460 of the waveguide 270 and propagates within the waveguide 270 by TIR. At points where the light 640 impinges on the DOE 570, a portion of the light exits the waveguide as exit beams 650. The exit beams 650 are illustrated as substantially parallel but, as discussed herein, they may also be redirected to propagate to the eye 210 at an angle (e.g., forming divergent exit beams), depending on the depth plane associated with the waveguide 270. It will be appreciated that substantially parallel exit beams may be indicative of a waveguide with out-coupling optical elements that out-couple light to form images that appear to be set on a depth plane at a large distance (e.g., optical infinity) from the eye 210. Other waveguides or other sets of out-coupling optical elements may output an exit beam pattern that is more divergent, which would require the eye 210 to accommodate to a closer distance to bring it into focus on the retina and would be interpreted by the brain as light from a distance closer to the eye 210 than optical infinity.

[0163] In some embodiments, a full color image may be formed at each depth plane by overlaying images in each of the component colors, e.g., three or more component colors. FIG. 8 illustrates an example of a stacked waveguide assembly in which each depth plane includes images formed using multiple different component colors. The illustrated embodiment shows depth planes 240a-240f, although more or fewer depths are also contemplated. Each depth plane may have three or more component color images associated with it, including: a first image of a first color, G; a second image of a second color, R; and a third image of a third color, B. Different depth planes are indicated in the figure by different numbers for diopters (dpt) following the letters G, R, and B. Just as examples, the numbers following each of these letters indicate diopters (1/m), or inverse distance of the depth plane from a viewer, and each box in the figures represents an individual component color image. In some embodiments, to account for differences in the eye's focusing of light of different wavelengths, the exact placement of the depth planes for different component colors may vary. For example, different component color images for a given depth plane may be placed on depth planes corresponding to different distances from the user. Such an arrangement may increase visual acuity and user comfort and/or may decrease chromatic aberrations.

[0164] In some embodiments, light of each component color may be outputted by a single dedicated waveguide and, consequently, each depth plane may have multiple waveguides associated with it. In such embodiments, each box in the figures including the letters G, R, or B may be understood to represent an individual waveguide, and three waveguides may be provided per depth plane where three component color images are provided per depth plane. While the waveguides associated with each depth plane are shown adjacent to one another in this drawing for ease of description, it will be appreciated that, in a physical device, the waveguides may all be arranged in a stack with one waveguide per level. In some other embodiments, multiple component colors may be outputted by the same waveguide, such that, e.g., only a single waveguide may be provided per depth plane.

[0165] With continued reference to FIG. 8, in some embodiments, G is the color green, R is the color red, and B is the color blue. In some other embodiments, other colors associated with other wavelengths of light, including magenta and cyan, may be used in addition to or may replace one or more of red, green, or blue.

[0166] It will be appreciated that references to a given color of light throughout this disclosure will be understood to encompass light of one or more wavelengths within a range of wavelengths of light that are perceived by a viewer as being of that given color. For example, red light may include light of one or more wavelengths in the range of about 620-780 nm, green light may include light of one or more wavelengths in the range of about 492-577 nm, and blue light may include light of one or more wavelengths in the range of about 435-493 nm.

[0167] In some embodiments, the light source 530 (FIG. 6) may be configured to emit light of one or more wavelengths outside the visual perception range of the viewer, for example, infrared and/or ultraviolet wavelengths. In addition, the in-coupling, out-coupling, and other light redirecting structures of the waveguides of the display 250 may be configured to direct and emit this light out of the display towards the user's eye 210, e.g., for imaging and/or user stimulation applications.

[0168] With reference now to FIG. 9A, in some embodiments, light impinging on a waveguide may need to be redirected to in-couple that light into the waveguide. An in-coupling optical element may be used to redirect and in-couple the light into its corresponding waveguide. FIG. 9A illustrates a cross-sectional side view of an example of a plurality or set 660 of stacked waveguides that each includes an in-coupling optical element. The waveguides may each be configured to output light of one or more different wavelengths, or one or more different ranges of wavelengths. It will be appreciated that the stack 660 may correspond to the stack 260 (FIG. 6) and the illustrated waveguides of the stack 660 may correspond to part of the plurality of waveguides 270, 280, 290, 300, 310, except that light from one or more of the image injection devices 360, 370, 380, 390, 400 is injected into the waveguides from a position that requires light to be redirected for in-coupling.

[0169] The illustrated set 660 of stacked waveguides includes waveguides 670, 680, and 690. Each waveguide includes an associated in-coupling optical element (which may also be referred to as a light input area on the waveguide), with, e.g., in-coupling optical element 700 disposed on a major surface (e.g., an upper major surface) of waveguide 670, in-coupling optical element 710 disposed on a major surface (e.g., an upper major surface) of waveguide 680, and in-coupling optical element 720 disposed on a major surface (e.g., an upper major surface) of waveguide 690. In some embodiments, one or more of the in-coupling optical elements 700, 710, 720 may be disposed on the bottom major surface of the respective waveguide 670, 680, 690 (particularly where the one or more in-coupling optical elements are reflective, deflecting optical elements). As illustrated, the in-coupling optical elements 700, 710, 720 may be disposed on the upper major surface of their respective waveguide 670, 680, 690 (or the top of the next lower waveguide), particularly where those in-coupling optical elements are transmissive, deflecting optical elements. In some embodiments, the in-coupling optical elements 700, 710, 720 may be disposed in the body of the respective waveguide 670, 680, 690. In some embodiments, as discussed herein, the in-coupling optical elements 700, 710, 720 are wavelength selective, such that they selectively redirect one or more wavelengths of light, while transmitting other wavelengths of light. While illustrated on one side or corner of their respective waveguide 670, 680, 690, it will be appreciated that the in-coupling optical elements 700, 710, 720 may be disposed in other areas of their respective waveguide 670, 680, 690 in some embodiments.

[0170] As illustrated, the in-coupling optical elements 700, 710, 720 may be laterally offset from one another, as seen in the illustrated head-on view in a direction of light propagating to these in-coupling optical elements. In some embodiments, each in-coupling optical element may be offset such that it receives light without that light passing through another in-coupling optical element. For example, each in-coupling optical element 700, 710, 720 may be configured to receive light from a different image injection device 360, 370, 380, 390, and 400 as shown in FIG. 6, and may be separated (e.g., laterally spaced apart) from other in-coupling optical elements 700, 710, 720 such that it substantially does not receive light from the other ones of the in-coupling optical elements 700, 710, 720.

[0171] Each waveguide also includes associated light distributing elements, with, e.g., light distributing elements 730 disposed on a major surface (e.g., a top major surface) of waveguide 670, light distributing elements 740 disposed on a major surface (e.g., a top major surface) of waveguide 680, and light distributing elements 750 disposed on a major surface (e.g., a top major surface) of waveguide 690. In some other embodiments, the light distributing elements 730, 740, 750, may be disposed on a bottom major surface of associated waveguides 670, 680, 690, respectively. In some other embodiments, the light distributing elements 730, 740, 750, may be disposed on both top and bottom major surface of associated waveguides 670, 680, 690, respectively; or the light distributing elements 730, 740, 750, may be disposed on different ones of the top and bottom major surfaces in different associated waveguides 670, 680, 690, respectively.

[0172] The waveguides 670, 680, 690 may be spaced apart and separated by, e.g., gas, liquid, and/or solid layers of material. For example, as illustrated, layer 760a may separate waveguides 670 and 680; and layer 760b may separate waveguides 680 and 690. In some embodiments, the layers 760a and 760b are formed of low refractive index materials (that is, materials having a lower refractive index than the material forming the immediately adjacent one of waveguides 670, 680, 690). Preferably, the refractive index of the material forming the layers 760a, 760b is 0.05 or more, or 0.10 or less than the refractive index of the material forming the waveguides 670, 680, 690. Advantageously, the lower refractive index layers 760a, 760b may function as cladding layers that facilitate total internal reflection (TIR) of light through the waveguides 670, 680, 690 (e.g., TIR between the top and bottom major surfaces of each waveguide). In some embodiments, the layers 760a, 760b are formed of air. While not illustrated, it will be appreciated that the top and bottom of the illustrated set 660 of waveguides may include immediately neighboring cladding layers.

[0173] Preferably, for ease of manufacturing and other considerations, the material forming the waveguides 670, 680, 690 are similar or the same, and the material forming the layers 760a, 760b are similar or the same. In some embodiments, the material forming the waveguides 670, 680, 690 may be different between one or more waveguides, and/or the material forming the layers 760a, 760b may be different, while still holding to the various refractive index relationships noted above.

[0174] With continued reference to FIG. 9A, light rays 770, 780, 790 are incident on the set 660 of waveguides. It will be appreciated that the light rays 770, 780, 790 may be injected into the waveguides 670, 680, 690 by one or more image injection devices 360, 370, 380, 390, 400 (FIG. 6).

[0175] In some embodiments, the light rays 770, 780, 790 have different properties, e.g., different wavelengths or different ranges of wavelengths, which may correspond to different colors. The in-coupling optical elements 700, 710, 720 each deflect the incident light such that the light propagates through a respective one of the waveguides 670, 680, 690 by TIR. In some embodiments, the in-coupling optical elements 700, 710, 720 each selectively deflect one or more particular wavelengths of light, while transmitting other wavelengths to an underlying waveguide and associated in-coupling optical element.

[0176] For example, in-coupling optical element 700 may be configured to deflect ray 770, which has a first wavelength or range of wavelengths, while transmitting rays 780 and 790, which have different second and third wavelengths or ranges of wavelengths, respectively. The transmitted ray 780 impinges on and is deflected by the in-coupling optical element 710, which is configured to deflect light of a second wavelength or range of wavelengths. The ray 790 is deflected by the in-coupling optical element 720, which is configured to selectively deflect light of third wavelength or range of wavelengths.

[0177] With continued reference to FIG. 9A, the deflected light rays 770, 780, 790 are deflected so that they propagate through a corresponding waveguide 670, 680, 690; that is, the in-coupling optical elements 700, 710, 720 of each waveguide deflects light into that corresponding waveguide 670, 680, 690 to in-couple light into that corresponding waveguide. The light rays 770, 780, 790 are deflected at angles that cause the light to propagate through the respective waveguide 670, 680, 690 by TIR. The light rays 770, 780, 790 propagate through the respective waveguide 670, 680, 690 by TIR until impinging on the waveguide's corresponding light distributing elements 730, 740, 750.

[0178] With reference now to FIG. 9B, a perspective view of an example of the plurality of stacked waveguides of FIG. 9A is illustrated. As noted above, the in-coupled light rays 770, 780, 790, are deflected by the in-coupling optical elements 700, 710, 720, respectively, and then propagate by TIR within the waveguides 670, 680, 690, respectively. The light rays 770, 780, 790 then impinge on the light distributing elements 730, 740, 750, respectively. The light distributing elements 730, 740, 750 deflect the light rays 770, 780, 790 so that they propagate towards the out-coupling optical elements 800, 810, 820, respectively.

[0179] In some embodiments, the light distributing elements 730, 740, 750 are orthogonal pupil expanders (OPE's). In some embodiments, the OPE's deflect or distribute light to the out-coupling optical elements 800, 810, 820 and, in some embodiments, may also increase the beam or spot size of this light as it propagates to the out-coupling optical elements. In some embodiments, the light distributing elements 730, 740, 750 may be omitted and the in-coupling optical elements 700, 710, 720 may be configured to deflect light directly to the out-coupling optical elements 800, 810, 820. For example, with reference to FIG. 9A, the light distributing elements 730, 740, 750 may be replaced with out-coupling optical elements 800, 810, 820, respectively. In some embodiments, the out-coupling optical elements 800, 810, 820 are exit pupils (EP's) or exit pupil expanders (EPE's) that direct light in a viewer's eye 210 (FIG. 7). It will be appreciated that the OPE's may be configured to increase the dimensions of the eye box in at least one axis and the EPE's may be to increase the eye box in an axis crossing, e.g., orthogonal to, the axis of the OPEs. For example, each OPE may be configured to redirect a portion of the light striking the OPE to an EPE of the same waveguide, while allowing the remaining portion of the light to continue to propagate down the waveguide. Upon impinging on the OPE again, another portion of the remaining light is redirected to the EPE, and the remaining portion of that portion continues to propagate further down the waveguide, and so on. Similarly, upon striking the EPE, a portion of the impinging light is directed out of the waveguide towards the user, and a remaining portion of that light continues to propagate through the waveguide until it strikes the EP again, at which time another portion of the impinging light is directed out of the waveguide, and so on. Consequently, a single beam of in-coupled light may be "replicated" each time a portion of that light is redirected by an OPE or EPE, thereby forming a field of cloned beams of light, as shown in FIG. 6. In some embodiments, the OPE and/or EPE may be configured to modify a size of the beams of light.

[0180] Accordingly, with reference to FIGS. 9A and 9B, in some embodiments, the set 660 of waveguides includes waveguides 670, 680, 690; in-coupling optical elements 700, 710, 720; light distributing elements (e.g., OPE's) 730, 740, 750; and out-coupling optical elements (e.g., EP's) 800, 810, 820 for each component color. The waveguides 670, 680, 690 may be stacked with an air gap/cladding layer between each one. The in-coupling optical elements 700, 710, 720 redirect or deflect incident light (with different in-coupling optical elements receiving light of different wavelengths) into its waveguide. The light then propagates at an angle which will result in TIR within the respective waveguide 670, 680, 690. In the example shown, light ray 770 (e.g., blue light) is deflected by the first in-coupling optical element 700, and then continues to bounce down the waveguide, interacting with the light distributing element (e.g., OPE's) 730 and then the out-coupling optical element (e.g., EPs) 800, in a manner described earlier. The light rays 780 and 790 (e.g., green and red light, respectively) will pass through the waveguide 670, with light ray 780 impinging on and being deflected by in-coupling optical element 710. The light ray 780 then bounces down the waveguide 680 via TIR, proceeding on to its light distributing element (e.g., OPEs) 740 and then the out-coupling optical element (e.g., EP's) 810. Finally, light ray 790 (e.g., red light) passes through the waveguide 690 to impinge on the light in-coupling optical elements 720 of the waveguide 690. The light in-coupling optical elements 720 deflect the light ray 790 such that the light ray propagates to light distributing element (e.g., OPEs) 750 by TIR, and then to the out-coupling optical element (e.g., EPs) 820 by TIR. The out-coupling optical element 820 then finally out-couples the light ray 790 to the viewer, who also receives the out-coupled light from the other waveguides 670, 680.

[0181] FIG. 9C illustrates a top-down plan view of an example of the plurality of stacked waveguides of FIGS. 9A and 9B. It will be appreciated that this top-down view may also be referred to as a head-on view, as seen in the direction of propagation of light towards the in-coupling optical elements 800, 810, 820; that is, the top-down view is a view of the waveguides with image light incident normal to the page. As illustrated, the waveguides 670, 680, 690, along with each waveguide's associated light distributing element 730, 740, 750 and associated out-coupling optical element 800, 810, 820, may be vertically aligned. However, as discussed herein, the in-coupling optical elements 700, 710, 720 are not vertically aligned; rather, the in-coupling optical elements are preferably non-overlapping (e.g., laterally spaced apart as seen in the top-down view). As discussed further herein, this nonoverlapping spatial arrangement facilitates the injection of light from different sources into different waveguides on a one-to-one basis, thereby allowing a specific light source to be uniquely coupled to a specific waveguide. In some embodiments, arrangements including nonoverlapping spatially-separated in-coupling optical elements may be referred to as a shifted pupil system, and the in-coupling optical elements within these arrangements may correspond to sub-pupils.

[0182] It will be appreciated that the spatially overlapping areas may have lateral overlap of 70% or more, 80% or more, or 90% or more of their areas, as seen in the top-down view. On the other hand, the laterally shifted areas of less than 30% overlap, less than 20% overlap, or less than 10% overlap of their areas, as seen in top-down view. In some embodiments, laterally shifted areas have no overlap.

[0183] FIG. 9D illustrates a top-down plan view of another example of a plurality of stacked waveguides. As illustrated, the waveguides 670, 680, 690 may be vertically aligned. However, in comparison to the configuration of FIG. 9C, separate light distributing elements 730, 740, 750 and associated out-coupling optical elements 800, 810, 820 are omitted. Instead, light distributing elements and out-coupling optical elements are effectively superimposed and occupy the same area as seen in the top-down view. In some embodiments, light distributing elements (e.g., OPE's) may be disposed on one major surface of the waveguides 670, 680, 690 and out-coupling optical elements (e.g., EPE's) may be disposed on the other major surface of those waveguides. Thus, each waveguide 670, 680, 690 may have superimposed light distributing and out coupling optical elements, collectively referred to as combined OPE/EPE's 1281, 1282, 1283, respectively. Further details regarding such combined OPE/EPE's may be found in U.S. application Ser. No. 16/221,359, filed on Dec. 14, 2018, the entire disclosure of which is incorporated by reference herein. The in-coupling optical elements 700, 710, 720 in-couple and direct light to the combined OPE/EPE's 1281, 1282, 1283, respectively. In some embodiments, as illustrated, the in-coupling optical elements 700, 710, 720 may be laterally shifted (e.g., they are laterally spaced apart as seen in the illustrated top-down view) in have a shifted pupil spatial arrangement. As with the configuration of FIG. 9C, this laterally-shifted spatial arrangement facilitates the injection of light of different wavelengths (e.g., from different light sources) into different waveguides on a one-to-one basis.

[0184] FIG. 9E illustrates an example of wearable display system 60 into which the various waveguides and related systems disclosed herein may be integrated. In some embodiments, the display system 60 is the system 250 of FIG. 6, with FIG. 6 schematically showing some parts of that system 60 in greater detail. For example, the waveguide assembly 260 of FIG. 6 may be part of the display 70.

[0185] With continued reference to FIG. 9E, the display system 60 includes a display 70, and various mechanical and electronic modules and systems to support the functioning of that display 70. The display 70 may be coupled to a frame 80, which is wearable by a display system user or viewer 90 and which is configured to position the display 70 in front of the eyes of the user 90. The display 70 may be considered eyewear in some embodiments. The display 70 may include one or more waveguides, such as the waveguide 270, configured to relay in-coupled image light and to output that image light to an eye of the user 90. In some embodiments, a speaker 100 is coupled to the frame 80 and configured to be positioned adjacent the ear canal of the user 90 (in some embodiments, another speaker, not shown, may optionally be positioned adjacent the other ear canal of the user to provide stereo/shapeable sound control). The display system 60 may also include one or more microphones 110 or other devices to detect sound. In some embodiments, the microphone is configured to allow the user to provide inputs or commands to the system 60 (e.g., the selection of voice menu commands, natural language questions, etc.), and/or may allow audio communication with other persons (e.g., with other users of similar display systems. The microphone may further be configured as a peripheral sensor to collect audio data (e.g., sounds from the user and/or environment). In some embodiments, the display system 60 may further include one or more outwardly-directed environmental sensors 112 configured to detect objects, stimuli, people, animals, locations, or other aspects of the world around the user. For example, environmental sensors 112 may include one or more cameras, which may be located, for example, facing outward so as to capture images similar to at least a portion of an ordinary field of view of the user 90. In some embodiments, the display system may also include a peripheral sensor 120a, which may be separate from the frame 80 and attached to the body of the user 90 (e.g., on the head, torso, an extremity, etc. of the user 90). The peripheral sensor 120a may be configured to acquire data characterizing a physiological state of the user 90 in some embodiments. For example, the sensor 120a may be an electrode.

[0186] With continued reference to FIG. 9E, the display 70 is operatively coupled by communications link 130, such as by a wired lead or wireless connectivity, to a local data processing module 140 which may be mounted in a variety of configurations, such as fixedly attached to the frame 80, fixedly attached to a helmet or hat worn by the user, embedded in headphones, or otherwise removably attached to the user 90 (e.g., in a backpack-style configuration, in a belt-coupling style configuration). Similarly, the sensor 120a may be operatively coupled by communications link 120b, e.g., a wired lead or wireless connectivity, to the local processor and data module 140. The local processing and data module 140 may comprise a hardware processor, as well as digital memory, such as non-volatile memory (e.g., flash memory or hard disk drives), both of which may be utilized to assist in the processing, caching, and storage of data. Optionally, the local processor and data module 140 may include one or more central processing units (CPUs), graphics processing units (GPUs), dedicated processing hardware, and so on. The data may include data a) captured from sensors (which may be, e.g., operatively coupled to the frame 80 or otherwise attached to the user 90), such as image capture devices (such as cameras), microphones, inertial measurement units, accelerometers, compasses, GPS units, radio devices, gyros, and/or other sensors disclosed herein; and/or b) acquired and/or processed using remote processing module 150 and/or remote data repository 160 (including data relating to virtual content), possibly for passage to the display 70 after such processing or retrieval. The local processing and data module 140 may be operatively coupled by communication links 170, 180, such as via a wired or wireless communication links, to the remote processing module 150 and remote data repository 160 such that these remote modules 150, 160 are operatively coupled to each other and available as resources to the local processing and data module 140. In some embodiments, the local processing and data module 140 may include one or more of the image capture devices, microphones, inertial measurement units, accelerometers, compasses, GPS units, radio devices, and/or gyros. In some other embodiments, one or more of these sensors may be attached to the frame 80, or may be standalone structures that communicate with the local processing and data module 140 by wired or wireless communication pathways.

[0187] With continued reference to FIG. 9E, in some embodiments, the remote processing module 150 may comprise one or more processors configured to analyze and process data and/or image information, for instance including one or more central processing units (CPUs), graphics processing units (GPUs), dedicated processing hardware, and so on. In some embodiments, the remote data repository 160 may comprise a digital data storage facility, which may be available through the internet or other networking configuration in a "cloud" resource configuration. In some embodiments, the remote data repository 160 may include one or more remote servers, which provide information, e.g., information for generating virtual content, to the local processing and data module 140 and/or the remote processing module 150. In some embodiments, all data is stored and all computations are performed in the local processing and data module, allowing fully autonomous use from a remote module. Optionally, an outside system (e.g., a system of one or more processors, one or more computers) that includes CPUs, GPUs, and so on, may perform at least a portion of processing (e.g., generating image information, processing data) and provide information to, and receive information from, modules 140, 150, 160, for instance via wireless or wired connections.

[0188] FIG. 10 illustrates an example of a wearable display system with a light projection system 910 having a spatial light modulator 930 and a separate light source 940. The light source 940 may comprise one or more light emitters and illuminates the spatial light modulator (SLM) 930. A lens structure 960 may be used to focus the light from the light source 940 onto the SLM 930. A beam splitter (e.g., a polarizing beam splitter (PBS)) 950 reflects light from the light source 940 to the spatial light modulator 930, which reflects and modulates the light. The reflected modulated light, also referred to as image light, then propagates through the beam splitter 950 to the eyepiece 920. Another lens structure, projection optics 970, may be utilized to converge or focus the image light onto the eyepiece 920. The eyepiece 920 may include one or more waveguides or waveguides that relay the modulated to the eye 210.

[0189] As noted herein, the separate light source 940 and associated lens structure 960 may undesirably add weight and size to the wearable display system. This may decrease the comfort of the display system, particularly for a user wearing the display system for an extended duration.

[0190] In addition, the light source 940 in conjunction with the SLM 930 may consume energy inefficiently. For example, the light source 940 may illuminate the entirety of the SLM 930. The SLM 930 then selectively reflects light towards the eyepiece 920 thus, not all the light produced by the light source 940 may be utilized to form an image; some of this light, e.g., light corresponding to dark regions of an image, is not reflected to the eyepiece 920. As a result, the light source 940 utilizes energy to generate light to illuminate the entirety of the SLM 930, but only a fraction of this light may be needed to form some images.

[0191] Moreover, as noted herein, in some cases, the SLM 930 may modulate light using a micro-mirror to selectively reflect incident light, or using liquid crystal molecules that modify the amount of light reflected from an underlying mirror. As a result, such devices require physical movement of optical elements (e.g., micro-mirrors or liquid crystal molecules, such as in LCoS or DLP panels, respectively) in order to modulate light from the light source 940. The physical movement required to modulate light to encode the light with image information, e.g., corresponding to a pixel, may occur at relatively slow speeds in comparison to, e.g., the ability to turn an LED or OLED "on" or "off". This relatively slow movement may limit the frame rate of the display system and may be visible as, e.g., motion blur, color-breakup, and/or presented images that are mismatched with the pose of the user's head or changes in said pose.

[0192] Thus, as discussed herein, light encoded with image information may be outputted by a projector system utilizing reflective spatial light modulators such as LCoS or DLP panels, which modulate light from one or more different light sources. In some other embodiments, the modulator may be a light source. In some embodiments, the spatial light modulator may be an emissive spatial light modulator, such as an emissive micro-display (for example, a micro-LED array). Wearable displays utilizing emissive micro-displays, as disclosed herein, may be particularly advantageous in facilitating wearable display systems that have a relatively low weight and bulkiness, high energy efficiency, and high frame rate, with low motion blur and low motion-to-photon latency. In addition, in comparison to scanning fiber displays, the emissive micro-displays may avoid artifacts caused by the use of coherent light sources.

[0193] With reference now to FIG. 11A, an example is illustrated of a wearable display system with a light projection system 1010 having multiple emissive micro-displays 1030a, 1030b, 1030c. Light from the micro-displays 1030a, 1030b, 1030c is combined by an optical combiner 1050 and directed towards an eyepiece 1020, which relays the light to the eye 210 of a user. Projection optics 1070 may be provided between the optical combiner 1050 and the eyepiece 1020. In some embodiments, the eyepiece 1020 may be a waveguide assembly comprising one or more waveguides. In some embodiments, the light projection system 1010 and the eyepiece 1020 may be supported (e.g., attached to) the frame 80 (FIG. 9E).

[0194] In some embodiments, the micro-displays 1030a, 1030b, 1030c may be monochrome micro-displays, with each monochrome micro-display outputting light of a different component color to provide a monochrome image. As discussed herein, the monochrome images combine to form a full-color image.

[0195] In some other embodiments, the micro-displays 1030a, 1030b, 1030c may be may each be full-color displays configured to output light of all component colors. For example, the micro-displays 1030a, 1030b, 1030c each include red, green, and blue light emitters. The micro-displays 1030a, 1030b, 1030c may be identical and may display the same image. However, utilizing multiple micro-displays may provide advantages for increasing the brightness and brightness dynamic range of the brightness of the image, by combining the light from the multiple micro-displays to form a single image. In some embodiments, two or more (e.g., three) micro-displays may be utilized, with the optical combiner 1050 is configured to combine light from all of these micro-displays.

[0196] The micro-displays may comprise an array of light emitters. Examples of light emitters include organic light-emitting diodes (OLEDs) and micro-light-emitting diodes (micro-LEDs). It will be appreciated that OLEDs utilize organic material to emit light and micro-LEDs utilize inorganic material to emit light. Advantageously, some micro-LEDs provide higher luminance and higher efficiency (in terms of lux/W) than OLEDs. In some embodiments, the micro-displays are preferably micro-LED displays.

[0197] With continued reference to FIG. 11A, the micro-displays 1030a, 1030b, 1030c may each be configured to emit image light 1032a, 1032b, 1032c. Where the micro-displays are monochrome micro-displays, the image light 1032a, 1032b, 1032c may each be of a different component color. The optical combiner 1050 receives the image light 1032a, 1032b, 1032c and effectively combines this light such that the light propagates generally in the same direction, e.g., toward the projection optics 1070. In some embodiments, the optical combiner 1050 may be a dichroic X-cube prism having reflective internal surfaces that redirect the image light 1032a, 1032b, 1032c to the projection optics 1070. It will be appreciated that the projection optics 1070 may be a lens structure comprising one or more lenses which converge or focus image light onto the eyepiece 1020. The eyepiece 1020 then relays the image light 1032a, 1032b, 1032c to the eye 210.

[0198] In some embodiments, the eyepiece 1020 may comprise a plurality of stacked waveguides 1020a, 1020b, 1020c, each of which has a respective in-coupling optical element 1022a, 1022b, 1022c. In some embodiments, the number of waveguides is proportional to the number of component colors provided by the micro-displays 1030a, 1030b, 1030c. For example, where there are three component colors, the number of waveguides in the eyepiece 1020 may include a set of three waveguides or multiple sets of three waveguides each. In some embodiments, each set may output light with wavefront divergence corresponding to a particular depth plane, as discussed herein. It will be appreciated that the waveguides 1020a, 1020b, 1020c and the in-coupling optical element 1022a, 1022b, 1022c may correspond to the waveguides 670, 680, 690 and the in-coupling optical elements 700, 710, 720, respectively, of FIGS. 9A-9C. As viewed from the projection optics 1070, the in-coupling optical elements 1022a, 1022b, 1022c may be laterally shifted, such that they at least partly do not overlap as seen in such a view.

[0199] As illustrated, the various in-coupling optical elements disclosed herein (e.g., the in-coupling optical element 1022a, 1022b, 1022c) may be disposed on a major surface of an associated waveguide (e.g., waveguides 1020a, 1020b, 1020c, respectively). In addition, as also illustrated, the major surface on which a given in-coupling optical element is disposed may be the rear surface of the waveguide. In such a configuration, the in-coupling optical element may be a reflective light redirecting element, which in-couples light by reflecting the light at angles which support TIR through the associated waveguide. In some other configurations, the in-coupling optical element may be disposed on the forward surface of the waveguide (closer to the projection optics 1070 than the rearward surface). In such configurations, the in-coupling optical element may be a transmissive light redirecting element, which in-couples light by changing the direction of propagation of light as the light is transmitted through the in-coupling optical element. It will be appreciated that any of the in-coupling optical elements disclosed herein may be reflective or transmissive in-coupling optical elements.

[0200] With continued reference to FIG. 11A, image light 1032a, 1032b, 1032c from different ones of the micro-displays 1030a, 1030b, 1030c may take different paths to the eyepiece 1020, such that they impinge on different ones of the in-coupling optical element 1022a, 1022b, 1022c. Where the image light 1032a, 1032b, 1032c includes light of different component colors, the associated in-coupling optical element 1022a, 1022b, 1022c, respectively, may be configured to selectively in couple light of different wavelengths, as discussed above regarding, e.g., the in-coupling optical elements 700, 710, 720 of FIGS. 9A-9C.

[0201] With continued reference to FIG. 11A, the optical combiner 1050 may be configured to redirect the image light 1032a, 1032b, 1032c emitted by the micro-displays 1030a, 1030b, 1030c such that the image light propagates along different optical paths, in order to impinge on the appropriate associated one of the in-coupling optical element 1022a, 1022b, 1022c. Thus, the optical combiner 1050 combines the image light 1032a, 1032b, 1032c in the sense that the image light is outputted from a common face of the optical combiner 1050, although light may exit the optical combiner in slightly different directions. For example, the reflective internal surfaces 1052, 1054 of the X-cube prism may each be angled to direct the image light 1032a, 1032b, 1032c along different paths to the eyepiece 1020. As a result, the image light 1032a, 1032b, 1032c may be incident on different associated ones of in-coupling optical elements 1022a, 1022b, 1022c. In some embodiments, the micro-displays 1030a, 1030b, 1030c may be appropriately angled relative to the reflective internal surfaces 1052, 1054 of the X-cube prism to provide the desired light paths to the in-coupling optical elements 1022a, 1022b, 1022c. For example, faces of one or more of the micro-displays 1030a, 1030b, 1030c may be angled to matching faces of the optical combiner 1050, such that image light emitted by the micro-displays is incident on the reflective internal surfaces 1052, 1054 at an appropriate angle to propagate towards the associated in coupling optical element 1022a, 1022b, or 1022c. It will be appreciated that, in addition to a cube, the optical combiner 1050 may take the form of various other polyhedra. For example, the optical combiner 1050 may be in the shape of a rectangular prism having at least two faces that are not squares.

[0202] With continued reference to FIG. 11A, in some embodiments, the monochrome micro-display 1030b directly opposite the output face 1051 may advantageously output green light. It will be appreciated that the reflective surfaces 1052, 1054 may have optical losses when reflecting light from the micro-displays. In addition, the human eye is most sensitive to the color green. Consequently, the monochrome micro-display 1030b opposite the output face 1051 preferably outputs green light, so that the green light may proceed directly through the optical combiner 1050 without needing to be reflected to be outputted from the optical combiner 1050. It will be appreciated, however, that the green monochrome micro-display may face other surfaces of the optical combiner 1050 in some other embodiments.

[0203] As discussed herein, the perception of a full color image by a user may be achieved with time division multiplexing in some embodiments. For example, different ones of the emissive micro-displays 1030a, 1030b, 1030c may be activated at different times to generate different component color images. In such embodiments, the different component color images that form a single full color image may be sequentially displayed sufficiently quickly that the human visual system does not perceive the component color images as being displayed at different times; that is, the different component color images that form a single full color image may all be displayed within a duration that is sufficiently short that the user perceives the component color images as being simultaneously presented, rather than being temporally separated. For example, it will be appreciated that the human visual system may have a flicker fusion threshold. The flicker fusion threshold may be understood to a duration within which the human visual system is unable to differentiate images as being presented at different times. Images presented within that duration are fused or combined and, as a result, may be perceived by a user to be present simultaneously. Flickering images with temporal gaps between the images that are outside of that duration are not combined, and the flickering of the images is perceptible. In some embodiments, the duration is 1/60 seconds or less, which corresponds to a frame rate of 60 Hz or more. Preferably, image frames for any individual eye are provided to the user at a frame rate equal to or higher than the duration of the flicker fusion threshold of the user. For example, the frame rate for each of the left-eye or right-eye pieces may be 60 Hz or more, or 120 Hz or more; and, as a result, the frame rate provided by the light projection system 1010 may be 120 Hz or more, or 240 Hz or more in some embodiments.

[0204] It will be appreciated that time division multiplexing may advantageously reduce the computational load on processors (e.g., graphics processors) utilized to form displayed images. In some other embodiments, such as where sufficient computational resources are available, all component color images that form a full color image may be displayed simultaneously by the micro-displays 1030a, 1030b, 1030c.

[0205] As discussed herein, the micro-displays 1030a, 1030b, 1030c may each include arrays of light emitters. FIG. 11B illustrates an example of an array 1042 of light emitters 1044. Where the associated micro-display is a monochrome micro-display, the light emitters 1044 may all be configured to emit light of the same color.

[0206] Where the associated micro-display is a full-color micro-display, different ones of the light emitters 1044 may be configured to emit light of different colors. In such embodiments, the light emitters 1044 may be considered subpixels and may be arranged in groups, with each group having at least one light emitter configured to emit light of each component color. For example, where the component colors are red, green, and blue, each group may have at least one red subpixel, at least one green subpixel, in at least one blue subpixel.

[0207] It will be appreciated, that while the light emitters 1044 are shown arranged in a grid pattern for ease of illustration, the light emitters 1044 may have other regularly repeating spatial arrangements. For example, the number of light emitters of different component colors may vary, the sizes of the light emitters may vary, the shapes of the light emitters and/or the shapes made out by groups of light emitters may vary, etc.

[0208] With continued reference to FIG. 11B, it will be appreciated that the micro-emitters 1044 emit light. In addition, manufacturing constraints, such as lithography or other patterning and processing limitations, and/or electrical considerations, may limit how closely neighboring light-emitters 1044 are spaced. As a result, there may be an area 1045 surrounding the light emitter 1044 within which it is not practical to form other light emitters 1044. This area 1045 forms the inter-emitter regions between light emitters 1044. In some embodiments, taking into account the area 1045, the light emitters have a pitch of, e.g., less than 10 .mu.m, less than 8 .mu.m, less than 6 .mu.m, or less than 5 .mu.m, and more than 1 .mu.m, including 1-5 .mu.m, and an emitter size of 2 .mu.m or less, 1.7 .mu.m or less, or 1.3 .mu.m or less. In some embodiments, the emitter size is within a range having an upper limit of the above-noted sizes and a lower limit of 1 .mu.m. In some embodiments, the ratio of emitter size to pitch is 1:1 to 1:5, 1:2 to 1:4, or 1:2 to 1:3.

[0209] It will be appreciated that, given some light emitter device architectures and materials, current crowding may decrease the emitter's efficiency and pixel droop may cause unintentional activation of pixels (e.g., due to energy directed to one light emitter bleeding into a neighboring light emitter). As a result, a relatively large area 1045 may beneficially reduce current crowding and pixel droop. In some embodiments, the ratio of emitter size to pitch is preferably 1:2 to 1:4, or 1:2 to 1:3.

[0210] It will also be appreciated, however, that large separations between light emitters (e.g., a small light emitter to pitch ratio) may undesirably cause visible gaps, or dark regions, between the light emitters. Even when laterally translated as discussed herein, some gaps may still be visible, depending on the size of the original gap, the distance of the translation, and the number of subframes (and resulting translation increments) utilized. In some embodiments, lens structure such as light collimators may be utilized to effectively fill or partially fill in these dark regions. For example, a light collimating lens may extend on and around a light emitter 1044, such that light from the emitter 1044 completely fills the lens. For example, the light collimating lens may have a larger width than the light emitters 1044 and, in some embodiments, the width of the collimating lens may be approximately equal to the pitch. As a result, the size of the emitter 1044 is effectively increased to extend across the area of the lens, thereby filling in some or all of the area 1045. In some other embodiments, the width of the collimating lens may be approximately equal to the distance that the projection system is translated, as discussed herein, for each subframe. Lens structures such as light collimators are further discussed herein (e.g., in FIG. 30A and the related discussion).

[0211] As discussed herein, the light emitters 1044 may be OLEDs or micro-LEDs. It will be appreciated that OLEDs may utilize layers of organic material, e.g., disposed between electrodes, to emit light. Micro-LEDs may utilize inorganic materials, e.g., Group III-V materials such as GaAs, GaN, and/or GaIn for light emission. Examples of GaN materials include InGaN, which may be used to form blue or green light emitters in some embodiments. Examples of GaIn materials include AlGaInP, which may be used to form red light emitters in some embodiments. In some embodiments, the light emitters 1044 may emit light of an initial color, which may be converted to other desired colors using phosphor materials or quantum dots. For example, the light emitter may emit blue light which excites a phosphor material or quantum dot that converts the blue wavelength light to green or red wavelengths.

[0212] With reference now to FIG. 12, another example is illustrated of a wearable display system with a light projection system having multiple emissive micro-displays 1030a, 1030b, 1030c. The illustrated display system is similar to the display system of FIG. 11A except that the optical combiner 1050 has a standard X-cube prism configuration and includes light redirecting structures 1080a and 1080c for modifying the angle of incidence of light on the reflective surfaces 1052, 1054 of the X-cube prism. It will be appreciated that a standard X-cube prism configuration will receive light which is normal to a face of the X-cube and redirect this light 45.degree. such that it is output at a normal angle from a transverse face of the X-cube. However, this would cause the image light 1032a, 1032b, 1032c to be incident on the same in-coupling optical element of the eyepiece 1020. In order to provide different paths for the image light 1032a, 1032b, 1032c, so that the image light is incident on associated ones of the in-coupling optical elements 1022a, 1022b, 1022c of the waveguide assembly, the light redirecting structures 1080a, 1080c may be utilized.

[0213] In some embodiments, the light redirecting structures 1080a, 1080c may be lens structures. It will be appreciated that the lens structures may be configured to receive incident light and to redirect the incident light at an angle such that the light reflects off a corresponding one of the reflective surfaces 1052, 1054 and propagates along a light path towards a corresponding one of the in-coupling optical elements 1022a, 1022c. As examples, the light redirecting structures 1080a, 1080c may comprise micro-lenses, nano-lenses, reflective wells, metasurfaces, and liquid crystal gratings. In some embodiments, the micro-lenses, nano-lenses, reflective wells, metasurfaces, and liquid crystal gratings may be organized in arrays. For example, each light emitter of the micro-displays 1030a, 1030c may be matched with one micro-lens. In some embodiments, in order to redirect light in a particular direction, the micro-lens or reflective wells may be asymmetrical and/or the light emitters may be disposed off-center relative to the micro-lens. In addition, in some embodiments, the light redirecting structures 1080a, 1080c may be collimators which narrow the angular emission profiles of associated light emitters, to increase the amount of light ultimately in-coupled into the eyepiece 1020. Further details regarding such light redirecting structures 1080a, 1080c are discussed below regarding FIGS. 24A-27C.

[0214] With reference now to FIG. 13A, in some embodiments, two or more of the in-coupling optical elements 1022a, 1022b, 1022c may overlap (e.g., as seen in a head-on view in the direction of light propagation into the in-coupling optical element 1022a, 1022b, 1022c). FIG. 13A illustrates an example of a side-view of a wearable display system with a light projection system 1010 having multiple emissive micro-displays 1032a, 1032b, 1032c and an eyepiece 1020 with overlapping light in-coupling optical elements 1022a, 1022c and non-overlapping light in-coupling optical element 1022b. As illustrated, the in-coupling optical elements 1022a, 1022c overlap, while the in-coupling optical elements 1022b are laterally shifted. Stated another way, the in-coupling optical elements 1022a, 1022c are aligned directly in the paths of the image light 1032a, 1032c, while the image light 1032b follows another path to the eyepiece 1020, such that it is incident on an area of the eyepiece 1020 that is laterally shifted relative to the area in which the image light 1032a, 1032c is incident.

[0215] As illustrated, differences between the paths for the image light 1032b and image light 1032a, 1032c may be established using light redirecting structures 1080a, 1080c. In some embodiments, the image light 1032b from the emissive micro-display 1030b proceeds directly through the optical combiner 1052. The image light 1032a from the emissive micro-display 1032a is redirected by the light redirecting structure 1080a such that it reflects off of the reflective surface 1054 and propagates out of the optical combiner 1050 in the same direction as the image light 1032c. It will be appreciated that the image light 1032c from the emissive micro-display 1032c is redirected by the light redirecting structure 1080c such that it reflects off of the reflective surface 1052 at an angle such that the image light 1032c propagates out of the optical combiner 1050 in the same direction as the image light 1032b. Thus, the redirection of light by the light redirecting structures 1080a, 1080c and the angles of the reflective surfaces 1052, 1054 are configured to provide a common path for the image light 1032a, 1032c out of the optical combiner 1050, with this common path being different from the path of the image light 1032b. In some other embodiments, one or both of the light redirecting structures 1080a, 1080c may be omitted and the reflective surfaces 1052, 1054 in the optical combiner 1050 may be configured to reflect the image light 1032a, 1032c in the appropriate respective directions such that they exit the optical combiner 1050 propagating in the same direction, which is different from the direction of the image light 1032b. As such, after propagating through the projection optics 1070, the image light 1032a, 1032c exit from one exit pupil while the image light 1032b exits from another exit pupil. In this configuration, the light projection system 1010 may be referred to as a two-pupil projection system.

[0216] In some embodiments, the light projection system 1010 may have a single output pupil and may be referred to as a single-pupil projection system. In such embodiments, the light projection system 1010 may be configured to direct the image light 1032a, 1032b, 1032c onto a single common area of the eyepiece 1020. Such a configuration is shown in FIG. 13B, which illustrates a wearable display system with a light projection system 1010 having multiple emissive micro-displays 1030a, 1030b, 1030c configured to direct light to a single light in-coupling area of the eyepiece 1020. In some embodiments, as discussed further herein, the eyepiece 1020 may include a stack of waveguides having overlapping light in-coupling optical elements. In some other embodiments, a single light in-coupling optical element may be configured to in-couple light of all component colors into a single waveguide. The display system of FIG. 13B is similar to the display system of FIG. 13A, except for the omission of the light redirecting structures 1080a, 1080c and the use of the in-coupling optical element 1122a and with the associated waveguide 1020a. As illustrated, the in-coupling optical element 1122a in-couples each of image light 1032a, 1032b, 1032c into the waveguide 1020a, which then relays the image light to the eye 210. In some embodiments, the in-coupling optical element 1122a may comprise a diffractive grating. In some embodiments, the in-coupling optical element 1122a is a metasurface and/or liquid crystal grating.

[0217] As discussed herein, in some embodiments, the emissive micro-displays 1030a, 1030b, 1030c may be monochrome micro-displays configured to emit light of different colors. In some embodiments, one or more of the emissive micro-displays 1030a, 1030b, 1030c may have groups of light emitters configured to emit light of two or more, but not all, component colors. For example, a single emissive micro-display may have groups of light emitters--with at least one light emitter per group configured to emit blue light and at least one light emitter per group configured to emit green light--and a separate emissive micro-display on a different face of the X-cube 1050 may have light emitters configured to emit red light. In some other embodiments, the emissive micro-displays 1030a, 1030b, 1030c may each be full-color displays, each having light emitters of all component colors. As noted herein, utilizing multiple similar micro-displays may provide advantages for dynamic range and increased display brightness.

[0218] In some embodiments, a single full-color emissive micro-display may be utilized. FIG. 14 illustrates an example of a wearable display system with a single emissive micro-display 1030b. The wearable display system of FIG. 14 is similar to the wearable display system of FIG. 14, except that the single emissive micro-display 1030b is a full color micro-display configured to emit light of all component colors. As illustrated, the micro-display 1030b emits image light 1032a, 1032b, 1032c of each component color. In such embodiments, the optical combiner 1050 (FIG. 13B) may be omitted, which may advantageously reduce the weight and size of the wearable display system relative to a system with an optical combiner.

[0219] As discussed above, the in-coupling optical elements of the eyepiece 1020 may assume various configurations. Some examples of configurations for the eyepiece 1020 are discussed below in relation to FIGS. 15-23C.

[0220] FIG. 15 illustrates a side view of an example of an eyepiece 1020 having a stack of waveguides 1020a, 1020b, 1020c with overlapping in-coupling optical elements 1022a, 1022b, 1022c, respectively. It will be appreciated that the illustrated waveguide stack may be utilized in place of the single illustrated waveguide 1020a of FIGS. 13B and 14. As discussed herein, each of the in-coupling optical elements 1022a, 1022b, 1022c is configured to in-couple light having a specific color (e.g., light of a particular wavelength, or a range of wavelengths). In the illustrated orientation of the eyepiece 1020 in which the image light propagates vertically down the page towards the eyepiece 1020, the in-coupling optical elements 1022a, 1022b, 1022c are vertically aligned with each other (e.g., along an axis parallel to the direction of propagation of the image light 1032a, 1032b, 1032c) such that they spatially overlap with each other as seen in a top down view (a head-on view in a direction of the image light 1032a, 1032b, 1032c propagating to the in-coupling optical elements).

[0221] With continued reference to FIG. 15, as discussed herein, the projection system 1010 (FIGS. 13, 14) is configured to output a first monochrome color image, a second monochrome color image, and a third monochrome color image (e.g., red, green and blue color images) through the single-pupil of the projection system, the monochrome images being formed by the image light 1032a, 1032b, 1032c, respectively. The in-coupling optical element 1022c is configured to in-couple the image light 1032c for the first color image into the waveguide 1020c such that it propagates through the waveguide 1020c by multiple total internal reflections at the upper and bottom major surfaces of the waveguide 1020c, the in-coupling optical element 1022b is configured to in-couple the image light 1032b for the second color image into the waveguide 1020b such that it propagates through the waveguide 1020b by multiple total internal reflections at the upper and bottom major surfaces of the waveguide 1020b, and the in-coupling optical element 1022a is configured to in-couple the image light 1032a for the third color image into the waveguide 1020a such that it propagates through the waveguide 1020a by multiple total internal reflections at the upper and bottom major surfaces of the waveguide 1020a.

[0222] As discussed herein, the in-coupling optical element 1022c is preferably configured to in-couple substantially all the incident light 1032c corresponding to the first color image into the associated waveguide 1020c while allowing substantially all the incident light 1032b, 1032a corresponding to the second color image and the third color image, respectively, to be transmitted without being in-coupled. Similarly, the in-coupling optical element 1022b is preferably configured to in-couple substantially all the incident image light 1032b corresponding to the second color image into the associated waveguide 1020b while allowing substantially all the incident light corresponding to the third color image to be transmitted without being in-coupled.

[0223] It will be appreciated that, in practice, the various in-coupling optical elements may not have perfect selectivity. For example, some of the image light 1032b, 1032a may undesirably be in-coupled into the waveguide 1020c by the in-coupling optical element 1022c; and some of the incident image light 1032a may undesirably be in-coupled into the waveguide 1020b by the in-coupling optical element 1022b. Furthermore, some of the image light 1032c may be transmitted through the in-coupling optical element 1022c and in-coupled into waveguides 1020b and/or 1020a by the in-coupling optical elements 1020b and/or 1020a, respectively. Similarly, some of the image light 1032b may be transmitted through the in-coupling optical element 1022b and in-coupled into waveguide 1020a by the in-coupling optical element 1022a.

[0224] In-coupling image light for a color image into an unintended waveguide may cause undesirable optical effects, such as, for example cross-talk and/or ghosting. For example, in-coupling of the image light 1032c for the first color image into unintended waveguides 1020b and/or 1020a may result in undesirable cross-talk between the first color image, the second color image and/or the third color image; and/or may result in undesirable ghosting. As another example, in-coupling of the image light 1032b, 1032a for the second or third color image, respectively, into the unintended waveguide 1020c may result in undesirable cross-talk between the first color image, the second color image and/or the third color image; and/or may cause undesirable ghosting. In some embodiments, these undesirable optical effects may be mitigated by providing color filters (e.g., absorptive color filters) that may reduce the amount of incident light that is in-coupled into an unintended waveguide.

[0225] FIG. 16 illustrates a side view of an example of a stack of waveguides with color filters for mitigating ghosting or crosstalk between waveguides. The eyepiece 1020 of FIG. 16 is similar to that of FIG. 15, except for the presence of one or more of the color filters 1024c, 1024b and 1028, 1026. The color filters 1024c, 1024b are configured to reduce the amount of light unintentionally in-coupled into the waveguides 1020b and 1020a, respectively. The color filters 1028, 1026 are configured to reduce the amount of unintentionally in-coupled image light which propagates through the waveguides 1020b, 1020c, respectively.

[0226] With continued reference to FIG. 16, a pair of color filters 1026 disposed on the upper and lower major surfaces of the waveguide 1020c may be configured to absorb image light 1032a, 1032b that may have been unintentionally been in-coupled into waveguide 1020c. In some embodiments, the color filter 1024c disposed between the waveguides 1020c and 1020b is configured to absorb image light 1032c that is transmitted through the in-coupling optical element 1022c without being in-coupled. A pair of color filters 1028 disposed on the upper and lower major surfaces of the waveguide 1020b is configured to absorb image light 1032a that is in-coupled into waveguide 1020b. A color filter 1024b disposed between the waveguides 1020b and 1020a is configured to absorb image light 1032b that is transmitted through the in-coupling optical element 710.

[0227] In some embodiments, the color filters 1026 on each major surface of the waveguide 1020c are similar and are configured to absorb light of the wavelengths of both image light 1032a, 1032b. In some other embodiments, the color filter 1026 on one major surface of the waveguide 1020c may be configured to absorb light of the color of image light 1032a, and the color filter on the other major surface may be configured to absorb light of the color of image light 1032b. In either arrangement, the color filters 1026 may be configured to selectively absorb the image light 1032a, 1032b propagating through the waveguide 1020c by total internal reflection. For example, at TIR bounces of the image light 1032a, 1032b off the major surfaces of the waveguide 1020c, the image light 1032a, 1032b contacts a color filter 1026 on those major surfaces and a portion of that image light is absorbed. Preferably, due to the selective absorption of image light 1032a, 1032b by the colors filters 1026, the propagation of the in-coupled the image light 1032c via TIR through the waveguide 1020c is not appreciably affected.

[0228] Similarly, the plurality of color filters 1028 may be configured as absorption filters that absorb in-coupled image light 1032a that propagates through the waveguide 1020b by total internal reflection. At TIR bounces of the image light 1032a off the major surfaces of the waveguide 1020b, the image light 1032a contacts a color filter 1028 on those major surfaces and a portion of that image light is absorbed. Preferably, the absorption of the image light 1032a is selective and does not affect the propagation of the in-coupled image light 1032b that is also propagating via TIR through the waveguide 1020b.

[0229] With continued reference to FIG. 16, the color filters 1024c and 1024b may also be configured as absorption filters. The color filter 1024c may be substantially transparent to light of the colors of the image light 1032a, 1032b such that the image light 1032a, 1032b is transmitted through the color filter 1024c with little to no attenuation, while light of the color of the image light 1032c is selectively absorbed. Similarly, the color filter 1024b may be substantially transparent to light of the color of the image light 1032a such that incident image light 1032a is transmitted through the color filter 1024b with little to no attenuation, while light of the color of the image light 1032b is selectively absorbed. The color filter 1024c may be disposed on a major surface (e.g., the upper major surface) of the waveguide 1020b as shown in FIG. 16. Alternately, the color filter 1024c may be disposed on a separate substrate positioned between the waveguides 1020c and 1020b. Likewise, the color filter 1024b may be disposed on a major surface (e.g., an upper major surface) of the waveguide 1020a. Alternately, the color filter 1024b may be disposed on a separate substrate positioned between the waveguides 1020b and 1020a. It will be appreciated that the color filters 1024c and 1024b may be vertically aligned with the single-pupil of the projector that outputs the image light 1032a, 1032b, 1032c (in orientations where the image light 1032a, 1032b, 1032c propagates vertically to the waveguide stack 1020, as illustrated).

[0230] In some embodiments, the color filters 1026 and 1028 may have single-pass attenuation factors of less than about 10%, (e.g., less than or equal to about 5%, less than or equal to about 2%, and greater than about 1%) to avoid significant undesired absorption of light propagating through the thickness the waveguides 1020c, 1020b (e.g., light of the colors of the image light 1032a, 1032b propagating through the waveguides 1020c, 1020b from the ambient environment and/or other waveguides). Various embodiments of the color filters 1024c and 1024b may be configured to have low attenuation factors for the wavelengths that are to be transmitted and high attenuation factor for the wavelengths that are to be absorbed. For example, in some embodiments, the color filter 1024c may be configured to transmit greater than 80%, greater than 90%, or greater than 95%, of incident light having the colors of the image light 1032a, 1032b and absorb greater than 80%, greater than 90%, or greater than 95%, of incident light having the color of the image light 1032a. Similarly, the color filter 1024b may be configured to transmit greater than 80%, greater than 90%, or greater than 95%, of incident light having the color of the image light 1032a and absorb greater than 80%, greater than 90%, or greater than 95%, of incident light having the color of the image light 1032b.

[0231] In some embodiments, the color filters 1026, 1028, 1024c, 1024b may comprise a layer of color selective absorbing material deposited on one or both surfaces of the waveguide 1020c, 1020b and/or 1020a. The color selective absorbing material may comprise a dye, an ink, or other light absorbing material such as metals, semiconductors, and dielectrics. In some embodiments, the absorption of material such as metals, semiconductors, and dielectrics may be made color selective by utilizing these materials to form subwavelength gratings (e.g., a grating that does not diffract the light). The gratings may be made of plasmonics (e.g. gold, silver, and aluminum) or semiconductors (e.g. silicon, amorphous silicon, and germanium).

[0232] The color selective material may be deposited on the substrate using various deposition methods. For example, the color selective absorbing material may be deposited on the substrate using jet deposition technology (e.g., ink-jet deposition). Ink-jet deposition may facilitate depositing thin layers of the color selective absorbing material. Because ink-jet deposition allows for the deposition to be localized on selected areas of the substrate, ink-jet deposition provides a high degree of control over the thicknesses and compositions of the layers of the color selective absorbing material, including providing for nonuniform thicknesses and/or compositions across the substrate. In some embodiments, the color selective absorbing material deposited using ink-jet deposition may have a thickness between about 10 nm and about 1 micron (e.g., between about 10 nm and about 50 nm, between about 25 nm and about 75 nm, between about 40 nm and about 100 nm, between about 80 nm and about 300 nm, between about 200 nm and about 500 nm, between about 400 nm and about 800 nm, between about 500 nm and about 1 micron, or any value in a range/sub-range defined by any of these values). Controlling the thickness of the deposited layer of the color selective absorbing material may be advantageous in achieving a color filter having a desired attenuation factor. Furthermore, layers having different thickness may be deposited in different portions of the substrate. Additionally, different compositions of the color selective absorbing material may be deposited in different portions of the substrate using ink-jet deposition. Such variations in composition and/or thickness may advantageously allowing for location-specific variations in absorption. For example, in areas of a waveguide in which transmission of light from the ambient (to allow the viewer to see the ambient environment) is not necessary, the composition and/or thickness may be selected to provide high absorption or attenuation of selected wavelengths of light. Other deposition methods such as coating, spin-coating, spraying, etc. may be employed to deposit the color selective absorbing material on the substrate.

[0233] FIG. 17 illustrates an example of a top-down view of the waveguide assemblies of FIGS. 15 and 16. As illustrated, in-coupling optical elements 1022a, 1022b, 1022c spatially overlap. In addition, the waveguides 1020a, 1020b, 1020c, along with each waveguide's associated light distributing element 730, 740, 750 and associated out-coupling optical element 800, 810, 820, may be vertically aligned. The in-coupling optical elements 1022a, 1022b, 1022c are configured to in-couple incident image light 1032a, 1032b, 1032c (FIGS. 15 and 16), respectively, in waveguides 1020a, 1020b, 1020c, respectively, such that the image light propagates towards the associated light distributing element 730, 740, 750 by TIR.

[0234] FIG. 18 illustrates another example of a top-down view of the waveguide assemblies of FIGS. 15 and 16. As in FIG. 17, in-coupling optical elements 1022a, 1022b, 1022c spatially overlap and the waveguides 1020a, 1020b, 1020c are vertically aligned. In place of each waveguide's associated light distributing element 730, 740, 750 and associated out-coupling optical element 800, 810, 820, however, are combined OPE/EPE's 1281, 1282, 1283, respectively. The in-coupling optical elements 1022a, 1022b, 1022c are configured to in-couple incident image light 1032a, 1032b, 1032c (FIGS. 15 and 16), respectively, in waveguides 1020a, 1020b, 1020c, respectively, such that the image light propagates towards the associated combined OPE/EPE's 1281, 1282, 1283 by TIR.

[0235] While FIGS. 15-18 show overlapping in-coupling optical elements for a single-pupil configuration of the display system, it will be appreciated that the display system may have a two-pupil configuration in some embodiments. In such a configuration, where three component colors are utilized, image light for two colors may have overlapping in-coupling optical elements, while image light for a third color may have a laterally-shifted in-coupling optical element. For example, the optical combiner 1050 (FIGS. 11A, 12, 13A-13B) and/or light redirecting structures 1080a, 1080c may be configured to direct image light through the projection optics 1070 such that image light of two colors are incident on directly overlapping areas of the eyepiece 1020 while another color of the image light is incident on an area that is laterally-shifted. For example, the reflective surfaces 1052, 1054 (FIG. 11A) may be angled such that image light of one color follows a common light path with image light from the emissive micro-display 1030b, while image light of another color follows a different light path. In some embodiments, rather than having both light redirecting structures 1080a, 1080c (FIG. 12), one of these light redirecting structures may be omitted, so that only light from one of the micro-displays 1030a, 1030c is angled to provide a different light path from the light emitted by the other two micro-displays.

[0236] FIG. 19A illustrates a side view of an example of an eyepiece having a stack of waveguides with some overlapping and some laterally-shifted in-coupling optical elements. The eyepiece of FIG. 19A is similar to the eyepiece of FIG. 15, except that one of the in-coupling optical elements is laterally shifted relative to the other in-coping optical elements. In the illustrated orientation of the eyepiece 1020 in which the image light propagates vertically down the page towards the eyepiece 1020, the in-coupling optical elements 1022a, 1022c are vertically aligned with each other (e.g., along an axis parallel to the direction of propagation of the image light 1032a, 1032c) such that they spatially overlap with each other as seen in a head-on view in a direction of the image light 1032a, 1032c propagating to the in-coupling optical elements 1022a, 1022b, 1022c. As seen in the same head-on view (e.g., as seen in a top-down view in the illustrated orientation), the in-coupling optical element 1022b is shifted laterally relative to the other in-coupling optical elements 1022a, 1022c. Light for the in-coupling optical element 1022b is output to the eyepiece 1020 through a different exit pupil than light for the in-coupling optical elements 1022a, 1022c. It will be appreciated that the illustrated waveguide stack comprising the waveguides 1020a, 1020b, 1020c may be utilized in place of the single illustrated waveguide 1020a of FIGS. 13 and 14.

[0237] With continued reference to FIG. 19, the in-coupling optical element 1022c is configured to in-couple the image light 1032c into the waveguide 1020c such that it propagates through the waveguide 1020c by multiple total internal reflections between the upper and bottom major surfaces of the waveguide 1020c, the in-coupling optical element 1022b is configured to in-couple the image light 1032b into the waveguide 1020b such that it propagates through the waveguide 1020b by multiple total internal reflections between the upper and bottom major surfaces of the waveguide 1020b, and the in-coupling optical element 1022a is configured to in-couple the image light 1032a into the waveguide 1020a such that it propagates through the waveguide 1020a by multiple total internal reflections between the upper and bottom major surfaces of the waveguide 1020a.

[0238] The in-coupling optical element 1022c is preferably configured to in-couple all the incident light 1032c into the associated waveguide 1020c while being transmissive to all the incident light 1032a. On the other hand, the image light 1032b may propagate to the in-coupling optical element 1022b without needing to propagate through any other in-coupling optical elements. This may be advantageous in some embodiments by allowing light, to which the eye is more sensitive, to be incident on a desired in-coupling optical element without any loss or distortion associated with propagation through other in-coupling optical elements. Without being limited by theory, in some embodiments, the image light 1032b is green light, to which the human eye is more sensitive. It will be appreciated that, while the waveguides 1020a, 1020b, 1020c are illustrated arranged a particular order, in some embodiments, the order of the waveguides 1020a, 1020b, 1020c may differ.

[0239] It will be appreciated that, as discussed herein, the in-coupling optical element 1022c overlying the in-coupling optical elements 1022a may not have perfect selectivity. Some of the image light 1032a may undesirably be in-coupled into the waveguide 1020c by the in-coupling optical element 1022c; and some of the image light 1032c may be transmitted through the in-coupling optical element 1022c, after which the image light 1032c may strike the in-coupling optical element 1020a and be in-coupled into the waveguide 1020a. As discussed herein, such undesired in-coupling may be visible as ghosting or crosstalk.

[0240] FIG. 19B illustrates a side view of an example of the eyepiece of FIG. 19A with color filters for mitigating ghosting or crosstalk between waveguides. In particular, color filters 1024c and/or 1026 are added to the structures shown in FIG. 19A. As illustrated, the in-coupling optical element 1022c may unintentionally in-couple a portion of the image light 1032a into the waveguide 1020c. In addition, or alternatively, a portion of the image light 1032c undesirably be transmitted through the in-coupling optical element 1022c after which it may unintentionally be in-coupled by the in-coupling optical element 1022a.

[0241] To mitigate unintentionally in-couple image light 1032a propagating through the waveguide 1022c, absorptive color filters 1026 may be provided on one or both major surfaces of the waveguide 1022c. The absorptive color filters 1026 may be configured to absorb light of the color of the unintentionally in-coupled image light 1032a. As illustrated, the absorptive color filters 1026 are disposed in the general direction of propagation of the image light through the waveguide 1020c. Thus, the absorptive color filters 1026 are configured to absorb image light 1032a as that light propagates through the waveguide 1020c by TIR and contacts the absorptive color filters 1026 while reflecting off one or both of the major surfaces of the waveguide 1020c.

[0242] With continued reference to FIG. 19B, to mitigate image light 1032c which propagates through the in-coupling optical element 1022c without being in-coupled, the absorptive color filter 1024c may be provided forward of the in-coupling optical element 1022a. The absorptive color filter 1024c is configured to absorb light of the color of the image light 1032c, to prevent that light from propagating to the in-coupling optical element 1022a. While illustrated between the waveguides 1020c and 1020b, in some other embodiments, the absorptive color filter 1024c may be disposed between the waveguides 1020b and 1020a. It will be appreciated that further details regarding the composition, formation, and properties of the absorptive color filters 1024c and 1026 are provided in the discussion of FIG. 16.

[0243] It will also be appreciated that in the embodiments illustrated in FIGS. 16 and 19B, one or more of the color filters 1026, 1028, 1024c, and 1024b may be omitted if one or more in-coupling optical elements 1022a, 1022b, 1022c have sufficiently high selectivity for the color of the light that is intended to be in-coupled into the associated waveguide 1020a, 1020b, 1022c, respectively.

[0244] FIG. 20A illustrates an example of a top-down view of the eyepieces of FIGS. 19A and 19B. As illustrated, in-coupling optical elements 1022a, 1022c spatially overlap, while in-coupling optical element 1022b is laterally-shifted. In addition, the waveguides 1020a, 1020b, 1020c, along with each waveguide's associated light distributing element 730, 740, 750 and associated out-coupling optical element 800, 810, 820, may be vertically aligned. The in-coupling optical elements 1022a, 1022b, 1022c are configured to in-couple incident image light 1032a, 1032b, 1032c (FIGS. 15 and 16), respectively, in waveguides 1020a, 1020b, 1020c, respectively, such that the image light propagates towards the associated light distributing element 730, 740, 750 by TIR.

[0245] FIG. 20B illustrates another example of a top-down view of the waveguide assembly of FIGS. 19A and 19B. As in FIG. 20A, in-coupling optical elements 1022a, 1022c spatially overlap, the in-coupling optical element is laterally-shifted, and the waveguides 1020a, 1020b, 1020c are vertically aligned. In place of each waveguide's associated light distributing element 730, 740, 750 and associated out-coupling optical element 800, 810, 820, however, are combined OPE/EPE's 1281, 1282, 1283, respectively. The in-coupling optical elements 1022a, 1022b, 1022c are configured to in-couple incident image light 1032a, 1032b, 1032c (FIGS. 15 and 16), respectively, in waveguides 1020a, 1020b, 1020c, respectively, such that the image light propagates towards the associated combined OPE/EPE's 1281, 1282, 1283 by TIR.

[0246] With reference now to FIG. 21, it will be appreciated that re-bounce of in-coupled light may undesirably occur in waveguides. Re-bounce occurs when in-coupled light propagating along a waveguide strikes the in-coupling optical element a second or subsequent time after the initial in-coupling incidence. Re-bounce may result in a portion of the in-coupled light being undesirably out-coupled and/or absorbed by a material of the in-coupling optical element. The out-coupling and/or light absorption undesirably may cause a reduction in overall in-coupling efficiency and/or uniformity of the in-coupled light.

[0247] FIG. 21 illustrates a side view of an example of re-bounce in a waveguide 1030a. As illustrated, image light 1032a is in-coupled into the waveguide 1030a by in-coupling optical element 1022a. In-coupling optical element 1022a redirects the image light 1032a such that it generally propagates through the waveguide in the direction 1033. Re-bounce may occur when in-coupled image light internally reflects or bounces off a major surface of the waveguide 1030a opposite the in-coupling optical element 1022a and is incident on or experiences a second bounce (a re-bounce) at the in-coupling optical element 1022a. The distance between two neighboring bounces on the same surface of the waveguide 1030a is indicated by spacing 1034.

[0248] Without being limited by theory, it will be appreciated that the in-coupling optical element 1022a may behave symmetrically; that is, it may redirect incident light such that the incident light propagates through the waveguide at TIR angles. However, light that is incident on the diffractive optical elements at TIR angles (such as upon re-bounce) may also be out-coupled. In addition or alternatively, in embodiments where the in-coupling optical element 1022a is coated with a reflective material, it will be understood that the reflection of light off of a layer of material such as metal may also involve partial absorption of the incident light, since reflection may involve the absorption and emission of light from a material. As a result, light out-coupling and/or absorption may undesirably cause loss of in-coupled light. Accordingly, re-bounced light may incur significant losses, as compared with light that interacts only once with the in-coupling optical element 1022a.

[0249] In some embodiments, the in-coupling elements are configured to mitigate in-coupled image light loss due to re-bounce. Generally, re-bounce of in-coupled light occurs towards the end 1023 of the in-coupling optical element 1022a in the propagation direction 1033 of the in-coupled light. For example, light in-coupled at the end of the in-coupling optical element 1022a opposite the end 1023 may re-bounce if the spacing 1034 for that light is sufficiently short. To avoid such re-bounce, in some embodiments, the in-coupling optical element 1022a is truncated at the propagation direction end 1023, to reduce the width 1022w of the in-coupling optical element 1022a along which re-bounce is likely to occur. In some embodiments, the truncation may be a complete truncation of all structures of the in-coupling optical element 1022a (e.g., the metallization and diffractive gratings). In some other embodiments, for example, where the in-coupling optical element 1022a comprises a metalized diffraction grating, a portion of the in-coupling optical element 1022a at the propagation direction end 1023 may not be metalized, such that the propagation direction end 1023 of the in-coupling optical element 1022a absorbs less re-bouncing light and/or outcouples re-bouncing light with a lower efficiency. In some embodiments, a diffractive region of an in-coupling optical element 1022a may have a width along a propagation direction 1033 shorter than its length perpendicular to the propagation direction 1033, and/or may be sized and shaped such that a first portion of image light 1032a is incident on the in-coupling optical element 1022a and a second portion of the beam of light impinges on the waveguide 1030a without being incident on the in-coupling optical element 1022a. While waveguide 1032a and light in-coupling optical element 1022a are illustrated alone for clarity, it will be appreciated that re-bounce and the strategies discussed for reducing re-bounce may apply to any of the in-coupling optical elements disclosed herein. It will also be appreciated that the spacing 1034 is related to the thickness of the waveguide 1030a (a larger thickness results in a larger spacing 1034). In some embodiments, the thickness of individual waveguides may be selected to set the spacing 1034 such that re-bounce does not occur. Further details regarding re-bounce mitigation may be found in U.S. Provisional Application No. 62/702,707, filed on Jul. 24, 2018, the entire disclosure of which is incorporated by reference herein.

[0250] FIGS. 22A-23C illustrate examples of top-down views of an eyepiece having in-coupling optical elements configured to reduce re-bounce. In-coupling optical element 1022a, 1022b, 1022c are configured to in-couple light such that it propagates in a propagation direction towards the associated light distributing elements 730, 740, 750 (FIGS. 22A-22C) or combined OPE/EPE's 1281, 1282, 1283 (FIGS. 23A-23C). As illustrated, the in-coupling optical element 1022a, 1022b, 1022c may have a shorter dimension along the propagation direction and a longer dimension along the transverse axis. For example, the in-coupling optical element 1022a, 1022b, 1022c may each be in the shape of a rectangle with a shorter side along the axis of the propagation direction and a longer side along an orthogonal axis. It will be appreciated that the in-coupling optical elements 1022a, 1022b, 1022c may have other shapes (e.g., orthogonal, hexagonal, etc.). In addition, different ones of the in-coupling optical elements 1022a, 1022b, 1022c may have different shapes in some embodiments. Also, preferably, as illustrated, non-overlapping in-coupling optical elements may be positioned such that they are not in the propagation direction of other in-coupling optical elements. For example, as shown in FIGS. 22A, 22B, 23A, and 23B, the non-overlapping in-coupling optical elements may be arranged in a line along an axis crossing (e.g., orthogonal to) the axis of the propagation direction.

[0251] It will be appreciated that in the waveguide assemblies of FIGS. 22A-22C are similar, except for the overlap of the in-coupling optical elements 1022a, 1022b, 1022c. For example, FIG. 22A illustrates in-coupling optical elements 1022a, 1022b, 1022c with no overlap. FIG. 22B illustrates overlapping in-coupling optical elements 1022a, 1022c, and non-overlapping in-coupling optical elements 1022b. FIG. 22C illustrates overlap between all the in-coupling optical elements 1022a, 1022b, 1022c.

[0252] The waveguide assemblies of FIGS. 23A-23C are also similar, except for the overlap of the in-coupling optical elements 1022a, 1022b, 1022c. FIG. 23A illustrates in-coupling optical elements 1022a, 1022b, 1022c with no overlap. FIG. 23B illustrates overlapping in-coupling optical elements 1022a, 1022c, and non-overlapping in-coupling optical elements 1022b. FIG. 22C illustrates overlap between all the in-coupling optical elements 1022a, 1022b, 1022c.

[0253] With reference now to FIG. 24A, it will be appreciated that the emissive micro-displays have high etendue, which presents a challenge for efficient light utilization. As discussed herein, the emissive micro-displays may include a plurality of individual light emitters. Each of these light emitters may have a large angular emission profile, e.g., a Lambertian or near-Lambertian emission profile. Undesirably, not all of this light may be captured and directed to the eyepiece of the display system.

[0254] FIG. 24A illustrates an example of angular emission profiles of light emitted by individual light emitters 1044 of an emissive micro-display 1032, and light captured by projection optics 1070. The illustrated emissive micro-display 1032 may correspond to any of the emissive-micro-displays disclosed herein, including the emissive micro-displays 1032a, 1032b, 1032c. As illustrated, the projection optics 1070 may be sized such that it will capture light having an angular emission profile 1046. However, the angular emission profiles 1046 in the light emitters 1044 is significantly larger; not all of the light emitted by the light emitters 1044 will be incident on the projection optics 1070, nor necessarily incident at angles at which the light will propagate into and through the projection optics 1070. As a result, some of the light emitted by the light emitter 1044 may undesirably be "wasted" since it is not captured and ultimately relayed to the user's eye to form images. This may result in images that appear darker than would be expected if more of the light outputted by the light emitters 1040 ultimately reached the user's eye.

[0255] In some embodiments, one strategy for capturing more of the light emitted by the light emitters 1040 is to increase the size of the projection optics 1070, to increase the size of the numerical aperture of the projection optics 1070 capturing light. In addition or alternatively, the projection optics 1070 may also be formed with high refractive index materials (e.g., having refractive indices above 1.5) which may also facilitate light collection. In some embodiments, the projection optics 1070 may utilize a lens sized to capture a desired, high proportion of the light emitted by the light emitters 1044. In some embodiments, the projection optics 1070 may be configured to have an elongated exit pupil, e.g., to emit light beams having a cross-sectional profile similar to the shapes of the in-coupling optical elements 1022a, 1022b, 1022c of FIGS. 22A-23C. For example, the projection optics 1070 may be elongated in a dimension corresponding to the elongated dimension of the in-coupling optical elements 1022a, 1022b, 1022c of FIGS. 22A-23C. Without being limited by theory, such elongated in-coupling optical elements 1022a, 1022b, 1022c may improve the etendue mismatch between the emissive micro-display and the eyepiece 1020 (FIGS. 22A-23C). In some embodiments, the thickness of the waveguides of the eyepiece 1020 (e.g., FIGS. 11A, and 12-23C) may be selected to increase the percentage of light effectively captured, e.g., by reducing re-bounce by increasing the re-bounce spacing, as discussed herein.

[0256] In some embodiments, one or more light collimators may be utilized to reduce or narrow the angular emission profile of light from the light emitters 1044. As a result, more of the light emitted by the light emitters 1044 may be captured by the projection optics 1070 and relayed to the eyes of a user, advantageously increasing the brightness of images and the efficiency of the display system. In some embodiments, the light collimators may allow the light collection efficiency of the projection optics (the percentage of light emitted by the light emitters 1044 that is captured by the projection optics) to reach values of 80% or more, 85% or more, or 90% or more, including about 85-95% or 85-90%. In addition, the angular emission profile of the light from the light emitters 1044 may be reduced to 60.degree. or less, 50.degree. or less, or 40.degree. or less (from, e.g., 180.degree.). In some embodiments, the reduced angular emission profiles may be in the range of about 30-60.degree., 30-50.degree., or 30-40.degree.. It will be appreciated that light from the light emitters 1044 may make out the shape of a cone, with the light emitter 1044 at the vertex of the cone. The angular mission profile refers to the angle made out by the sides of the cone, with the associated light emitter 1044 at the vertex of the angle (as seen in a cross-section taken along a plane extending through the middle of the cone and including the cone apex).

[0257] FIG. 24B illustrates an example of the narrowing of angular emission profiles using an array of light collimators. As illustrated, the emissive micro-display 1032 includes an array of light emitters 1044, which emit light with an angular emission profile 1046. An array 1300 of light collimators 1302 is disposed forward of the light emitters 1044. In some embodiments, each light emitter 1044 is matched 1-to-1 with an associated light collimator 1302 (one light collimator 1302 per light emitter 1044). Each light collimator 1302 redirects incident light from the associated light emitter 1044 to provide a narrowed angular emission profiles 1047. Thus, the relatively large angular emission profiles 1046 are narrowed to the smaller angular emission profiles 1047.

[0258] In some embodiments, the light collimators 1302 and array 1300 may be part of the light redirecting structures 1080a, 180c of FIGS. 12 and 13A. Thus, light collimators 1302 may narrow the angular emission profile of the light emitters 1044 and also redirect the light such that it propagates into the optical combiner 1050 at the appropriate angles to define multiple light paths and the related multiple exit pupils. It will be appreciated that light may be redirected in particular directions by appropriately shaping the light collimators 1302.

[0259] Preferably, the light collimators 1302 are positioned in tight proximity to the light emitters 1044 to capture a large proportion of the light outputted by the light emitters 1044. In some embodiments, there may be a gap between the light collimators 1302 and the light emitters 1044. In some other embodiments, the light collimator 1302 may be in contact with the light emitters 1044. It will be appreciated that the angular emission profile 1046 may make out a wide cone of light. Preferably, the entirety or majority of a cone of light from a light emitter 1044 is incident on a single associated light collimator 1302. Thus, in some embodiments, each light emitter 1044 is smaller (occupies a smaller area) than the light receiving face of an associated light collimator 1302. In some embodiments, each light emitter 1044 has a smaller width than the spacing between neighboring far light emitters 1044.

[0260] Advantageously, the light collimators 1302 may increase the efficiency of the utilization of light and may also reduce the occurrence of crosstalk between neighboring light emitters 1044. It will be appreciated that crosstalk between light emitters 1044 may occur when light from a neighboring light emitter is captured by a light collimator 1302 not associated with that neighboring light emitter. That captured light may be propagated to the user's eye, thereby providing erroneous image information for a given pixel.

[0261] With reference to FIGS. 24A and 24B, the size of the beam of light captured by the projection optics 1070 may influence the size of the beam of light which exits the projection optics 1070. As shown in FIG. 24A, without the use light collimators, the exit beam may have a relatively large width 1050. As shown in FIG. 24B, with light collimators 1302, the exit beam may have a smaller width 1052. Thus, in some embodiments, the light collimators 1302 may be used to provide a desired beam size for in-coupling into an eyepiece. For example, the amount that the light collimators 1302 narrow the angular emission profile 1046 may be selected based at least partly upon the size of the intra-coupling optical elements in the eyepiece to which the light outputted by the projection optics 1070 is directed.

[0262] It will be appreciated that the light collimators 1302 may take various forms. For example, the light collimators 1302 may be micro-lenses or lenslets, in some embodiments. As discussed herein, each micro-lens preferably has a width greater than the width of an associated light emitter 1044. The micro-lenses may be formed of curved transparent material, such as glass or polymers, including photoresist and resins such as epoxy. In some embodiments, light collimators 1302 may be nano-lenses, e.g., diffractive optical gratings. In some embodiments, light collimators 1302 may be metasurfaces and/or liquid crystal gratings. In some embodiments, light collimator's 1302 may take the form of reflective wells.

[0263] It will be appreciated that different light collimators 1302 may have different dimensions and/or shapes depending upon the wavelengths or colors of light emitted by the associated light emitter 1044. Thus, for full-color emissive micro-displays, the array 1300 may include a plurality of light collimators 1302 with different dimensions and/or shapes depending upon the color of light emitted by the associate light emitter 1044. In embodiments where the emissive micro-display is a monochrome micro-display, the array 1300 may be simplified, with each of the light collimators 1302 in the array being configured to redirect light of the same color. With such monochrome micro-displays, the light collimator 1302 may be similar across the array 1300 in some embodiments.

[0264] With continued reference to FIG. 24B, as discussed herein, the light collimators 1302 may have a 1-to-1 association with the light emitters 1044. For example, each light emitter 1044 may have a discrete associated light collimator 1302. In some other embodiments, light collimators 1302 may be elongated such that they extend across multiple light emitters 1044. For example, in some embodiments, the light collimator 1302 may be elongated into the page and extend in front of a row of multiple light emitters 1044. In some other embodiments, a single light collimator 1302 may extend across a column of light emitters 1044. In yet other embodiments, the light collimator 1302 may comprise stacked columns and/or rows of lens structures (e.g., nano-lens structures, micro-lens structures, etc.).

[0265] As noted above, the light collimators 1302 may take the form of reflective wells. FIG. 25A illustrates an example of a side view of an array of tapered reflective wells for directing light to projection optics. As illustrated, the light collimator array 1300 may include a substrate 1301 in which a plurality of light collimators 1302, in the form of reflective wells, may be formed. Each well may include at least one light emitter 1044, which may emit light with a Lambertian angular emission profile 1046. The reflective walls 1303 of the wells of the light collimators 1302 are tapered and reflect the emitted light such that it is outputted from the well with a narrower angular emission profile 1047. As illustrated, reflective walls 1303 may be tapered such that the cross-sectional size increases with distance from the light emitter 1044. In some embodiments, the reflective walls 1303 may be curved. For example, the sides 1303 may have the shape of a compound parabolic concentrator (CPC).

[0266] With reference now to FIG. 25B, an example of a side view of an asymmetric tapered reflective well is illustrated. As discussed herein, e.g., as illustrated in FIGS. 12A-13A, it may be desirable to utilize the light collimators 1302 to steer light in a particular direction not normal to the surface of the light emitter 1044. In some embodiments, as viewed in a side view such as illustrated in FIG. 25B, the light collimator 1302 may be asymmetric, with the upper side 1303a forming a different angle (e.g., a larger angle) with the surface of the light emitter 1044 than the lower side 1303b; for example, the angles of the reflective walls 1303a, 1303b relative to the light emitter 1044 may differ on different sides of the light collimators 1302 in order to direct the light in the particular non-normal direction. Thus, as illustrated, light exiting the light collimator 1302 may propagate generally in a direction 1048 which is not normal to the surface of the light emitter 1044. In some other embodiments, in order to direct light in the direction 1048, the taper of the upper side 1303a may be different than the taper of the lower side; for example, the upper side 1303a may flare out to a greater extent than the lower side 1303b.

[0267] With continued reference to FIG. 25, the substrate 1301 may be formed of various materials having sufficient mechanical integrity to maintain the desired shape of the reflective walls 1303. Examples of suitable materials include metals, plastics, and glasses. In some embodiments, the substrate 1301 may be a plate of material. In some embodiments, substrate 1301 is a continuous, unitary piece of material. In some other embodiments, the substrate 1301 may be formed by joining together two or more pieces of material.

[0268] The reflective walls 1303 may be formed in the substrate 1301 by various methods. For example, the walls 1303 may be formed in a desired shape by machining the substrate 1301, or otherwise removing material to define the walls 1303. In some other embodiments, the walls 1303 may be formed as the substrate 1301 is formed. For example, the walls 1303 may be molded into the substrate 1301 as the substrate 1301 is molded into its desired shape. In some other embodiments, the walls 1303 may be defined by rearrangement of material after formation of the body 2200. For example, the walls 1303 may be defined by imprinting.

[0269] Once the contours of the walls 1303 are formed, they may undergo further processing to form surfaces having the desired degree of reflection. In some embodiments, the surface of the substrate 1301 may itself be reflective, e.g., where the body is formed of a reflective metal. In such cases, the further processing may include smoothing or polishing the interior surfaces of the walls 1303 to increase their reflectivity. In some other embodiments, the interior surfaces of the reflectors 2110 may be lined with a reflective coating, e.g., by a vapor deposition process. For example, the reflective layer may be formed by physical vapor deposition (PVD) or chemical vapor deposition (CVD).

[0270] It will be appreciated that the location of a light emitter relative to an associated light collimator may influence the direction of emitted light out of the light collimator. This is illustrated, for example, in FIGS. 26A-26C, which illustrate examples of differences in light paths for light emitters at different positions relative to center lines of overlying, associated light collimators. As shown in FIG. 26A, the emissive micro-display another 30 has a plurality of light emitters 1044a, each having an associated light collimator 1302 which facilitates the output of light having narrowed angular emission profiles 1047. The light passes through the projection optics 1070 (represented as a simple lens for ease of illustration), which converges the light from the various light emitters 1044a onto an area 1402a.

[0271] With continued reference to FIG. 26A, in some embodiments, each of the light collimators 1302 may be symmetric and may have a center line which extends along the axis of symmetry of the light collimator. In the illustrated configuration, the light emitters 1044a are disposed on the center line of each of the light collimators 1302.

[0272] With reference now to FIG. 26B, light emitters 1044b are offset by a distance 1400 from the center lines of their respective light collimators 1302. This offset causes light from the light emitters 1044b to take a different path through the light collimators 1302, which output light from the light emitters 1044b with narrowed angular emission profiles 1047b. The projection optics 1070 then converges the light from the light emitters 1044b onto the area 1402b, which is offset relative to the area 1402a on which light from the light emitters 1044a converge.

[0273] With reference now to FIG. 26C, light emitters 1044c offset from both the light emitters 1044a and 1044b are illustrated. This offset causes light from the light emitters 1044c to take a different path through the light collimators 1302 than light from the light emitters 1044a and 1044b. This causes the light collimators 1302 to output light from the light emitters 1044c with narrowed angular emission profiles that take a different path to the projection optics 1070 than the light from the light emitters 1044a and 1044b. Ultimately, the projection optics 1070 converges the light from the light emitters 1044c onto the area 1402c, which is offset relative to the areas 1402a and 1402b.

[0274] With reference to FIGS. 26A-26C, each triad of light emitters 1044a, 1044b, 1044c may share a common light collimator 1302. In some embodiments, the micro-display 1030 may be a full-color micro-display and each light emitter 1044a, 1044b, 1044c may be configured to emit light of a different component color. Advantageously, the offset areas 1402a, 1402b, 1402c may correspond to the in-coupling optical elements of a waveguide in some embodiments. For example, the areas 1402a, 1402b, 1402c may correspond to the in-coupling optical element 1022a, 1022b, 1022c, respectively, of FIGS. 11A and 12. Thus, the light collimators 1302 and the offset orientations of the light emitters 1044a, 1044b, 1044c may provide an advantageously simple three-pupil projection system 1010 using a full-color emissive micro-display.

[0275] As noted herein, the light collimator 1302 may also take the form of a nano-lens. FIG. 27 illustrates an example of a side view of individual light emitters 1044 of an emissive micro-display 1030 with an overlying array 1300 of light collimators 1302 which are nano-lenses. As discussed herein, individual ones of the light emitters 1044 may each have an associated light collimator 1302. The light collimators 1302 redirect light from the light emitters 1044 to narrow the large angular emission profile 1046 of the light emitters 1044, to output light with the narrowed angular emission profile 1047.

[0276] With continued reference to FIG. 27, in some embodiments, the light collimators 1302 may be grating structures. In some embodiments, the light collimators 1302 may be gratings formed by alternating elongated discrete expanses (e.g., lines) of material having different refractive indices. For example, expanses of material 1306 may be elongated into and out of the page and may be formed in and separated by material of the substrate 1308. In some embodiments, the elongated expanses of material 1306 may have sub-wavelength widths and pitch (e.g., widths and pitch that are smaller than the wavelengths of light that the light collimators 1302 are configured to receive from the associated light emitters 1044). In some embodiments, the pitch 1304 may be 30-300 nm, the depth of the grating may be 10-1000 nm, the refractive index of the material forming the substrate 1308 may be 1.5-3.5, and the refractive index of the material forming the grating features 1306 may be 1.5-2.5 (and different from the refractive index of the material forming the substrate 1308).

[0277] The illustrated grating structure may be formed by various methods. For example, the substrate 1308 may be etched or nano-imprinted to define trenches, and the trenches may be filled with material of a different refractive index from the substrate 1308 to form the grating features 1306.

[0278] Advantageously, nano-lens arrays may provide various benefits. For example, the light collection efficiencies of the nano-lenslets may be large, e.g., 80-95%, including 85-90%, with excellent reductions in angular emission profiles, e.g., reductions to 30-40.degree. (from 180.degree.). In addition, low levels of cross-talk may be achieved, since each of the nano-lens light collimators 1302 may have physical dimensions and properties (e.g., pitch, depth, the refractive indices of materials forming the feature 1306 and substrate 1308) selected to act on light of particular colors and possibly particular angles of incidence, while preferably providing high extinction ratios (for wavelengths of light of other colors). In addition, the nano-lens arrays may have flat profiles (e.g., be formed on a flat substrate), which may facilitates integration with micro-displays that may be flat panels, and may also facilitate manufacturing and provide high reproducibility and precision in forming the nano-lens array. For example, highly reproducible trench formation and deposition processes may be used to form each nano-lens. Moreover, these processes allow, with greater ease and reproducibility, for variations between nano-lenses of an array than are typically achieved when forming curved lens with similar variations.

[0279] With reference now to FIG. 28, a perspective view of an example of an emissive micro-display 1030 is illustrated. It will be appreciated that the light collimator arrays 1300 advantageously allow light emitted from a micro-display to be routed as desired. As result, in some embodiments, the light emitters of a full-color micro-display may be organized as desired, e.g., for ease of manufacturing or implementation in the display device. In some embodiments, the light emitters 1044 may be arranged in rows or columns 1306a, 1306b, 1306c. Each row or column may include light emitters 1044 configured to emit light of the same component color. In displays where three component colors are utilized, there may be groups of three rows or columns which repeat across the micro-display 1030. It will be appreciated that where more component colors are utilized, each repeating group may have that number of rows or columns. For example, where four component colors are utilized, each group may have four rows or four columns, with one row or one column formed by light emitters configured to emit light of a single component color.

[0280] In some embodiments, some rows or columns may be repeated to increase the number of light emitters of a particular component color. For example, light emitters of some component colors may occupy multiple rows or columns. This may facilitate color balancing and/or may be utilized to address differential aging or reductions in light emission intensity over time.

[0281] With reference to FIGS. 27 and 28, in some embodiments, the light emitters 1044 may each have an associated light collimator 1302. In some other embodiments, each line 1306a, 1306b, 1306c of multiple light emitters 1044 may have a single associated light collimator 1302. That single associated light collimator 1302 may extend across substantially the entirety of the associated line 1306a, 1306b, or 1306c. In some other embodiments, the associated light collimator 1302 may be elongated and extend over a plurality of light emitters 1044 forming a portion of of an associated line 1306a, 1306b, or 1306c, and multiple similar light collimators 1302 may be provided along each of the associated lines 1306a, 1306b, 1306c.

[0282] With continued reference to FIG. 28, each light emitter 1044 may be elongated along a particular axis (e.g., along the y-axis as illustrated); that is, each light emitter has a length along the particular axis, the length being longer than a width of the light emitter. In addition, a set of light emitters configured to emit light of the same component color may be arranged in a line 1306a, 1306b, or 1306c (e.g. a row or column) extending along an axis (e.g., the x-axis) which crosses (e.g., is orthogonal to) the light emitter 1044's elongate axis. Thus, in some embodiments, light emitters 1044 of the same component color form a line 1306a, 1306b, or 1306c of light emitters, with the line extending along a first axis (e.g., the x-axis), and with individual light emitters 1044 within the line elongated along a second axis (e.g., the y-axis).

[0283] In contrast, it will be appreciated that full-color micro-display typically include sub-pixels of each component color, with the sub-pixels arranged in particular relatively closely-packed spatial orientations in groups, with these groups reproduced across an array. Each group of sub-pixels may form a pixel in an image. In some cases, the sub-pixels are elongated along an axis, and rows or columns of sub-pixels of the same component color extent along that same axis. It will be appreciated that such an arrangement allows the sub-pixels of each group to be located close together, which may have benefits for image quality and pixel density. In the illustrated arrangement of FIG. 28, however, sub-pixels of different component colors are relatively far apart, due to the elongate shape of the light emitters 1044; that is, the light emitters of the line 1306a are relatively far apart from the light emitters of the line 1306c since the elongated shape of the light emitters of the line 1306b causes the light emitters 1306a and 1306c to be spaced out more than neighboring light emitters of a given line of light emitters. While this may be expected to provide unacceptably poor image quality if the image formed on the surface of the micro-display 1030 was directly relayed to a user's eye, the use of the light collimator array 1300 advantageously allows light of different colors to be routed as desired to form a high quality image. For example, light of each component color may be used to form separate monochrome images which are then routed to and combined in an eyepiece, such as the eyepiece 1020 (e.g., FIG. 11A and 12-14).

[0284] With reference to FIGS. 27 and 28, in some embodiments, the light emitters 1044 may each have an associated light collimator 1302. In some other embodiments, each line 1306a, 1306b, 1306c of light emitters 1044 may have a single associated light collimator 1302. That single associated light collimator 1302 may extend across substantially the entirety of the associated line 1306a, 1306b, or 1306c. In some other embodiments, the associated light collimator 1302 may be elongated and extend over a plurality of light emitters 1044 forming a portion of an associated line 1306a, 1306b, or 1306c, and multiple similar light collimators 1302 may be provided along each of the associated lines 1306a, 1306b, 1306c.

[0285] It will be appreciated that the light collimators 1302 may be utilized to direct light along different light paths to form multi-pupil projections systems. For example, the light collimators 1302 may direct light of different component colors to two or three areas, respectively, for light in-coupling.

[0286] FIG. 29 illustrates an example of a wearable display system with the full-color emissive micro-display 1030 of FIG. 28 used to form a multi-pupil projection system 1010. In the illustrated embodiment, the full-color emissive micro-display 1030 emits light of three component colors and forms a three-pupil projection system 1010. The projection system 1010 has three exit pupils through which image light 1032a, 1032b, 1032c of different component colors propagates to three laterally-shifted light in-coupling optical elements 1022a, 1022b, 1022c, respectively, of an eyepiece 1020. The eyepiece 1020 then relays the image light 1032a, 1032b, 1032c to the eye 210 of a user.

[0287] The emissive-micro-display 1030 includes an array of light emitters 1044, which may be subdivided into monochrome light emitters 1044a, 1044b, 1044c, which emit the image light 1032a, 1032b, 1032c, respectively. It will be appreciated that the light emitters 1044 emit image light with a broad angular emission profile 1046. The image light propagates through the array 1300 of light collimators, which reduces the angular emission profile to the narrowed angular emission profile 1047.

[0288] In addition, the array of 1300 of light collimators is configured to redirect the image light (image light 1032a, 1032b, 1032c) such that the image light is incident on the projection optics 1070 at angles which cause the projection optics 1070 to output the image light such that the image light propagates to the appropriate in-coupling optical element 1022a, 1022b, 1022c. For example, the 1300 array of light collimators is preferably configured to: direct the image light 1032a such that it propagates through the projection optics 1070 and is incident on the in-coupling optical element 1022a; direct the image light 1032b such that it propagates through the projection optics 1070 and is incident on the in-coupling optical element 1022b; and direct the image light 1032c such that it propagates through the projection optics 1070 and is incident on the in-coupling optical element 1022c.

[0289] Since different light emitters 1044 may emit light of different wavelengths and may need to be redirected into different directions to reach the appropriate in-coupling optical element, in some embodiments, the light collimators associated with different light emitters 1044 may have different physical parameters (e.g., different pitches, different widths, etc.). Advantageously, the use of flat nano-lenses as light collimators facilitates the formation of light collimators which vary in physical properties across the array 1300 of light collimators. As noted herein, the nano-lenses may be formed using patterning and deposition processes, which facilitates the formation of structures with different pitches, widths, etc. across a substrate.

[0290] With reference again to FIG. 24A, it will be appreciated that the illustrated display system shows a single emissive micro-display and omits an optical combiner 1050 (FIGS. 11A and 12-13B). In embodiments utilizing an optical combiner 1050, the reflective surfaces 1052, 1054 (FIGS. 11A, 12-13B, and 30B) in the optical combiner 1050 are preferably specular reflectors, and light from the light emitters 1044 would be expected to retain their large angular emission profiles after being reflected from the reflective surfaces 1052, 1054. Thus, the problems with wasted light shown in FIG. 24A are similarly present when an optical combiner 1050 is utilized.

[0291] With reference now to FIG. 30A, an example of a wearable display system with an emissive micro-display and an associated array of light collimators is illustrated. FIG. 30A shows additional details regarding the interplay between the light emitters 1044, the light collimators 1302, and the in-coupling optical elements of the eyepiece 1020. The display system includes a micro-display 1030b, which may be a full-color micro-display in some embodiments. In some other embodiments, the micro-display 1030b may be a monochrome micro-display and additional monochrome micro-displays (not shown) may be provided at different faces of the optional optical combiner 1050 (as shown in FIG. 30C).

[0292] With continued reference to FIG. 30A, the micro-display 1030b includes an array of light emitters 1044, each of which emits light with a wide angular emission profile (e.g., a Lambertian angular emission profile). Each light emitter 1044 has an associated, dedicated light collimator 1302 which effectively narrows the angular emission profile to a narrowed angular remission profile 1047. Light beams 1032b with the narrowed angular emission profiles pass through the projection optics 1070, which projects or converges those light beams onto the in-coupling optical element 1022b. It will be appreciated that the light beams 1032b have a certain cross-sectional shape and size 1047a. In some embodiments, the in-coupling optical element 1022b has a size and shape which substantially matches or is larger than the cross-sectional shape and size of the light beam 1032b, when that beam 1032b is incident on that in-coupling optical element 1022b. Thus, in some embodiments, the size and shape of the in-coupling optical element 1022b may be selected based upon the cross-sectional size and shape of the light beam 1032b when incident on the in-coupling optical element 1022b. In some other embodiments, other factors (re-bounce mitigation, or the angles or field of view supported by the in-coupling optical elements 1022b) may be utilized to determine the size and shape of the in-coupling optical element 1022b, and the light collimator 1302 may be configured (e.g., sized and shaped) to provide the light beam 1032b with an appropriately sized and shaped cross-section, which is preferably fully or nearly fully encompassed by the size and shape of the in-coupling optical element 1022b. In some embodiments, physical parameters for the light collimator 1302 and the in-coupling optical element 1022b may be mutually modified to provide highly efficient light utilization in conjunction with other desired functionality (e.g., re-bounce mitigation, support for the desired fields of view, etc.). Advantageously, the above-noted light collimation provided by the light collimator 1302, and matching of the cross-sectional size and shape of the light beam 1032b with the size and shape of the in-coupling optical element 1022b allows the in-coupling optical element 1022b to capture a large percentage of the incident light beam 1032b. The in-coupled light then propagates through the waveguide 1020b and is out-coupled to the eye 210.

[0293] As illustrated, the micro-display 1030b may include an array 1042 of light emitters 1044, each surrounded by non-light-emitting areas 1045 having a total width 1045w. In addition, the light emitters 1044 have a width W and a pitch P. In arrays in which the light emitters 1044 are regularly spaced, each light emitter 1044 and surrounding area 1045 effectively forms a unit cell having the width 1045w, which may be equal to the pitch P.

[0294] In some embodiments, the light collimators 1302 are micro-lenses disposed directly on and surrounding associated light emitters 1044. In some embodiments, the width of the micro-lenses 1302 is equal to 1045w, such that neighboring micro-lenses 1302 nearly contact or directly contact one another. It will be appreciated that light from the light emitters 1044 may fill the associated micro-lens 1302, effectively magnifying the area encompassed by the light emitter 1044. Advantageously, such a configuration reduces the perceptibility of the areas 1045 which do not emit light and may otherwise be visible as dark spaces to a user. However, because micro-lens 1302 effectively magnifies the associated light emitter 1044 such that it extends across the entire area of the micro-lens 1302, the areas 1045 may be masked.

[0295] With continued reference to FIG. 30A, the relative sizes of the light emitters 1044 and light collimators 1302 may be selected such that light from the light emitters 1044 fills the associated light collimators 1302. For example, the light emitters 1044 may be spaced sufficiently far apart such that micro-lens collimators 1302 having the desired curvature may be formed extending over individual ones of the light emitters 1044. In addition, as noted above, the size and shape of the intra-coupling optical element 1022b is preferably selected such that it matches or exceeds the cross-sectional shape and size of the light beam 1032b when incident on that in-coupling optical element 1022b. Consequently, in some embodiments, a width 1025 of the in-coupling optical element 1022b is equal to or greater than the width of the micro-lens 1302 (which may have a width equal to 1045w or P) . Preferably, the width 1025 is greater than the width of the micro-lens 1302, or 1045w or P, to account for some spread in the light beam 1032b. As discussed herein, the width 1025 may also be selected to mitigate rebounce and may be shorter than the length (which is orthogonal to the width) of the in-coupling optical element 1022b. In some embodiments, the width 1025 may extend along the same axis as the direction of propagation of incoupled light 1032b through the waveguide 1020b before being out-coupled for propagation to the eye 210.

[0296] With reference now to FIG. 30B, an example of a light projection system 1010 with multiple emissive micro-displays 1030a, 1030b, 1030c, and associated arrays 1300a, 1300b, 1300c of light collimators, respectively, is illustrated. The angular emission profiles of light emitted by the micro-displays 1030a, 1030b, 1030c are narrowed by the light collimator arrays 1300a, 1300b, 1300c, thereby facilitating the collection of a large percentage of the emitted light by the projection optics 1070 after the light propagates through the optical combiner 1050. The projection optics 1070 then directs the light to an eyepiece such as the eyepiece 1020 (e.g., FIG. 11A and 12-14) (not shown).

[0297] FIG. 30C illustrates an example of a wearable display system with multiple emissive micro-displays 1030a, 1030b, 1030c, each with an associated array 1300a, 1300b, 1300c, respectively, of light collimators. The illustrated display system includes a plurality of micro-displays 1030a, 1030b, 1030c for emitting light with image information. As illustrated, the micro-displays 1030a, 1030b, 1030c may be micro-LED panels. In some embodiments, the micro-displays may be monochrome micro-LED panels, each configured to emit a different component color. For example, the micro-display 1030a may be configured to emit light 1032a which is red, the micro-display 1030b may be configured to emit light 1032b which is green, and the micro-display 1030c may be configured to emit light 1032c which is blue.

[0298] Each micro-display 1030a, 1030b, 1030c may have an associated array 1300a, 1300b, 1300c, respectively, of light collimators. The light collimators narrow the angular emission profile of light 1032a, 1032b, 1032c from light emitters of the associated micro-display. In some embodiments, individual light emitters have a dedicated associated light collimator (as shown in FIG. 30A).

[0299] With continued reference to FIG. 30C, the arrays 1300a, 1300b, 1300c of light collimators are between the associated micro-displays 1030a, 1030b, 1030c and the optical combiner 1050, which may be an X-cube. As illustrated, the optical combiner 1050 has internal reflective surfaces 1052, 1054 for reflecting incident light out of an output face of the optical combiner. In addition to narrowing the angular emission profile of incident light, the arrays 1300a, 1300c of light collimators may be configured to redirect light from associated micro-displays 1030a, 1030c such that the light strikes the internal reflective surfaces 1052, 1054 of the optical combiner 1050 at angles appropriate to propagate towards the associated light in-coupling optical elements 1022a, 1022c, respectively. In some embodiments, in order to redirect light in a particular direction, the arrays 1300a, 1300c of light collimators may comprise micro-lens or reflective wells, which may be asymmetrical and/or the light emitters may be disposed off-center relative to the micro-lens or reflective wells, as disclosed herein.

[0300] With continued reference to FIG. 30C, projection optics 1070 (e.g., projection lens) is disposed at the output face of the optical combiner 1050 to receive image light exiting from that optical combiner. The projection optics 1070 may comprise lenses configured to converge or focus image light onto the eyepiece 1020. As illustrated, the eyepiece 1020 may comprise a plurality of waveguides, each of which is configured to in-couple and out-couple light of a particular color. For example, waveguide 1020a may be configured to receive red light 1032a from the micro-display 1030a, waveguide 1020b may be configured to receive green light 1032b from the micro-display 1030b, and waveguide 1020c may be configured to receive blue light 1032c from the micro-display 1030c. Each waveguide 1020a, 1020b, 1020c has an associated light in-coupling optical elements 1022a, 1022b, 1022c, respectively, for in coupling light therein. In addition, as discussed herein, the waveguides 1020a, 1020b, 1020c may correspond to the waveguides 670, 680, 690, respectively, of FIG. 9B and may each have associated orthogonal pupil expanders (OPE's) and exit pupil expanders (EPE's), which ultimately out-couple the light 1032a, 1032b, 1032c to a user.

[0301] As discussed herein, the wearable display system incorporating micro-displays is preferably configured to output light with different amounts of wavefront divergence, to provide comfortable accommodation-vergence matching for the user. These different amounts of wavefront divergence may be achieved using out-coupling optical elements with different optical powers. As discussed herein, the out-coupling optical elements may be present on or in waveguides of an eyepiece such as the eyepiece 1020 (e.g., FIG. 11A and 12-14). In some embodiments, lenses may be utilized to augment the wavefront divergence provided by the out-couple optical elements or may be used to provide the desired wavefront divergence in configurations where the out-couple optical elements are configured to output collimated light.

[0302] FIGS. 31A and 31B illustrate examples of eyepieces 1020 having lens for varying the wavefront divergence of light to a viewer. FIG. 31A illustrates an eyepiece 1020 having a waveguide structure 1032. In some embodiments, as discussed herein, light of all component colors may be in-coupled into a single waveguide, such that the waveguide structure 1032 includes only the single waveguide. This advantageously provides for a compact eyepiece. In some other embodiments, the waveguide structure 1032 may be understood to include a plurality of waveguides (e.g., the waveguides 1032a, 1032b, 1032c of FIG. 11A and 12-13A), each of which may be configured to relay light of a single component color to a user's eye.

[0303] In some embodiments, the variable focus lens elements 1530, 1540 may be disposed on either side of the waveguide structure 1032. The variable focus lens elements 1530, 1540 may be in the path of image light from the waveguide structure 1032 to the eye 210, and also in the path of light from the ambient environment through the waveguide structure 1003 2 to the eye 210. The variable focus optical element 1530 may modulate the wavefront divergence of image light outputted by the waveguide structure 1032 to the eye 210. It will be appreciated that the variable focus optical element 1530 may have optical power which may distort the eye 210's view of the world. Consequently, in some embodiments, a second variable focus optical element 1540 may be provided on the world side of the waveguide structure 1032. The second variable focus optical element 1540 may provide optical power opposite to that of the variable focus optical element 1530 (or opposite to the net optical power of the optical element 1530 and the waveguide structure 1032, where the waveguide structure 1032 has optical power), so that the net optical power of the variable focus lens elements 1530, 1540 and the waveguide structure 1032 is substantially zero.

[0304] Preferably, the optical power of the variable focus lens elements 1530, 1540 may be dynamically altered, for example, by applying an electrical signal thereto. In some embodiments, the variable focus lens elements 1530, 1540 may comprise a transmissive optical element such as a dynamic lens (e.g., a liquid crystal lens, an electro-active lens, a conventional refractive lens with moving elements, a mechanical-deformation-based lens, an electrowetting lens, an elastomeric lens, or a plurality of fluids with different refractive indices). By altering the variable focus lens elements' shape, refractive index, or other characteristics, the wavefront of incident light may be changed. In some embodiments, the variable focus lens elements 1530, 1540 may comprise a layer of liquid crystal sandwiched between two substrates. The substrates may comprise an optically transmissive material such as glass, plastic, acrylic, etc.

[0305] In some embodiments, in addition or as alternative to providing variable amounts of wavefront divergence for placing virtual content on different depth planes, the variable focus lens elements 1530, 1540 and waveguide structure 1032 may advantageously provide a net optical power equal to the user's prescription optical power for corrective lenses. Thus, the eyepiece 1020 may serve as a substitute for lenses used to correct for refractive errors, including myopia, hyperopia, presbyopia, and astigmatism. Further details regarding the use of variable focus lens elements as substitutes for corrective lenses may be found in U.S. application Ser. No. 15/481,255, filed Apr. 6, 2017, the entire disclosure of which is incorporated by reference herein.

[0306] With reference now to FIG. 31B, in some embodiments, the eyepiece 1020 may include static, rather than variable, lens elements. As with FIG. 31B, the waveguide structure 1032 may include a single waveguide (e.g., which may relay light of different colors) or a plurality of waveguides (e.g., each of which may relay light of a single component color). Similarly, the waveguide structure 1034 may include a single waveguide (e.g., which may relay light of different colors) or a plurality of waveguides (e.g., each of which may relay light of a single component color). The one or both of the waveguide structures 1032, 1034 may have optical power and may output light with particular amounts of wavefront divergence, or may simply output collimated light.

[0307] With continued reference to FIG. 31B, the eyepiece 1020 may include static lens elements 1532, 1534, 1542 in some embodiments. Each of these lens elements are disposed in the path of light from the ambient environment through waveguide structures 1032, 1034 into the eye 210. In addition, the lens element 1532 is between a waveguide structure 1003 2 and the eye 210. The lens element 1532 modifies a wavefront divergence of light outputted by the waveguide structure 1032 to the eye 210.

[0308] The lens element 1534 modifies a wavefront divergence of light outputted by the waveguide structure 1034 to the eye 210. It will be appreciated that the light from the waveguide structure 1034 also passes through the lens element 1532. Thus, the wavefront divergence of light outputted by the waveguide structure 1034 is modified by both the lens element 1534 and the lens element 1532 (and the waveguide structure 1032 in cases where the waveguide structure 1003 2 has optical power). In some embodiments, the lens elements 1532, 1534 and the waveguide structure 1032 provide a particular net optical power for light outputted from the waveguide structure 1034.

[0309] The illustrated embodiment provides two different levels of wavefront divergence, one for light outputted from the waveguide structure 1032 and a second for light outputted by a waveguide structure 1034. As a result, virtual objects may be placed on two different depth planes, corresponding to the different levels of wavefront divergence. In some embodiments, an additional level of wavefront divergence and, thus, an additional depth plane may be provided by adding an additional waveguide structure between lens element 1532 and the eye 210, with an additional lens element between the additional waveguide structure and the eye 210. Further levels of wavefront divergence may be similarly added, by adding further waveguide structures and lens elements.

[0310] With continued reference to FIG. 31B, it will be appreciated that the lens elements 1532, 1534 and the waveguide structures 1032, 1034 provide a net optical power that may distort the users view of the world. As a result, lens element 1542 may be used to counter the optical power and distortion of ambient light. In some embodiments, the optical power of the lens element 1542 is set to negate the aggregate optical power provided by the lens elements 1532, 1534 and the waveguide structures 1032, 1034. In some other embodiments, the net optical power of the lens element 1542; the lens elements 1532, 1534; and the waveguide structures 1032, 1034 is equal to a user's prescription optical power for corrective lenses

Display System with Low Motion-to-Photon Latency

[0311] As described herein, an LCoS may be utilized as spatial light modulators in display systems. For example, due to, e.g., the time needed to change the orientation of the liquid crystals of the LCoS, LCoS may be limited to a relatively low maximum refresh rate. As described above, this maximum refresh rate may be about 330 Hz in some cases. Thus, and as described herein, this maximum refresh rate may cause undesirable visible display artifacts.

[0312] As an example, with respect to motion-to-photon latency, the virtual content may be configured such that it is perceived as being placed within a real world. The display system may use information generated by one or more orientation sensors (e.g., inertial measurement units (IMUs)) to determine, at least in part, a head pose associated with the user. A head pose may inform an orientation of the user's head within three-dimensional space. This head pose may thus inform generation of virtual content. For example, as the user rotates his/her head about an axis, the virtual content should be adjusted accordingly such that the virtual content does not appear to move. As described above, motion-to-photon latency may indicate a time from which a user's pose is determined to a time at which light forming virtual content adjusted based on the movement is outputted to the user's eyes. The maximum refresh rate may thus limit an extent to which this time may be reduced. Therefore, as the user moves his/her head, motion-to-photon latency may be perceptible.

[0313] As another example, there may be evident motion blur associated with the presented virtual content as perceived by a user. As described above, it may be appreciated that persistence of presented virtual content may relate to motion blur. Persistence may indicate a time for which a frame of virtual content is being output to a user from a start of the output frame to a subsequent frame being output. As utilized herein, a duty cycle may indicate a percentage time for which a backlight (e.g., LEDs) outputs light. Thus, the duty cycle may be based on the persistence and frame rate associated with presentation of the virtual content. For example, the duty cycle may be substantially similar to the persistence divided by the time for each frame. Increasing persistence may increase a perceived brightness, for example because the duty cycle corresponding increases. However, increasing persistence may have the deleterious effect of increasing motion blur. Thus, it may be advantageous to decrease persistence. However, with an LCoS panel the reduction in brightness may render the presented virtual content not lifelike.

[0314] Described herein are examples of display systems which overcome, at least, the above-described example problems. In the example embodiments herein, the display systems are described as utilizing micro-LEDs for ease of discussion. As described above, micro-LEDs may be capable of switching (e.g., turning on and off) at high speeds (e.g., 2000 Hz, 2500 Hz, and so on). Additionally, micro-LEDs may be emissive. In some embodiments, each pixel of a virtual content frame may be separately addressable. Thus, micro-LEDs may replace an LCoS panel. While micro-LEDs are described in some particular embodiments, it will be understood that additional display technologies may be leveraged. For example, digital light processing (DLP) displays, organic LED (OLED) technology, and so on, may optionally be utilized. In some embodiments, the spatial light modulator may be a DLP panel, an OLED array, etc. as discussed herein.

[0315] FIG. 32 illustrates a block diagram of an example spatial light modulator 3200 according to some embodiments. It will be appreciated that the spatial light modulator 3200 may include optical elements for providing spatially modulated light and electronics for, e.g., operating the optical elements and various other processing, as disclosed herein. The spatial light modulator 3200 may be included in a display system (e.g., the display system 60, FIG. 9E), for example as part of the display system worn on a user's head (e.g., the display unit 70, FIG. 9E). In some embodiments, the spatial light modulator 3200 may take the form of a panel comprising an array of pixels. As illustrated, the spatial light modulator 3200 preferably includes an orientation sensor 3202 (e.g., an inertial measurement unit (IMU), eye tracking cameras, and the like), a warp engine 3204, and on-panel control logic 3206. For example, these various elements may share a common substrate, e.g., a common circuit board or other support with electrical interconnections. The orientation sensor 3202, warp engine 3204, and on-panel control logic 3206 will be described in more detail below, however, it will be understood that one or more of these elements may be physically separate from the spatial light modulator 3200 in some embodiments. For example, the elements may be included within other parts of the display system. In this example, the elements may communicate with the modulator 3200 via one or more connections.

[0316] With continued reference to FIG. 32, the local processing & data module 140 (FIG. 9E) may generate rendered content 3222 for presentation via a display unit. The local processing & data module 140 may include processing elements, such as a graphics processing unit, a central processing unit, and so on. Using, for example, a graphics processing unit 3220, the local processing & data module 140 may generate a rendered frame of virtual content for presentation to a user. As illustrated, an eye 3210 of the user receives spatially-modulated light 3212 encoded with image information corresponding to the virtual content. The spatially modulated light 3212 is provided based on operations of the spatial light modulator 3200. While not illustrated, it will be understood that the light 3212 may be routed to the eye 3210 through one or more optical elements (e.g., combiners, collimating optics, focus manipulating optics, and so on). Additionally, emissive display technologies may be utilized to generate the spatially-modulated light 3212, such as micro-LEDs.

[0317] The local processing & data module 140 may be separate from the display unit, and in communication with the display unit via a data link 130. As an example, the data link 130 may represent a physical connection between the modulator 3200 and module 140 (e.g., via one or more cables). As another example, the data link 130 may be a wireless connection, for example provided via WiFi (e.g., 802.11ad, 802.11ay), and so on.

[0318] Thus, the spatial light modulator 3200 and local processing & data module 140 may utilize the data link 130 to route information between each other. For example, the spatial light modulator 3200 may provide orientation information 3208 to the local processing & data module 140. The orientation information 3208 may be generated based on an inertial measurement unit 3202, eye tracking cameras, and the like. As described above, the orientation information 3208 may inform a head pose associated with the display unit, eye gaze of the user, and the like. As an example, the orientation information 3208 may be utilized to determine a translation or rotation about one or more axes. The local processing & data module 140 may utilize this orientation information 3208 to generate virtual content.

[0319] For example, the graphics processing unit 3220 may generate virtual content for presentation within a virtual or real-world environment of the user. The virtual content may be configured for a particular placement relative to the user, and the graphics processing unit 3220 may render each frame based on a head pose of the user. It will be appreciated that the graphics processing unit 3220 may render content 3222 at a particular frame rate (e.g., 60 Hz, 330 Hz), or up to the particular frame rate. The particular frame rate may be based on, e.g., constraints regarding power usage, heat generation, and so on of the display system. Additionally, the rendered content 3222 may be rendered at high quality, for example with realistic lighting, shading, polygons, and so on. The particular frame rate may be selected to balance rendering high quality virtual content with the constraints indicated above. Thus, the graphics processing unit 3220 may utilize the received orientation information 3208 to periodically generate rendered content 3222. The rendered content 3222 may then be provided via the data link 130 to the spatial light modulator 3200.

[0320] As described above, a motion-to-photon latency may indicate a time from which a particular orientation or pose of the user is determined (e.g., via the orientation sensor 3202) to a time at which light 3212 forming virtual content which incorporates the detected movement is presented to the user's eye 3210. Since the graphics processing unit 3220 may output rendered content 3222 at the particular frame rate described above, the motion-to-photon latency may be noticeable to the user. With respect to the example of the rendered content 3222 being provided at 60 Hz, the motion-to-photon latency may be 16 milliseconds or more.

[0321] Per the example above, after a frame of rendered content 3222 is provided to the eye 3210, the user's eye 3210 may thus not receive a subsequent frame for 16 ms. As will be appreciated, the user may have moved his/her body about one or more axes prior to receipt of the subsequent frame. For example, the user may have rotated his/her head, stepped closer to or farther from virtual content, and so on. Thus, to incorporate more recent orientation information, the example frame may be presented two or more times to the eye 3210. Each presentation may vary based on updated orientation information. As described above, the rendered content 3222 may be presented at a render frame rate (e.g., 60 Hz, 330 Hz) from the graphics processing unit 3220 via the data link 130. Each rendered frame may be adjusted one or more times based on orientation information. The same image information included in each rendered frame may therefore be output to the user's eye 3210 two or more times.

[0322] As discussed herein, where there is a change in user pose, each rendered frame of rendered content 3222 associate with the timing of that change in pose may be warped according to updated orientation information generated by the orientation sensor 3202. The orientation sensor 3202 may generate updated orientation information at greater than a threshold frequency (e.g., 2000 Hz, 3000 Hz, 5000 Hz, and so on). In the illustrated example, a warp engine 3204 is included in the spatial light modulator 3200. This warp engine 3204 may represent a processing element which performs a warp process on a received rendered frame. For example, the warp engine 3204 may be a hardware ASIC or field programmable gate array (FPGA) designed to perform the warp process. As another example, the warp engine 3204 may represent software executing on one or more processors forming part of the spatial light modulator 3200. The warp engine 3204 may thus generate warped frames by updating a rendered frame based on information received from the orientation sensor 3202. The warp engine 3204 may then output the warped frames at a warp frame rate. For example, the warp frame rate may be substantially higher than the render frame rate (e.g., 640 Hz, 666 Hz, 1000 Hz, 2000 Hz, and so on).

[0323] Advantageously, the warp engine 3204 may be positioned proximate to on-panel control logic 3206 associated with the display technologies described herein. As an example, the on-panel control logic 3206 may address particular micro-LEDs and cause the addressed micro-LEDs to output light 3212. This placement of the warp engine 3204 may provide various advantages in comparison to placing the warp engine 3204 at the local processing & data module 140.

[0324] For example, in such a scheme with the warp engine 3204 at the local processing & data module 140, the local processing & data module 140 may need to output rendered content 3222 at the substantially higher render frame rate. Thus, the data link 130 may require a corresponding increase in bandwidth between the module 140 and spatial light modulator 3200. This increase in bandwidth may limit a flexibility of the module 140. For example, the display system (e.g., display system 60) may utilize more power. The increased bandwidth may force the power expended by the display system to maintain the datalink 130 at the required speeds to similarly increase. Indeed, a cable connecting the local processing & data module 140 and the display unit may be required to be powered itself. In this way, such a placement may reduce a battery life of the display system. Similarly, such a placement may require an increased battery size which may add weight, cost, and so on, to the display system.

[0325] Including the warp engine 3204 as part of the spatial light modulator 3200 also provides advantages over locating the local processing & data module 140 physically closer to the display unit. Undesirably, this may reduce usability of the display unit described herein. As an example, the local processing & data module 140 may be placed on the display unit 70 (FIG. 9E) itself. This may add weight, bulk, and so on to the piece worn on the user's head. As another example, a cable connecting the local processing & data module 140 and the display unit may be required to be thicker. Additionally, the cable may be more fragile, such that the display unit may be less usable to end-users.

[0326] Since the warp engine 3204, as described above, may optionally be a hardware ASIC, the warp engine 3204 may have thermal design power (TDP) below a threshold. The placement of the warp engine 3204 within the display unit may therefore avoid decreases in usability of the display unit due to heat, requirement of fans or increased fans, and so on. Thus, the placement may advantageously address the problems described herein, while enabling substantial reductions in motion-to-photon latency.

[0327] As described herein, the warp engine 3204 may generate warped frames for output via the eyepiece 270 of the display unit 70 (FIG. 9E). In some embodiments, the warp engine 3204 may generate full image frames. For example, the full image frames may include warped image information associated with a rendered frame of the rendered content 3222. These full image frames may then be utilized by the spatial light modulator 3200 to output light 3210 which forms the full image frames in the eye 3210 of the user. In some embodiments, the warp engine 3204 may instead generate information indicating adjustments to be made to a rendered frame of the rendered content 3222. An example adjustment may include shifts to be made to pixels within the rendered frame. As described below with respect to FIGS. 33A-33B, the on-panel control logic may thus implement the adjustments. In this way, the bandwidth required between the warp engine 3204 and on-panel control logic 3206 may be reduced.

[0328] FIG. 33A illustrates a block diagram of another example spatial light modulator 3302 according to some embodiments. In this example, a display unit 3300 includes an orientation sensor 3202, such as an inertial measurement unit (IMU), and further includes the spatial light modulator 3302. As described in FIG. 32, the spatial light modulator 3302 may include on-panel control logic 3206 which includes a warp engine 3204 that performs an example warp process. The spatial light modulator 3302 may warp received rendered frames of rendered content 3222 using the warp engine 3204. The spatial light modulator 3302 may then control display elements, such as micro-LEDs, to output light 3212 forming the directly warped frames.

[0329] In this example, the on-panel control logic 3206 may store a currently received rendered frame of rendered content 3222 from the local processing & data module 140. As described above, the local processing & data module 140 may generate rendered frames of rendered content 3222 at 60 Hz, 330 Hz, and so on. Thus, the on-panel control logic 3206 may store a current rendered frame for about 16 ms, 8.33 ms, and so on. The on-panel control logic 3206 may receive periodic updates from the orientation sensor 3202 regarding orientation information of the display unit 3300. As described above, the orientation information may be utilized to warp the current rendered frame.

[0330] For example, the on-panel control logic may include a warp engine 3204. The warp engine 3204 may utilize the information from the orientation sensor 3202 to generate information sufficient to adjust pixels of the current rendered frame. This generated information may reflect shifted pixel values. The generated information may also reflect one or more transforms to be applied to pixels of the current rendered frame. In some embodiments, the generated information may comprise a table indicating shifts, or adjustments, to be made to particular pixels or groups of pixels (e.g., referenced according to a sub-region of the current rendered frame, and so on).

[0331] The on-panel control logic 3206 may utilize the information generated by the warp engine 3204 to manipulate the above-described current rendered frame. As an example, for one or more periodic updates received from the orientation sensor, the warp engine 3204 may generate adjustment information. The on-panel control logic 3206 may then warp the current rendered frame accordingly. In this way, the on-pane control logic may generate a multitude of warped frames.

[0332] FIG. 33B illustrates another block diagram of the example spatial light modulator 3302. In this example, the display unit 3300 includes a gaze predictor 3304. The gaze predictor 3304 may receive information from one or more cameras monitoring the eyes of the user. The cameras may obtain periodic images of the eyes, and utilizing computer vision or machine learning based techniques, determine an orientation associated with the pupils of the eyes. Utilizing the orientation, the gaze predictor 3304 may determine a three-dimensional fixation point associated with a gaze of the user. The three-dimensional fixation point may represent an intersection in three-dimensional space of vectors extending from the pupils. The gaze predictor 3304 may thus monitor a gaze of the user.

[0333] The warp engine 3204 may use information received from the gaze predictor 3304 (e.g., determined gazes) to inform warping of rendered content 3222. For example, the warp engine 3204 may utilize the gaze predictor 3304 and inertial measurement unit 3202 as respective signals. These signals may be aggregated, for example according to one or more stored models. Determining gaze may provide increased accuracy with respect to warping of rendered content item 3222, since changes in the gaze of the user may change the perspective from which the user views virtual content and, thus, may change the desired warping of the rendered content 3222.

[0334] FIG. 34A illustrates a diagram of an example scheme to update pixels of a spatial light modulator according to some embodiments. As described above, a spatial light modulator may modulate light outputted to a user by, e.g., changing the intensity of that light at different locations or pixels across the spatial light modulator. As a result, an image of virtual content may be output to a user. As described herein, an example display technology to generate such light may include micro-LEDs. It will be appreciated that micro-LEDs may be capable of switching at high speed. For example, a micro-LED may be capable of being refreshed at 2000 Hz or more. As will be described below, the spatial light modulator may utilize different schemes that leverage this high speed to output light to a user.

[0335] In some embodiments, each pixel of a frame of virtual content may be associated with one or more micro-LEDs. As an example, there may be a plurality of micro-LEDs, e.g., three micro-LEDs, for each pixel (e.g., red, green, blue). To generate light which forms the frame, the spatial light modulator (e.g., on-pane control logic 3206) may provide information to the micro-LEDs associated with each pixel. The provided information may be utilized to control one or more of the brightness of each micro-LED, the duration of each micro-LED being turned on, and so on. In some embodiments, the spatial light modulator may separately address each pixel and its associated micro-LEDs.

[0336] With continued reference to FIG. 34A, on-panel control logic 3206 is illustrated as providing a global update 3402 to the array of pixels of the spatial light modulator. As described above, the on-panel control logic 3206 may separately address micro-LEDs which output light to form pixels of a rendered frame. Thus, in this example, the on-panel control logic 3206 may trigger micro-LEDs, which form pixels of a frame of virtual content, to be globally updated based on the rendered frame. The global update may simultaneously update each of the pixels of the spatial light modulator. The update may change, for example, the intensity of light emitted by the micro-LED and/or the duration of this light emission. This updating may be performed at a particular refresh rate, such as at 2000 Hz. In some embodiments, each pixel may be associated with a plurality of micro-LEDs, e.g., three micro-LEDs. Thus, each pixel may be globally updated at 2000 Hz and the resulting virtual content may be presented at the global update rate divided by the number of micro-LEDs. Where there are three micro LEDs, the virtual content is effectively updated at a refresh rate of 666 Hz. In some embodiments, the update may only apply to pixels associated with virtual content and may skip pixels that are not associated with virtual content. For example, it will be appreciated that virtual content may not occupy an entire frame. In some embodiments, only pixels in a pixel array providing the virtual content are updated.

[0337] Advantageously, performing a global update 3402 may limit visual artifacts associated with the presentation of virtual content to a user. As will be described below, with respect FIG. 34B, in some embodiments the on-panel control logic 3206 may cause a scanning update to be performed. In such a scanning update, the micro-LEDs may be updated sequentially. This updating may introduce visible scanning artifacts, which may decrease the visual fidelity or viewing comfort of presented virtual content. In the example of FIG. 34A, the global update 3402 may avoid these scanning artifacts.

[0338] While the global updating 3402 may avoid such artifacts, the bandwidth required for communicating with the on-panel control logic 3206 may, in some embodiments, be substantial. For example, and as described FIGS. 32-33B, the on-panel control logic 3206 may present warped frames of virtual content at a warp frame rate (e.g., 666 Hz, 2000 Hz, and so on). Thus, the on-panel control logic 3206 may require a bandwidth equivalent to the image information in each frame multiplied by the warp frame rate. This large bandwidth may undesirably utilize a large amount of power. In some embodiments, the on-panel control logic 3206 may utilize the scanning updates shown in FIG. 34B to reduce the utilization of system resources.

[0339] As discussed regarding FIGS. 33A-33B, the on-panel control logic 3206 may also include functionality to warp frames of rendered content. For example, the on-panel control logic 3206 may receive a frame of rendered content (e.g., from the local processing & data module 140) as described above. The on-panel control logic 3206 may then receive updates regarding user pose from, at least, an orientation sensor. Using these updates, the on-panel control logic 3206 may warp the received rendered frame a threshold number of times before receipt of a subsequent frame of rendered content. Relative to providing rendered content at a rate equivalent to the rate at which warped frames are generated, warping provides a lower requirement for bandwidth to the on-panel control logic 3206. In some embodiments, the bandwidth may be similar to the image information included in a rendered frame of content multiplied by a render frame rate (e.g., 60 Hz, 330 Hz).

[0340] Due to the fast switching capability of micro-LEDs, the persistence 3406 of the micro-LEDs may be low (e.g., 0.4 ms, 0.5 ms, or 0.6 ms). As illustrated in FIG. 34A, a micro-LED 3404 may quickly turn on (e.g., the LED rise time 3408), be on for the persistence 3406, and then quickly turn off (e.g., the LED fall time 3410). As described above, the persistence 3406 may represent a time for which the pixels are on when presenting a frame of virtual content. Since the user is receiving frames of virtual content at a rapid rate (e.g., the warp frame rate, such as at 2000 Hz), the persistence 3406 may be short.

[0341] Advantageously, due to the low persistence 3406, the techniques described herein may reduce motion blur associated with viewing virtual content. It will be appreciated that an increase in persistence may cause a corresponding increase in motion blur. Previous techniques have utilized a reduction in persistence to combat motion blur, such as for LCoS panels. However, reducing the persistence may also reduce the duty cycle of the panel. Due to this reduced duty cycle, the brightness perceived by a user will undesirably be reduced.

[0342] In contrast, the micro-LEDs described herein may have low persistence while maintaining a high duty cycle. In the example of FIG. 33A, the persistence may be about 0.5 milliseconds and the duty cycle may be 99%. Thus, the user may be presented with virtual content that is advantageously perceived to be bright. Additionally, the high duty cycle provides a desirably efficient conversion of power to brightness.

[0343] FIG. 34B illustrates a diagram of additional example schemes to update pixels of a spatial light modulator according to some embodiments. In these examples, the on-panel control logic 3206 may update pixels of a panel 3422 via sequential scanning updates 3420. It will be appreciated that the panel 3422 may be a spatial light modulator comprising an array of pixels. The panel 3422 may be controlled via the on-panel control logic 3206 and pixels of the panel may comprise micro-LEDs in some embodiments.

[0344] In the illustrated example, two types of scanning updates are presented. With respect to scanning update A 3424, the on-panel control logic 3206 may cause single pixels of the panel 3422 to be sequentially updated. As an example, the on-panel control logic 3206 may cause a pixel at an upper left of the panel 3422 (e.g., as illustrated) to update. The logic 3206 may then cause an adjacent pixel (e.g., to the right, such as in a same row) to be updated. The logic 3206 may thus scan across a row, and then descend to a next row. Optionally, the on-panel control logic 3206 may skip pixels which are not associated with virtual content. For example, it will be appreciated that virtual content may not occupy an entire frame. In some embodiments, virtual content may be sparse such that only certain pixels in a pixel array of the panel 3422, the pixels corresponding to virtual content, are utilized and updated.

[0345] Optionally, for the scanning update A 3424, the on-panel control logic 3206 may scan based on a foveated region associated with a frame of virtual content. For example, the display system may determine a fixation point at which a user is fixating. Virtual content falling within a threshold angular distance of this fixation point may be identified as falling on a fovea of the user. It will be appreciated that the user may have heightened visual acuity for content falling on the fovea. In some embodiments, the display system may be configured to preferentially update pixels for this content falling on the fovea. In some embodiments, pixels for content falling on the fovea may be updated at a higher rate than pixels for content falling on peripheral regions of the retina. In some other embodiments, only pixels forming on the fovea are updated. In some embodiments, the on-panel control logic 3206 may initiate scanning at a pixel included in the foveated region. For example, the logic 3206 may initiate scanning at an upper left pixel of the foveated region.

[0346] With respect to scanning update B 3426, the on-pane control logic 3206 may cause sequential updates in different locations of the panel 3422 at a same time. For example, the on-panel control logic 3206 may update the panel in multiple waves. In this parlance, a single wave may thus represent scanning update A, with scanning update B having multiple simultaneous instances of scanning update A. As illustrated, the on-panel control logic 3206 is updating the panel 3422 in ten waves. It will be appreciated that less than, or greater than, this number of waves may be utilized in some embodiments. The on-panel control logic 3206 may optionally assign each pixel of the panel 3422 to a particular wave. The on-panel control logic 3206 may then sequentially update the pixels assigned to a same wave. As illustrated, the waves may be updated in parallel. Due to the multiple waves, scanning update B may cause less evident scanning artifacts as compared to scanning update A.

[0347] Similar to the above discussion regarding foveated regions, the on-panel control logic 3206 may cause certain pixels assigned to each wave to be updated based on their inclusion in a foveated region. Optionally, the on-panel control logic 3206 may rapidly assign, and de-assign, pixels from a wave. For example, the on-panel control logic 3206 may increase a number of waves in a foveated region. Thus, if the pixels in a foveated region were assigned to a first number of waves, the logic 3206 may increase this to a second, higher, number of waves. Thus, the pixels in the foveated region may be updated more quickly. In some embodiments, pixels for content falling on the fovea may be updated at a higher rate than pixels for content falling on peripheral regions of the retina. In some other embodiments, only pixels forming on the fovea are updated in waves.

[0348] In the scanning updates 3420 described herein, the on-panel control logic 3206 may scan at a particular scanning rate. This particular scanning rate may, as an example, be higher than the maximum refresh rate described above. As an example, the maximum refresh rate may be 2000 Hz. Thus, each pixel (e.g., the associated micro-LEDs) may be updated no faster than every 0.5 milliseconds (e.g., 2000 Hz). However, since the pixels are sequentially scanned, the on-panel control logic 3206 may provide information to two adjacent pixels within a shorter time span than 0.5 milliseconds. As a result, the scanning updates 3420 may still achieve the example benefits described herein. For example, motion-to-photon latency may be reduced. As another example, motion blur may be reduced as discussed herein.

[0349] FIG. 35 illustrates a flowchart of an example process 3500 for outputting a warped frame of rendered content according to the techniques described herein. For convenience, the process 3500 will be described as being performed by a display system of one or more processors (e.g., the wearable display system 60, FIG. 9E).

[0350] At block 3502, the display unit receives a rendered frame of virtual content. As described above, with respect to at least FIG. 32, the display system may utilize a graphics processing unit (GPU) to generate frames of virtual content. For example, the GPU may output frames at 60 Hz, 330 Hz, and so on.

[0351] At block 3504, the display unit determines an updated head pose of a user of the display unit. The display unit may utilize an orientation sensor (e.g., an inertial measurement unit (IMU)), optionally with a gaze detector, to determine head poses of the user. As the user moves and changes their pose, the virtual content may thus be updated based on the movement.

[0352] At block 3506, the display unit warps the rendered frame based on the determined head pose. As described above, the display unit may utilize a processor, hardware ASIC, and so on, to warp the rendered frame. The warped rendered frame may thus be warped according to a most recent determination as to head pose.

[0353] At block 3508, the display unit outputs or presents the warped frame to the user. The display unit may output the warped frame to the user according to the techniques described herein. For example, the display unit may utilize micro-LEDs to output spatially modulated light to display the warped frame.

[0354] Blocks 3504, 3506, 3508 may be repeated one or more times before the display unit receives another rendered frame of virtual content. Thus, the display unit may determine a new head pose, warp the rendered frame based on the new head pose, and output the new warped frame one or more times before receiving another rendered frame. The display unit may thus output warped frames at greater than a threshold frequency (e.g., 666 Hz, 2000 Hz, and so on), which may be higher than the rate that rendered frames are provided to the display unit.

Additional Considerations

[0355] The foregoing description, for purposes of explanation, used specific nomenclature to provide a thorough understanding of the described embodiments. However, it will be apparent to one skilled in the art that the specific details are not required in order to practice the described embodiments. Thus, the foregoing descriptions of specific embodiments are presented for purposes of illustration and description. They are not intended to be exhaustive or to limit the described embodiments to the precise forms disclosed. It will be apparent to one of ordinary skill in the art that many modifications and variations are possible in view of the above teachings. The specification and drawings are, accordingly, to be regarded in an illustrative rather than restrictive sense.

[0356] The various aspects, implementations, or features of the described embodiments may be used separately or in any combination. Various aspects of the described embodiments may be implemented by software, hardware or a combination of hardware and software. The described embodiments may also be embodied as computer readable code on a computer readable medium for controlling manufacturing operations or as computer readable code on a computer readable medium for controlling a manufacturing line. The computer readable medium is any data storage device that may store data, which may thereafter be read by a computer system. Examples of the computer readable medium include read-only memory, random-access memory, CD-ROMs, HDDs, DVDs, magnetic tape, and optical data storage devices. The computer readable medium may also be distributed over network-coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.

[0357] Thus, each of the processes, methods, and algorithms described herein and/or depicted in the figures may be embodied in, and fully or partially automated by, code modules executed by one or more physical computing systems, hardware computer processors, application-specific circuitry, and/or electronic hardware configured to execute specific and particular computer instructions. For example, computing systems may include computers (e.g., servers) programmed with specific computer instructions or special purpose computers, special purpose circuitry, and so forth. A code module may be compiled and linked into an executable program, installed in a dynamic link library, or may be written in an interpreted programming language. In some embodiments, particular operations and methods may be performed by circuitry that is specific to a given function.

[0358] Further, certain embodiments of the functionality of the present disclosure are sufficiently mathematically, computationally, or technically complex that application-specific hardware or one or more physical computing devices (utilizing appropriate specialized executable instructions) may be necessary to perform the functionality, for example, due to the volume or complexity of the calculations involved or to provide results substantially in real-time. For example, a video may include many frames, with each frame having millions of pixels, and specifically programmed computer hardware is necessary to process the video data to provide a desired image processing task or application in a commercially reasonable amount of time.

[0359] Code modules or any type of data may be stored on any type of non-transitory computer-readable medium, such as physical computer storage including hard drives, solid state memory, random access memory (RAM), read only memory (ROM), optical disc, volatile or non-volatile storage, combinations of the same and/or the like. In some embodiments, the non-transitory computer-readable medium may be part of one or more of the local processing and data module (140), the remote processing module (150), and remote data repository (160). The methods and modules (or data) may also be transmitted as generated data signals (e.g., as part of a carrier wave or other analog or digital propagated signal) on a variety of computer-readable transmission mediums, including wireless-based and wired/cable-based mediums, and may take a variety of forms (e.g., as part of a single or multiplexed analog signal, or as multiple discrete digital packets or frames). The results of the disclosed processes or process steps may be stored, persistently or otherwise, in any type of non-transitory, tangible computer storage or may be communicated via a computer-readable transmission medium.

[0360] Any processes, blocks, states, steps, or functionalities in flow diagrams described herein and/or depicted in the attached figures will be understood as potentially representing code modules, segments, or portions of code which include one or more executable instructions for implementing specific functions (e.g., logical or arithmetical) or steps in the process. The various processes, blocks, states, steps, or functionalities may be combined, rearranged, added to, deleted from, modified, or otherwise changed from the illustrative examples provided herein. In some embodiments, additional or different computing systems or code modules may perform some or all of the functionalities described herein. The methods and processes described herein are also not limited to any particular sequence, and the blocks, steps, or states relating thereto may be performed in other sequences that are appropriate, for example, in serial, in parallel, or in some other manner. Tasks or events may be added to or removed from the disclosed example embodiments. Moreover, the separation of various system components in the embodiments described herein is for illustrative purposes and should not be understood as requiring such separation in all embodiments. It will be understood that the described program components, methods, and systems may generally be integrated together in a single computer product or packaged into multiple computer products.

[0361] It will be appreciated that the systems and methods of the disclosure each have several innovative aspects, no single one of which is solely responsible or required for the desirable attributes disclosed herein. The various features and processes described above may be used independently of one another, or may be combined in various ways. All possible combinations and subcombinations are intended to fall within the scope of this disclosure.

[0362] Certain features that are described in this specification in the context of separate embodiments also may be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment also may be implemented in multiple embodiments separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination may in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination. No single feature or group of features is necessary or indispensable to each and every embodiment.

[0363] It will be appreciated that conditional language used herein, such as, among others, "can," "could," "might," "may," "e.g.," and the like, unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without author input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment. The terms "comprising," "including," "having," and the like are synonymous and are used inclusively, in an open-ended fashion, and do not exclude additional elements, features, acts, operations, and so forth. Also, the term "or" is used in its inclusive sense (and not in its exclusive sense) so that when used, for example, to connect a list of elements, the term "or" means one, some, or all of the elements in the list. In addition, the articles "a," "an," and "the" as used in this application and the appended claims are to be construed to mean "one or more" or "at least one" unless specified otherwise. Similarly, while operations may be depicted in the drawings in a particular order, it is to be recognized that such operations need not be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. Further, the drawings may schematically depict one more example processes in the form of a flowchart. However, other operations that are not depicted may be incorporated in the example methods and processes that are schematically illustrated. For example, one or more additional operations may be performed before, after, simultaneously, or between any of the illustrated operations. Additionally, the operations may be rearranged or reordered in other embodiments. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it will be understood that the described program components and systems may generally be integrated together in a single software product or packaged into multiple software products. Additionally, other embodiments are within the scope of the following claims. In some cases, the actions recited in the claims may be performed in a different order and still achieve desirable results.

[0364] Accordingly, the claims are not intended to be limited to the embodiments shown herein, but are to be accorded the widest scope consistent with this disclosure, the principles and the novel features disclosed herein.

您可能还喜欢...