雨果巴拉:行业北极星Vision Pro过度设计不适合市场

Facebook Patent | Pixel Cell With Multiple Photodiodes

Patent: Pixel Cell With Multiple Photodiodes

Publication Number: 20200075652

Publication Date: 20200305

Applicants: Facebook

Abstract

In one example, an apparatus comprises: a semiconductor substrate including a plurality of pixel cells, each pixel cell including at least four photodiodes; a plurality of filter arrays, each filter array including a filter element overlaid on each photodiode of the pixel cell, at least two of the filter elements of the each filter array having different wavelength passbands; and a plurality of microlens, each microlens overlaid on the each filter array and configured to direct light from a spot of a scene via each filter element of the each filter array to each photodiode of the each pixel cell.

RELATED APPLICATIONS

[0001] This patent application claims priority to U.S. Provisional Patent Application Ser. No. 62/727,343, filed Sep. 5, 2018, entitled “PIXEL STRUCTURE WITH REDUCED CROSSTALK BETWEEN MULTIPLE PHOTODIODES,” which is assigned to the assignee hereof and is incorporated herein by reference in its entirety for all purposes.

BACKGROUND

[0002] The disclosure relates generally to image sensors, and more specifically to a pixel cell that includes multiple photodiodes.

[0003] A typical pixel cell in an image sensor includes a photodiode to sense incident light by converting photons into charge (e.g., electrons or holes). The charge can be temporarily stored in photodiode during an exposure period. For improved noise and dark current performances, a pinned photodiode can be included in the pixel to convert the photons into charge. The pixel cell may further include a capacitor (e.g., a floating diffusion) to collect the charge from the photodiode and to convert the charge to a voltage. An image sensor typically includes an array of pixel cells. The pixel cells can be configured to detect light of different wavelength ranges to generate 2D and/or 3D image data.

SUMMARY

[0004] The present disclosure relates to image sensors. More specifically, and without limitation, this disclosure relates to a pixel cell configured to perform collocated sensing of light of different wavelengths.

[0005] In one example, an apparatus is provided. The apparatus includes a semiconductor substrate including a plurality of pixel cells, each pixel cell including at least a first photodiode, a second photodiode, a third photodiode, and a fourth photodiode. The apparatus further includes a plurality of filter arrays, each filter array including at least a first filter element, a second filter element, a third filter element, and a fourth filter element, the first filter element of the each filter array overlaid on the first photodiode of the each pixel cell, the second filter element of the filter array overlaid on the second photodiode of the each pixel cell, the third filter element of the filter array overlaid on the third photodiode of the each pixel cell, the fourth filter element of the filter array overlaid on the fourth photodiode of the each pixel cell, at least two of the first, second, third, and fourth filter element of the each filter array having different wavelength passbands. The apparatus further includes a plurality of microlens, each microlens overlaid on the each filter array and configured to direct light from a spot of a scene via the first filter element, the second filter element, the third filter element, and the fourth filter element of the each filter array to, respectively, the first photodiode, the second photodiode, the third photodiode, and the fourth photodiode of the each pixel cell.

[0006] In one aspect, the first filter element and the second filter element of the each filter array are aligned along a first axis. The first photodiode and the second photodiode of the each pixel cell are aligned along the first axis underneath a light receiving surface of the semiconductor substrate. The first filter element is overlaid on the first photodiode along a second axis perpendicular to the first axis. The second filter element is overlaid on the second photodiode along the second axis. The each microlens is overlaid on the first filter element and the second filter element of the each filter array along the second axis.

[0007] In one aspect, the apparatus further comprises a camera lens overlaid on the plurality of microlenses along the second axis. A surface of the each filter array facing the camera lens and an exit pupil of the camera lens are positioned at conjugate positions of the each microlens.

[0008] In one aspect, the first filter element and the second filter element overlaid on the each pixel cell are configured to pass different color components of visible light to, respectively, the first photodiode and the second photodiode of the each pixel cell.

[0009] In one aspect, the first filter element and the second filter element of each filter array are arranged based on a Bayer pattern.

[0010] In one aspect, the first filter element is configured to pass one or more color components of visible light. The second filter element is configured to pass an infra-red light.

[0011] In one aspect, the first filter elements of the plurality of filter arrays are arranged based on a Bayer pattern.

[0012] In one aspect, the first filter element comprises a first filter and a second filter forming a stack along the second axis.

[0013] In one aspect, the apparatus further comprises a separation wall between adjacent filter elements overlaid on a pixel cell and between adjacent filter elements overlaid on adjacent pixel cells.

[0014] In one aspect, the separation wall is configured to reflect light that enters a filter element of the each filter array from the each microlens towards the photodiode on which the filter element is overlaid.

[0015] In one aspect, the separation wall includes a metallic material.

[0016] In one aspect, the apparatus further comprises an optical layer interposed between the plurality of filter arrays and the semiconductor substrate. The optical layer includes at least one of: an anti-reflection layer, or a pattern of micro-pyramids configured to direct infra-red light to at least one of the first photodiode or the second photodiode.

[0017] In one aspect, the apparatus further comprises an isolation structure interposed between adjacent photodiodes of the each pixel cell and adjacent photodiodes of adjacent pixel cells.

[0018] In one aspect, the isolation structure comprises a deep trench isolation (DTI), the DTI comprising insulator layers and a metallic filling layer sandwiched between the insulator layers.

[0019] In one aspect, the first photodiode and the second photodiode of the each pixel cell are pinned photodiodes.

[0020] In one aspect, a back side surface of the semiconductor substrate is configured as a light receiving surface from which the first photodiode and the second photodiode of the each pixel cell receive light. The semiconductor further comprises, in the each pixel cell, floating drains configured to store charge generated by the first photodiode and the second photodiode of the each pixel cell. The apparatus further comprises polysilicon gates formed on a front side surface of the semiconductor substrate opposite to the back side surface to control flow of the charge from the first photodiode and the second photodiode to the floating drains of the each pixel cell.

[0021] In one aspect, a front side surface of the semiconductor substrate is configured as a light receiving surface from which the first photodiode and the second photodiode of the each pixel cell receive light. The semiconductor further comprises, in the each pixel cell, floating drains configured to store charge generated by the first photodiode and the second photodiode of the each pixel cell. The apparatus further comprises polysilicon gates formed on the front side surface of the semiconductor substrate to control flow of the charge from the first photodiode and the second photodiode to the floating drains of the each pixel cell.

[0022] In one aspect, the semiconductor substrate is a first semiconductor substrate. The apparatus further comprises a second semiconductor substrate comprising a quantizer to quantize charge generated by the first photodiode and the second photodiode of the each pixel cell. The first semiconductor substrate and the second semiconductor substrate form a stack.

[0023] In one aspect, the second semiconductor substrate further includes an imaging module configured to: generate a first image based on the quantized charge of the first photodiode of the each pixel cell; and generate a second image based on the quantized charge of the second photodiode of the each pixel cell. Each pixel of the first image corresponds to each pixel of the second image.

[0024] In one aspect, each pixel of the first image and each pixel of the second image are generated based on charge generated by the first photodiode and the second photodiode within an exposure period.

BRIEF DESCRIPTION OF THE DRAWINGS

[0025] Illustrative embodiments are described with reference to the following figures:

[0026] FIG. 1A and FIG. 1B are diagrams of an embodiment of a near-eye display.

[0027] FIG. 2 is an embodiment of a cross section of the near-eye display.

[0028] FIG. 3 illustrates an isometric view of an embodiment of a waveguide display.

[0029] FIG. 4 illustrates a cross section of an embodiment of the waveguide display.

[0030] FIG. 5 is a block diagram of an embodiment of a system including the near-eye display.

[0031] FIG. 6 illustrates an example of an image sensor including a multi-photodiode pixel cell.

[0032] FIG. 7A, FIG. 7B, and FIG. 7C illustrate examples of operations of the image sensor of FIG. 6.

[0033] FIG. 8A and FIG. 8B illustrate example components of the image sensor of FIG. 6.

[0034] FIG. 9A and FIG. 9B illustrate additional example components of the image sensor of FIG. 6.

[0035] FIG. 10A, FIG. 10B, FIG. 10C, and FIG. 10D illustrate additional example components of the image sensor of FIG. 6.

[0036] FIG. 11A, FIG. 11B, and FIG. 11C illustrate additional example components of the pixel cells of image sensor of FIG. 6.

[0037] FIG. 12 illustrates an example circuit schematic of the image sensor of FIG. 6.

[0038] The figures depict embodiments of the present disclosure for purposes of illustration only. One skilled in the art will readily recognize from the following description that alternative embodiments of the structures and methods illustrated may be employed without departing from the principles, or benefits touted, of this disclosure.

[0039] In the appended figures, similar components and/or features may have the same reference label. Further, various components of the same type may be distinguished by following the reference label by a dash and a second label that distinguishes among the similar components. If only the first reference label is used in the specification, the description is applicable to any one of the similar components having the same first reference label irrespective of the second reference label.

DETAILED DESCRIPTION

[0040] In the following description, for the purposes of explanation, specific details are set forth in order to provide a thorough understanding of certain inventive embodiments. However, it will be apparent that various embodiments may be practiced without these specific details. The figures and description are not intended to be restrictive.

[0041] A typical image sensor typically includes an array of pixel cells. Each pixel cell may have a photodiode to sense incident light by converting photons into charge (e.g., electrons or holes). For improved noise and dark current performances, a pinned photodiode can be included in the pixel to convert the photons into charge. The charge can be sensed by a charge sensing device, such as a floating drain region and/or other capacitors, which can convert the charge to a voltage. A pixel value can be generated based on the voltage. The pixel value can represent an intensity of light received by the pixel cell. An image comprising an array of pixels can be derived from the digital outputs of the voltages output by an array of pixel cells.

[0042] An image sensor can be used to perform different modes of imaging, such as 2D and 3D sensing. The 2D and 3D sensing can be performed based on light of different wavelength ranges. For example, visible light can be used for 2D sensing, whereas invisible light (e.g., infra-red light) can be used for 3D sensing. An image sensor may include an optical filter array to allow visible light of different optical wavelength ranges and colors (e.g., red, green, and blue colors) to a first set of pixel cells assigned for 2D sensing, and invisible light to a second set of pixel cells assigned for 3D sensing.

[0043] To perform 2D sensing, a photodiode at a pixel cell can generate charge at a rate that is proportional to an intensity of visible light incident upon the pixel cell, and the quantity of charge accumulated in an exposure period can be used to represent the intensity of visible light (or a certain color component of the visible light). The charge can be stored temporarily at the photodiode and then transferred to a capacitor (e.g., a floating diffusion) to develop a voltage.

[0044] The voltage can be sampled and quantized by an analog-to-digital converter (ADC) to generate an output corresponding to the intensity of visible light. An image pixel value can be generated based on the outputs from multiple pixel cells configured to sense different color components of the visible light (e.g., red, green, and blue colors).

[0045] Moreover, to perform 3D sensing, light of a different wavelength range (e.g., infra-red light) can be projected onto an object, and the reflected light can be detected by the pixel cells. The light can include structured light, light pulses, etc. The pixel cells outputs can be used to perform depth sensing operations based on, for example, detecting patterns of the reflected structured light, measuring a time-of-flight of the light pulse, etc. To detect patterns of the reflected structured light, a distribution of quantities of charge generated by the pixel cells during the exposure time can be determined, and pixel values can be generated based on the voltages corresponding to the quantities of charge. For time-of-flight measurement, the timing of generation of the charge at the photodiodes of the pixel cells can be determined to represent the times when the reflected light pulses are received at the pixel cells. Time differences between when the light pulses are projected to the object and when the reflected light pulses are received at the pixel cells can be used to provide the time-of-flight measurement.

[0046] A pixel cell array can be used to generate information of a scene. In some examples, a subset (e.g., a first set) of the pixel cells within the array can be used to perform 2D sensing of the scene, and another subset (e.g., a second set) of the pixel cells within the array can be used to perform 3D sensing of the scene. The fusion of 2D and 3D imaging data are useful for many applications that provide virtual-reality (VR), augmented-reality (AR) and/or mixed reality (MR) experiences. For example, a wearable VR/AR/MR system may perform a scene reconstruction of an environment in which the user of the system is located. Based on the reconstructed scene, the VR/AR/MR can generate display effects to provide an interactive experience. To reconstruct a scene, a subset of pixel cells within a pixel cell array can perform 3D sensing to, for example, identify a set of physical objects in the environment and determine the distances between the physical objects and the user. Another subset of pixel cells within the pixel cell array can perform 2D sensing to, for example, capture visual attributes including textures, colors, and reflectivity of these physical objects. The 2D and 3D image data of the scene can then be merged to create, for example, a 3D model of the scene including the visual attributes of the objects. As another example, a wearable VR/AR/MR system can also perform a head tracking operation based on a fusion of 2D and 3D image data. For example, based on the 2D image data, the VR/AR/MR system can extract certain image features to identify an object. Based on the 3D image data, the VR/AR/MR system can track a location of the identified object relative to the wearable device worn by the user. The VR/AR/MR system can track the head movement based on, for example, tracking the change in the location of the identified object relative to the wearable device as the user’s head moves.

[0047] Using different sets of pixel for 2D and 3D imaging, however, can pose a number of challenges. First, because only a subset of the pixel cells of the array is used to perform either 2D imaging or 3D imaging, the spatial resolutions of both of the 2D image and 3D image are lower than the maximum spatial resolution available at the pixel cell array. Although the resolutions can be improved by including more pixel cells, such an approach can lead to increases in the form-factor of the image sensor as well as power consumption, both of which are undesirable especially for a wearable device.

[0048] Moreover, since pixel cells assigned to measure light of different wavelength ranges (for 2D and 3D imaging) are not collocated, different pixel cells may capture information of different spots of a scene, which can complicate the mapping between 2D and 3D images. For example, a pixel cell that receives a certain color component of visible light (for 2D imaging) and a pixel cell that receives invisible light (for 3D imaging) may also capture information of different spots of the scene. The output of these pixel cells cannot be simply merged to generate the 2D and 3D images. The lack of correspondence between the output of the pixel cells due to their different locations can be worsened when the pixel cell array is capturing 2D and 3D images of a moving object. While there are processing techniques available to correlate different pixel cell outputs to generate pixels for a 2D image, and to correlate between 2D and 3D images (e.g., interpolation), these techniques are typically computation-intensive and can also increase power consumption.

[0049] The present disclosure relates to an image sensor to provide collocated sensing of light of different wavelengths. The image sensor includes a plurality of pixel cells, each pixel cell including a first photodiode and a second photodiode arranged along a first axis (e.g., a horizontal axis). The image sensor further includes a plurality of filter arrays, each filter array including a first filter and a second filter overlaid on the each pixel cell along a second axis perpendicular to the first axis (e.g., along a vertical axis). The first filter of the each filter array is overlaid on the first photodiode of the each pixel cell, whereas the second filter of the filter array overlaid on the second photodiode of the each filter cell. The first filter and the second filter of the each filter array have different wavelength passbands, to enable the first photodiode and the second photodiode of the each pixel cell to sense light of different wavelengths. The image sensor further includes a plurality of microlenses. Each microlens is overlaid on the each filter array (and the each pixel cell) and configured to direct light from a spot of a scene via the first filter and the second filter of the each filter array to, respectively, the first photodiode and the second photodiode of the each pixel cell. Both the first photodiode and the second photodiode can be part of a semiconductor substrate.

[0050] The image sensor further includes a controller to enable the first photodiode of the each pixel cell to generate a first charge representing an intensity of a first light component of a first wavelength received from the spot and via the first filter, and to enable the second photodiode of the each pixel cell to generate a second charge representing an intensity of a second light component of a second wavelength received from the spot and via the second filter. The first wavelength and the second wavelength may be different among the plurality of pixel cells and are configured by the filter arrays. The image sensor further includes a quantizer to quantize the first charge and the second charge of the each pixel cell to, respectively, a first digital value and a second digital value for a pixel. A first image can be generated based on the first digital value of the pixels, whereas a second image can be generated based on the second digital value of the pixels, with each pixel of the first image and of the second image generated based on, respectively, the first digital output and the second digital output of the same pixel cell.

[0051] With the examples of the present disclosure, collocated sensing of light of different wavelengths can be performed as both the first photodiode and the second photodiode receive light from the same spot in a scene, which can simplify the mapping/correlation process between the first image and the second image. For example, in a case where the first photodiode senses a visible light component (e.g., one of red, green, blue, or monochrome) whereas the second photodiode senses infra-red light, the image sensor can support collocated 2D and 3D imaging, and the mapping/correlation processing between a 2D image frame (e.g., the first image frame) and a 3D image frame (e.g., the second image frame) can be simplified, as each pixel of both image frames represents light from the same spot of the scene. For similar reasons, in a case where the first and second photodiodes sense different light components of visible light, the mapping/correlation processing of image frames of different visible light components to form a 2D image frame can also be simplified. All these can substantially enhance the performance of the image sensor and the applications that rely on the image sensor outputs.

[0052] The image sensor according to the examples of the present disclosure may include additional features to improve the collocated sensing operations. Specifically, the image sensor can include features to enhance the absorption of light by the first photodiode and the second photodiode of the each pixel cell. For example, the image sensor may include a camera lens overlaid on the plurality of microlenses to collect and focus light from the scene. Each pixel cell can be positioned with respect to the each microlens and the camera lens such that the pixel cell and the exit pupil of the camera lens are at conjugate points of the each microlens. Such arrangements allow light from a spot of the scene, upon exiting through the exit pupil of the cameras lens and further refracted by the microlens, can be evenly distributed between the first photodiode and the second photodiode. The microlens can also be designed such that its focal point to be in front of the filter array, to enable the light to be spread out. Further, a structure, such as an anti-reflection layer (e.g., a layer having lower refractive index than the semiconductor substrate that includes the photodiodes), an infra-red absorption-enhancing structure (e.g., a micro-pyramid structured thin film), etc., can be interposed between the filter array and the photodiodes, to reduce reflection of the incident light away from the photodiodes and/or increase the intensity of the incident light that enters the photodiodes. All these can improve the absorption of light by the first photodiode and the second photodiode of the each pixel cell and improve the performance of the image sensor.

[0053] In addition, the image sensor may include features to reduce noise in the first charge and in the second charge generated by, respectively, the first photodiode and the second photodiode. The noise can refer to a component of the charge generated by the photodiode not due to the target light component to be detected by the photodiode. There are various sources of noise, including optical crosstalk between light of different wavelengths, charge leakage between photodiodes, dark charge, etc. The optical crosstalk may include a light component outside the target wavelength range to be sensed by the photodiode. In the example above, the first photodiode of a pixel cell may be configured, based on the first filter overlaid on the first photodiode, to detect the first light component of the first wavelength. For the first photodiode, the optical crosstalk may include light components of other wavelengths other than the first wavelength, which may include the second light component of the second wavelength to be detected by the second photodiode. Moreover, for the second photodiode, the optical crosstalk may include light components of other wavelengths other than the second wavelength, which may include the first light component of the first wavelength to be detected by the first photodiode. Moreover, charge leakage may occur due to movement of the first charge from the first photodiode to the second photodiode, or vice versa. Further, dark charge may occur due to dark current generated at the defects of a surface of the semiconductor substrate that includes the photodiodes.

[0054] In some examples, the image sensor can include features to mitigate the effect of optical crosstalk, charge leakage, and dark charge to reduce noise and to improve the performance of the image sensor. For example, the image sensor may include an optical insulator to separate between the first filter and the second filter in each filter array. The optical insulator can be configured as sidewalls that surround each side surfaces of the first filter and the second filter. The optical insulator can be configured as reflectors (e.g., metallic reflectors) to direct the light components passed by a filter to only the photodiode overlaid by the filter, but not to other photodiodes. For example, the optical insulator can direct the first light component only to the first photodiode but not to the second photodiode, and direct the second light component only to the second photodiode but not to the first photodiode. Moreover, the semiconductor substrate may include an electrical insulator, such as a deep trench isolation (DTI) structure between the first photodiode and the second photodiode, to prevent charge from moving between the first photodiode and the second photodiode. The DTI structure can also be filled with reflective materials, such as metals, so that the DTI structure can also function as an optical insulator to reduce optical crosstalk between the photodiodes within the semiconductor substrate. Further, the first photodiode and the second photodiode can be implemented as pinned photodiodes to become isolated from the surface defects of the semiconductor substrate, to mitigate the effect of dark current. All these arrangements can reduce noise present in the charge generated by each photodiode and improve the performance of the image sensor.

[0055] Examples of the present disclosure may include, or be implemented in conjunction with, an artificial reality system. Artificial reality is a form of reality that has been adjusted in some manner before presentation to a user, which may include, e.g., a virtual reality (VR), an augmented reality (AR), a mixed reality (MR), a hybrid reality, or some combination and/or derivatives thereof. Artificial reality content may include completely generated content or generated content combined with captured (e.g., real-world) content. The artificial reality content may include video, audio, haptic feedback, or some combination thereof, any of which may be presented in a single channel or in multiple channels (such as stereo video that produces a three-dimensional effect to the viewer). Additionally, in some embodiments, artificial reality may also be associated with applications, products, accessories, services, or some combination thereof, that are used to, e.g., create content in an artificial reality and/or are otherwise used in (e.g., perform activities in) an artificial reality. The artificial reality system that provides the artificial reality content may be implemented on various platforms, including a head-mounted display (HMD) connected to a host computer system, a standalone HMD, a mobile device or computing system, or any other hardware platform capable of providing artificial reality content to one or more viewers.

[0056] FIG. 1A is a diagram of an example of a near-eye display 100. Near-eye display 100 presents media to a user. Examples of media presented by near-eye display 100 include one or more images, video, and/or audio. In some embodiments, audio is presented via an external device (e.g., speakers and/or headphones) that receives audio information from the near-eye display 100, a console, or both, and presents audio data based on the audio information. Near-eye display 100 is generally configured to operate as a virtual reality (VR) display. In some embodiments, near-eye display 100 is modified to operate as an augmented reality (AR) display and/or a mixed reality (MR) display.

[0057] Near-eye display 100 includes a frame 105 and a display 110. Frame 105 is coupled to one or more optical elements. Display 110 is configured for the user to see content presented by near-eye display 100. In some embodiments, display 110 comprises a waveguide display assembly for directing light from one or more images to an eye of the user.

[0058] Near-eye display 100 further includes image sensors 120a, 120b, 120c, and 120d. Each of image sensors 120a, 120b, 120c, and 120d may include a pixel cell array comprising an array of pixel cells and configured to generate image data representing different fields of views along different directions. For example, sensors 120a and 120b may be configured to provide image data representing two field of views towards a direction A along the Z axis, whereas sensor 120c may be configured to provide image data representing a field of view towards a direction B along the X axis, and sensor 120d may be configured to provide image data representing a field of view towards a direction C along the X axis.

[0059] In some embodiments, sensors 120a-120d can be configured as input devices to control or influence the display content of the near-eye display 100, to provide an interactive VR/AR/MR experience to a user who wears near-eye display 100. For example, sensors 120a-120d can generate physical image data of a physical environment in which the user is located. The physical image data can be provided to a location tracking system to track a location and/or a path of movement of the user in the physical environment. A system can then update the image data provided to display 110 based on, for example, the location and orientation of the user, to provide the interactive experience. In some embodiments, the location tracking system may operate a SLAM algorithm to track a set of objects in the physical environment and within a view of field of the user as the user moves within the physical environment. The location tracking system can construct and update a map of the physical environment based on the set of objects, and track the location of the user within the map. By providing image data corresponding to multiple fields of views, sensors 120a-120d can provide the location tracking system a more holistic view of the physical environment, which can lead to more objects to be included in the construction and updating of the map. With such arrangement, the accuracy and robustness of tracking a location of the user within the physical environment can be improved.

[0060] In some embodiments, near-eye display 100 may further include one or more active illuminator 130 to project light into the physical environment. The light projected can be associated with different frequency spectrums (e.g., visible light, infra-red light, ultra-violet light, etc.), and can serve various purposes. For example, illuminator 130 may project light and/or light patterns in a dark environment (or in an environment with low intensity of infra-red light, ultra-violet light, etc.) to assist sensors 120a-120d in capturing 3D images of different objects within the dark environments. The 3D images may include, for example, pixel data representing the distances between the objects and near-eye display 100. The distance information can be used to, for example, construct a 3D model of the scene, to track a head movement of the user, to track a location of the user, etc. As to be discussed in more detail below, sensors 120a-120d can be operated in a first mode for 2D sensing and in a second mode for 3D sensing at different times. The 2D and 3D image data can be merged and provided to a system to provide a more robust tracking of, for example, the location of the user, the head movement of the user, etc.

[0061] FIG. 1B is a diagram of another embodiment of near-eye display 100. FIG. 1B illustrates a side of near-eye display 100 that faces the eyeball(s) 135 of the user who wears near-eye display 100. As shown in FIG. 1B, near-eye display 100 may further include a plurality of illuminators 140a, 140b, 140c, 140d, 140e, and 140f. Near-eye display 100 further includes a plurality of image sensors 150a and 150b. Illuminators 140a, 140b, and 140c may emit lights of certain optical frequency range (e.g., NIR) towards direction D (which is opposite to direction A of FIG. 1A). The emitted light may be associated with a certain pattern, and can be reflected by the left eyeball of the user. Sensor 150a may include a pixel cell array to receive the reflected light and generate an image of the reflected pattern. Similarly, illuminators 140d, 140e, and 140f may emit NIR lights carrying the pattern. The NIR lights can be reflected by the right eyeball of the user, and may be received by sensor 150b. Sensor 150b may also include a pixel cell array to generate an image of the reflected pattern. Based on the images of the reflected pattern from sensors 150a and 150b, the system can determine a gaze point of the user, and update the image data provided to near-eye display 100 based on the determined gaze point to provide an interactive experience to the user. In some examples, image sensors 150a and 150b may include same pixel cells as sensors 120a-120d.

[0062] FIG. 2 is an embodiment of a cross section 200 of near-eye display 100 illustrated in FIG. 1. Display 110 includes at least one waveguide display assembly 210. An exit pupil 230 is a location where a single eyeball 220 of the user is positioned in an eyebox region when the user wears the near-eye display 100. For purposes of illustration, FIG. 2 shows the cross section 200 associated eyeball 220 and a single waveguide display assembly 210, but a second waveguide display is used for a second eye of a user.

[0063] Waveguide display assembly 210 is configured to direct image light to an eyebox located at exit pupil 230 and to eyeball 220. Waveguide display assembly 210 may be composed of one or more materials (e.g., plastic, glass, etc.) with one or more refractive indices. In some embodiments, near-eye display 100 includes one or more optical elements between waveguide display assembly 210 and eyeball 220.

[0064] In some embodiments, waveguide display assembly 210 includes a stack of one or more waveguide displays including, but not restricted to, a stacked waveguide display, a varifocal waveguide display, etc. The stacked waveguide display is a polychromatic display (e.g., a red-green-blue (RGB) display) created by stacking waveguide displays whose respective monochromatic sources are of different colors. The stacked waveguide display is also a polychromatic display that can be projected on multiple planes (e.g., multi-planar colored display). In some configurations, the stacked waveguide display is a monochromatic display that can be projected on multiple planes (e.g., multi-planar monochromatic display). The varifocal waveguide display is a display that can adjust a focal position of image light emitted from the waveguide display. In alternate embodiments, waveguide display assembly 210 may include the stacked waveguide display and the varifocal waveguide display.

[0065] FIG. 3 illustrates an isometric view of an embodiment of a waveguide display 300. In some embodiments, waveguide display 300 is a component (e.g., waveguide display assembly 210) of near-eye display 100. In some embodiments, waveguide display 300 is part of some other near-eye display or other system that directs image light to a particular location.

[0066] Waveguide display 300 includes a source assembly 310, an output waveguide 320, an illuminator 325, and a controller 330. Illuminator 325 can include illuminator 130 of FIG. 1A. For purposes of illustration, FIG. 3 shows the waveguide display 300 associated with a single eyeball 220, but in some embodiments, another waveguide display separate, or partially separate, from the waveguide display 300 provides image light to another eye of the user.

[0067] Source assembly 310 generates image light 355. Source assembly 310 generates and outputs image light 355 to a coupling element 350 located on a first side 370-1 of output waveguide 320. Output waveguide 320 is an optical waveguide that outputs expanded image light 340 to an eyeball 220 of a user. Output waveguide 320 receives image light 355 at one or more coupling elements 350 located on the first side 370-1 and guides received input image light 355 to a directing element 360. In some embodiments, coupling element 350 couples the image light 355 from source assembly 310 into output waveguide 320. Coupling element 350 may be, e.g., a diffraction grating, a holographic grating, one or more cascaded reflectors, one or more prismatic surface elements, and/or an array of holographic reflectors.

[0068] Directing element 360 redirects the received input image light 355 to decoupling element 365 such that the received input image light 355 is decoupled out of output waveguide 320 via decoupling element 365. Directing element 360 is part of, or affixed to, first side 370-1 of output waveguide 320. Decoupling element 365 is part of, or affixed to, second side 370-2 of output waveguide 320, such that directing element 360 is opposed to the decoupling element 365. Directing element 360 and/or decoupling element 365 may be, e.g., a diffraction grating, a holographic grating, one or more cascaded reflectors, one or more prismatic surface elements, and/or an array of holographic reflectors.

[0069] Second side 370-2 represents a plane along an x-dimension and a y-dimension. Output waveguide 320 may be composed of one or more materials that facilitate total internal reflection of image light 355. Output waveguide 320 may be composed of e.g., silicon, plastic, glass, and/or polymers. Output waveguide 320 has a relatively small form factor. For example, output waveguide 320 may be approximately 50 mm wide along x-dimension, 30 mm long along y-dimension and 0.5-1 mm thick along a z-dimension.

您可能还喜欢...