空 挡 广 告 位 | 空 挡 广 告 位

Meta Patent | Wavelength-tunable optical pattern projector

Patent: Wavelength-tunable optical pattern projector

Patent PDF: 20240241375

Publication Number: 20240241375

Publication Date: 2024-07-18

Assignee: Meta Platforms Technologies

Abstract

A wavelength-tunable patterned-beam projector for object imaging includes a broadband light source for emitting light spanning a wavelength band and a liquid crystal tunable filter for spectrally filtering the light and for tuning a center wavelength of the light. A liquid crystal based spatial filter spatially modulates the light to provide the patterned light beam having alternating bright and dark regions in a cross-section of the beam. The projector may be used in a near-eye display to illuminate an eye of the user for eye tracking, and to illuminate surroundings for 3D imaging and/or ranging.

Claims

1. An illuminator comprising:a light source for emitting light spanning a wavelength band;a liquid crystal tunable filter (LCTF) for spectrally filtering the light and for tuning a center wavelength of the light within the wavelength band; anda spatial filter for spatially modulating the light to provide a patterned light beam comprising a pattern of alternating bright and dark regions in a cross-section of the patterned light beam, wherein the spatial filter is located downstream of the LCTF.

2. The illuminator of claim 1, further comprising:a display system for projecting images to a viewer, wherein in operation, the patterned light beam illuminates an eye of the viewer; anda camera configured to receive a portion of the patterned light beam reflected from the eye.

3. The illuminator of claim 2, wherein the wavelength band of the light source is in an infrared portion of spectrum.

4. The illuminator of claim 2, further comprising a controller operably coupled to the LCTF and the camera and configured to:wavelength-tune the LCTF;obtain eye images captured by the camera in coordination with the wavelength tuning; andprocess the eye images to obtain eye tracking information, wherein different ones of the eye images correspond to different eye illuminating wavelengths.

5. The illuminator of claim 4, wherein:the spatial filter is configured to vary the pattern of alternating bright and dark regions; andthe controller is configured to obtain the eye images captured by the camera at different patterns of the alternating bright and dark regions of the patterned light beam illuminating the eye, and to process the images to obtain eye tracking information.

6. The illuminator of claim 1, further comprising:a display system for projecting images to a viewer, wherein in operation, the patterned light beam illuminates at least a part of an environment surrounding the viewer;a camera configured to receive a portion of the patterned light beam reflected from one or more objects in the environment; anda controller configured to process the images to obtain depth information for the environment.

7. The illuminator of claim 1, wherein the light is incident on the spatial filter as a polarized light beam, and wherein the spatial filter comprises a patterned waveplate configured to spatially modulate a polarization state of the polarized light beam, such that the polarization state at an output of the patterned waveplate alternates between first and second orthogonal polarization states in a cross-section of the polarized light beam; anda polarizer downstream of the patterned waveplate for blocking light in the first polarization state while propagating light in the second polarization state to provide the patterned light beam.

8. The illuminator of claim 7, wherein the patterned waveplate comprises an array of alternating first and second waveplate segments having differing retardance.

9. The illuminator of claim 8, wherein the first waveplate segments have an approximately half-wave retardance, and the second waveplate segments have an approximately zero retardance or an approximately full-wave retardance.

10. The illuminator of claim 1, wherein the spatial filter comprises a spatial light modulator (SLM).

11. The illuminator of claim 10, wherein the SLM comprises an array of liquid crystal pixels having individually tunable retardance, and a polarizer downstream of the array of liquid crystal pixels.

12. The illuminator of claim 1, wherein the LCTF comprises a sequence of serially coupled blocks, each block comprising a tunable waveplate upstream of a polarizer.

13. The illuminator of claim 12, wherein each tunable waveplate comprises an LC layer having a tunable retardance, wherein retardance values of different tunable waveplates of the LCTF are in a binary relationship with one another.

14. A display system for projecting images to a viewer, the display system comprising:an illuminator for illuminating an eye of the viewer with a patterned light beam comprising a pattern of alternating bright and dark regions in a cross-section of the patterned light beam, the illuminator comprising a spatial filter operable to spatially filter light incident thereon to output the patterned light beam, and a tunable wavelength filter upstream of the spatial filter;a camera disposed to receive a portion of the patterned light beam reflected from the eye to capture a plurality of eye images; anda controller configured to receive the eye images captured by the camera;wherein the controller is further configured to wavelength-tune the tunable wavelength filter, and to process the eye images for obtaining eye tracking information, different ones of the eye images being captured at different eye illuminating wavelengths.

15. The display system of claim 14, wherein the spatial filter is configured to vary the pattern of bright regions and the controller is configured to receive a set of eye images captured by the camera for different patterns of the bright regions, and to process the set of eye images for obtaining the eye tracking information.

16. A method for imaging an object, the method comprising:filtering light spanning a wavelength band with a liquid crystal tunable filter (LCTF); andspatially modulating the light with a spatial filter to provide a patterned light beam comprising a pattern of alternating bright and dark regions in a cross-section of the patterned light beam, wherein the spatial filter is located downstream of the LCTF.

17. The method of claim 16, further comprising:illuminating the object with the patterned light beam;receiving a portion of the patterned light beam reflected from the object by an image capturing camera;wavelength-tuning the LCTF to vary a center wavelength of the patterned light beam;capturing, with the camera, images of the object in coordination with the wavelength tuning, the images comprising at least two images captured for two different center wavelengths of the patterned light beam; andprocessing the images to obtain a 3D information about the object.

18. The method of claim 17, further comprising varying the pattern of alternating bright and dark regions in coordination with the capturing, so that the images comprise at least two images captured for two different patterns of the patterned light beam.

19. The method of claim 18 for use in a display system for displaying images to a viewer, wherein the object comprises an eye of the viewer, and wherein the processing includes obtaining eye tracking information.

20. The method of claim 18, wherein the illuminating comprises illuminating a scene comprising the object and the processing includes obtaining a 3D image of the scene.

Description

TECHNICAL FIELD

The present disclosure relates to optical components, and in particular to light sources for visual display devices and 3D imaging.

BACKGROUND

Visual displays provide information to viewer(s) including still images, video, data, etc. Visual displays have applications in diverse fields including entertainment, education, engineering, science, professional training, advertising, to name just a few examples. Some visual displays such as TV sets display images to several users, and some visual display systems such as head-mounted display (HMDs) and near-eye displays (NEDs) are intended for individual users.

An artificial reality system generally includes an HMD, a NED, or the like (e.g., a headset or a pair of glasses) configured to present content to a user. A NED or an HMD may display virtual objects or combine images of real objects with virtual objects, as in virtual reality (VR), augmented reality (AR), or mixed reality (MR) applications. The displayed VR/AR/MR content can be three-dimensional (3D) to enhance the experience and, possibly, to match virtual objects to real objects when observable by the user. An HMD or NED for VR applications may include ranging means to orient the user in a surrounding environment, as the surrounding scenery may not be directly visible to the user.

Because HMDs and NEDs are usually worn on the head of a user, they can benefit from a compact and efficient configuration, including efficient light sources and illuminators for eye tracking and ranging.

BRIEF DESCRIPTION OF THE DRAWINGS

Exemplary embodiments will now be described in conjunction with the drawings, which are not to scale, in which like elements are indicated with like reference numerals, and in which:

FIG. 1 is a side cross-sectional view of an example tunable patterned-light illuminator;

FIG. 2 is an illustration of a cross-section of a spatially patterned beam provided by the illuminator of FIG. 1 in an example embodiment;

FIG. 3A is a diagram illustrating an example optical spectrum of a broadband light source that may be used in the illuminator of FIG. 1 according to an embodiment;

FIG. 3B is a diagram illustrating a wavelength-tunable light spectrum downstream of a liquid-crystal tunable filter (LCTF) in the illuminator of FIG. 1 according to an embodiment;

FIG. 4 is a schematic cross-sectional view of an example LCTF according to an embodiment;

FIG. 5A is a schematic cross-sectional view of an example spatial filter including a patterned waveplate for providing a patterned light beam;

FIG. 5B is a schematic plan view of the patterned waveplate of the spatial filter of FIG. 5A showing alternating regions of different retardance;

FIG. 6A is a schematic cross-sectional view of a tunable spatial light modulator (SLM) for providing a tunable spatial illumination pattern;

FIG. 6B is a schematic plan view of a 1D array of LC pixels that may be used as a patterned waveplate to spatially modulate light polarization in an embodiment of the SLM of FIG. 6A;

FIG. 6C is a schematic plan view of a 2D array of LC pixels that may be used as a patterned waveplate to spatially modulate light polarization in an embodiment of the SLM of FIG. 6A;

FIGS. 7A, 7B, and 7C are schematic diagrams illustrating alternating waveplate regions of a tunable patterned waveplate the spatial filter of the illuminator of FIG. 1 for three different duty cycles of the pattern;

FIGS. 7D, 7E, and 7F are schematic diagrams illustrating alternating bright and dark regions of an illumination pattern at the output of the spatial filter for the three waveplate patterns of FIGS. 7A, 7B, and 7C, respectively;

FIG. 8 is a schematic diagram illustrating a cross-sectional side view of a multi-layer LC embodiment of the illuminator of FIG. 1 with the LCTF of FIG. 4;

FIG. 9A is a flowchart of a method for generating a tunable illumination pattern for imaging an object according to an embodiment;

FIG. 9B is a flowchart of a method for imaging an object using the tunable illumination pattern;

FIG. 10A schematically illustrates a set of different images of an object corresponding to tunable illumination patterns generated by the method of FIG. 9A;

FIG. 10B is schematically illustrates an enhanced image that may be generate based on the set of images shown in FIG. 10A;

FIG. 11 is a schematic diagram of an eye tracking system in a NED using the illuminator of FIG. 1;

FIG. 12 is a schematic diagram illustrating 3D object imaging using the illuminator of FIG. 1;

FIG. 13 is a view of an augmented reality (AR) display of this disclosure having a form factor of a pair of eyeglasses; and

FIG. 14 is a three-dimensional view of a head-mounted display (HMD) of this disclosure.

DETAILED DESCRIPTION

While the present teachings are described in conjunction with various embodiments and examples, it is not intended that the present teachings be limited to such embodiments. On the contrary, the present teachings encompass various alternatives and equivalents, as will be appreciated by those of skill in the art. All statements herein reciting principles, aspects, and embodiments of this disclosure, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future, i.e., any elements developed that perform the same function, regardless of structure.

As used herein, the terms “first”, “second”, and so forth are not intended to imply sequential ordering, but rather are intended to distinguish one element from another, unless explicitly stated. Similarly, sequential ordering of method steps does not imply a sequential order of their execution, unless explicitly stated.

Embodiments described herein relate to an apparatus and related method for object imaging, including three-dimensional (3D) imaging of various objects. The imaging may include illuminating an object with structured light of different wavelengths, and detecting reflections of the structured with a camera to obtain a set of 2D images at different illumination wavelengths. The apparatus may include a structured light projector, or illuminator, that includes a liquid crystal (LC) tunable filter (LCTF) in series with a spatial filter, e.g. an LC spatial light modulator (SLM). The illuminator may be configured to project, upon the object or scene, an illumination pattern composed of alternating bright and dark regions, e.g. a sequence of spaced apart bright stripes or fringes.

The method may include tuning the SLM to provide different illumination patterns, e.g. of different periods and/or duty cycles, and capturing the reflected light for the different patterns to obtain two or more sets of the 2D images. Each of these sets may include one or more of the images captured at different illumination wavelengths. The plurality of images so obtained may then be used, e.g., to generate an enhanced image of the object that has more information about the object than any of the individually captured images. For example, compositionally different parts of the object (e.g. a human eye) may have different reflectivity at different illumination wavelengths, and those parts may be identified by comparing the images taken at different illumination wavelength.

In some embodiments, images of the object using illumination patterns with alternating bright and dark regions may be used to approximately determine a 3D shape of the object, e.g. based on apparent deformation of the bright regions of the illumination pattern after reflection from the object. The illuminator may be conveniently embodied using a compact and low-weight multi-layer LC structure, which may be useful e.g. in an HMD or a NED, e.g. for eye tracking and/or for ranging and 3D imaging of the surroundings.

An aspect of the present disclosure relates to an illuminator comprising a light source for emitting light spanning a wavelength band, a liquid crystal tunable filter (LCTF) for spectrally filtering the light and for tuning a center wavelength of the light within the wavelength band, and a spatial filter for spatially modulating the light to provide a patterned light beam comprising a pattern of alternating bright and dark regions in a cross-section of the patterned light beam.

Some implementations may comprise a camera and a display system for projecting images to a viewer. In some of such implementations a controller operably coupled to the LCTF and the camera may be provided. In some of such implementations, the patterned light beam illuminates an eye of the viewer; and the camera is configured to receive a portion of the patterned light beam reflected from the eye. In some of such implementations the wavelength band of the light source may be in an infrared portion of spectrum. The controller may be configured to wavelength-tune the LCTF, obtain eye images captured by the camera in coordination with the wavelength tuning, and process the eye images to obtain eye tracking information, wherein different ones of the eye images correspond to different eye illuminating wavelengths. In some of the above implementations, the patterned light beam may illuminate at least a part of an environment surrounding the viewer, the camera may be configured to receive a portion of the patterned light beam reflected from one or more objects in the environment, and the controller may be configured to process the images to obtain depth information for the environment.

In any of the above implementations the spatial filter may be configured to vary the pattern of alternating bright and dark regions. The controller may be configured to obtain the eye images captured by the camera at different patterns of the alternating bright and dark regions of the patterned light beam illuminating the eye, and to process the images to obtain eye tracking information.

In any of the above implementations, the light may be is incident on the spatial filter as a polarized light beam, and the spatial filter may comprise a patterned waveplate configured to spatially modulate a polarization state of the polarized light beam, such that the polarization state at an output of the patterned waveplate alternates between first and second orthogonal polarization states in a cross-section of the polarized light beam. The spatial filter may further comprise a polarizer downstream of the patterned waveplate for blocking light in the first polarization state while propagating light in the second polarization state to provide the patterned light beam. In some of such implementations the patterned waveplate may comprise an array of alternating first and second waveplate segments having differing retardance. The first waveplate segments may have e.g. an approximately half-wave retardance, and the second waveplate segments may have e.g. an approximately zero retardance or an approximately full-wave retardance. In some implementations the spatial filter may comprise a spatial light modulator (SLM). The SLM may comprise e.g. an array of liquid crystal pixels having individually tunable retardance, and a polarizer downstream of the array of liquid crystal pixels.

In any of the above implementations the LCTF may comprise a sequence of serially coupled blocks, each block comprising a tunable waveplate upstream of a polarizer. Each tunable waveplate may comprise, e.g., an LC layer having a tunable retardance, wherein retardance values of different tunable waveplates of the LCTF are in a binary relationship with one another.

An aspect of the present disclosure provides a display system for projecting images to a viewer. The display system comprises an illuminator for illuminating an eye of the viewer with a patterned light beam comprising a pattern of alternating bright and dark regions in a cross-section of the patterned light beam, the illuminator comprising a spatial filter operable to spatially filter light incident thereon to output the patterned light beam, and a tunable wavelength filter upstream or downstream of the spatial filter. The display system further comprises a camera disposed to receive a portion of the patterned light beam reflected from the eye to capture a plurality of eye images, and a controller configured to receive the eye images captured by the camera. The controller is further configured to wavelength-tune the tunable wavelength filter, and to process the eye images for obtaining eye tracking information, different ones of the eye images being captured at different eye illuminating wavelengths.

In some implementations of the display system, the spatial filter may be configured to vary the pattern of bright regions and the controller is configured to receive a set of eye images captured by the camera for different patterns of the bright regions, and to process the set of eye images for obtaining the eye tracking information.

An aspect of the present disclosure provides a method for imaging an object. The method comprises: filtering light spanning a wavelength band with a liquid crystal tunable filter (LCTF), and spatially modulating the light to provide a patterned light beam comprising a pattern of alternating bright and dark regions in a cross-section of the patterned light beam.

In some implementations, the method may further comprise illuminating the object with the patterned light beam, receiving a portion of the patterned light beam reflected from the object by an image capturing camera, wavelength-tuning the LCTF to vary a center wavelength of the patterned light beam, capturing, with the camera, images of the object in coordination with the wavelength tuning, the images comprising at least two images captured for two different center wavelengths of the patterned light beam, and processing the images to obtain a 3D information about the object.

In some implementations, the method may further comprise varying the pattern of alternating bright and dark regions in coordination with the capturing, so that the images comprise at least two images captured for two different patterns of the patterned light beam.

In some implementations, the method may be used in a display system for displaying images to a viewer, the object may comprise an eye of the viewer, and the processing may include obtaining eye tracking information. In some implementations of the method, the illuminating may comprise illuminating a scene comprising the object and the processing includes obtaining a 3D image of the scene.

FIG. 1 schematically illustrates, in a side cross-sectional view, an example illuminator 100 configured to project a wavelength-tunable optical pattern, e.g. an optical pattern 230 shown in FIG. 2, comprised of alternating bright (231) and dark (233) regions. In the illustrated embodiment, the illuminator 100, which may also be referred to herein as the wavelength-tunable optical pattern projector, includes a wavelength-tunable light source 110 and a spatial filter 130 downstream of the wavelength-tunable light source 110.

The wavelength-tunable light source 110 may include an optical emitter 112 for emitting light 111 spanning a wavelength band Δλ, (e.g. a wavelength band 311, FIG. 3A), and a liquid crystal tunable filter (LCTF) 120 for tunably band-pass filtering the light 111 to provide a wavelength-tunable light beam 121. In some embodiments, the light beam 121 may be approximately collimated. In some embodiments, a beam shaping and/or collimating optics 114 may be provided, e.g. in an optical path of the light 111 upstream of the LCTF 120, to shape and/or collimate the light 111. In some embodiments, the light beam 121 may be polarized, e.g. linearly polarized. The spatial filter 130 is configured to spatially modulate, or spatially filter, the light beam 121 to provide a patterned light beam 131. The patterned light beam 131 may include the pattern 230 in a cross-section 240 of the light beam, as illustrated in FIG. 2. In some embodiments, the spatial filter 130 may be positioned upstream of the LCTF 120.

Referring to FIG. 3A, the optical emitter 112 may be a low-coherence, broadband light source, with an optical spectrum 310 of the emitting light spanning a wavelength band 311 that is at least 50 nm wide at the 3 dB level, or at least 100 nm wide, or at least 200 nm wide. In example embodiments, the optical emitter 112 may be a super-luminescent light-emitting diode (SLED) or a supercontinuum laser source. In some embodiments, the optical emitter 112 may be configured to emit infrared light, e.g. including but not limited to the 800 nm-1600 nm wavelength region of a light spectrum. In some embodiments the optical emitter 112 may be configured to emit visible light, e.g. in the 400 nm-700 nm wavelength region. In some embodiments the optical emitter 112 may be configured to emit ultraviolet light.

Turning to FIG. 3B, the LCTF 120 is configured to operate as a tunable pass-band filter to tune a center wavelength λc of a passband thereof 330 across at least a portion of the wavelength band 311 of the emitter 112, thereby also tuning the center wavelength of the light beam 121. FIG. 3B illustrates an example embodiment or scenario where the center wavelength λc of the light beam 121 downstream of the LCTF 120 is tunable, by tuning the LCTF 120, to any one of a seven different center wavelengths λc1, λc2, . . . , λc7 within the wavelength band 311 of the optical emitter 112. In various embodiments, the center wavelength λc may be continuously tunable over the emitter wavelength band 311 (FIG. 3A), or switchable between N≥2 fixed center wavelengths λc.

FIG. 4 illustrates an example LCTF 400 that may embody the LCTF 120. The LCTF 400 is configured as an electrically tunable Lyot filter, and includes a sequence of tunable waveplates 4201, 4202, 4203 alternating with optical polarizers 4101, 4102, 4103, and 4104, e.g. suitably oriented linear polarizers. The waveplates 4201, 4202, 4203 may be generally referred to as waveplates 420, while optical polarizers 4101, 4102, 4103, and 4104 may be generally referred to as polarizers 410. Each one of the waveplates 420 is sandwiched between two optical polarizers 410 and has a different thickness and/or optical retardation, with the thickness and/or retardation ratios of the waveplates 420 belonging to a binary set 2, 4, 8, . . . , etc. In other words, the LCTF 400 includes a sequence of serially coupled blocks, each block comprising a tunable waveplate (e.g. the tunable waveplate 4201, 4202, 4203) upstream of a polarizer (e.g. the optical polarizer 4101, 4102, 4103).

In an example embodiment shown, each one of the waveplates 420 is half as thick as the preceding waveplate 420. The thickest one of the waveplates 420 defines a free spectral range of the LCTF 400, and the thinnest one defines the wavelength selectivity, e.g. the spectral bandwidth of the filtering function of the LCTF 400. The filter sections do not necessarily have to be disposed in the order of decreasing/increasing thickness or retardance. It is generally enough that the retardance values of different tunable waveplates of the LCTF are in a binary relationship with one another.

Each of the waveplates 420 may include an LC layer 430, which retardance may be tuned by varying a voltage applied thereto, e.g. from a source of a control voltage 450. In some embodiments, one or more of the waveplates 420 may further include a fixed-retardance waveplate, e.g. a quartz waveplate 440. Although the LCTF 400 as shown in FIG. 4 includes three tunable waveplates 420, in various embodiments the number of waveplates 420 in the LCTF 400 may vary from two to ten and more.

FIG. 5A illustrates, in a cross-sectional side view, a spatial filter 500, which may be used as the spatial filter 130 of the illuminator 100 of FIG. 1. The spatial filter 500 (FIG. 5A) includes a patterned waveplate 510, e.g. a patterned LC waveplate, followed by an optical polarizer 520. FIG. 5B illustrates a plan view of the patterned waveplate 510, e.g. as seen in a cross-section thereof perpendicular to a direction of propagation of an input light beam 501 incident upon the waveplate 510. The input light beam 501 may represent, e.g., the wavelength-tunable light beam 121 as described above with reference to FIG. 1.

The patterned LC waveplate 510 is composed of first waveplate regions 511 alternating with second waveplate regions 512. In the illustrated regions 511 and 512 have the shape of stripes, but they may be shaped differently in other embodiments. The first waveplate regions 511 may have approximately half-wave retardance at the operating wavelengths λ, i.e. generally (2m+1)λ/2 retardance, where m is an integer. In other words, the first waveplate regions 511 may have the retardance of an odd number of half-waves of the input light beam 501. The second waveplate regions 512 may have approximately zero-wave retardance at the operating wavelengths λ, i.e. generally i·λ retardance, where i is an integer. In other words, the first waveplate regions 512 may have the retardance of an integer number of waves of the input light beam 501. Here “approximately” means to within +\−0.1λ.

The input light beam 501 may be polarized, e.g. linearly polarized in a plane defined by the orientation of an output polarizer of the LCTF 120, e.g. the polarizer 4104 of the LCTF 400 of FIG. 4. In an embodiment the patterned waveplate 510 is configured so that the first waveplate regions 511 rotate the polarization of the incident light to an orthogonal polarization state, while the second waveplate regions 512 approximately do not change the polarization of the incident light. The polarizer 520 is configured, e.g. oriented with an optic axis approximately perpendicular or approximately parallel to the polarization direction of the input light beam 501, so as to block light passing through one of the first (511) or the second (512) waveplate regions, while letting the light incident on the other of the regions to pass through. As a result, the spatial filter 500 provides an output light beam 503 that has bright regions alternated with dark regions, e.g. as illustrated in FIG. 2. The output light beam 503 may represent the patterned light beam 131 of FIG. 1.

FIGS. 6A and 6B schematically illustrate an example spatial light modulator (SLM) 600 that may be used as the spatial filter 130 in some embodiments of the illuminator 100 of FIG. 1. The SLM 600, which is shown in FIG. 6A in a cross-sectional side view, includes a pixelated LC waveplate 610 followed by a polarizer 620. The pixelated LC waveplate 610 includes an LC layer 612 sandwiched between transparent electrodes 615, at least one of which is segmented to provide an array of individually addressable regions, or pixels, 611. One of the electrodes 615 may be disposed upon an optically transparent substrate 617, physically supporting the LC layer 612. The electrodes 615 may be made of a material that is transparent to incident light of relevant wavelengths, such as e.g. indium tin oxide (ITO). The retardance of the pixels 611 may be individually electrically tunable by a source of control voltage 630, which is operable to vary an electrical field in the LC layer 612 within a selected pixel or pixels to change the orientation of LC molecules 618.

In an example embodiment, the LC molecules 618 may be tilted relative to a normal to the electrodes 615 in the absence of the electric field, at an angle that may be defined by alignment layers (not shown) at one or both sides of the LC layer 612. The tilted orientation of the LC molecules 618 provides the LC layer with non-zero retardance, e.g. approximately a half-wave retardance. By applying a suitable voltage between the electrodes 615, the LC molecules may be forced to approximately aligned along the electric field, as schematically illustrated at 619, which may correspond to an approximately zero retardance.

FIG. 6B shows, in a plan view, an example embodiment of the pixelated LC waveplate 610, referred to herein as waveplate 610A. In this embodiment, at least one of the electrodes 615 is segmented along one direction, e.g. horizontal in FIG. 6C, into an array of conducting bars 613n to form a linear array of waveplate pixels 611n, n=1, . . . , N. By selectively energizing a set of k adjacent pixels 611n, alternating the sets of energized pixels with sets of l adjacent pixels that remain non-energized, wherein k and l are integers that can vary each from 1 to K≤N/2, the SLM 600 may be configured to operate as a patterned waveplate 510 described above, with a variable width and number of first waveplate regions 511 and a corresponding number of second waveplate regions 512, also of a variable width.

FIG. 6C shows, in a plan view, another example embodiment of the pixelated LC waveplate 610, referred to herein as waveplate 610B. In this embodiment, at least one of the electrodes 615 is segmented into a two-dimensional (2D) M×N array of conducting elements to form a corresponding M×N array of waveplate pixels 611n,m, n=1, . . . , N, m=1, . . . , M. The waveplate 610 may be operated similarly as waveplate 610A, by selectively energizing spaced apart sets of k adjacent pixel columns of the 2D pixel array, alternating the sets of energized pixel columns with sets of l adjacent pixel columns that remain non-energized. In other embodiments, the waveplate 610B may be operated to provide a variable 2D waveplate pattern, in which waveplate regions of variable size having a first retardance alternate, along two orthogonal directions, with variable-size waveplate regions of a second retardance.

Referring to FIGS. 7A-7C, the waveplate 610A or the waveplate 610B may be operated as a linear array of first waveplate segments 711 alternating with second waveplate segments 712, with different numbers of the segment boundaries per unit area of the array. The first waveplate segments 711 may have an approximately half-wave retardance, while the second waveplate segments 712 may have an approximately zero retardance, e.g. due to being suitably energized by an applied voltage, as described above. For example, FIGS. 7A-7C may correspond to selectively applying a suitable non-zero voltage to spaced-apart sets of three, two, and one consecutive LC pixels 611i, alternating these sets of energized LC pixels 611i with a same or different number of consecutive non-energized LC pixels 611i.

FIGS. 7D-7F illustrate corresponding patterns of alternating bright (722) and dark (721) regions in a cross-sections of the patterned light beam 503 at the output of the polarizer 620. In this embodiment, the light beam 501 may be polarized at 45 degrees to the optic axis of the first waveplate segments 711, and the polarizer 620 may be oriented to transmit the polarization of the light beam 501 and to block the polarization orthogonal thereto. In an embodiment wherein the polarizer 620 is oriented to block the polarization of the light beam 501, regions 721 of the output light beam 503 become bright, and regions 722 become dark.

In embodiments of the SLM 600 including the 2D-patterned waveplate 610B, the SLM 600 may be operated to provide a variable 2D pattern of light and dark regions in a cross-section of the output light beam 503.

FIG. 8 schematically illustrates an example embodiment in which various layers of an LCTF 810 and of a spatial filter 820 are stacked to form a multi-layer tunable filter block 800. The LCTF 810 may be an embodiment of the LCTF 120 or the LCTF 400 described above. The spatial filter 820 may be an embodiment of the spatial filter 500 or the SLM 600 described above. An embodiment of the illuminator 100 using the tunable filter block 800 in combination with a suitable optical emitter, e.g. the low-coherence, broadband optical emitter 112 may be suitably compact and light-weight for use in a NED or an HMD.

When a non-flat object or a plurality of objects located at different depths is illuminated by an embodiment of the wavelength-tunable optical pattern projector 100 (“illuminator 100”) generating the patterned light beam 131 or 503, the bright regions in the reflected light, e.g. regions 231 or 722, become distorted, i.e. change their shape, with the visible distortion being dependent on the 3D shape of the object or the difference in the location depth of the objects. By receiving the reflections of the patterned light beam from the object(s) with a camera comprising an array of photo-sensitive elements, and processing the corresponding images of the object(s) captured by the camera to detect and analyze the reflection-induced distortions of various illumination patterns, a 3D image or model of the object(s), e.g. a 3D point cloud, may be generated using a suitably programmed computer. By processing the images obtained for different center wavelengths kc of the patterned light beam, object(s) or parts thereof having different material composition may be discerned, and in some embodiments identified, e.g. based on known features of reflectance spectra thereof.

Referring now to FIG. 9A, a method 900A for imaging an object using the illuminator 100 of FIG. 1 may include: filtering (FIG. 9A; 910) light spanning a wavelength band (e.g. wavelength band 311, FIG. 3A) with an LCTF (e.g. LCTF 120, 400, 810) to obtain a wavelength-tunable light beam (e.g. the light beam 121 in FIG. 1, the light beam 501 in FIGS. 5, 6A). The wavelength-tunable light beam is spatially modulated (FIG. 9A; 920) to provide a patterned light beam (e.g. the light beam 131 in FIG. 1, 503 in FIGS. 5, 6A) comprising a pattern of alternating bright and dark regions (e.g. regions 233, 231 in FIG. 2, regions 722, 721 in FIGS. 7D-7F) in a cross-section of the light beam.

The patterned light beam obtained in this way may then be used to illuminate the object, to capture one or more sets of images of the object at various illumination wavelengths and/or illumination patterns. The set(s) of these images may then be processed to obtain an enhanced image of the object, i.e. an image of the object that includes more information about the object than any single one of the images. In some embodiments, the set(s) of these two-dimensional (2D) images may be used to obtain a three-dimensional (3D) model or 3D image of the object.

Referring to FIG. 9B with further reference to FIGS. 10A and 10B, an extension 900B of the method 900A (FIG. 9B, “method 900B”) may further include illuminating (FIG. 9B, 930) the object with the patterned light beam, and receiving (FIG. 9B, 940) a portion of the patterned light beam reflected from the object by an image capturing camera (e.g. camera 1120 in FIG. 11 or 1220 in FIG. 12). The LCTF is wavelength-tuned (FIG. 9B, 950) to vary a center wavelength λc of the patterned light beam, e.g. as illustrated in FIG. 3B.

Images of the object are captured (FIG. 9B, 960) in coordination with the wavelength tuning. By way of a non-limiting illustrative example, each one of images 1001, 1002, and/or 1003 of FIG. 10A may be captured at each one of two different center wavelengths (e.g. λc1 and one of λc2 or λc7 in FIG. 3B) of the light beam.

In some embodiments, the wavelength tuning 950 and capturing 960 of the method 900B of FIG. 9B may further include varying the pattern of alternating bright and dark regions in coordination with the image capturing, so that the sets of the images comprise at least two images, e.g. 1001 and 1003, captured for two different patterns of the patterned light beam. The method may further include processing (FIG. 9B, 970) the images to obtain an enhanced image of the object (e.g. image 1010 schematically represented in FIG. 10B).

Turning to FIG. 11, a display system 1100 includes an illuminator 1110, which may be an embodiment of the illuminator 100 described above with reference to FIGS. 1-8. The illuminator 1110 is configured to illuminate an eye 1150 of a viewer with a wavelength-tunable patterned light beam 1111. The display system 1100 further includes a camera 1120 for capturing reflections 1113 of the patterned light beam 1111 from the eye 1150, and a controller 1130 configured to tune the illuminator 1110 to vary at least one of the wavelength and a spatial modulation pattern of the patterned light beam 1111, and to process images of the eye captured by the camera 1120 in coordination with the tuning. The display system 1100 may include e.g. an NED or an HMD.

In a typical embodiment, the patterned light beam 1111 is invisible to the eye 1150, i.e. a beam of infrared light. The display system 1100 may further include an image projecting module 1140 for projecting images to the viewer using visible light. The image projecting module 1140 may include, for example, one or more image conveying waveguides for conveying image-carrying light from an image source (not shown), e.g. a display panel or a scanning projector, to the eye 1150.

In an embodiment, the display system may implement an embodiment of the methods 900A and 900B as described above, for illuminating the eye 1150. In such embodiment, the controller 1130 is configured to perform, in cooperation with the camera 1120 and the illuminator 1110, the method 900A, 900B of FIGS. 9A and 9B respectively. The controller 1130 may be configured to obtain eye tracking information from the set(s) of images captured by the camera 1120 for different center wavelengths, and at least in some embodiments, different patterns of alternating bright and dark regions in a cross-section of the beam 1111 illuminating the eye 1150. By processing images of the eye 1150 obtained at different wavelength and different light/dark patterns of the illuminating light, accurate eye tracking information may be obtained.

By way of an illustrative non-limiting example, the images 1001, 1002, 1003 (FIG. 10A) may be images of the eye 1150, or some portion thereof, captured for three different illumination wavelengths, e.g. the center wavelengths λc1, λc4, and λc7 (FIG. 3B) of the patterned light beam 1111. The fill patterns of the images 1001, 1002, and 1003 are not intended to approximate an actual image but are only to distinguish the images. The controller 1130 may be configured, e.g. programmed, to cause the illuminator 1110 to sequentially tune the center wavelength of the patterned light beam 1111 to each of the center wavelengths λc1, λc4, λc7, saving the set of corresponding images 1001, 1002, 1003 in memory. The controller 1130 may be further configured to cause the illuminator 1110 to switch the pattern of dark and bright regions in the cross-section of the beam 1111 between patterns 710 shown in FIGS. 7D, 7E, and 7F, for one or more of the center wavelengths λc1, λc4, λc7.

The controller may then process the sets of images captured for two or more of the patterns to obtain the eye tracking information, such as the gaze direction and the spatial position of the eye 1150, e.g. of the pupil of the eye 1150. It is to be appreciated that the number of center wavelengths λc of the beam 1111, and the number of different spatial illumination patterns 710 in the description above are by way of example only, and the numbers of illumination wavelength and illumination patterns used in various embodiments may be different from the above description, e.g. in a range from 2 to 20 wavelengths λc and/or from 2 to 20 illumination patterns 710.

Referring now to FIG. 12, a display system 1200, e.g. a NED, includes an illuminator 1210, which may be an embodiment of the illuminator 100 described above with reference to FIGS. 1-8. The illuminator 1210 is configured to illuminate outside environment surrounding the viewer and the display system 1200, e.g. an outside scene 1250, with a wavelength-tunable patterned light beam 1211. The outside scene 1250 may include one or more objects, e.g. an object 1205. The display system 1200 further includes a camera 1220 and a controller 1230. The controller 1230 may be configured to control the illuminator 1210 and to receive images captured by the camera 1220. The camera 1220 may be configured to capture reflections 1213 of the patterned light beam 1211 from the object(s) 1205. The controller 1230 may be configured to tune the illuminator 1210 to vary the wavelength and/or a spatial modulation pattern of the light beam 1211, and to process images of the scene 1250 captured by the camera 1120 in coordination with the tuning of the illuminator 1210. The display system 1200 may include e.g. an NED or an HMD.

In some embodiments, the patterned light beam 1211 is invisible to the eye 1150, i.e. a beam of infrared light. In some embodiments, the patterned light beam 1211 is a beam of visible light. The display system 1200 may further include an image projecting module 1240 for projecting images to the eye 1150 of the viewer using visible light. The image projecting module 1240 may include, for example, one or more image conveying waveguides for conveying image-carrying light from an image source (not shown), e.g. a display panel or a scanning projector, to the eye 1150.

In an embodiment, the display system 1200 may be configured to implement an embodiment of the method 900 of FIGS. 9A and 9B. The controller 1230 may be configured to perform, in cooperation with the camera 1220 and the illuminator 1210, the methods 900A and/or 900B of FIGS. 9A and 9B respectively to illuminate the scene 1250, or at least of the object 1205 within the scene, with the light beam 1211 using different patterns of alternating bright and dark regions in a cross-section of the light beam 1211, and optionally at different illumination wavelengths. By processing the set(s) of images captured by the camera 1220 at different light/dark patterns of the illuminating light and/or different illumination wavelengths, a 3D image or model of the scene 1250, or at least of the object 1205, may be obtained.

Referring to FIG. 13, a virtual reality (VR) near-eye display (NED) 1300 includes a frame 1301 supporting, for each eye: an image projector 1330, e.g. an LC display panel or a scanning projector; a lightguide 1310 for relaying image light generated by the image projector 1330 to an eyebox 1312 where the eye is detected. One or more eyebox illuminators 1306, such as the illuminators 100, 1110, 1210 described above, may be placed to illuminate each eyebox 1312. A light-tracking camera 1304 may be provided for each eyebox 1312 to receive light of the illuminators 1306 reflected from the eye in the eyebox 1312. The purpose of the cameras 1304 is to determine position and/or orientation of both eyes of the user. The illuminators 1306 illuminate the eyes at the corresponding eyeboxes 1312 with a sequence of illumination patterns and/or a sequence of illumination wavelengths, allowing the eye-tracking cameras 1304 to obtain and process the images of the eyes for different illumination patterns and/or illumination wavelengths.

Turning to FIG. 14, an HMD 1400 is an example of an AR/VR wearable display system which encloses the user's face, for a greater degree of immersion into the AR/VR environment. The HMD 1400 may generate the entirely virtual 3D imagery. The HMD 1400 may include a front body 1402 and a band 1404 that can be secured around the user's head. The front body 1402 is configured for placement in front of eyes of a user in a reliable and comfortable manner. A display system 1480 may be disposed in the front body 1402 for presenting AR/VR imagery to the user. The display system 1480 may include various display devices, e.g. such as display panels and scanning image projectors, and lightguides. Sides 1406 of the front body 1402 may be opaque or transparent.

In some embodiments, the front body 1402 includes an inertial measurement unit (IMU) 1410 for tracking acceleration of the HMD 1400, and position sensors 1412 for tracking position of the HMD 1400. The IU 1410 is an electronic device that generates data indicating a position of the HMD 1400 based on measurement signals received from one or more of position sensors 1412, which generate one or more measurement signals in response to motion of the HMD 1400. Examples of position sensors 1412 include: one or more accelerometers, one or more gyroscopes, one or more magnetometers, one or more other suitable type of sensors that detects motion, a type of sensor used for error correction of the IMU 1410, or some combination thereof. The position sensors 1412 may be located external to the IMU 1410, internal to the IMU 1410, or some combination thereof.

The HMD 1400 may further include an illuminator 1411 and a depth camera assembly (DCA) 1408. The illuminator 1411, which may be an embodiment of the illuminator 100 of FIG. 1 or the illuminator 1210 of FIG. 12, is configured to illuminate the scenery in front of the HMD 1400 at a sequence of wavelengths and/or different illumination patterns, e.g. as described above. The DCA 1408 captures reflections of the illumination light from surrounding objects and processes corresponding images, e.g. to generate a 3D model of the environment in front of the HMD 1400, and/or obtain depth information for the environment surrounding some or all of the HMD 1400. The depth information may be compared with the information from the IMU 1410, for better accuracy of determination of position and orientation of the HMD 1400 in 3D space. In some embodiments, the 3D model of environment may be overlapped with the VR images generated by the display system 1480.

The HMD 1400 may further include an eye tracking system 1414 for determining orientation and position of user's eyes in real time. The eye tracking system 1414 may include a tunable patterned-beam illuminator for illuminating an eye of the user wearing the HMD 1400 with a sequence of illumination wavelengths and/or illumination patterns, and a camera for capturing a sequence of corresponding eye images, e,g. as described above with reference to FIG. 11 or FIG. 13. The images are processed to obtain a position and an orientation of the eye, allowing the HMD 1400 to determine the gaze direction of the user and to adjust the image generated by the display system 1480 accordingly. The determined gaze direction and vergence angle may be used to adjust the display system 1480 to reduce the vergence-accommodation conflict. The direction and vergence may also be used for displays' exit pupil steering. Furthermore, the determined vergence and gaze angles may be used for interaction with the user, highlighting objects, bringing objects to the foreground, creating additional objects or pointers, etc. An audio system may also be provided including e.g. a set of small speakers built into the front body 1402.

The functions of the various elements described above as “processors” and/or “controllers,” may be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software. When provided by a processor, the functions may be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which may be shared. Moreover, explicit use of the term “processor” or “controller” should not be construed to refer exclusively to hardware capable of executing software, and may implicitly include, without limitation, digital signal processor (DSP) hardware, network processor, application specific integrated circuit (ASIC), field programmable gate array (FPGA), read only memory (ROM) for storing software, random access memory (RAM), and non volatile storage. Other hardware, conventional and/or custom, may also be included.

Embodiments of the present disclosure may include, or be implemented in conjunction with, an artificial reality system. An artificial reality system adjusts sensory information about outside world obtained through the senses such as visual information, audio, touch (somatosensation) information, acceleration, balance, etc., in some manner before presentation to a user. By way of non-limiting examples, artificial reality may include virtual reality (VR), augmented reality (AR), mixed reality (MR), hybrid reality, or some combination and/or derivatives thereof. Artificial reality content may include entirely generated content or generated content combined with captured (e.g., real-world) content. The artificial reality content may include video, audio, somatic or haptic feedback, or some combination thereof. Any of this content may be presented in a single channel or in multiple channels, such as in a stereo video that produces a three-dimensional effect to the viewer. Furthermore, in some embodiments, artificial reality may also be associated with applications, products, accessories, services, or some combination thereof, that are used to, for example, create content in artificial reality and/or are otherwise used in (e.g., perform activities in) artificial reality. The artificial reality system that provides the artificial reality content may be implemented on various platforms, including a wearable display such as an HMD connected to a host computer system, a standalone HMD, a near-eye display having a form factor of eyeglasses, a mobile device or computing system, or any other hardware platform capable of providing artificial reality content to one or more viewers.

The present disclosure is not to be limited in scope by the specific embodiments described herein. Other various embodiments and modifications, in addition to those described herein, may be apparent to those of ordinary skill in the art from the foregoing description and accompanying drawings. Such other embodiments and modifications are intended to fall within the scope of the present disclosure. Further, although the present disclosure has been described herein in the context of a particular implementation in a particular environment for a particular purpose, those of ordinary skill in the art will recognize that its usefulness is not limited thereto and that the present disclosure may be beneficially implemented in any number of environments for any number of purposes. Accordingly, the claims set forth below should be construed in view of the full breadth and spirit of the present disclosure as described herein.

您可能还喜欢...