Facebook Patent | Spatial Deposition Of Resins With Different Functionality On Different Substrates
Patent: Spatial Deposition Of Resins With Different Functionality On Different Substrates
Publication Number: 20200355862
Publication Date: 20201112
Applicants: Facebook
Abstract
Techniques disclosed herein relate to optical devices. Resins with different optical properties can be deposited in different areas to provide increased optical functionality. It can be difficult to design a single photopolymer material that meets several technical requirements. Different resins can be deposited on the same substrate to make a single film with spatially varying properties. Different resins can also be applied to different substrates in a stack. By using different resins, an optical component can be made that meets several technical requirements.
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] This application claims priority to U.S. Provisional Application No. 62/845,154, filed on May 8, 2019, the disclosure of which is incorporated by reference in its entirety for all purposes.
[0002] The following two U.S. patent applications (including this one) are being filed concurrently, and the entire disclosure of the other application is incorporated by reference into this application for all purposes: [0003] application Ser. No. 16/_, filed May 1, 2020, entitled “Spatial Deposition of Resins with Different Functionality”; and [0004] application Ser. No. 16/_, filed May 1, 2020, entitled “Spatial Deposition of Resins with Different Functionality on Different Substrates.”
BACKGROUND
[0005] An artificial reality system, such as a head-mounted display (HMD) or heads-up display (HUD) system, generally includes a near-eye display system in the form of a headset or a pair of glasses and configured to present content to a user via an electronic or optic display within, for example, about 10-20 mm in front of the user’s eyes. The near-eye display system may display virtual objects or combine images of real objects with virtual objects, as in virtual reality (VR), augmented reality (AR), or mixed reality (MR) applications. For example, in an AR system, a user may view both images of virtual objects (e.g., computer-generated images (CGIs)) and the surrounding environment by, for example, seeing through transparent display glasses or lenses (often referred to as optical see-through).
[0006] One example of an optical see-through AR system may use a waveguide based optical display, where light of projected images may be coupled into a waveguide (e.g., a transparent substrate), propagate within the waveguide, and be coupled out of the waveguide at different locations. In some implementations, the light of the projected images may be coupled into or out of the waveguide using a diffractive optical element, such as a holographic grating. In some implementations, the artificial reality systems may employ eye-tracking subsystems that can track the user’s eye (e.g., gaze direction) to modify or generate content based on the direction in which the user is looking, thereby providing a more immersive experience for the user. The eye-tracking subsystems may be implemented using various optical components, such as holographic optical elements.
BRIEF DESCRIPTION OF THE DRAWINGS
[0007] Illustrative embodiments are described in detail below with reference to the following figures.
[0008] FIG. 1 is a simplified block diagram of an example of an artificial reality system environment including a near-eye display system according to certain embodiments.
[0009] FIG. 2 is a perspective view of an example of a near-eye display system in the form of a head-mounted display (HMD) device for implementing some of the examples disclosed herein.
[0010] FIG. 3 is a perspective view of an example of a near-eye display system in the form of a pair of glasses for implementing some of the examples disclosed herein.
[0011] FIG. 4 illustrates an example of an optical see-through augmented reality system using a waveguide display that includes an optical combiner according to certain embodiments.
[0012] FIG. 5A illustrates an example of a volume Bragg grating. FIG. 5B illustrates the Bragg condition for the volume Bragg grating shown in FIG. 5A.
[0013] FIG. 6A illustrates the recording light beams for recording a volume Bragg grating according to certain embodiments. FIG. 6B is an example of a holography momentum diagram illustrating the wave vectors of recording beams and reconstruction beams and the grating vector of the recorded volume Bragg grating according to certain embodiments.
[0014] FIG. 7 illustrates an example of a holographic recording system for recording holographic optical elements according to certain embodiments.
[0015] FIG. 8 is a simplified diagram of an embodiment of an inkjet depositing a first resin on a substrate.
[0016] FIG. 9 is a simplified diagram of an embodiment of the inkjet depositing a second resin on the substrate.
[0017] FIG. 10 illustrates a two-dimensional map of spatial frequency response of an embodiment of an optical device.
[0018] FIG. 11 is a simplified diagram of an embodiment of a stack having resins with different properties.
[0019] FIG. 12 is a chart of optical absorption of embodiments of different resins of a stack.
[0020] FIG. 13 is a simplified flow chart illustrating an example of a method of applying two materials to one substrate according to certain embodiments.
[0021] FIG. 14 is a simplified flow chart illustrating an example of a method of creating a stacked optical device according to certain embodiments.
[0022] FIG. 15 is a simplified block diagram of an example of an electronic system 1500 of a near-eye display system (e.g., HMD device) for implementing some of the examples disclosed herein according to certain embodiments.
[0023] The figures depict embodiments of the present disclosure for purposes of illustration only. One skilled in the art will readily recognize from the following description that alternative embodiments of the structures and methods illustrated may be employed without departing from the principles, or benefits touted, of this disclosure.
[0024] In the appended figures, similar components and/or features may have the same reference label. Further, various components of the same type may be distinguished by following the reference label by a dash and a second label that distinguishes among the similar components. If only the first reference label is used in the specification, the description is applicable to any one of the similar components having the same first reference label irrespective of the second reference label.
DETAILED DESCRIPTION
[0025] Techniques disclosed herein relate generally to optical devices. More specifically, and without limitation, this disclosure relates to optical devices for artificial-reality systems. According to certain embodiments, a grating for an artificial-reality display is described. Various inventive embodiments are described herein, including systems, modules, devices, components, methods, and the like.
[0026] In an artificial reality system, such as an augmented reality (AR) or mixed reality (MR) system, to improve the performance of the system, such as improving the brightness of the displayed images, expanding the eyebox, reducing artifacts, increasing the field of view, and improving user interaction with presented content, various holographic optical elements may be used for light beam coupling and/or shaping. A volume Bragg grating can be used in an artificial-reality display (e.g., to couple light out of and/or into a waveguide). It can be difficult to design a single photopolymer material that meets many technical requirements (e.g., high dynamic range, low absorption & haze, good resolution at high & low spatial frequencies, sensitivity across visible spectrum, etc.). It can be especially difficult to design a single resin that is capable of patterning large pitch & small pitch features, due to reaction/diffusion mechanisms inherent to materials used. Accordingly, it can be beneficial to design several photopolymer materials that each meet only some requirements, but when combined into a single film or stack of films, meet all desired requirements. For some embodiments, this specification describes: (A) depositing different resins on the same substrate to make a single film with spatially varying properties (e.g., absorption, spatial frequency response, etc.); and (B) depositing different resins on different substrates and combining the different substrates either before or after exposure to make a single optical device.
[0027] As used herein, visible light may refer to light with a wavelength between about 380 nm and about 750 nm, between about 400 nm and about 700 nm, or between about 440 nm and about 650 nm. Near infrared (NIR) light may refer to light with a wavelength between about 750 nm to about 2500 nm. The desired infrared (IR) wavelength range may refer to the wavelength range of IR light that can be detected by a suitable IR sensor (e.g., a complementary metal-oxide semiconductor (CMOS), a charge-coupled device (CCD) sensor, or an InGaAs sensor), such as between 830 nm and 860 nm, between 930 nm and 980 nm, or between about 750 nm to about 1000 nm.
[0028] As also used herein, a substrate may refer to a medium within which light may propagate. The substrate may include one or more types of dielectric materials, such as glass, quartz, plastic, polymer, poly (methyl methacrylate) (PMMA), crystal, or ceramic. At least one type of material of the substrate may be transparent to visible light and NIR light. A thickness of the substrate may range from, for example, less than about 1 mm to about 10 mm or more. As used herein, a material may be “transparent” to a light beam if the light beam can pass through the material with a high transmission rate, such as larger than 60%, 75%, 80%, 90%, 95%, 98%, 99%, or higher, where a small portion of the light beam (e.g., less than 40%, 25%, 20%, 10%, 5%, 2%, 1%, or less) may be scattered, reflected, or absorbed by the material. The transmission rate (i.e., transmissivity) may be represented by either a weighted or an unweighted average transmission rate over a range of wavelengths, or the lowest transmission rate over a range of wavelengths, such as the visible wavelength range.
[0029] In the following description, for the purposes of explanation, specific details are set forth in order to provide a thorough understanding of examples of the disclosure. However, it will be apparent that various examples may be practiced without these specific details. For example, devices, systems, structures, assemblies, methods, and other components may be shown as components in block diagram form in order not to obscure the examples in unnecessary detail. In other instances, well-known devices, processes, systems, structures, and techniques may be shown without necessary detail in order to avoid obscuring the examples. The figures and description are not intended to be restrictive. The terms and expressions that have been employed in this disclosure are used as terms of description and not of limitation, and there is no intention in the use of such terms and expressions of excluding any equivalents of the features shown and described or portions thereof. The word “example” is used herein to mean “serving as an example, instance, or illustration.” Any embodiment or design described herein as “example” is not necessarily to be construed as preferred or advantageous over other embodiments or designs.
[0030] FIG. 1 is a simplified block diagram of an example of an artificial reality system environment 100 including a near-eye display system 120 in accordance with certain embodiments. Artificial reality system environment 100 shown in FIG. 1 may include near-eye display system 120, an optional imaging device 150, and an optional input/output interface 140 that may each be coupled to an optional console 110. While FIG. 1 shows example artificial reality system environment 100 including one near-eye display system 120, one imaging device 150, and one input/output interface 140, any number of these components may be included in artificial reality system environment 100, or any of the components may be omitted. For example, there may be multiple near-eye display systems 120 monitored by one or more external imaging devices 150 in communication with console 110. In some configurations, artificial reality system environment 100 may not include imaging device 150, optional input/output interface 140, and optional console 110. In alternative configurations, different or additional components may be included in artificial reality system environment 100. In some configurations, near-eye display systems 120 may include imaging device 150, which may be used to track one or more input/output devices (e.g., input/output interface 140), such as a handhold controller.
[0031] Near-eye display system 120 may be a head-mounted display that presents content to a user. Examples of content presented by near-eye display system 120 include one or more of images, videos, audios, or some combination thereof. In some embodiments, audios may be presented via an external device (e.g., speakers and/or headphones) that receives audio information from near-eye display system 120, console 110, or both, and presents audio data based on the audio information. Near-eye display system 120 may include one or more rigid bodies, which may be rigidly or non-rigidly coupled to each other. A rigid coupling between rigid bodies may cause the coupled rigid bodies to act as a single rigid entity. A non-rigid coupling between rigid bodies may allow the rigid bodies to move relative to each other. In various embodiments, near-eye display system 120 may be implemented in any suitable form factor, including a pair of glasses. Some embodiments of near-eye display system 120 are further described below. Additionally, in various embodiments, the functionality described herein may be used in a headset that combines images of an environment external to near-eye display system 120 and artificial reality content (e.g., computer-generated images). Therefore, near-eye display system 120 may augment images of a physical, real-world environment external to near-eye display system 120 with generated content (e.g., images, video, sound, etc.) to present an augmented reality to a user.
[0032] In various embodiments, near-eye display system 120 may include one or more of display electronics 122, display optics 124, and an eye-tracking system 130. In some embodiments, near-eye display system 120 may also include one or more locators 126, one or more position sensors 128, and an inertial measurement unit (IMU) 132. Near-eye display system 120 may omit any of these elements or include additional elements in various embodiments. Additionally, in some embodiments, near-eye display system 120 may include elements combining the function of various elements described in conjunction with FIG. 1.
[0033] Display electronics 122 may display or facilitate the display of images to the user according to data received from, for example, console 110. In various embodiments, display electronics 122 may include one or more display panels, such as a liquid crystal display (LCD), an organic light emitting diode (OLED) display, an inorganic light emitting diode (ILED) display, a micro light emitting diode (.mu.LED) display, an active-matrix OLED display (AMOLED), a transparent OLED display (TOLED), or some other display. For example, in one implementation of near-eye display system 120, display electronics 122 may include a front TOLED panel, a rear display panel, and an optical component (e.g., an attenuator, polarizer, or diffractive or spectral film) between the front and rear display panels. Display electronics 122 may include pixels to emit light of a predominant color such as red, green, blue, white, or yellow. In some implementations, display electronics 122 may display a three-dimensional (3D) image through stereo effects produced by two-dimensional panels to create a subjective perception of image depth. For example, display electronics 122 may include a left display and a right display positioned in front of a user’s left eye and right eye, respectively. The left and right displays may present copies of an image shifted horizontally relative to each other to create a stereoscopic effect (i.e., a perception of image depth by a user viewing the image).
[0034] In certain embodiments, display optics 124 may display image content optically (e.g., using optical waveguides and couplers), magnify image light received from display electronics 122, correct optical errors associated with the image light, and present the corrected image light to a user of near-eye display system 120. In various embodiments, display optics 124 may include one or more optical elements, such as, for example, a substrate, optical waveguides, an aperture, a Fresnel lens, a convex lens, a concave lens, a filter, input/output couplers, or any other suitable optical elements that may affect image light emitted from display electronics 122. Display optics 124 may include a combination of different optical elements as well as mechanical couplings to maintain relative spacing and orientation of the optical elements in the combination. One or more optical elements in display optics 124 may have an optical coating, such as an anti-reflective coating, a reflective coating, a filtering coating, or a combination of different optical coatings.
[0035] Magnification of the image light by display optics 124 may allow display electronics 122 to be physically smaller, weigh less, and consume less power than larger displays. Additionally, magnification may increase a field of view of the displayed content. The amount of magnification of image light by display optics 124 may be changed by adjusting, adding, or removing optical elements from display optics 124. In some embodiments, display optics 124 may project displayed images to one or more image planes that may be further away from the user’s eyes than near-eye display system 120.
[0036] Display optics 124 may also be designed to correct one or more types of optical errors, such as two-dimensional optical errors, three-dimensional optical errors, or a combination thereof. Two-dimensional errors may include optical aberrations that occur in two dimensions. Example types of two-dimensional errors may include barrel distortion, pincushion distortion, longitudinal chromatic aberration, and transverse chromatic aberration. Three-dimensional errors may include optical errors that occur in three dimensions. Example types of three-dimensional errors may include spherical aberration, comatic aberration, field curvature, and astigmatism.
[0037] Locators 126 may be objects located in specific positions on near-eye display system 120 relative to one another and relative to a reference point on near-eye display system 120. In some implementations, console 110 may identify locators 126 in images captured by imaging device 150 to determine the artificial reality headset’s position, orientation, or both. A locator 126 may be a light emitting diode (LED), a corner cube reflector, a reflective marker, a type of light source that contrasts with an environment in which near-eye display system 120 operates, or some combinations thereof. In embodiments where locators 126 are active components (e.g., LEDs or other types of light emitting devices), locators 126 may emit light in the visible band (e.g., about 380 nm to 750 nm), in the infrared (IR) band (e.g., about 750 nm to 1 mm), in the ultraviolet band (e.g., about 10 nm to about 380 nm), in another portion of the electromagnetic spectrum, or in any combination of portions of the electromagnetic spectrum.
[0038] Imaging device 150 may be part of near-eye display system 120 or may be external to near-eye display system 120. Imaging device 150 may generate slow calibration data based on calibration parameters received from console 110. Slow calibration data may include one or more images showing observed positions of locators 126 that are detectable by imaging device 150. Imaging device 150 may include one or more cameras, one or more video cameras, any other device capable of capturing images including one or more of locators 126, or some combinations thereof. Additionally, imaging device 150 may include one or more filters (e.g., to increase signal to noise ratio). Imaging device 150 may be configured to detect light emitted or reflected from locators 126 in a field of view of imaging device 150. In embodiments where locators 126 include passive elements (e.g., retroreflectors), imaging device 150 may include a light source that illuminates some or all of locators 126, which may retro-reflect the light to the light source in imaging device 150. Slow calibration data may be communicated from imaging device 150 to console 110, and imaging device 150 may receive one or more calibration parameters from console 110 to adjust one or more imaging parameters (e.g., focal length, focus, frame rate, sensor temperature, shutter speed, aperture, etc.).
……
……
……