Facebook Patent | Stacked liquid crystal structures
Patent: Stacked liquid crystal structures
Drawings: Click to check drawins
Publication Number: 20210080759
Publication Date: 20210318
Applicant: Facebook
Abstract
A first type of stacked LC structure includes at least two liquid crystal (LC) cells arranged in optical series that share a common substrate between adjacent LC cells. A second type of stacked LC structure includes at least two LC cells arranged in optical series that share a common electrode layer between adjacent LC cells. An optical assembly for use in a head mounted display (HMD) may include one or more stacked LC structures configured to transmit light in successive optical stages to provide a varifocal optical display assembly having adjustable optical power. By sharing a common substrate or a common electrode layer between adjacent LC cells, the total thickness of a stacked LC structure may be reduced, which may lead to a corresponding reduction in size and weight and improvement in user comfort for an HMD.
Claims
-
(canceled)
-
(canceled)
-
(canceled)
-
(canceled)
-
The stacked LC structure of claim 12, wherein the first state is associated with application of a first voltage to the common electrically conductive layer and the second state is associated with application of a second voltage to the common electrically conductive layer, wherein the first voltage is different than the second voltage.
-
The stacked LC structure of claim 5, wherein the first voltage is substantially equal to zero.
-
The stacked LC structure of claim 12, wherein the light of the first polarization is right circularly polarized light and the light of the second polarization is left circularly polarized light.
-
The stacked LC structure of claim 12, wherein the common electrically conductive layer is an optically transparent electrically conductive polymer.
-
The stacked LC structure of claim 12, wherein the optically transparent electrically conductive polymer is poly(3,4-ethylenedioxythiophene):polystyrene sulfonate (PEDOT:PSS).
-
The stacked LC structure of claim 12, wherein in the first state the stacked LC structure functions as one of a nominal quarter-wave plate or a nominal half-wave plate.
-
(canceled)
-
A stacked liquid crystal (LC) structure comprising: a bottom substrate; a top substrate; a common electrically conductive layer; a first LC cell disposed between an output surface of the bottom substrate and the common electrically conductive layer; and a second LC cell disposed between an input surface of the top substrate and the common electrically conductive layer; wherein the common electrically conductive layer acts as an electrode for the first LC cell and the second LC cell, wherein the stacked LC structure is configurable to be in a first state or a second state, and wherein: in the first state, the stacked LC structure converts incident light of a first polarization into light of a second polarization; and in the second state, the stacked LC structure transmits incident light without changing polarization of the incident light.
-
(canceled)
-
The stacked LC structure of claim 12, wherein the bottom substrate includes a first electrically conductive layer adjacent to the first LC cell and the top substrate includes a second electrically conductive layer adjacent to the second LC cell, and wherein the first electrically conductive layer and the common electrically conductive layer act as an electrode pair for the first LC cell and the second electrically conductive layer and the common electrically conductive layer act as an electrode pair for the second LC cell.
-
The stacked LC structure of claim 12, wherein the stacked LC structure is configurable to be in the first state or the second state by application of a voltage to the common electrically conductive layer.
-
The stacked LC structure of claim 12, wherein the common electrically conductive layer comprises an electrically conductive polymer.
-
The stacked LC structure of claim 12, further comprising an optical element on an output surface of the top substrate, wherein behavior of the optical element depends on polarization of light incident on the optical element.
-
A head mounted display comprising: a display configured to emit image light; and an optical assembly configured to transmit the image light, wherein the optical assembly comprises: a stacked liquid crystal (LC) structure comprising: a bottom substrate; a common electrically conductive layer; a top substrate; a first LC cell disposed between the bottom substrate and the common substrate; a second LC cell disposed between the common substrate and the top substrate, wherein the common electrically conductive layer acts as an electrode for the first LC cell and the second LC cell, wherein the stacked LC structure is configurable to be in a first state or a second state, wherein: in the first state, the stacked LC structure converts incident light of a first polarization into light of a second polarization; and in the second state, the stacked LC structure transmits incident light without changing polarization of the incident light.
-
The head mounted display of claim 18, wherein the common electrically conductive layer is an optically transparent electrically conductive polymer.
-
The head mounted display of claim 18, wherein the stacked LC structure further comprises an optical element on an output surface of the top substrate, wherein behavior of the optical element depends on polarization of light incident on the optical element.
Description
[0001] This application claims the benefit of U.S. Provisional Application No. 62/900,123, filed Sep. 13, 2019, the entire content of which is incorporated by reference herein.
BACKGROUND
[0002] Artificial reality systems have applications in many fields such as computer gaming, health and safety, industry, and education. As a few examples, artificial reality systems are being incorporated into mobile devices, gaming consoles, personal computers, movie theaters, and theme parks. In general, artificial reality is a form of reality that has been adjusted in some manner before presentation to a user, which may include, e.g., a virtual reality, an augmented reality, a mixed reality, a hybrid reality, or some combination and/or derivatives thereof.
[0003] Typical artificial reality systems include one or more devices for rendering and displaying content to users. As one example, an artificial reality system may incorporate a head-mounted display (HMD) worn by a user and configured to output artificial reality content to the user. The artificial reality content may entirely consist of content that is generated by the system or may include generated content combined with real-world content (e.g., pass through views or captured real-world video and/or images of a user’s physical environment). During operation, the user typically interacts with the artificial reality system to select content, launch applications, configure the system and, in general, experience artificial reality environments.
SUMMARY
[0004] In general, the disclosure describes stacked liquid crystal (LC) structures that may be integrated into an optical assembly of a head mounted display. In accordance with some examples, the disclosure is directed to a stacked liquid crystal (LC) structure comprising a bottom substrate; a common substrate; a top substrate; a first LC cell disposed between the bottom substrate and the common substrate; a second LC cell disposed between the common substrate and the top substrate, wherein the common substrate includes or is coated with at least one electrically conductive layer that acts as an electrode for at least one of the two LC cells, wherein the stacked LC structure is configurable to be in a first state or a second state, wherein in the first state, the stacked LC structure converts incident light of a first polarization into light of a second polarization; and in the second state, the stacked LC structure transmits incident light without changing polarization of the incident light.
[0005] The common substrate may include an input surface and an output surface, and the common substrate may include a first electrically conductive layer disposed on the input surface that acts as an electrode for the first LC cell, and a second electrically conductive layer disposed on the output surface that acts as an electrode for the second LC cell. The bottom substrate may include a third electrically conductive layer adjacent to the first LC cell and the top substrate may include a fourth electrically conductive layer adjacent to the second LC cell, and wherein the first and third electrically conductive layers act as an electrode pair for the first LC cell and the second and fourth electrically conductive layers as an electrode pair for the second LC cell.
[0006] The stacked LC structure may be configurable to be in the first state or the second state by application of a voltage to the at least one electrically conductive layer. The first state may be associated with application of a first voltage to the at least one electrically conductive layer and the second state may be associated with application of a second voltage to the at least one electrically conductive layer, wherein the first voltage is different than the second voltage.
[0007] In accordance with other examples, the disclosure is directed to a stacked liquid crystal (LC) structure comprising a bottom substrate; a top substrate; a common electrically conductive layer; a first LC cell disposed between an output surface of the bottom substrate and the common electrically conductive layer; and a second LC cell disposed between an input surface of the top substrate and the common electrically conductive layer; wherein the stacked LC structure is configurable to be in a first state or a second state, and wherein: in the first state, the stacked LC structure converts incident light of a first polarization into light of a second polarization; and in the second state, the stacked LC structure transmits incident light without changing polarization of the incident light.
[0008] The common electrically conductive layer may act as an electrode for the first LC cell and the second LC cell. The bottom substrate may include a first electrically conductive layer adjacent to the first LC cell and the top substrate may include a second electrically conductive layer adjacent to the second LC cell, wherein the first electrically conductive layer and the common electrically conductive layer act as an electrode pair for the first LC cell and the second electrically conductive layer and the common electrically conductive layer act as an electrode pair for the second LC cell.
[0009] The stacked LC structure may be configurable to be in the first state or the second state by application of a voltage to the common electrically conductive layer.
[0010] In accordance with other examples, the disclosure is directed to a head mounted display comprising a display configured to emit image light; and an optical assembly configured to transmit the image light, wherein the optical assembly comprises: a stacked liquid crystal (LC) structure comprising a bottom substrate; a common substrate; a top substrate, a first LC cell disposed between the bottom substrate and the common substrate; a second LC cell disposed between the common substrate and the top substrate, wherein the common substrate includes or is coated with at least one electrically conductive layer that acts as an electrode for at least one of the two LC cells, wherein the stacked LC structure is configurable to be in a first state or a second state, wherein in the first state, the stacked LC structure converts incident light of a first polarization into light of a second polarization; and in the second state, the stacked LC structure transmits incident light without changing polarization of the incident light.
[0011] In any of the above examples, the stacked LC structure(s) may further include an optical element on an output surface of the top substrate, wherein behavior of the optical element depends on polarization of light incident on the optical element.
[0012] The details of one or more examples are set forth in the accompanying drawings and the description below. Other features and advantages will be apparent from the description and drawings, and from the claims.
BRIEF DESCRIPTION OF THE DRAWINGS
[0013] FIG. 1 is an illustration depicting an example artificial reality system in which an optical assembly of a head mounted display (HMD) includes one or more stacked LC structures in accordance with the techniques described in this disclosure.
[0014] FIG. 2A is an illustration depicting an example HMD having an optical assembly that includes one or more stacked LC structures in accordance with techniques described in this disclosure.
[0015] FIG. 2B is an illustration depicting another example HMD, in accordance with techniques described in this disclosure.
[0016] FIG. 3 is a block diagram showing example implementations of a console, an HMD, and a peripheral device of the multi-device artificial reality systems of FIG. 1, in accordance with techniques described in this disclosure.
[0017] FIG. 4 is a block diagram depicting an example in which gesture detection, user interface generation, and virtual surface functions are performed by the HMD of the artificial reality systems of FIG. 1, in accordance with the techniques described in this disclosure.
[0018] FIG. 5 is a block diagram illustrating a more detailed example implementation of a distributed architecture for a multi-device artificial reality system in which one or more devices (e.g., peripheral device and HMD) are implemented using one or more System on a Chip (SoC) integrated circuits within each device, in accordance with the techniques described in this disclosure.
[0019] FIG. 6 illustrates an example stacked LC structure that includes two LC cells configured as Pi cells in accordance with some embodiments.
[0020] FIG. 7 illustrates an example stacked LC cell structure that includes two LC cells with antiparallel alignment in accordance with some embodiments.
[0021] FIG. 8A illustrates an example stacked LC cell structure that includes two LC cells with perpendicular alignment in accordance with some embodiments.
[0022] FIG. 8B illustrates an example stacked LC structure depicted in FIG. 8A in an alternate configuration in accordance with some embodiments.
[0023] FIG. 9 illustrates an example stacked LC structure having a common electrode layer in accordance with some embodiments.
[0024] FIG. 10 illustrates an example stacked LC structure of any of those shown in FIGS. 6-9 in combination with an optical element to form an optical stage in accordance with some embodiments.
DETAILED DESCRIPTION
[0025] FIG. 1 is an illustration depicting an example artificial reality system 10 including a head mounted display (HMD) 112, one or more controllers 114A and 114B (collectively, “controller(s) 114”), and a console 106. HMD 112 is typically worn by a user 110 and includes an electronic display and optical assembly for presenting artificial reality content 122 to user 110. The optical assembly of HMD 112 includes one or more stacked LC structures in accordance with the techniques described in this disclosure. For example, the optical assembly of HMD 112 may include one or more stacked LC structures configured to transmit light in successive optical stages as part of a varifocal optical display assembly having adjustable optical power.
[0026] HMD 112 includes one or more sensors (e.g., accelerometers) for tracking motion of the HMD 112 and may include one or more image capture devices 138 (e.g., cameras, line scanners) for capturing image data of the surrounding physical environment. Although illustrated as a head-mounted display, AR system 10 may alternatively, or additionally, include glasses or other display devices for presenting artificial reality content 122 to user 110.
[0027] Each controller(s) 114 is an input device that user 110 may use to provide input to console 106, HMD 112, or another component of artificial reality system 10. Controller 114 may include one or more presence-sensitive surfaces for detecting user inputs by detecting a presence of one or more objects (e.g., fingers, stylus) touching or hovering over locations of the presence-sensitive surface. In some examples, controller(s) 114 may include an output display, which may be a presence-sensitive display. In some examples, controller(s) 114 may be a smartphone, tablet computer, personal data assistant (PDA), or other hand-held device. In some examples, controller(s) 114 may be a smartwatch, smartring, or other wearable device. Controller(s) 114 may also be part of a kiosk or other stationary or mobile system. Alternatively, or additionally, controller(s) 114 may include other user input mechanisms, such as one or more buttons, triggers, joysticks, D-pads, or the like, to enable a user to interact with and/or control aspects of the artificial reality content 122 presented to user 110 by artificial reality system 10.
[0028] In this example, console 106 is shown as a single computing device, such as a gaming console, workstation, a desktop computer, or a laptop. In other examples, console 106 may be distributed across a plurality of computing devices, such as distributed computing network, a data center, or cloud computing system. Console 106, HMD 112, and sensors 90 may, as shown in this example, be communicatively coupled via network 104, which may be a wired or wireless network, such as Wi-Fi, a mesh network or a short-range wireless communication medium, or combination thereof. Although HMD 112 is shown in this example as being in communication with, e.g., tethered to or in wireless communication with, console 106, in some implementations HMD 112 operates as a stand-alone, mobile artificial reality system, and artificial reality system 10 may omit console 106.
[0029] In general, artificial reality system 10 renders artificial reality content 122 for display to user 110 at HMD 112. In the example of FIG. 1, a user 110 views the artificial reality content 122 constructed and rendered by an artificial reality application executing on HMD 112 and/or console 106. In some examples, the artificial reality content 122 may be fully artificial, i.e., images not related to the environment in which user 110 is located. In some examples, artificial reality content 122 may comprise a mixture of real-world imagery (e.g., a hand of user 110, controller(s) 114, other environmental objects near user 110) and virtual objects to produce mixed reality and/or augmented reality. In some examples, virtual content items may be mapped (e.g., pinned, locked, placed) to a position within artificial reality content 122, e.g., relative to real-world imagery. A position for a virtual content item may be fixed, as relative to one of a wall or the earth, for instance. A position for a virtual content item may be variable, as relative to controller(s) 114 or a user, for instance. In some examples, the position of a virtual content item within artificial reality content 122 is associated with a position within the real-world, physical environment (e.g., on a surface of a physical object).
[0030] During operation, the artificial reality application constructs artificial reality content 122 for display to user 110 by tracking and computing pose information for a frame of reference, typically a viewing perspective of HMD 112. Using HMD 112 as a frame of reference, and based on a current field of view as determined by a current estimated pose of HMD 112, the artificial reality application renders 3D artificial reality content which, in some examples, may be overlaid, at least in part, upon the real-world, 3D physical environment of user 110. During this process, the artificial reality application uses sensed data received from HMD 112, such as movement information and user commands, and, in some examples, data from any external sensors 90, such as external cameras, to capture 3D information within the real world, physical environment, such as motion by user 110 and/or feature tracking information with respect to user 110. Based on the sensed data, the artificial reality application determines a current pose for the frame of reference of HMD 112 and, in accordance with the current pose, renders the artificial reality content 122.
[0031] Artificial reality system 10 may trigger generation and rendering of virtual content items based on a current field of view 130 of user 110, as may be determined by real-time gaze tracking of the user, or other conditions. More specifically, image capture devices 138 of HMD 112 capture image data representative of objects in the real-world, physical environment that are within a field of view 130 of image capture devices 138. Field of view 130 typically corresponds with the viewing perspective of HMD 112. In some examples, the artificial reality application presents artificial reality content 122 comprising mixed reality and/or augmented reality. As illustrated in FIG. 1A, the artificial reality application may render images of real-world objects, such as the portions of peripheral device 136, hand 132, and/or arm 134 of user 110, that are within field of view 130 along the virtual objects, such as within artificial reality content 122. In other examples, the artificial reality application may render virtual representations of the portions of peripheral device 136, hand 132, and/or arm 134 of user 110 that are within field of view 130 (e.g., render real-world objects as virtual objects) within artificial reality content 122. In either example, user 110 is able to view the portions of their hand 132, arm 134, peripheral device 136 and/or any other real-world objects that are within field of view 130 within artificial reality content 122. In other examples, the artificial reality application may not render representations of the hand 132 or arm 134 of the user.
[0032] FIG. 2A is an illustration depicting an example HMD 112. HMD 112 may be part of an artificial reality system, such as artificial reality system 10 of FIG. 1, or may operate as a stand-alone, mobile artificial realty system configured to implement the techniques described herein. HMD 112 includes an optical assembly having one or more stacked LC structures in accordance with the techniques described in this disclosure.
[0033] In this example, HMD 112 includes a front rigid body and a band to secure HMD 112 to a user. In addition, HMD 112 includes an interior-facing electronic display 203 configured to present artificial reality content to the user via an optical assembly 205. Electronic display 203 may be any suitable display technology, such as liquid crystal displays (LCD), quantum dot display, dot matrix displays, light emitting diode (LED) displays, organic light-emitting diode (OLED) displays, cathode ray tube (CRT) displays, e-ink, or monochrome, color, or any other type of display capable of generating visual output. In some examples, the electronic display is a stereoscopic display for providing separate images to each eye of the user. In some examples, the known orientation and position of display 203 relative to the front rigid body of HMD 112 is used as a frame of reference, also referred to as a local origin, when tracking the position and orientation of HMD 112 for rendering artificial reality content according to a current viewing perspective of HMD 112 and the user. In other examples, HMD 112 may take the form of other wearable head mounted displays, such as glasses or goggles.
[0034] Optical assembly 205 includes optical elements configured to manage light output by electronic display 203 for viewing by the user of HMD 112 (e.g., user 110 of FIG. 1). The optical elements may include, for example, one or more lens, one or more diffractive optical element, one or more reflective optical elements, one or more waveguide, or the like, that manipulates (e.g., focuses, defocuses, reflects, refracts, diffracts, or the like) light output by electronic display. In accordance with the techniques of the present disclosure, optical assembly 205 includes one or more stacked LC structures. For example, optical assembly 205 may include one or more stacked LC structures configured to transmit light in successive optical stages as part of a varifocal optical display assembly having adjustable optical power. The stacked LC structure may include two LC cells arranged in optical series that share a common substrate between the LC cells. In some previous stacked LC structures, each LC cell is surrounded by corresponding first and second substrates (e.g., one substrate on a first side of the LC cell and one substrate on a second side of the LC cell). By sharing a common, middle substrate, the total thickness of the stacked LC structure may be reduced. This may reduce a size or thickness of optical assembly 205. Reducing the size or thickness of optical assembly 205 may enable a reduction in size and weight of HMD 112, which may improve comfort of user 110.
[0035] As further shown in FIG. 2A, in this example, HMD 112 further includes one or more motion sensors 206, such as one or more accelerometers (also referred to as inertial measurement units or “IMUs”) that output data indicative of current acceleration of HMD 112, GPS sensors that output data indicative of a location of HMD 112, radar or sonar that output data indicative of distances of HMD 112 from various objects, or other sensors that provide indications of a location or orientation of HMD 112 or other objects within a physical environment. Moreover, HMD 112 may include integrated image capture devices 138A and 138B (collectively, “image capture devices 138”), such as video cameras, laser scanners, Doppler radar scanners, depth scanners, or the like, configured to output image data representative of the physical environment. More specifically, image capture devices 138 capture image data representative of objects (including peripheral device 136 and/or hand 132) in the physical environment that are within a field of view 130A, 130B of image capture devices 138, which typically corresponds with the viewing perspective of HMD 112. HMD 112 includes an internal control unit 210, which may include an internal power source and one or more printed-circuit boards having one or more processors, memory, and hardware to provide an operating environment for executing programmable operations to process sensed data and present artificial reality content on display 203.
……
……
……