Facebook Patent | Virtual reality systems and methods
Patent: Virtual reality systems and methods
Drawings: Click to check drawins
Publication Number: 20210325683
Publication Date: 20211021
Applicant: Facebook
Abstract
The disclosed systems and methods may include an example assembly for isolating an inertial measurement unit, for example, in the case of a virtual reality system. Additionally, an example system may include shock-absorbing devices, for example for head-mounted displays. In some examples, the disclosed systems and methods may include a form-in-place gasket for water ingress protection around a flexible printed circuit board. The disclosed systems may also include a system for enhancing remote or virtual social experiences using biosignals. The disclosed systems may additionally include a split device with identity fixed to physical location. Various other related methods and systems are also disclosed.
Claims
-
A method comprising: a processor; a memory device comprising instructions that, when executed by the processor, perform at least one of: a process for enhancing remote or virtual social experiences using biosignals comprising: obtaining biosignals from a first user; obtaining biosignals from a second user; transforming the biosignals of the first user into position information or force information for the first user’s body; transforming the biosignals of the second user into position information or force information for the second user’s body; and updating position information or force information for a virtual object based on at least one of the position information or the force information of the first user’s body or the position information or the force information of the second user’s body; or an additional process for enhancing remote or virtual social experiences using biosignals comprising: hosting a virtual environment for the first user and the second user, the virtual environment having at least one virtual object with which the first user and the second user may simultaneously interact; receiving, while the first user interacts with the at least one virtual object, position information or force information for the first user’s body, the position information or the force information for the first user’s body having been derived from biosignals obtained from the first user’s body; receiving, while the second user interacts with the at least one virtual object, position information or force information for the second user’s body, the position information or the force information for the second user’s body having been derived from biosignals obtained from the second user’s body; and updating position information or force information for the at least one virtual object based on at least one of the position information or the force information of the first user’s body or the position information or the force information of the second user’s body; or a process for configuring data comprising: initializing configuration data stored in a memory of a mount device; installing a configurable device into a mounting portion of the mount device; and configuring the configurable device using the configuration data.
-
A system comprising at least one of: a shock absorbing device comprising: a shock absorbing material and an adhesive material, wherein the shock absorbing material is shaped and configured to partially surround an image sensor and the adhesive material is positioned and configured to secure the shock absorbing material to the image sensor; or a mount device comprising: a physical memory storing configuration data associated with a location of the mount device; a mounting portion for holding a configurable device; and a communication module for communicatively coupling the mount device and the configurable device; or a circuit board enclosure device comprising: a flexible printed circuit board comprising at least one section and a gasket bonded to the section, the gasket enclosing the section of the flexible printed circuit board and being formed by: placing the section of the flexible printed circuit board in a molding fixture; injecting a fluid into the molding fixture; curing the fluid; and removing the molding fixture from the section of the flexible printed circuit board.
-
The system of claim 2, wherein, when an impact force is imparted to the image sensor, the shock-absorbing material is configured to transfer the impact force to a base of the image sensor.
-
The system of claim 2, wherein the shock-absorbing material is configured to absorb an impact force imparted to the image sensor.
-
The system of claim 4, wherein absorbing the impact force imparted to the image sensor comprises distributing the impact force across the shock-absorbing material.
-
The system of claim 2, wherein the shock-absorbing material is configured to substantially maintain a structural integrity of the image sensor when an impact force is imparted to the image sensor.
-
The system of claim 2, wherein the shock-absorbing material comprises at least one of: a polymer material; an elastomer; a plastic; a polyethylene material; a polycarbonate material; an acrylonitrile butadiene styrene material; a visco-elastic polymer material; a polymer matrix composite material; a fiber-reinforced polymer composite material; a polyurethane material; a butyl rubber material; or a neoprene rubber material.
-
The system of claim 2, wherein the image sensor is integrated into a head-mounted display.
-
A system comprising at least one of: an inertial measurement unit assembly system comprising: a circuit board; an inertial measurement unit coupled to the circuit board; a frame; and an isolation assembly disposed between the circuit board and the frame, the isolation assembly configured to reduce vibrations in least a portion of the circuit board adjacent to the inertial measurement unit; or an additional system for a shock-absorbing head-mounted display comprising: a head-mounted display; an image sensor; and a shock absorbing device, wherein the shock absorbing device comprises a shock absorbing material, the shock absorbing device is secured to the image sensor by an adhesive material, and the shock absorbing material is shaped and configured to partially surround the image sensor; or a system for enhancing remote or virtual social experiences using biosignals comprising: at least one physical processor and physical memory storing computer-executable instructions that, when executed by the physical processor, cause the physical processor to: obtain biosignals from a first user; obtain biosignals from a second user; transform the biosignals of the first user into position information or force information for the first user’s body; transform the biosignals of the second user into position information or force information for the second user’s body; and update position information or force information for a virtual object based on at least one of the position information or the force information of the first user’s body or the position information or the force information of the second user’s body; or an additional system for enhancing remote or virtual social experiences using biosignals comprising: at least one additional physical processor and additional physical memory storing computer-executable instructions that, when executed by the additional physical processor, cause the additional physical processor to: host a virtual environment for the first user and the second user, the virtual environment having at least one virtual object with which the first user and the second user simultaneously interact; receive, while the first user and the second user interact with the at least one virtual object, position information or force information for the first user’s body, the position information or the force information for the first user’s body having been derived from biosignals obtained from the first user’s body; receive, while the first user and the second user interact with the at least one virtual object, position information or force information for the second user’s body, the position information or the force information for the second user’s body having been derived from biosignals obtained from the second user’s body; and update position information or force information for the at least one virtual object based on the position information or the force information of the first user’s body and/or the position information or the force information of the second user’s body; or a system comprising a configurable device and a mount device comprising: a physical memory storing configuration data associated with a location of the mount device; a mounting portion for holding the configurable device; and a communication module for communicatively coupling the mount device and the configurable device.
-
The system of claim 9, wherein the isolation assembly comprises one or more of: a rigid piece; or a compressible foam layer.
-
The system of claim 9, wherein the configuration data comprises at least one of a location name, location data, or application data.
-
The system of claim 9, wherein the configuration data comprises data for initializing the configurable device.
-
The system of claim 12, wherein the configuration data comprises data for initializing an application on the configurable device.
-
The system of claim 12, wherein the configuration data is static data.
-
The system of claim 9, wherein the communication module comprises an electrical connector.
-
The system of claim 15, wherein the electrical connector is integrated with the mounting portion.
-
The system of claim 16, wherein the mounting portion comprises spring fingers for connecting with the configurable device.
-
The system of claim 9, wherein the communication module comprises a wireless connection.
-
The system of claim 9, wherein the configurable device comprises at least one of a digital sign, a booking system, a wayfinder device, or a smart bus-stop sign.
-
The system of claim 9, wherein the configurable device comprises a non-removable battery.
Description
CROSS REFERENCE TO RELATED APPLICATION
[0001] This application claims the benefits of U.S. Provisional Application No. 63/120,452, filed Dec. 2, 2020, U.S. Provisional Application No. 63/073,795, filed Sep. 2, 2020, U.S. Provisional Application No. 63/111,854, filed Nov. 10, 2020, U.S. Provisional Application No. 63/132,235, filed Dec. 30, 2020, and U.S. Provisional Application No. 63/152,813, filed Feb. 23, 2021, the disclosures of each of which are incorporated, in their entirety, by this reference.
BRIEF DESCRIPTION OF DRAWINGS
[0002] The accompanying drawings illustrate a number of exemplary embodiments and are a part of the specification. Together with the following description, these drawings demonstrate and explain various principles of the present disclosure.
[0003] FIG. 1 is an illustration of exemplary augmented-reality glasses that may be used in connection with embodiments of this disclosure.
[0004] FIG. 2 is an illustration of an exemplary virtual-reality headset that may be used in connection with embodiments of this disclosure.
[0005] FIG. 3 is an illustration of an exemplary system for isolating an IMU that may be used in connection with embodiments of this disclosure.
[0006] FIG. 4 is an illustration of a perspective view of the system for isolating an IMU that may be used in connection with embodiments of this disclosure.
[0007] FIG. 5 is an illustration of a cross-sectional side view of the system for isolating an IMU that may be used in connection with embodiments of this disclosure.
[0008] FIG. 6 is a table including examples of isolation assembly configurations and materials.
[0009] FIG. 7 illustrates an example head-mounted display with integrated image sensors, according to at least one embodiment of the present disclosure.
[0010] FIG. 8 is a cross-sectional view of an example image sensor, according to at least one embodiment of the present disclosure.
[0011] FIG. 9 is another cross-sectional view of an example image sensor, according to at least one embodiment of the present disclosure.
[0012] FIG. 10 is a plan view of an example image sensor, according to at least one embodiment of the present disclosure.
[0013] FIG. 11 is a perspective view of an example image sensor, according to at least one embodiment of the present disclosure.
[0014] FIG. 12 is an illustration of example haptic devices that may be used in connection with embodiments of this disclosure.
[0015] FIG. 13 is an illustration of an example virtual-reality environment according to embodiments of this disclosure.
[0016] FIG. 14 is an illustration of an example augmented-reality environment according to embodiments of this disclosure.
[0017] FIG. 15 is an illustration of an example top view of a gasket placed over a section of a flexible printed circuit board.
[0018] FIG. 16 is an illustration of an example top view of a flexible printed circuit board that shows wings included in a section of the flexible printed circuit board that may be enclosed by a gasket.
[0019] FIG. 17 is an illustration of an example top view of a molding fixture showing a section of a flexible printed circuit board placed inside of the molding fixture.
[0020] FIG. 18 is an illustration of an example cross-sectional side view of a section of a flexible printed circuit board enclosed by a gasket and incorporated into an enclosure of a device.
[0021] FIG. 19 is an illustration of an example wearable electronic device.
[0022] FIG. 20 is a schematic diagram of components of an exemplary biosignal sensing system in accordance with some embodiments of the technology described herein.
[0023] FIG. 21 is a block diagram of an exemplary system for enhancing remote or virtual social experiences using biosignals.
[0024] FIG. 22 is a flow diagram of an exemplary computer-implemented method for enhancing remote or virtual social experiences using biosignals.
[0025] FIG. 23 is a sequence diagram of an exemplary system for enhancing remote or virtual social experiences using biosignals.
[0026] FIG. 24 is a diagram of an exemplary virtual object being simultaneously interacted with by two users.
[0027] FIG. 25A is a block diagram of an exemplary system for a split device with identity fixed to a physical location.
[0028] FIG. 25B is a block diagram of the exemplary system of FIG. 25A mounted with a configurable device.
[0029] FIG. 25C is a block diagram of the exemplary system of FIG. 25A mounted with the configurable device removed.
[0030] FIG. 25D is a block diagram of the exemplary system of FIG. 25A mounted with a replacement device.
[0031] FIG. 26 is a flow diagram of an exemplary method for using a split device with identity fixed to a physical location.
[0032] FIGS. 27-39B provide examples of accelerometer readings showing results both with and without the IMU isolation assembly.
[0033] Throughout the drawings, identical reference characters and descriptions indicate similar, but not necessarily identical, elements. While the exemplary embodiments described herein are susceptible to various modifications and alternative forms, specific embodiments have been shown by way of example in the drawings and will be described in detail herein. However, the exemplary embodiments described herein are not intended to be limited to the particular forms disclosed. Rather, the present disclosure covers all modifications, equivalents, and alternatives falling within this disclosure.
DETAILED DESCRIPTION
[0034] Example Assembly for Isolating an Inertial Measurement Unit (IMU)
[0035] Many conventional artificial-reality systems include a headset that uses video and audio to aid in augmenting a user’s perception of reality or for immersing a user in an artificial-reality experience. Often, a traditional artificial-reality headset will include speakers coupled to or otherwise integrated with the headset. In order to track rotational movements, angular rate, and acceleration (to maintain a user’s position, point of view, and the like in an artificial-reality world for example), conventional artificial-reality headsets may include a sensor such as an inertial measurement unit (IMU). Traditional artificial-reality headsets will typically have the IMU coupled to the headset to obtain relevant accelerometer and gyroscopic data and aid in presenting artificial-reality worlds and augmented scenarios to a user.
[0036] Being in proximity to the speakers, however, may introduce interference with the IMU’s readings and cause inaccuracies in the IMU data, especially with high audio volume levels. The effects of this interference could include gyroscope drift, in which the initial position, or zero reading of the IMU changes over time. As a result, the user’s virtual experience is impacted as sound, video, and even haptic feedback may be inaccurately conveyed to the user.
[0037] The present disclosure is generally directed to an assembly for isolating an IMU from vibrations that would otherwise cause gyroscopic drift. The IMU may therefore still be located on the headset near the cameras, speakers, and other equipment where the IMU can collect the most relevant data without much of the inference caused by surrounding components.
[0038] Embodiments of the present disclosure may include or be implemented in conjunction with various types of artificial reality systems. Artificial reality is a form of reality that has been adjusted in some manner before presentation to a user, which may include, for example, a virtual reality, an augmented reality, a mixed reality, a hybrid reality, or some combination and/or derivative thereof. Artificial-reality content may include completely computer-generated content or computer-generated content combined with captured (e.g., real-world) content. The artificial-reality content may include video, audio, haptic feedback, or some combination thereof, any of which may be presented in a single channel or in multiple channels (such as stereo video that produces a three-dimensional (3D) effect to the viewer). Additionally, in some embodiments, artificial reality may also be associated with applications, products, accessories, services, or some combination thereof, that are used to, for example, create content in an artificial reality and/or are otherwise used in (e.g., to perform activities in) an artificial reality.
[0039] Artificial-reality systems may be implemented in a variety of different form factors and configurations. Some artificial reality systems may be designed to work without near-eye displays (NEDs). Other artificial reality systems may include an NED that also provides visibility into the real world (such as, e.g., augmented-reality system 100 in FIG. 1) or that visually immerses a user in an artificial reality (such as, e.g., virtual-reality system 200 in FIG. 2). While some artificial-reality devices may be self-contained systems, other artificial-reality devices may communicate and/or coordinate with external devices to provide an artificial-reality experience to a user. Examples of such external devices include handheld controllers, mobile devices, desktop computers, devices worn by a user, devices worn by one or more other users, and/or any other suitable external system.
[0040] Turning to FIG. 1, augmented-reality system 100 may include an eyewear device 102 with a frame 110 configured to hold a left display device 115(A) and a right display device 115(B) in front of a user’s eyes. Display devices 115(A) and 115(B) may act together or independently to present an image or series of images to a user. While augmented-reality system 100 includes two displays, embodiments of this disclosure may be implemented in augmented-reality systems with a single NED or more than two NEDs.
[0041] In some embodiments, augmented-reality system 100 may include one or more sensors, such as sensor 140. Sensor 140 may generate measurement signals in response to motion of augmented-reality system 100 and may be located on substantially any portion of frame 110. Sensor 140 may represent one or more of a variety of different sensing mechanisms, such as a position sensor, an inertial measurement unit (IMU), a depth camera assembly, a structured light emitter and/or detector, or any combination thereof. In some embodiments, augmented-reality system 100 may or may not include sensor 140 or may include more than one sensor. In embodiments in which sensor 140 includes an IMU, the IMU may generate calibration data based on measurement signals from sensor 140. Examples of sensor 140 may include, without limitation, accelerometers, gyroscopes, magnetometers, other suitable types of sensors that detect motion, sensors used for error correction of the IMU, or some combination thereof.
[0042] In some examples, augmented-reality system 100 may also include a microphone array with a plurality of acoustic transducers 120(A)-120(J), referred to collectively as acoustic transducers 120. Acoustic transducers 120 may represent transducers that detect air pressure variations induced by sound waves. Each acoustic transducer 120 may be configured to detect sound and convert the detected sound into an electronic format (e.g., an analog or digital format). The microphone array in FIG. 1 may include, for example, ten acoustic transducers: 120(A) and 120(B), which may be designed to be placed inside a corresponding ear of the user, acoustic transducers 120(C), 120(D), 120(E), 120(F), 120(G), and 120(H), which may be positioned at various locations on frame 110, and/or acoustic transducers 120(I) and 120(J), which may be positioned on a corresponding neckband 105.
[0043] In some embodiments, one or more of acoustic transducers 120(A)-(J) may be used as output transducers (e.g., speakers). For example, acoustic transducers 120(A) and/or 120(B) may be earbuds or any other suitable type of headphone or speaker.
[0044] The configuration of acoustic transducers 120 of the microphone array may vary. While augmented-reality system 100 is shown in FIG. 1 as having ten acoustic transducers 120, the number of acoustic transducers 120 may be greater or less than ten. In some embodiments, using higher numbers of acoustic transducers 120 may increase the amount of audio information collected and/or the sensitivity and accuracy of the audio information. In contrast, using a lower number of acoustic transducers 120 may decrease the computing power required by an associated controller 150 to process the collected audio information. In addition, the position of each acoustic transducer 120 of the microphone array may vary. For example, the position of an acoustic transducer 120 may include a defined position on the user, a defined coordinate on frame 110, an orientation associated with each acoustic transducer 120, or some combination thereof.
[0045] Acoustic transducers 120(A) and 120(B) may be positioned on different parts of the user’s ear, such as behind the pinna, behind the tragus, and/or within the auricle or fossa. Or, there may be additional acoustic transducers 120 on or surrounding the ear in addition to acoustic transducers 120 inside the ear canal. Having an acoustic transducer 120 positioned next to an ear canal of a user may enable the microphone array to collect information on how sounds arrive at the ear canal. By positioning at least two of acoustic transducers 120 on either side of a user’s head (e.g., as binaural microphones), augmented-reality device 100 may simulate binaural hearing and capture a 3D stereo sound field around about a user’s head. In some embodiments, acoustic transducers 120(A) and 120(B) may be connected to augmented-reality system 100 via a wired connection 130, and in other embodiments acoustic transducers 120(A) and 120(B) may be connected to augmented-reality system 100 via a wireless connection (e.g., a Bluetooth connection). In still other embodiments, acoustic transducers 120(A) and 120(B) may not be used at all in conjunction with augmented-reality system 100.
[0046] Acoustic transducers 120 on frame 110 may be positioned in a variety of different ways, including along the length of the temples, across the bridge, above or below display devices 115(A) and 115(B), or some combination thereof. Acoustic transducers 120 may also be oriented such that the microphone array is able to detect sounds in a wide range of directions surrounding the user wearing the augmented-reality system 100. In some embodiments, an optimization process may be performed during manufacturing of augmented-reality system 100 to determine relative positioning of each acoustic transducer 120 in the microphone array.
[0047] In some examples, augmented-reality system 100 may include or be connected to an external device (e.g., a paired device), such as neckband 105. Neckband 105 generally represents any type or form of paired device. Thus, the following discussion of neckband 105 may also apply to various other paired devices, such as charging cases, smart watches, smart phones, wrist bands, other wearable devices, hand-held controllers, tablet computers, laptop computers, other external compute devices, etc.
[0048] As shown, neckband 105 may be coupled to eyewear device 102 via one or more connectors. The connectors may be wired or wireless and may include electrical and/or non-electrical (e.g., structural) components. In some cases, eyewear device 102 and neckband 105 may operate independently without any wired or wireless connection between them. While FIG. 1 illustrates the components of eyewear device 102 and neckband 105 in example locations on eyewear device 102 and neckband 105, the components may be located elsewhere and/or distributed differently on eyewear device 102 and/or neckband 105. In some embodiments, the components of eyewear device 102 and neckband 105 may be located on one or more additional peripheral devices paired with eyewear device 102, neckband 105, or some combination thereof.
[0049] Pairing external devices, such as neckband 105, with augmented-reality eyewear devices may enable the eyewear devices to achieve the form factor of a pair of glasses while still providing sufficient battery and computation power for expanded capabilities. Some or all of the battery power, computational resources, and/or additional features of augmented-reality system 100 may be provided by a paired device or shared between a paired device and an eyewear device, thus reducing the weight, heat profile, and form factor of the eyewear device overall while still retaining desired functionality. For example, neckband 105 may allow components that would otherwise be included on an eyewear device to be included in neckband 105 since users may tolerate a heavier weight load on their shoulders than they would tolerate on their heads. Neckband 105 may also have a larger surface area over which to diffuse and disperse heat to the ambient environment. Thus, neckband 105 may allow for greater battery and computation capacity than might otherwise have been possible on a stand-alone eyewear device. Since weight carried in neckband 105 may be less invasive to a user than weight carried in eyewear device 102, a user may tolerate wearing a lighter eyewear device and carrying or wearing the paired device for greater lengths of time than a user would tolerate wearing a heavy standalone eyewear device, thereby enabling users to more fully incorporate artificial reality environments into their day-to-day activities.
[0050] Neckband 105 may be communicatively coupled with eyewear device 102 and/or to other devices. These other devices may provide certain functions (e.g., tracking, localizing, depth mapping, processing, storage, etc.) to augmented-reality system 100. In the embodiment of FIG. 1, neckband 105 may include two acoustic transducers (e.g., 120(I) and 120(J)) that are part of the microphone array (or potentially form their own microphone subarray). Neckband 105 may also include a controller 125 and a power source 135.
[0051] Acoustic transducers 120(I) and 120(J) of neckband 105 may be configured to detect sound and convert the detected sound into an electronic format (analog or digital). In the embodiment of FIG. 1, acoustic transducers 120(I) and 120(J) may be positioned on neckband 105, thereby increasing the distance between the neckband acoustic transducers 120(I) and 120(J) and other acoustic transducers 120 positioned on eyewear device 102. In some cases, increasing the distance between acoustic transducers 120 of the microphone array may improve the accuracy of beamforming performed via the microphone array. For example, if a sound is detected by acoustic transducers 120(C) and 120(D) and the distance between acoustic transducers 120(C) and 120(D) is greater than, e.g., the distance between acoustic transducers 120(D) and 120(E), the determined source location of the detected sound may be more accurate than if the sound had been detected by acoustic transducers 120(D) and 120(E).
[0052] Controller 125 of neckband 105 may process information generated by the sensors on neckband 105 and/or augmented-reality system 100. For example, controller 125 may process information from the microphone array that describes sounds detected by the microphone array. For each detected sound, controller 125 may perform a direction-of-arrival (DOA) estimation to estimate a direction from which the detected sound arrived at the microphone array. As the microphone array detects sounds, controller 125 may populate an audio data set with the information. In embodiments in which augmented-reality system 100 includes an inertial measurement unit, controller 125 may compute all inertial and spatial calculations from the IMU located on eyewear device 102. A connector may convey information between augmented-reality system 100 and neckband 105 and between augmented-reality system 100 and controller 125. The information may be in the form of optical data, electrical data, wireless data, or any other transmittable data form. Moving the processing of information generated by augmented-reality system 100 to neckband 105 may reduce weight and heat in eyewear device 102, making it more comfortable to the user.
[0053] Power source 135 in neckband 105 may provide power to eyewear device 102 and/or to neckband 105. Power source 135 may include, without limitation, lithium ion batteries, lithium-polymer batteries, primary lithium batteries, alkaline batteries, or any other form of power storage. In some cases, power source 135 may be a wired power source. Including power source 135 on neckband 105 instead of on eyewear device 102 may help better distribute the weight and heat generated by power source 135.
[0054] As noted, some artificial reality systems may, instead of blending an artificial reality with actual reality, substantially replace one or more of a user’s sensory perceptions of the real world with a virtual experience. One example of this type of system is a head-worn display system, such as virtual-reality system 200 in FIG. 2, that mostly or completely covers a user’s field of view. Virtual-reality system 200 may include a front rigid body 202 and a band 204 shaped to fit around a user’s head. Virtual-reality system 200 may also include output audio transducers 206(A) and 206(B). Furthermore, while not shown in FIG. 2, front rigid body 202 may include one or more electronic elements, including one or more electronic displays, one or more inertial measurement units (IMUS), one or more tracking emitters or detectors, and/or any other suitable device or system for creating an artificial-reality experience.
[0055] Artificial reality systems may include a variety of types of visual feedback mechanisms. For example, display devices in augmented-reality system 100 and/or virtual-reality system 200 may include one or more liquid crystal displays (LCDs), light emitting diode (LED) displays, organic LED (OLED) displays, digital light project (DLP) micro-displays, liquid crystal on silicon (LCoS) micro-displays, and/or any other suitable type of display screen. These artificial reality systems may include a single display screen for both eyes or may provide a display screen for each eye, which may allow for additional flexibility for varifocal adjustments or for correcting a user’s refractive error. Some of these artificial reality systems may also include optical subsystems having one or more lenses (e.g., conventional concave or convex lenses, Fresnel lenses, adjustable liquid lenses, etc.) through which a user may view a display screen. These optical subsystems may serve a variety of purposes, including to collimate (e.g., make an object appear at a greater distance than its physical distance), to magnify (e.g., make an object appear larger than its actual size), and/or to relay (to, e.g., the viewer’s eyes) light. These optical subsystems may be used in a non-pupil-forming architecture (such as a single lens configuration that directly collimates light but results in so-called pincushion distortion) and/or a pupil-forming architecture (such as a multi-lens configuration that produces so-called barrel distortion to nullify pincushion distortion).
[0056] In addition to or instead of using display screens, some of the artificial reality systems described herein may include one or more projection systems. For example, display devices in augmented-reality system 100 and/or virtual-reality system 200 may include micro-LED projectors that project light (using, e.g., a waveguide) into display devices, such as clear combiner lenses that allow ambient light to pass through. The display devices may refract the projected light toward a user’s pupil and may enable a user to simultaneously view both artificial reality content and the real world. The display devices may accomplish this using any of a variety of different optical components, including waveguide components (e.g., holographic, planar, diffractive, polarized, and/or reflective waveguide elements), light-manipulation surfaces and elements (such as diffractive, reflective, and refractive elements and gratings), coupling elements, etc. Artificial reality systems may also be configured with any other suitable type or form of image projection system, such as retinal projectors used in virtual retina displays.
[0057] The artificial reality systems described herein may also include various types of computer vision components and subsystems. For example, augmented-reality system 100 and/or virtual-reality system 200 may include one or more optical sensors, such as two-dimensional (2D) or 3D cameras, structured light transmitters and detectors, time-of-flight depth sensors, single-beam or sweeping laser rangefinders, 3D LiDAR sensors, and/or any other suitable type or form of optical sensor. An artificial reality system may process data from one or more of these sensors to identify a location of a user, to map the real world, to provide a user with context about real-world surroundings, and/or to perform a variety of other functions.
[0058] The artificial reality systems described herein may also include one or more input and/or output audio transducers. Output audio transducers may include voice coil speakers, ribbon speakers, electrostatic speakers, piezoelectric speakers, bone conduction transducers, cartilage conduction transducers, tragus-vibration transducers, and/or any other suitable type or form of audio transducer. Similarly, input audio transducers may include condenser microphones, dynamic microphones, ribbon microphones, and/or any other type or form of input transducer. In some embodiments, a single transducer may be used for both audio input and audio output.
[0059] In some embodiments, the artificial reality systems described herein may also include tactile (i.e., haptic) feedback systems, which may be incorporated into headwear, gloves, body suits, handheld controllers, environmental devices (e.g., chairs, floormats, etc.), and/or any other type of device or system. Haptic feedback systems may provide various types of cutaneous feedback, including vibration, force, traction, texture, and/or temperature. Haptic feedback systems may also provide various types of kinesthetic feedback, such as motion and compliance. Haptic feedback may be implemented using motors, piezoelectric actuators, fluidic systems, and/or a variety of other types of feedback mechanisms. Haptic feedback systems may be implemented independent of other artificial reality devices, within other artificial reality devices, and/or in conjunction with other artificial reality devices.
[0060] By providing haptic sensations, audible content, and/or visual content, artificial reality systems may create an entire virtual experience or enhance a user’s real-world experience in a variety of contexts and environments. For instance, artificial reality systems may assist or extend a user’s perception, memory, or cognition within a particular environment. Some systems may enhance a user’s interactions with other people in the real world or may enable more immersive interactions with other people in a virtual world. Artificial reality systems may also be used for educational purposes (e.g., for teaching or training in schools, hospitals, government organizations, military organizations, business enterprises, etc.), entertainment purposes (e.g., for playing video games, listening to music, watching video content, etc.), and/or for accessibility purposes (e.g., as hearing aids, visual aids, etc.). The embodiments disclosed herein may enable or enhance a user’s artificial reality experience in one or more of these contexts and environments and/or in other contexts and environments.
[0061] Some augmented-reality systems may map a user’s and/or device’s environment using techniques referred to as “simultaneous location and mapping” (SLAM). SLAM mapping and location identifying techniques may involve a variety of hardware and software tools that can create or update a map of an environment while simultaneously keeping track of a user’s location within the mapped environment. SLAM may use many different types of sensors to create a map and determine a user’s position within the map.
[0062] SLAM techniques may, for example, implement optical sensors to determine a user’s location. Radios including Wi-Fi, Bluetooth, global positioning system (GPS), cellular or other communication devices may be also used to determine a user’s location relative to a radio transceiver or group of transceivers (e.g., a Wi-Fi router or group of GPS satellites). Acoustic sensors such as microphone arrays or 2D or 3D sonar sensors may also be used to determine a user’s location within an environment. Augmented-reality and virtual-reality devices (such as systems 100 and 200 of FIGS. 1 and 2, respectively) may incorporate any or all of these types of sensors to perform SLAM operations such as creating and continually updating maps of the user’s current environment. In at least some of the embodiments described herein, SLAM data generated by these sensors may be referred to as “environmental data” and may indicate a user’s current environment. This data may be stored in a local or remote data store (e.g., a cloud data store) and may be provided to a user’s AR/VR device on demand.
[0063] When the user is wearing an augmented-reality headset or virtual-reality headset in a given environment, the user may be interacting with other users or other electronic devices that serve as audio sources. In some cases, it may be desirable to determine where the audio sources are located relative to the user and then present the audio sources to the user as if they were coming from the location of the audio source. The process of determining where the audio sources are located relative to the user may be referred to as “localization,” and the process of rendering playback of the audio source signal to appear as if it is coming from a specific direction may be referred to as “spatialization.”
[0064] Localizing an audio source may be performed in a variety of different ways. In some cases, an augmented-reality or virtual-reality headset may initiate a DOA analysis to determine the location of a sound source. The DOA analysis may include analyzing the intensity, spectra, and/or arrival time of each sound at the artificial-reality device to determine the direction from which the sounds originated. The DOA analysis may include any suitable algorithm for analyzing the surrounding acoustic environment in which the artificial-reality device is located.
[0065] For example, the DOA analysis may be designed to receive input signals from a microphone and apply digital signal processing algorithms to the input signals to estimate the direction of arrival. These algorithms may include, for example, delay and sum algorithms where the input signal is sampled, and the resulting weighted and delayed versions of the sampled signal are averaged together to determine a direction of arrival. A least mean squared (LMS) algorithm may also be implemented to create an adaptive filter. This adaptive filter may then be used to identify differences in signal intensity, for example, or differences in time of arrival. These differences may then be used to estimate the direction of arrival. In another embodiment, the DOA may be determined by converting the input signals into the frequency domain and selecting specific bins within the time-frequency (TF) domain to process. Each selected TF bin may be processed to determine whether that bin includes a portion of the audio spectrum with a direct-path audio signal. Those bins having a portion of the direct-path signal may then be analyzed to identify the angle at which a microphone array received the direct-path audio signal. The determined angle may then be used to identify the direction of arrival for the received input signal. Other algorithms not listed above may also be used alone or in combination with the above algorithms to determine DOA.
[0066] In some embodiments, different users may perceive the source of a sound as coming from slightly different locations. This may be the result of each user having a unique head-related transfer function (HRTF), which may be dictated by a user’s anatomy including ear canal length and the positioning of the ear drum. The artificial-reality device may provide an alignment and orientation guide, which the user may follow to customize the sound signal presented to the user based on their unique HRTF. In some embodiments, an artificial-reality device may implement one or more microphones to listen to sounds within the user’s environment. The augmented-reality or virtual-reality headset may use a variety of different array transfer functions (e.g., any of the DOA algorithms identified above) to estimate the direction of arrival for the sounds. Once the direction of arrival has been determined, the artificial-reality device may play back sounds to the user according to the user’s unique HRTF. Accordingly, the DOA estimation generated using the array transfer function (ATF) may be used to determine the direction from which the sounds are to be played from. The playback sounds may be further refined based on how that specific user hears sounds according to the HRTF.
[0067] In addition to or as an alternative to performing a DOA estimation, an artificial-reality device may perform localization based on information received from other types of sensors. These sensors may include cameras, IR sensors, heat sensors, motion sensors, GPS receivers, or in some cases, sensors that detect a user’s eye movements. For example, as noted above, an artificial-reality device may include an eye tracker or gaze detector that determines where the user is looking. Often, the user’s eyes will look at the source of the sound, if only briefly. Such clues provided by the user’s eyes may further aid in determining the location of a sound source. Other sensors such as cameras, heat sensors, and IR sensors may also indicate the location of a user, the location of an electronic device, or the location of another sound source. Any or all of the above methods may be used individually or in combination to determine the location of a sound source and may further be used to update the location of a sound source over time.
[0068] Some embodiments may implement the determined DOA to generate a more customized output audio signal for the user. For instance, an “acoustic transfer function” may characterize or define how a sound is received from a given location. More specifically, an acoustic transfer function may define the relationship between parameters of a sound at its source location and the parameters by which the sound signal is detected (e.g., detected by a microphone array or detected by a user’s ear). An artificial-reality device may include one or more acoustic sensors that detect sounds within range of the device. A controller of the artificial-reality device may estimate a DOA for the detected sounds (using, e.g., any of the methods identified above) and, based on the parameters of the detected sounds, may generate an acoustic transfer function that is specific to the location of the device. This customized acoustic transfer function may thus be used to generate a spatialized output audio signal where the sound is perceived as coming from a specific location.
[0069] Indeed, once the location of the sound source or sources is known, the artificial-reality device may re-render (i.e., spatialize) the sound signals to sound as if coming from the direction of that sound source. The artificial-reality device may apply filters or other digital signal processing that alter the intensity, spectra, or arrival time of the sound signal. The digital signal processing may be applied in such a way that the sound signal is perceived as originating from the determined location. The artificial-reality device may amplify or subdue certain frequencies or change the time that the signal arrives at each ear. In some cases, the artificial-reality device may create an acoustic transfer function that is specific to the location of the device and the detected direction of arrival of the sound signal. In some embodiments, the artificial-reality device may re-render the source signal in a stereo device or multi-speaker device (e.g., a surround sound device). In such cases, separate and distinct audio signals may be sent to each speaker. Each of these audio signals may be altered according to the user’s HRTF and according to measurements of the user’s location and the location of the sound source to sound as if they are coming from the determined location of the sound source. Accordingly, in this manner, the artificial-reality device (or speakers associated with the device) may re-render an audio signal to sound as if originating from a specific location.
[0070] FIG. 3 depicts one embodiment of a system 300 for isolating an IMU. In this example, system 300 isolates the IMU from mechanical and audio vibrations, such as those occurring as a result of high audio volume on headset speakers. FIG. 3 depicts a camera frame 302 or other supporting structure of an augmented-reality system such as the system 100 of FIG. 1 or the system 200 of FIG. 2. For example, the camera frame 302 may be coupled to and/or include a supporting structure for components of the eyewear device 102 of FIG. 1 or the virtual-reality system 200 of FIG. 2. The camera frame 302 may include metal or other rigid material and may include a circuit board coupled to the camera frame 302. Any suitable circuit board may be used such as, for example, a motherboard. FIG. 3 also depicts a motherboard 304 coupled to the camera frame 302 and an IMU 306 coupled to the motherboard 304.
[0071] The IMU 306, being coupled to the camera frame 302 via the motherboard 304 as depicted, may be suitably positioned in the system 100 or 200 to generate gyroscopic data but may also be prone to gyroscopic drift due to audio and mechanical vibrations of nearby components such as audio speakers. Therefore, the system 300 includes an isolation assembly 308 disposed between the motherboard 304 and a base surface 310 of the camera frame 302. Referring also to FIG. 4, which shows a perspective view of the system 300, the isolation assembly 308 may be located in a region of the motherboard 304 on which the IMU 306 is coupled (e.g., the depicted embodiment shows the isolation assembly 308 disposed between the motherboard 304 and base surface 310 in a corner of the motherboard 304). The isolation assembly 308 may serve to isolate the IMU 306 from interfering vibrations by, for example, stiffening the nearby region of the motherboard 304 and/or reducing vibrations in at least a portion of the motherboard adjacent to the IMU.
[0072] Referring to FIG. 5, which shows a cross sectional side view of the system 300, the isolation assembly 308 may be sandwiched between the motherboard 304 and the camera frame 302. The isolation assembly 308 may be adhered (e.g., using pressure sensitive adhesive) to an underside of the motherboard 304. Referring to FIG. 4 and FIG. 5, in one embodiment, the isolation assembly 308 may include a rigid piece 312. The rigid piece 312 may be adhered (e.g., using pressure sensitive adhesive) to an underside of the motherboard 304. Any suitable rigid piece 312 may be used in a variety of compositions and configurations and in varying degrees of rigidity. Some examples of materials that may be used include, without limitation, plastic, composite or metal. In some examples, as used herein, the term “shim” refers to the rigid piece 312.
[0073] The depicted isolation assembly 308 also includes a compressible foam layer 314 disposed between the rigid piece 312 and the base surface 310 of the camera frame 302 such that the foam layer 314 and the rigid piece 312 are sandwiched between the motherboard 304 and the base surface 310. The foam layer 314 may be compressed against the base surface 310 to further absorb and reduce vibration that could affect the IMU 306. The foam layer 314 may be adhered (e.g., via a pressure sensitive adhesive) to an underside of the rigid piece 312. In certain examples, the foam layer 314 occupies a gap of approximately 1.5 mm maximum between the base surface 310 and the rigid piece 312, although any suitable gap may be used. In other examples, the foam layer 314 may be approximately 1 mm in height with approximately 50% compression. Although the depicted isolation assembly 308 includes both a rigid piece 312 and a foam layer 314, in other embodiments, the isolation assembly 308 may include a foam layer 314 alone. Any suitable material and configuration may be used for the foam layer 314. Because certain foams may dampen noise at targeted frequencies, a type of foam may be selected depending on the desired frequencies to target. The foam may have, in some examples, an open and/or closed cell structure. Some examples of foam materials may include, without limitation, polyurethane foams (e.g., polyurethane foams sold under the trademark PORON.RTM. from Rogers Corporation, polyurethane foams sold under the trademark E-A-R.TM. from 3M Corporation, polyurethane foams sold by General Plastics Manufacturing Company, etc.). Example materials and configurations of the isolation assembly will be referenced below in a detailed description of FIG. 6.
[0074] Referring back to FIG. 3 and FIG. 4, in the depicted embodiment, the motherboard 304 includes a slot 316 to allow the rigid piece 312 to partially surround the IMU 306 by way of a protruding sidewall 318 positioned in the slot 316 of the motherboard 304. Although the depicted embodiment of the rigid piece 312 includes a protruding sidewall 318, in certain examples, the rigid piece 312 lacks a sidewall. Furthermore, the depicted embodiment has a single protruding sidewall 318; however, in other embodiments, the rigid piece 312 may include one or more other protruding sidewalls 318 extending along peripheral edges of portions of the motherboard 304. In some examples, to achieve the desired rigidity while minimizing or eliminating the use of sidewalls, a more rigid material for the rigid piece 312 may be used.
[0075] The isolation assembly 308 may serve to locally isolate and stiffen an area of the motherboard 304 around the IMU 306, putting the resonance and vibration frequencies of the IMU 306, caused by, for example, audio from nearby speakers, out of the IMU frequency recording range used for tracking and data acquisition purposes. This may improve the functional quality of the augmented-reality system 100 or virtual-reality system 200 and improve the user experience.
[0076] FIG. 6 depicts a table 600 having several examples of materials and configurations of the isolation assembly. Table 600 includes columns for foam type, foam thickness, a gap (which may be at least partially occupied by the compressed foam), the compression percentage, whether the rigid piece 312 (referred to as a “shim”) has a sidewall, and the corresponding drawing in FIGS. 27-39B showing sample testing results of the particular configuration.
Example Embodiments
Example 1
[0077] A system may include a circuit board, an inertial measurement unit (“IMU”) coupled to the circuit board, a frame, and an isolation assembly disposed between the circuit board and the frame, with the isolation assembly configured to reduce vibrations in least a portion of the circuit board adjacent to the IMU.
Example 2
[0078] the system of Example 1, where the isolation assembly may include one or more of a rigid piece or a compressible foam layer.
Example Shock-Absorbing Devices and Related Systems and Methods
[0079] A traditional electronic device (e.g., an image sensor) may include components configured to protect the electronic device from ingression of foreign substances such as dust, water, etc. For example, a traditional image sensor may include seals, coatings, housings, mountings, etc., that are configured and positioned to protect the image sensor and ensure its reliability. However, these traditional image sensor components may be deficient in preventing damage to the image sensor from forces associated with a shock impact. Systems, devices, and methods of the present disclosure may overcome these deficiencies. For example, embodiments of the present disclosure may include a shock-absorbing device that is configured to surround an image sensor and absorb a shock impact. The absorption of the impact force by the shock-absorbing device may substantially maintain a structural integrity of the image sensor when the image sensor is subjected to the impact.
[0080] Artificial-reality system often include a head-mounted display (HMD) that can be worn by a user while playing a video game or carrying out some other artificial-reality activity. Due to the active nature of many artificial-reality games or activities, the user may accidentally drop the HMD. The user may also accidentally drop the HMD while holding the HMD, putting the HMD on, or taking the HMD off. In some embodiments, an artificial-reality system may include an image sensor mounted on and protruding from a surface of the HMD. Given the possibility that the HMD may be dropped, the instant disclosure identifies and addresses a need for mounting and configuring the image sensors on the HMD in such a way as to prevent the image sensors from experiencing impact damage when the HMD is dropped. In some examples, these image sensors may include a compressible shock-absorbing device mounted on the image sensor to prevent damage to the image sensor when the HMD is dropped.
[0081] The following will provide, with reference to FIGS. 7-11, detailed descriptions of systems and devices for protecting electronic devices, such as image sensors by disposing a shock-absorbing device substantially around a portion of the image sensor.
[0082] FIG. 7 illustrates an example HMD 700 including image sensors 702 mounted to (e.g., extending from) HMD 700, according to at least one embodiment of the present disclosure. In some embodiments, image sensors 702 are mounted on and protruding from a surface (e.g., a front surface, a corner surface, etc.) of HMD 700. HMD 700 may include virtual-reality system 1300 of FIG. 13 or HMD 102 of FIG. 1. Image sensors 702 may include sensor 140 of FIG. 1. A compressible shock-absorbing device may be mounted on image sensors 702. As will be described with reference to FIGS. 8, 9, and 11 below, the shock-absorbing device may be configured to substantially maintain the structural integrity of image sensors 702 in case an impact force is imparted on image sensors 702. In some embodiments, image sensors 702 may protrude from a surface (e.g., the front surface) of HMD 700 so as to increase a field of view of image sensors 702. In some examples, image sensors 702 may be pivotally and/or translationally mounted to HMD 700 to pivot image sensors 702 at a range of angles and/or to allow for translation in multiple directions in response to an impact. For example, image sensors 702 may protrude from the front surface of HMD 700 so as to give image sensors 702 a 180-degree field of view of objects (e.g., a surrounding real-world environment).
[0083] FIG. 8 is a cross-sectional view of an image sensor 800, according to at least one embodiment of the present disclosure. Image sensor 800 may include a lens, a lens ring, a shock-absorbing device, a barrel, and a flexible connector. When image sensor 800 experiences a shock impact, the shock-absorbing device may absorb, distribute, transfer, dampen, and/or reduce the shock impact force such that the components of image sensor 800 remain structurally and functionally intact. The shock-absorbing device may prevent the transfer of impact energy between the components of image sensor 800 by absorbing and/or dampening the impact energy. The shock-absorbing device may dampen the impact energy by the dispersion or disruption of the energy caused by shock’s impact forces. The shock-absorbing device may absorb the energy from the shock by decreasing the amplitude (strength) of the shock energy’s wave or by changing the shock wave’s frequency. The absorption of impact energy may reduce or eliminate adverse effects or damage to the components (e.g., the barrel) of image sensor 800 caused by shock, thereby retaining the structural integrity of image sensor 800.
[0084] Image sensor 800 may include image sensor 702 of FIG. 7 that is integrated into HMD 700. In some embodiments, image sensor 800 may include a shock-absorbing device comprising a resilient material that is configured to compress inwards towards the lens ring of image sensor 800 when image sensor 800 experiences a shock force, such as resulting from dropping of the HMD. The resilient material may be configured to return to its original shape once the force is removed so as to return image sensor 800 to a position protruding forward from the front surface of the HMD. This configuration may improve durability of the coupling of image sensor 800 to the HMD so as to reduce the possibility of image sensor 800 being damaged or separated from the HMD upon force of impact.
[0085] In some examples, the shock-absorbing device of image sensor 800 may include a shock-absorbing material with a structure capable of distributing an applied stress (e.g., stress resulting from a shock force acting on image sensor 800 when the HMD is dropped). In some embodiments, the shock-absorbing material may include a material capable of converting the kinetic energy of the shock into another form of energy, for example heat energy, which is then dissipated. The shock-absorbing device may transfer the impact energy to another component of image sensor 800 and/or to another component of the HMD. For example, the shock-absorbing device may transfer the impact energy to a base of image sensor 800.
[0086] The shock-absorbing material may include, without limitation, a polymer material, an elastomer, a plastic, a polyethylene material, a polycarbonate material, an acrylonitrile butadiene styrene (ABS) material, a visco-elastic polymer material, a polymer matrix composite material, a fiber-reinforced polymer composite material, a polyurethane material, a butyl rubber material, a neoprene rubber material, or a combination thereof. This configuration has the advantage that the shock-absorbing material will compress upon receiving the initial shock and then return to its original shape and configuration when the stress is removed. This flexibility allows the shock-absorbing device to reversibly compress and/or extend. In some examples, the shock-absorbing material may have a compressive modulus in the range of 1-2, 2-3, 3-4, 4-5, 5-6, 6-7, or 7-8.
[0087] In some examples, image sensor 800 may include a flexible connector and/or flexible cable for data communications with a processor of the HMD. In the event of dropping of the HMD, the flat flexible connector and/or flexible cable may be configured to flex accordingly to prevent a disconnection of the electrical connection between the image sensor 800 and the HMD.
[0088] FIG. 9 is a cross-sectional view of an image sensor 900, according to at least one embodiment of the present disclosure. Image sensor 900 may include a lens, a lens ring, a shock-absorbing device 901, a barrel, an adhesive 903, a sleeve 905, and a flexible connector. When image sensor 900 experiences a shock impact, shock-absorbing device 901 may absorb, distribute, transfer, dampen, and/or reduce the shock impact force such that the components of image sensor 900 remain structurally and functionally intact. Shock-absorbing device 901 may prevent the transfer of impact energy between the components of image sensor 900 by absorbing and/or dampening the impact energy. Shock-absorbing device 901 may dampen the impact energy by the dispersion or disruption of the energy caused by shock’s impact forces. Shock-absorbing device 901 may absorb the energy from the shock by decreasing the amplitude (strength) of the shock energy’s wave or by changing the shock wave’s frequency. The absorption of impact energy may reduce or eliminate adverse effects or damage to the components of image sensor 900 caused by shock, thereby retaining the structural integrity of image sensor 900.
[0089] Image sensor 900 may include image sensor 702 of FIG. 7 that is integrated into HMD 700. In some embodiments, image sensor 900 may include shock-absorbing device 901 that includes a resilient material configured to compress (e.g., compress inwards towards the lens ring of image sensor 900) when image sensor 900 experiences a shock impact resulting from dropping of the HMD. The resilient material may be configured to return to its original shape once the impact force is removed so as to return image sensor 900 to a position protruding forward from the front surface of the HMD. This configuration may improve durability of the coupling of image sensor 900 to the HMD to reduce the possibility of image sensor 900 being damaged or separated from the HMD upon force of impact.
[0090] In some examples, shock-absorbing device 901 of image sensor 900 may include a shock-absorbing material with a structure capable of distributing an applied stress (e.g., stress resulting from a shock or impact force acting on image sensor 900 when the HMD is dropped). In some embodiments, the shock-absorbing material may include a material capable of converting the kinetic energy of the shock into another form of energy, for example heat energy, which is then dissipated. Shock-absorbing device 901 may transfer the impact energy to another component of image sensor 900 and/or to another component of the HMD. For example, shock-absorbing device 901 may transfer the impact energy to a base of image sensor 900. Image sensor 900 may include sleeve 905. Sleeve 905 may be positioned around the entire perimeter of image sensor 900, positioned around a portion of the perimeter of image sensor 900, or positioned in proximity to image sensor 900. Sleeve 905 may include any type of rigid material including, without limitation, metal, ABS plastic, ceramics, carbides, or a combination thereof. In some examples, shock-absorbing device 901 may transfer the impact energy to sleeve 905 of image sensor 900. Additionally or alternatively, sleeve 905 may absorb, distribute, transfer, dampen, and/or reduce the shock impact force such that the components of image sensor 900 remain structurally and functionally intact. In some examples, shock-absorbing device 901 may be assembled on image sensor 900 after image sensor 900 is installed in an HMD (e.g., HMD 700 of FIG. 7). For example, shock-absorbing device 901 may be installed on image sensor 900 by expanding a radius of shock-absorbing device 901 and fitting shock-absorbing device 901 around a portion (e.g., the barrel) of image sensor 900. In some examples, shock-absorbing device 901 may be adhered to image sensor 900 using an adhesive 903 disposed between shock-absorbing device 901 and image sensor 900. In some examples, the combination of sleeve 905, shock-absorbing device 901, and adhesive 903 may provide structural support to image sensor 900.
[0091] FIG. 10 is a plan view of an image sensor 1000, according to at least one embodiment of the present disclosure. Image sensor 1000 may include shock-absorbing device 1001. Shock-absorbing device 1001 may be positioned to substantially surround a portion (e.g., the barrel) of image sensor 1000. In some examples, when image sensor 1000 experiences a shock impact, shock-absorbing device 1001 may absorb, distribute, transfer, dampen, and/or reduce the shock impact force such that the components of image sensor 1000 remain structurally and functionally intact. Shock-absorbing device 1001 may prevent the transfer of impact energy between the components of image sensor 1000 by absorbing and/or dampening the impact energy. Shock-absorbing device 1001 may dampen the impact energy by the dispersion or disruption of the energy caused by shock’s impact forces. Shock-absorbing device 1001 may absorb the energy from the shock by decreasing the amplitude (strength) of the shock energy’s wave or by changing the shock wave’s frequency. The absorption of impact energy may reduce or eliminate adverse effects or damage to the components of image sensor 1000 caused by shock, thereby retaining the structural integrity of image sensor 1000. In some examples, shock-absorbing device 1001 may be assembled on image sensor 1000 after image sensor 1000 is installed in an HMD (e.g., HMD 700 of FIG. 7). For example, shock-absorbing device 1001 may be installed on image sensor 1000 by expanding a radius of shock-absorbing device 1001 and fitting shock-absorbing device 1001 around a portion of image sensor 1000. In some examples, shock-absorbing device 1001 may include tabs and/or wings to assist in the assembly process of installing shock-absorbing device 1001 onto image sensor 1000. In some examples, shock-absorbing device 1001 may be adhered to image sensor 1000 using an adhesive disposed between shock-absorbing device 1001 and image sensor 1000.
[0092] FIG. 11 is a perspective view of an image sensor 1100, according to at least one embodiment of the present disclosure. Image sensor 1100 may include shock-absorbing device 1101. Shock-absorbing device 1101 may be positioned to substantially surround a portion of image sensor 1100. For example, shock-absorbing device 1101 may be positioned to substantially surround a barrel portion of image sensor 1100. In some examples, when image sensor 1100 experiences a shock impact, shock-absorbing device 1101 may absorb, distribute, transfer, dampen, and/or reduce the shock impact force such that the components of image sensor 1100 remain structurally and functionally intact. Shock-absorbing device 1101 may prevent the transfer of impact energy between the components of image sensor 1100 by absorbing and/or dampening the impact energy. Shock-absorbing device 1101 may dampen the impact energy by the dispersion or disruption of the energy caused by shock’s impact forces. Shock-absorbing device 1101 may absorb the energy from the shock by decreasing the amplitude (strength) of the shock energy’s wave or by changing the shock wave’s frequency. The absorption of impact energy may reduce or eliminate adverse effects or damage to the components of image sensor 1100 caused by shock, thereby retaining the structural integrity of image sensor 1100. In some examples, shock-absorbing device 1101 may be installed on image sensor 1100 after image sensor 1100 is installed in an HMD (e.g., HMD 700 of FIG. 7). For example, shock-absorbing device 1101 may be installed on image sensor 1100 by expanding a radius of shock-absorbing device 1101 and fitting shock-absorbing device 1101 around a portion of image sensor 1100. Shock-absorbing device 1101 may be configured as a “c-clip” that partially expands in opening 1105 in order to facilitate assembly of shock-absorbing device 1101 onto image sensor 1100. Shock-absorbing device 1101 may include tabs and/or wings positioned adjacent to opening 1105 to assist in the assembly process of installing shock-absorbing device 1101 onto image sensor 1100. In some examples, shock-absorbing device 1101 may be adhered to image sensor 1100 using an adhesive disposed between shock-absorbing device 1101 and image sensor 1100.
[0093] Embodiments of the present disclosure may include a system that includes a head-mounted display, an image sensor, and a shock-absorbing device. The shock-absorbing device may be shaped in the form of a ring (e.g., a C-clip) that includes a shock-absorbing material. The shock-absorbing device may be secured to the image sensor by an adhesive material. The shock-absorbing device may be shaped and configured to partially surround a portion of the image sensor. The image sensor may be integrated into the front portion of an HMD. When an impact force is imparted to the image sensor, the shock-absorbing device is configured to transfer the impact force to a base of the image sensor thereby maintaining the structural integrity of the image sensor preventing damage to the image sensor.
[0094] As noted, the artificial-reality systems 100 and 200 may be used with a variety of other types of devices to provide a more compelling artificial-reality experience. These devices may be haptic interfaces with transducers that provide haptic feedback and/or that collect haptic information about a user’s interaction with an environment. The artificial-reality systems disclosed herein may include various types of haptic interfaces that detect or convey various types of haptic information, including tactile feedback (e.g., feedback that a user detects via nerves in the skin, which may also be referred to as cutaneous feedback) and/or kinesthetic feedback (e.g., feedback that a user detects via receptors located in muscles, joints, and/or tendons).
[0095] Haptic feedback may be provided by interfaces positioned within a user’s environment (e.g., chairs, tables, floors, etc.) and/or interfaces on articles that may be worn or carried by a user (e.g., gloves, wristbands, etc.). As an example, FIG. 12 illustrates a vibrotactile system 1200 in the form of a wearable glove (haptic device 1210) and wristband (haptic device 1220). The haptic device 1210 and the haptic device 1220 are shown as examples of wearable devices that include a flexible, wearable textile material 1230 that is shaped and configured for positioning against a user’s hand and wrist, respectively. This disclosure also includes vibrotactile systems that may be shaped and configured for positioning against other human body parts, such as a finger, an arm, a head, a torso, a foot, or a leg. By way of example and not limitation, vibrotactile systems according to various embodiments of the present disclosure may also be in the form of a glove, a headband, an armband, a sleeve, a head covering, a sock, a shirt, or pants, among other possibilities. In some examples, the term “textile” may include any flexible, wearable material, including woven fabric, non-woven fabric, leather, cloth, a flexible polymer material, a composite material, etc.
[0096] One or more vibrotactile devices 1240 may be positioned at least partially within one or more corresponding pockets formed in the textile material 1230 of the vibrotactile system 1200. The vibrotactile devices 1240 may be positioned in locations to provide a vibrating sensation (e.g., haptic feedback) to a user of the vibrotactile system 1200. For example, the vibrotactile devices 1240 may be positioned to be against the user’s finger(s), thumb, or wrist, as shown in FIG. 12. The vibrotactile devices 1240 may, in some examples, be sufficiently flexible to conform to or bend with the user’s corresponding body part(s).
[0097] A power source 1250 (e.g., a battery) for applying a voltage to the vibrotactile devices 1240 for activation thereof may be electrically coupled to the vibrotactile devices 1240, such as via conductive wiring 1252. In some examples, each of the vibrotactile devices 1240 may be independently electrically coupled to the power source 1250 for individual activation. In some embodiments, a processor 1260 may be operatively coupled to the power source 1250 and configured (e.g., programmed) to control activation of the vibrotactile devices 1240.
[0098] The vibrotactile system 1200 may be implemented in a variety of ways. In some examples, the vibrotactile system 1200 may be a standalone system with integral subsystems and components for operation independent of other devices and systems. As another example, the vibrotactile system 1200 may be configured for interaction with another device or system 1270. For example, the vibrotactile system 1200 may, in some examples, include a communications interface 1280 for receiving and/or sending signals to the other device or system 1270. The other device or system 1270 may be a mobile device, a gaming console, an artificial-reality (e.g., virtual-reality, augmented-reality, mixed-reality) device, a personal computer, a tablet computer, a network device (e.g., a modem, a router, etc.), a handheld controller, etc. The communications interface 1280 may enable communications between the vibrotactile system 1200 and the other device or system 1270 via a wireless (e.g., Wi-Fi, Bluetooth, cellular, radio, etc.) link or a wired link. If present, the communications interface 1280 may be in communication with the processor 1260, such as to provide a signal to the processor 1260 to activate or deactivate one or more of the vibrotactile devices 1240.
[0099] The vibrotactile system 1200 may optionally include other subsystems and components, such as touch-sensitive pads 1290, pressure sensors, motion sensors, position sensors, lighting elements, and/or user interface elements (e.g., an on/off button, a vibration control element, etc.). During use, the vibrotactile devices 1240 may be configured to be activated for a variety of different reasons, such as in response to the user’s interaction with user interface elements, a signal from the motion or position sensors, a signal from the touch-sensitive pads 1290, a signal from the pressure sensors, a signal from the other device or system 1270, etc.
[0100] Although the power source 1250, the processor 1260, and the communications interface 1280 are illustrated in FIG. 12 as being positioned in the haptic device 1220, the present disclosure is not so limited. For example, one or more of the power source 1250, the processor 1260, or the communications interface 1280 may be positioned within the haptic device 1210 or within another wearable textile.
[0101] Haptic wearables, such as those shown in and described in connection with FIG. 12, may be implemented in a variety of types of artificial-reality systems and environments. FIG. 13 shows an example artificial-reality environment 1300 including one head-mounted virtual-reality display and two haptic devices (i.e., gloves), and in other embodiments any number and/or combination of these components and other components may be included in an artificial-reality system. For example, in some embodiments there may be multiple head-mounted displays each having an associated haptic device, with each head-mounted display and each haptic device communicating with the same console, portable computing device, or other computing system.
[0102] Head-mounted display 1302 generally represents any type or form of virtual-reality system, such as the virtual-reality system 200 in FIG. 2. Haptic device 1304 generally represents any type or form of wearable device, worn by a use of an artificial-reality system, that provides haptic feedback to the user to give the user the perception that he or she is physically engaging with a virtual object. In some embodiments, the haptic device 1304 may provide haptic feedback by applying vibration, motion, and/or force to the user. For example, the haptic device 1304 may limit or augment a user’s movement. To give a specific example, the haptic device 1304 may limit a user’s hand from moving forward so that the user has the perception that his or her hand has come in physical contact with a virtual wall. In this specific example, one or more actuators within the haptic advice may achieve the physical-movement restriction by pumping fluid into an inflatable bladder of the haptic device. In some examples, a user may also use the haptic device 1304 to send action requests to a console. Examples of action requests include, without limitation, requests to start an application and/or end the application and/or requests to perform a particular action within the application.
[0103] While haptic interfaces may be used with virtual-reality systems, as shown in FIG. 13, haptic interfaces may also be used with augmented-reality systems, as shown in FIG. 14. FIG. 14 is a perspective view a user 1410 interacting with an augmented-reality system 1400. In this example, the user 1410 may wear a pair of augmented-reality glasses 1420 that have one or more displays 1422 and that are paired with a haptic device 1430. The haptic device 1430 may be a wristband that includes a plurality of band elements 1432 and a tensioning mechanism 1434 that connects band elements 1432 to one another.
[0104] One or more of the band elements 1432 may include any type or form of actuator suitable for providing haptic feedback. For example, one or more of the band elements 1432 may be configured to provide one or more of various types of cutaneous feedback, including vibration, force, traction, texture, and/or temperature. To provide such feedback, the band elements 1432 may include one or more of various types of actuators. In one example, each of the band elements 1432 may include a vibrotactor (e.g., a vibrotactile actuator) configured to vibrate in unison or independently to provide one or more of various types of haptic sensations to a user. Alternatively, only a single band element or a subset of band elements may include vibrotactors.
[0105] The haptic devices 1210, 1220, 1304, and 1430 may include any suitable number and/or type of haptic transducer, sensor, and/or feedback mechanism. For example, the haptic devices 1210, 1220, 1304, and 1430 may include one or more mechanical transducers, piezoelectric transducers, and/or fluidic transducers. The haptic devices 1210, 1220, 1304, and 1430 may also include various combinations of different types and forms of transducers that work together or independently to enhance a user’s artificial-reality experience. In one example, each of the band elements 1432 of the haptic device 1430 may include a vibrotactor (e.g., a vibrotactile actuator) configured to vibrate in unison or independently to provide one or more of various types of haptic sensations to a user.
[0106] By way of non-limiting examples, the following embodiments are included in the present disclosure.
Example Embodiments
……
……
……