Apple Patent | Haptic output system
Patent: Haptic output system
Drawings: Click to check drawins
Publication Number: 20210176548
Publication Date: 20210610
Applicant: Apple
Abstract
A method of providing a haptic output includes detecting a condition; determining if a head-mounted haptic accessory comprising an array of two or more haptic actuators is being worn by a user; determining an actuation pattern for the array of haptic actuators; and in response to detecting the condition and determining that the head-mounted haptic accessory is being worn by the user, initiating the actuation pattern to produce a directional haptic output that is configured to direct the user’s attention along a direction.
Claims
1-20. (canceled)
-
A head-mounted electronic system comprising: a display device configured to display a virtual-reality environment, the virtual-reality environment including a virtual object; an audio device configured to produce audio outputs associated with the virtual-reality environment; and a haptic output system configured to produce a haptic output associated with the virtual-reality environment, wherein the haptic output indicates a virtual position of the virtual object within the virtual-reality environment.
-
The head-mounted electronic system of claim 21, wherein: the head-mounted electronic system further comprises: a sensor system configured to determine an orientation of a wearer’s head; and a processor configured to determine an actuation pattern for the haptic output system based at least in part on the virtual position of the virtual object and the orientation of the wearer’s head; and the actuation pattern is configured to direct the wearer’s attention toward the virtual position of the virtual object.
-
The head-mounted electronic system of claim 22, wherein: the haptic output system comprises a first haptic actuator and a second haptic actuator; and the actuation pattern comprises: a first actuation of the first haptic actuator; and a second actuation of the second haptic actuator.
-
The head-mounted electronic system of claim 23, wherein the first actuation occurs prior to the second actuation.
-
The head-mounted electronic system of claim 21, wherein, prior to the production of the haptic output, the virtual object is outside the wearer’s field of view of the virtual-reality environment.
-
The head-mounted electronic system of claim 21, wherein the virtual object is a notification indication.
-
The head-mounted electronic system of claim 21, wherein the virtual object is an audio source.
-
A head-mounted electronic system comprising: a head-mounted accessory comprising: a display configured to display a computer-generated virtual object to a wearer; and an array of haptic actuators configured to produce a directional haptic output that is configured to direct the wearer’s attention along a direction; and an audio system configured to produce an audio output.
-
The head-mounted electronic system of claim 28, wherein the array of haptic actuators comprises: a first haptic actuator configured to produce a first tactile sensation to a first area of the wearer’s body; and a second haptic actuator configured to produce a second tactile sensation to a second area of the wearer’s body.
-
The head-mounted electronic system of claim 29, wherein the directional haptic output comprises: a first actuation of the first haptic actuator; and a second actuation of the second haptic actuator, the second actuation produced after the first actuation of the first haptic actuator.
-
The head-mounted electronic system of claim 28, wherein the directional haptic output comprises a pattern of haptic actuations produced by the array of haptic actuators.
-
The head-mounted electronic system of claim 28, wherein: the computer-generated virtual object has a virtual position within a virtual-reality environment; and the directional haptic output is configured to direct the wearer’s attention towards the virtual position of the computer-generated virtual object in the virtual-reality environment.
-
The head-mounted electronic system of claim 32, wherein: the virtual object corresponds to an audio source; and the audio output is associated with the virtual object.
-
The head-mounted electronic system of claim 28, wherein: the head-mounted electronic system further comprises a sensing system configured to determine a change in orientation of the wearer’s head; and the directional haptic output changes in accordance with the change in the orientation of the wearer’s head.
-
The head-mounted electronic system of claim 28, wherein the audio system comprises a pair of earbuds.
-
A wearable electronic system comprising: a display configured to display a computer-generated virtual object; a first head-mounted haptic actuator configured to impart a first portion of a pattern of haptic outputs on a first area of a wearer’s head; a second head-mounted haptic actuator configured to impart a second portion of the pattern of haptic outputs on a second area of the wearer’s head, the second area different than the first area; and a head-mounted audio device configured to produce an audio output detectable by the wearer and associated with the display of the computer-generated virtual object.
-
The wearable electronic system of claim 36, wherein: the first head-mounted haptic actuator is positioned on a first side of the wearer’s head; and the second head-mounted haptic actuator is positioned on a second side of the wearer’s head, the second side opposite the first side.
-
The wearable electronic system of claim 37, further comprising a third head-mounted haptic actuator configured to impart a third portion of the pattern of haptic outputs on a third area of the wearer’s head, the third area different than the first area and the second area.
-
The wearable electronic system of claim 36, wherein the pattern of haptic outputs is configured to direct the wearer’s attention to a virtual position of the computer-generated virtual object.
-
The wearable electronic system of claim 36, wherein: the wearable electronic system further comprises: a first temple piece configured to support the wearable electronic system on a first ear of the wearer; and a second temple piece configured to support the wearable electronic system on a second ear of the wearer; and the first head-mounted haptic actuator is positioned on the first temple piece; and the second head-mounted haptic actuator is positioned on the second temple piece.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application is a continuation of U.S. patent application Ser. No. 16/191,373, filed Nov. 14, 2018, which is a nonprovisional patent application of and claims the benefit of U.S. Provisional Patent Application No. 62/736,354, Sep. 25, 2018, the disclosures of which are hereby incorporated herein by reference in their entirety.
FIELD
[0002] The described embodiments relate generally to wearable electronic devices, and, more particularly, to wearable electronic devices that produce haptic outputs that can be felt by wearers of the electronic devices.
BACKGROUND
[0003] Wearable electronic devices are increasingly ubiquitous in modern society. For example, wireless audio devices (e.g., headphones, earbuds) are worn to provide convenient listening experiences for music and other audio. Head-mounted displays are worn to provide virtual or augmented reality environments to users for gaming, productivity, entertainment, and the like. Wrist-worn devices, such as smart watches, provide convenient access to various types of information and applications, including weather information, messaging applications, activity tracking applications, and the like. Some wearable devices, such as smart watches, may use haptic outputs to provide tactile alerts to the wearer, such as to indicate that a message has been received or that an activity goal has been reached.
SUMMARY
[0004] A method of providing a haptic output includes detecting a condition, determining if a head-mounted haptic accessory comprising an array of two or more haptic actuators is being worn by a user, determining an actuation pattern for the array of haptic actuators, and in response to detecting the condition and determining that the head-mounted haptic accessory is being worn by the user, initiating the actuation pattern to produce a directional haptic output that is configured to direct the user’s attention along a direction.
[0005] The head-mounted haptic accessory may include a pair of earbuds, each earbud including an earbud body, a speaker positioned within the earbud body, and a haptic actuator positioned within the earbud body and configured to impart a haptic output to the user’s ear. Detecting the condition may include detecting a presence of an audio source in an audio signal that is sent to the pair of earbuds. The method may further include determining a virtual position of the audio source relative to the user. Initiating the actuation pattern may include initiating a first haptic output at a first earbud of the pair of earbuds and subsequently initiating a second haptic output at a second earbud of the pair of earbuds. The directional haptic output may be configured to direct the user’s attention toward the direction, which corresponds to the virtual position of the audio source. The audio signal may correspond to audio of a teleconference having multiple participants, the audio source may correspond to a participant of the multiple participants, and each respective participant of the multiple participants may have a distinct respective virtual position relative to the user.
[0006] The head-mounted haptic accessory may include an earbud including an earbud body and a haptic actuator positioned within the earbud body and comprising a movable mass, and initiating the actuation pattern may cause the haptic actuator to move the movable mass along an actuation direction that is configured to impart a reorientation force on the user.
[0007] Detecting the condition may include detecting a presence of an audio source in an audio signal that is sent to the pair of earbuds. The method may further include determining a virtual position of the audio source relative to the user, after initiating the actuation pattern, determining the user’s orientation relative to the virtual position of the audio source, and increasing a volume of an audio output corresponding to the audio signal as the user’s orientation becomes aligned with the virtual position of the audio source.
[0008] Detecting the condition may include detecting a notification associated with a graphical object. The graphical object may have a virtual position in a virtual environment being presented to the user, and the directional haptic output may be configured to direct the user’s attention toward the direction, which corresponds to the virtual position of the graphical object.
[0009] Detecting the condition may include detecting an interactive object in a virtual environment being presented to the user. The interactive object may have a virtual position within the virtual environment, and the directional haptic output may be configured to direct the user’s attention toward the direction, which corresponds to the virtual position of the interactive object.
[0010] An electronic system may include an earbud comprising an earbud body configured to be received at least partially within an ear of a user, a speaker positioned within the earbud body and configured to output sound into an ear canal of the user’s ear, and a haptic actuator positioned within the earbud body and configured to impart a haptic output to the user’s ear. The haptic actuator may be a linear resonant actuator having a linearly translatable mass that is configured to produce the haptic output.
[0011] The electronic system may further include a processor communicatively coupled with the haptic actuator and configured to detect a condition, determine an actuation pattern for the haptic actuator, and in response to detecting the condition, initiate the haptic output in accordance with the actuation pattern. The electronic system may further include a portable electronic device in wireless communication with the earbud, and the processor may be within the portable electronic device.
[0012] The electronic system may further include an additional earbud comprising an additional earbud body, an additional speaker positioned within the additional earbud body, and an additional haptic actuator positioned within the additional earbud body. The haptic actuator may include a mass configured to move along a horizontal direction when the earbud is worn in the user’s ear, and the mass may be configured to produce an impulse that is perceptible as a force acting on the user’s ear in a single direction.
[0013] A method of providing a haptic output may include detecting an audio feature in audio data, determining a characteristic frequency of the audio feature, causing a wearable electronic device to produce an audio output corresponding to the audio data and including the audio feature, and while the audio feature is being outputted, causing a haptic actuator of the wearable electronic device to produce a haptic output at a haptic frequency that corresponds to the characteristic frequency of the audio feature. The haptic frequency may be a harmonic or subharmonic of the characteristic frequency. The haptic output may be produced for an entire duration of the audio feature.
[0014] Detecting the audio feature may include detecting a triggering event in the audio data, and the triggering event may correspond to a rate of change of volume of the audio output that satisfies a threshold. Detecting the audio feature may include detecting audio content within a target frequency range.
[0015] The method may further include determining a variation in an audio characteristic of the audio feature and varying a haptic characteristic of the haptic output in accordance with the variation in the audio characteristic of the audio feature. The variation in the audio characteristic of the audio feature may be a variation in an amplitude of the audio feature, and varying a component of the haptic output in accordance with the variation in the audio characteristic of the audio feature may include varying an intensity of the haptic output in accordance with the variation in the amplitude.
BRIEF DESCRIPTION OF THE DRAWINGS
[0016] The disclosure will be readily understood by the following detailed description in conjunction with the accompanying drawings, wherein like reference numerals designate like structural elements, and in which:
[0017] FIGS. 1A-1B depict an example electronic system in use by a user.
[0018] FIGS. 2A-2B depict an example head-mounted haptic accessory.
[0019] FIGS. 3A-3B depict another example head-mounted haptic accessory.
[0020] FIGS. 4A-4B depict another example head-mounted haptic accessory.
[0021] FIG. 5 depicts an example process for producing a haptic output.
[0022] FIG. 6A depicts an example directional haptic output produced by a head-mounted haptic accessory.
[0023] FIG. 6B depicts additional examples of directional haptic outputs produced by a head-mounted haptic accessory.
[0024] FIGS. 7A-7B depict an additional example directional haptic output produced by a head-mounted haptic accessory.
[0025] FIG. 8 depicts an example haptic output scheme.
[0026] FIG. 9 depicts an example chart showing differences between various head-mounted haptic accessories.
[0027] FIGS. 10A-10B depict participants in a teleconference.
[0028] FIG. 11 depicts participants in a teleconference.
[0029] FIGS. 12A-12B depict a user engaged in a virtual-reality environment.
[0030] FIG. 13A depicts an example audio feature in audio data.
[0031] FIG. 13B depicts an example haptic output associated with the audio feature of FIG. 13A.
[0032] FIGS. 14A-14B depict a spatial arrangement of a user and two audio sources.
DETAILED DESCRIPTION
[0033] Reference will now be made in detail to representative embodiments illustrated in the accompanying drawings. It should be understood that the following description is not intended to limit the embodiments to one preferred embodiment. To the contrary, it is intended to cover alternatives, modifications, and equivalents as can be included within the spirit and scope of the described embodiments as defined by the appended claims.
[0034] The embodiments herein are generally directed to wearable electronic devices that include haptic actuators, and more particularly, to haptic outputs that are coordinated with a position of a virtual object (which may correspond to or represent a person, an audio source, an instrument, a graphical object, etc.) relative to the wearer of the electronic device. The wearable electronic devices may include an array of haptic actuators (e.g., two or more haptic actuators) that can be actuated according to an actuation pattern in order to direct the wearer’s attention in a particular direction. For example, an array of haptic actuators in contact with various locations on a wearer’s head may be actuated in a pattern that produces a sensation having a distinct directional component. More particularly, the user may feel the pattern moving left or right. The user may then be motivated to turn his or her head or body in the direction indicated by the haptic pattern.
[0035] Indicating a direction via directional haptic outputs may be used to enhance various types of interactions with audio and/or visual content, and in particular to enhance interaction with content that has a real or virtual position relative to the wearer, and/or content that has a visual or audible component. For example, and as described in greater detail herein, directional haptic outputs may be used to direct a wearer’s attention along a direction towards a virtual location of a participant in a multi-party telephone conference. As another example, a directional haptic output may be used to direct a user’s attention towards the position of a graphical object in a virtual or augmented reality environment.
[0036] Haptic outputs provided via a wearable electronic device may also be used to enhance an experience of consuming audio or video content. For example, haptic outputs may be synchronized with certain audio features in a musical work or with audio or visual features of video content. In the context of music, the haptic outputs may be synchronized with notes from a certain instrument or notes having a certain prominence in the music. In some cases, the position of the wearer relative to a virtual position of an instrument may also affect the haptic output provided to the user. In the context of video, the haptic outputs may be synchronized with some visual and/or audio content of the video, such as by initiating a haptic output when an object appears to move towards or near the viewer.
[0037] These and other haptic outputs may be imparted to the user via various types of wearable devices. For example, a pair of earbuds, such as those that are conventionally used to provide audio to a user, may include haptic actuators that can produce haptic or tactile sensations to a user’s ear. As used herein, the term ear may refer to any portion of an ear of a person, including the outer ear, middle ear, and/or inner ear. The outer ear of a person, which may include the auricle or pinna (e.g., the visible part of the ear that is external to a person’s head) and the ear canal. Earbuds may reside at least partially in the ear canal, and may contact portions of the ear canal and/or the auricle of the ear. Accordingly, haptic actuators in earbuds may produce haptic or tactile sensations on the auricle and/or ear canal of a person’s ear.
[0038] As another example, a pair of glasses may include haptic actuators (e.g., on the temple pieces and/or nose bridge). As yet another example, a headband, hat, or other head-worn object may include haptic actuators. In some cases, these wearable device(s) include an array of two or more haptic actuators, which may facilitate the production of directional haptic outputs by using different types of actuation patterns for the various actuators in the array.
[0039] FIGS. 1A-1B illustrate right and left sides, respectively, of a user 100 using an electronic system 101. The electronic system 101 may include a head-mounted haptic accessory 102 and a processing system 104, and may define or be referred to as a haptic output system. For example, the head-mounted haptic accessory 102 and the portions of the processing system 104 that interact with the head-mounted haptic accessory 102 (or otherwise provide functionality relating to producing haptic outputs via the head-mounted haptic accessory 102) may define the haptic output system.
[0040] The head-mounted haptic accessory 102 is shown as a pair of earbuds that are configured to be positioned within an ear of the user 100. The head-mounted haptic accessory 102 may include an array of two or more haptic actuators. For example, in the case of the earbuds shown in FIGS. 1A-1B, each earbud may include a haptic actuator to define an array of two haptic actuators in contact with the user 100 (e.g., with the user’s ears). In other embodiments, as described herein, the head-mounted haptic accessory may be another type of wearable, head-mounted device, such as over-ear or on-ear headphones, in-ear monitors, a pair of glasses, a headband, a hat, a head-mounted display, etc. In some cases, the head-mounted haptic accessory 102 may also include one or more speakers that produce audio outputs.
[0041] The electronic system 101 may include a processing system 104, which may be a device that is separate from the head-mounted haptic accessory 102 (as shown in FIG. 1A), or it may be integrated with the head-mounted haptic accessory 102. The processing system 104 is depicted in FIG. 1A as a portable electronic device, such as a mobile phone or smartphone, however, this merely represents one type or form factor for the processing system 104. In other cases, the processing system 104 may be another type of portable electronic device, such as a tablet computer, a wearable electronic device (e.g., a smart watch, a head-mounted display), a notebook computer, or any other suitable portable electronic device. In some cases, the processing system 104 may be another type of electronic or computing device, such as a desktop computer, a gaming console, a voice-activated digital assistant, or any other suitable electronic device. The processing system 104 may perform various operations of the electronic system 101, including for example determining whether a head-mounted haptic accessory 102 is being worn, determining when haptic outputs are to be produced via the head-mounted haptic accessory 102, determining actuation patterns for the haptic actuators of the head-mounted haptic accessory 102, and the like. The processing system 104 may also provide audio signals to the head-mounted haptic accessory 102 (such as where the head-mounted haptic accessory 102 is a pair of headphones or earbuds). Audio signals may be digital or analog, and may be processed by the processing system 104 and/or the head-mounted haptic accessory 102 to produce an audio output (e.g., audible sound). Audio signals may correspond to, include, or represent audio data from various different sources, such as teleconference voice data, an audio portion of a real-time video stream, an audio track of a recorded video, an audio recording (e.g., music, podcast, spoken word, etc.), or the like. The processing system 104 may also perform other operations of the electronic system 101 as described herein.
[0042] FIG. 2A is a side view of a user 200 wearing a head-mounted haptic accessory that includes earbuds 202 each having a haptic actuator positioned within an earbud body. FIG. 2B is a schematic top view of the user 200, illustrating how the earbuds 202 define an array of haptic actuation points 204 on the head of the user 200. Because the earbuds 202 (or another pair of headphones or head-worn audio device) are positioned on or in the ear of the user 200, the haptic actuation points are on opposite lateral sides of the user’s head.
[0043] FIG. 3A is a side view of a user 300 wearing a head-mounted haptic accessory embodied as a pair of glasses 302 that includes haptic actuators 303 positioned at various locations on the glasses 302. For example, an actuator may be positioned on each temple piece, and another may be positioned on a nose bridge segment of the glasses 302. FIG. 3B is a schematic top view of the user 300, illustrating how the glasses 302, and more particularly the actuators 303 of the glasses 302, define an array of haptic actuation points 304 on the head of the user 300. As shown in FIG. 3B, two haptic actuation points are positioned on opposite lateral sides of the head, and one is positioned on the center of the head (e.g., on or near the bridge of the user’s nose). In some cases, more or fewer haptic actuators may be included in the glasses 302. For example, the actuator on the nose bridge segment may omitted.
[0044] FIG. 4A is a side view of a user 400 wearing a head-mounted haptic accessory embodied as a headband 402 that includes haptic actuators 403 positioned at various locations along the headband 402. For example, eight actuators 403 may be positioned at various locations around the headband 402, though more or fewer actuators 403 are also contemplated. FIG. 4B is a schematic top view of the user 400, illustrating how the headband 402, and more particularly the actuators 403 of the headband 402, define an array of haptic actuation points 404 on the head of the user 400. As shown in FIG. 4B, the actuation points 404 are positioned equidistantly around the circumference of the user’s head, though this is merely one example arrangement. Further, while FIGS. 4A-4B illustrate the head-mounted haptic accessory as a headband, this embodiment may equally represent any head-worn clothing, device, or accessory that wraps around some or all of the user’s head, including but not limited to hats, caps, head-mounted displays, hoods, visors, helmets, and the like.
[0045] The arrays of haptic actuators shown and described with respect to FIGS. 2A-4B illustrate examples in which the haptic actuators define a radial array of actuators that at least partially encircle or surround a user’s head. The radial array configurations may help convey directionality to the user via the haptic outputs. For example, the haptic actuators of the various head-mounted haptic accessories may be initiated in accordance with an actuation pattern that is recognizable as indicating a particular direction to a user. Such directional haptic outputs can be used to direct a user’s attention in a particular direction, such as towards a virtual position of a virtual audio source. By directing the user’s attention in this way, the user may be subtly directed to move his or her head to face the position of the virtual audio source, which may increase engagement of the wearer with the audio source, especially where multiple audio sources (and thus multiple positions) are active. Additional details of example actuation patterns and particular use cases for producing the actuation patterns are described herein.
[0046] FIG. 5 is an example flow chart of a method 500 of operating an electronic system that produces directional haptic outputs, as described herein. At operation 502, a condition is detected (e.g., by the electronic system 101). The condition may be any suitable condition that is a triggering event for initiating a haptic output (e.g., a directional haptic output) via a wearable haptic device (e.g., a head-mounted haptic accessory 102). For example, detecting the condition may include or correspond to detecting a presence of an audio source in an audio signal, where the audio source may be associated with a virtual position relative to the user. More particularly, as described in greater detail with respect to FIGS. 10A-10B, if the user is engaged in a conference call with multiple participants, each participant may have an assigned virtual location relative to the user. In this case, detecting the condition may include detecting that one of the participants is speaking or otherwise producing audio. Detecting the condition may also include detecting whether a characteristic of a signal, including but not limited to a volume or amplitude of an audio output corresponding to an audio signal, has satisfied a threshold value. For example, in the context of a multi-party conference call, detecting the condition may include detecting that an audio output associated with one of the participants has satisfied a threshold value (e.g., a threshold volume).
[0047] As another example, detecting the condition may include or correspond to detecting a notification indicating that the user has received a message, or that a graphical object (or audio message) has been received or is otherwise available in a virtual environment. As yet another example, detecting the condition may include or correspond to detecting the presence of an interactive object or affordance in a virtual environment. As used herein, an interactive object may correspond to or be associated with a graphical object in a virtual environment and that a user can interact with in a manner beyond mere viewing. For example, a user may be able to select the interactive object, virtually manipulate the interactive object, provide inputs to the interactive object, or the like. As one specific example, where the virtual environment corresponds to a gaming application, an interactive object may be an item that the user may select and add to his or her inventory. As another specific example, where the virtual environment corresponds to a word processing application, the interactive object may be a selectable icon that controls a program setting of the application.
[0048] At operation 504, it is determined whether a wearable haptic accessory is being worn by a user. For example, a processing system 104 may detect whether a head-mounted haptic accessory 102 is being worn by a user. In some cases, the head-mounted haptic accessory 102 may determine whether it is being worn by either sensing the presence of the user (using, for example, a proximity sensor), or by inferring from an orientation or motion of the head-mounted haptic accessory 102 that it is being worn (using, for example, an accelerometer or magnetometer or motion sensor). The head-mounted haptic accessory 102 may report to the processing system 104 whether it is or is not being worn. If the processing system 104 cannot communicate with a head-mounted haptic accessory, the processing system 104 may assume that no head-mounted haptic accessory is available.
[0049] If it is determined that a head-mounted haptic accessory is being worn by a user, a directional component for a haptic output may be determined at operation 506. The directional component for the haptic output may correspond to a direction that a user must turn his or her head or body in order to be facing a desired position or location. For example, if a user is not facing a virtual position or location of an audio source, the directional component for the haptic output may be a direction that the user must turn his or her head or body in order to face the virtual position or location. In some cases, the determination of the directional component for the haptic output may be based at least in part on an orientation of the wearer of the head-mounted haptic accessory. Such information may be determined by the head-mounted haptic accessory, such as via sensors (e.g., accelerometers, magnetometers, gyroscopes, orientation sensors) incorporated with the head-mounted haptic accessory. Such information may be reported to the processing system 104, which may then determine the directional component. Determining the directional component may also include determining an actuation pattern for an array of actuators on the head-mounted haptic accessory. For example, if the directional component indicates that the user needs to turn his or her head 30 degrees to the left, the pattern may cause the haptic actuators to fire in a sequence that moves across the user’s body from right to left.
[0050] At operation 508, in response to detecting the condition and determining the directional component (e.g., determining the actuation pattern), determining that the haptic accessory is being worn by the user, and determining the directional component for the haptic output, the haptic output may be produced. As described herein, this may include sending a signal to the haptic accessory that will cause the haptic accessory to produce the haptic output in accordance with the directional component. As described in greater detail herein, the haptic output may produce a sensation that has an identifiable directional component or that otherwise suggests a particular direction to a user. For example, a sequence of haptic outputs may travel around a user’s head from left to right, indicating that the user should direct his or her orientation along that direction (e.g., to the right). As another example, a haptic output may produce a tugging or pulling sensation that suggests the direction that a user should move (e.g., rotate) his or her head.
[0051] In some cases, a signal defining or containing the actuation may be sent to the haptic accessory from the processing system. In other cases, data defining haptic patterns is stored in the haptic accessory, and the processing system sends a message (and optionally an identifier of a particular actuation pattern) to the haptic accessory that causes the haptic accessory to produce the haptic output.
[0052] FIG. 5 describes a general framework for the operation of an electronic system as described herein. It will be understood that certain operations described herein may correspond to operations explicitly described with respect to FIG. 5, while other operations may be included instead of or in addition to operations described with respect to FIG. 5.
[0053] As described above, haptic outputs delivered via a head-mounted haptic accessory may include a directional component or may otherwise be configured to direct the user’s attention along a particular direction. In order to indicate a direction to a user, an actuation pattern or sequence may be used to produce a tactile sensation that suggests a particular direction to the wearer. Actuation patterns where haptic outputs are triggered or produced sequentially (e.g., at different times) may be referred to as a haptic sequence or actuation sequence.
[0054] FIGS. 6A-6B are schematic top views of a user wearing various types of head-mounted haptic accessories, as well as example actuation patterns that may produce the intended tactile sensation. FIG. 6A illustrates a schematic top view of a user 600 having a head-mounted haptic accessory with two actuation points 602-1, 602-2. The head-mounted haptic accessory may correspond to a pair of earbuds or other headphones that are worn on, in, or around the user’s ears. Alternatively, the head-mounted haptic accessory may be any device that defines two haptic actuation points.
[0055] FIGS. 6A-6B provide an example of how a haptic output may be configured to orient a user toward a virtual objet or direct the user’s attention along a particular direction. For example, in order to produce a haptic output to direct the user 600 to turn to the right (indicated by arrow 604), the electronic system may initiate a haptic sequence 605 that causes an actuator associated with the first actuation point 602-1 to produce a haptic output 606 that decreases in intensity over a time span. (Arrow 610 in FIG. 6A indicates a time axis of the actuation sequence.) After, or optionally overlapping with, the first haptic output 606, a haptic actuator associated with the second actuation point 602-2 may produce a haptic output 608 that increases in intensity over a time span. This haptic sequence may produce a tactile sensation that is indicative or suggestive of a right-hand direction, which may signal to the wearer that he or she should turn his or her head to the right.
[0056] The intensity of a haptic output may correspond to any suitable characteristic or combination of characteristics of a haptic output that contribute to the perceived intensity of the haptic output. For example, changing an intensity of a haptic output may be achieved by changing an amplitude of a vibration of the haptic actuator, by changing a frequency of a vibration of the haptic actuator, or a combination of these actions. In some cases, higher intensity haptic outputs may be associated with relatively higher amplitudes and relatively lower frequencies, whereas lower intensity haptic outputs may be associated with relatively lower amplitudes and relatively higher frequencies.
[0057] FIG. 6B illustrates a schematic top view of a user 611 having a head-mounted haptic accessory with three actuation points 612-1, 612-2, and 612-3. The head-mounted haptic accessory may correspond to a pair of glasses (e.g., the glasses 302, FIG. 3A), a headband (e.g., the headband 402, FIG. 4A), or any other suitable head-mounted haptic accessory.
[0058] In order to produce a haptic output that is configured to direct the user’s attention along a given direction, and more particularly to direct the user 611 to turn to the right (indicated by arrow 614), the electronic system may initiate an actuation sequence 615. The actuation sequence 615 may cause an actuator associated with the first actuation point 612-1 to produce a first haptic output 616, then cause an actuator associated with the second actuation point 612-2 to produce a second haptic output 618 , and then cause an actuator associated with the third actuation point 612-3 to produce a third haptic output 620. (Arrow 622 in FIG. 6A indicates a time axis of the actuation sequence.) The actuation sequence 615 thus produces a series of haptic outputs that move along the user’s head from left to right. This haptic sequence may produce a tactile sensation that is indicative or suggestive of a right-hand direction, which may signal to the wearer that he or she should turn his or her head to the right. As shown, the haptic outputs 616, 618, 620 do not overlap, though in some implementations they may overlap.
……
……
……