空 挡 广 告 位 | 空 挡 广 告 位

Intel Patent | Virtual wearables

Patent: Virtual wearables

Drawings: Click to check drawins

Publication Number: 20210157149

Publication Date: 20210527

Applicant: Intel

Abstract

A mechanism is described for dynamically facilitating virtual wearables according to one embodiment. A method of embodiments, as described herein, includes detecting a wearable area. The wearable area may represent a human body part of a primary user. The method may further include scanning the wearable area to facilitate suitability of the wearable area for projection of a virtual wearable, and projecting the virtual wearable on the wearable area using a primary wearable device of the primary user such that the projecting is performed via a projector of the primary wearable device.

Claims

1.-24. (canceled)

  1. At least one storage device comprising instructions that, when executed, cause one or more processors to: cause a virtual wearable accessory to be projected on a portion of a body of a user, the virtual wearable accessory to display first content; detect a gesture performed by the user based on one or more signals output by at least one sensor associated with the body of the user; determine whether the gesture corresponds to a command for the virtual wearable accessory; in response to determining that the gesture corresponds to the command for the virtual wearable accessory, cause the virtual wearable accessory to display second content, the second content associated with the command, the second content different than the first content; and in response to determining that gesture does not correspond to the command for the virtual wearable accessory, take no action with respect to the virtual wearable accessory.

  2. The at least one storage device of claim 25, wherein the instructions, when executed, cause the one or more processors to: determine an orientation of a face of the user relative to the virtual wearable accessory; and verify that the gesture corresponds to the command for the virtual wearable accessory based on the orientation of the face of the user.

  3. The at least one storage device of claim 25, wherein (a) in response to the gesture occurring in a first orientation relative to the virtual wearable accessory, the gesture is to correspond to a first interaction with the second content and (b) in response to the gesture occurring in a second orientation relative to the virtual wearable accessory, the gesture is to correspond to a second interaction with the second content, the second orientation different than the first orientation.

  4. The at least one storage device of claim 25, wherein the second content includes a virtual object and the instructions, when executed cause the one or more processors to: determine a velocity of the gesture based on one or more of image data including the user or the one or more signals output by the at least one sensor; and cause the display of the virtual object to move relative to the virtual wearable accessory based on the velocity of the gesture.

  5. The at least one storage device of claim 25, wherein the command is an authentication command to enable access to the first content and the second content.

  6. The at least one storage device of claim 25, wherein the instructions, when executed, cause the one or more processors to: detect a presence of a physical wearable accessory worn on the portion of the body of the user based on one or more of image data including the user or the one or more signals output by the at least one sensor; and cause the virtual wearable accessory to be projected on a location of the portion of the body of the user proximate to the physical wearable accessory.

  7. An apparatus comprising: memory; and at least one processor to execute instructions to: cause a virtual wearable accessory to be projected on a portion of a body of a user, the virtual wearable accessory to display first content; detect a gesture performed by the user based on one or more signals output by at least one sensor associated with the body of the user; determine whether the gesture corresponds to a command for the virtual wearable accessory; in response to determining that the gesture corresponds to the command for the virtual wearable accessory, cause the virtual wearable accessory to display second content, the second content associated with the command, the second content different than the first content; and in response to determining that gesture does not correspond to the command for the virtual wearable accessory, take no action with respect to the virtual wearable accessory.

  8. The apparatus of claim 31, wherein the instructions, when executed, cause the one or more processors to: determine an orientation of a face of the user relative to the virtual wearable accessory; and verify that the gesture corresponds to the command for the virtual wearable accessory based on the orientation of the face of the user.

  9. The apparatus of claim 32, wherein (a) in response to the gesture occurring in a first orientation relative to the virtual wearable accessory, the gesture is to correspond to a first interaction with the second content and (b) in response to the gesture occurring in a second orientation relative to the virtual wearable accessory, the gesture is to correspond to a second interaction with the second content, the second orientation different than the first orientation.

  10. The apparatus of claim 31, wherein the second content includes a virtual object and the instructions, when executed cause the one or more processors to: determine a velocity of the gesture based on one or more of image data including the user or the one or more signals output by the at least one sensor; and cause the display of the virtual object to move relative to the virtual wearable accessory based on the velocity of the gesture.

  11. The apparatus of claim 31, wherein the command is an authentication command to enable access to the first content and the second content.

  12. The apparatus of claim 31, wherein the instructions, when executed, cause the one or more processors to: detect a presence of a physical wearable accessory worn on the portion of the body of the user based on one or more of image data including the user or the one or more signals output by the at least one sensor; and cause the virtual wearable accessory to be projected on a location of the portion of the body of the user proximate to the physical wearable accessory.

  13. An apparatus comprising: means for projecting; means for sensing movement of a user; and at least one processor to: cause the projecting means to project a virtual wearable accessory on a portion of a body of the user, the virtual wearable accessory to display first content; detect a gesture performed by the user based on one or more signals output by the sensing means; determine whether the gesture corresponds to a command for the virtual wearable accessory; in response to determining that the gesture corresponds to the command for the virtual wearable accessory, cause the virtual wearable accessory to display second content, the second content associated with the command, the second content different than the first content; and in response to determining that gesture does not correspond to the command for the virtual wearable accessory, take no action with respect to the virtual wearable accessory.

  14. The apparatus of claim 37, wherein the projecting means is carried by a head-mounted device to be worn by the user.

  15. The apparatus of claim 37, wherein the sensing means includes an accelerometer.

  16. The apparatus of claim 37, further including a camera to output image data including the portion of the body of the user, the at least one processor to cause the projecting means to project the virtual wearable accessory on the portion of the body of the user based on the image data.

  17. The apparatus of claim 40, wherein the at least one processor is to: detect a presence of a physical wearable accessory worn on the portion of the body of the user based on the image data; and cause the virtual wearable accessory to be projected on a location of the portion of the body of the user proximate to the physical wearable accessory.

  18. The apparatus of claim 40, wherein the second content includes a virtual object and the at least one processor is to: estimate a velocity of the gesture based on one or more of the image data or the one or more signals output by the sensing means; and cause the display of the virtual object to move relative to the virtual wearable accessory based on the velocity of the gesture.

  19. The apparatus of claim 37, wherein the at least one processor is to: determine an orientation of a face of the user relative to the virtual wearable accessory; and verify that the gesture corresponds to the command for the virtual wearable accessory based on the orientation of the face of the user.

  20. The apparatus of claim 37, wherein the command is an authentication command to enable access to the first content and the second content.

Description

FIELD

[0001] Embodiments described herein generally relate to computers. More particularly, embodiments relate to dynamically facilitating virtual wearables.

BACKGROUND

[0002] With the growth of mobile computing devices, wearable devices are also gaining popularity and noticeable traction in becoming a mainstream technology. However, today’s wearable devices are physical devices that are to be attached to or worn on the user’s body. Further, these conventional physical wearable devices vary in their functionalities and uses, such as from needing to use one wearable device for tracking health indicators to another wearable device for playing games. Given the physical nature of these wearable devices and their lack of ability to perform varying tasks, makes these wearable devices inflexible and inefficient. Other conventional techniques require additional external hardware that are expensive, cumbersome, impractical, unstable, and provide for unsatisfying user experience, etc., while yet other conventional techniques require intrusive marks that provide for inflexible configuration and lack of privacy.

BRIEF DESCRIPTION OF THE DRAWINGS

[0003] Embodiments are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings in which like reference numerals refer to similar elements.

[0004] FIG. 1 illustrates a computing device employing a dynamic virtual wearable mechanism according to one embodiment.

[0005] FIG. 2 illustrates a dynamic virtual wearable mechanism according to one embodiment.

[0006] FIG. 3A illustrates a method for facilitating virtual wearables according to one embodiment.

[0007] FIG. 3B illustrates a method for facilitating access to virtual wearables via secondary wearable devices according to one embodiment.

[0008] FIG. 4 illustrates computer system suitable for implementing embodiments of the present disclosure according to one embodiment.

[0009] FIG. 5 illustrates computer environment suitable for implementing embodiments of the present disclosure according to one embodiment.

[0010] FIG. 6A illustrates a computing device having an architectural placement of a selective set of components according to one embodiment.

[0011] FIG. 6B illustrates a virtual wearable according to one embodiment.

[0012] FIG. 6C illustrates tracking points associated with wearable areas according to one embodiment.

[0013] FIGS. 6D and 6E illustrate scanning techniques for determining and securing wearable areas according to one embodiment.

[0014] FIG. 6F illustrates sharing of virtual wearables according to one embodiment.

[0015] FIG. 6G illustrates scanned target wearable area according to one embodiment.

DETAILED DESCRIPTION

[0016] In the following description, numerous specific details are set forth. However, embodiments, as described herein, may be practiced without these specific details. In other instances, well-known circuits, structures and techniques have not been shown in details in order not to obscure the understanding of this description.

[0017] Embodiments provide for virtual wearables (also referred to as “virtual wearable computers” or “virtual wearable devices”). In one embodiment, a virtual wearable may be achieved by combining one or more wearable devices (e.g., head-mounted devices, such as wearable glasses (e.g., Google.RTM. glass.TM., etc.) with one or more portable micro-projectors, wherein the virtual wearable may be augmented to be presented on any number and type of sites or areas, such as various human body parts (e.g., front/back of a hand, arm, knee, etc.) wherein the virtual wearable may be accessed and used by the user.

[0018] Embodiments further provide for virtual wearables that are (without limitation): 1) secured and private (such as the user may be able to see and decide who else can view their virtual wearable, etc.); 2) configurable (such as the user may be given the ability and option to change, download, and/or share various designs); 3) flexibly designed; 4) configurable to use a single wearable device, such as a head-mounted display, to present other wearables and their features and functionalities; 5) low in consuming power (e.g., single wearable as opposed to several); 6) enhanced to provide better user experience; and 7) accurate.

[0019] FIG. 1 illustrates a computing device 100 employing a dynamic virtual wearable mechanism 110 according to one embodiment. Computing device 100 serves as a host machine for hosting dynamic virtual wearable mechanism (“virtual mechanism”) 110 that includes any number and type of components, as illustrated in FIG. 2, to efficiently employ one or more components to dynamically facilitate virtual wearables as will be further described throughout this document.

[0020] Computing device 100 may include any number and type of communication devices, such as large computing systems, such as server computers, desktop computers, etc., and may further include set-top boxes (e.g., Internet-based cable television set-top boxes, etc.), global positioning system (GPS)-based devices, etc. Computing device 100 may include mobile computing devices serving as communication devices, such as cellular phones including smartphones, personal digital assistants (PDAs), tablet computers, laptop computers (e.g., Ultrabook.TM. system, etc.), e-readers, media internet devices (MIDs), media players, smart televisions, television platforms, intelligent devices, computing dust, media players, head-mounted displays (HMDs) (e.g., optical head-mounted display (e.g., wearable glasses, such as Google.RTM. glass.TM.), head-mounted binoculars, gaming displays, military headwear, etc.), and other wearable devices (e.g., smartwatches, bracelets, smartcards, jewelry, clothing items, etc.), etc.

[0021] Although, as aforementioned, computing device 100 may include any number and type of computing devices and that embodiments are not limited to merely HMDs or other wearable device or any other particular type of computing devices. However, in one embodiment, computing device 100 may include a head-mounting display or another form of wearable device and thus, throughout this document, “HMD”, “head-mounting display” and/or “wearable device” may be interchangeably referenced as computing device 100 to be used as an example for brevity, clarity, and ease of understanding.

[0022] Computing device 100 may include an operating system (OS) 106 serving as an interface between hardware and/or physical resources of the computer device 100 and a user. Computing device 100 further includes one or more processors 102, memory devices 104, network devices, drivers, or the like, as well as input/output (I/O) sources 108, such as touchscreens, touch panels, touch pads, virtual or regular keyboards, virtual or regular mice, etc.

[0023] It is to be noted that terms like “node”, “computing node”, “server”, “server device”, “cloud computer”, “cloud server”, “cloud server computer”, “machine”, “host machine”, “device”, “computing device”, “computer”, “computing system”, and the like, may be used interchangeably throughout this document. It is to be further noted that terms like “application”, “software application”, “program”, “software program”, “package”, “software package”, “code”, “software code”, and the like, may be used interchangeably throughout this document. Also, terms like “job”, “input”, “request”, “message”, and the like, may be used interchangeably throughout this document. It is contemplated that the term “user” may refer to an individual or a group of individuals using or having access to computing device 100.

[0024] FIG. 2 illustrates a dynamic virtual wearable mechanism 110 according to one embodiment. In one embodiment, virtual mechanism 110 may include any number and type of components, such as (without limitation): detection/reception logic 201; authentication/permission logic 203; area scanning/tracking logic 205; area-based model creation logic 207; adjustment/activation logic 209; interaction and recognition logic 209; sharing logic 211; and communication/compatibility logic 213. Computing device 100 may further include any number and type of other components, such as capturing/sensing components 221, output components 223, and micro-projector 225, etc.

[0025] Capturing/sensing components 221 may include any number and type of capturing/sensing devices, such as one or more sending and/or capturing devices (e.g., cameras, microphones, biometric sensors, chemical detectors, signal detectors, wave detectors, force sensors (e.g., accelerometers), illuminators, etc.) that may be used for capturing any amount and type of visual data, such as images (e.g., photos, videos, movies, audio/video streams, etc.), and non-visual data, such as audio streams (e.g., sound, noise, vibration, ultrasound, etc.), radio waves (e.g., wireless signals, such as wireless signals having data, metadata, signs, etc.), chemical changes or properties (e.g., humidity, body temperature, etc.), biometric readings (e.g., figure prints, etc.), environmental/weather conditions, maps, etc. It is contemplated that “sensor” and “detector” may be referenced interchangeably throughout this document. It is further contemplated that one or more capturing/sensing components 221 may further include one or more supporting or supplemental devices for capturing and/or sensing of data, such as illuminators (e.g., infrared (IR) illuminator), light fixtures, generators, sound blockers, etc. It is to be noted that “visual data” may be referred to as “visual” or “visuals”; while, “non-visual data” may be referred to as “non-visual” or “non-visuals” throughout this document.

[0026] It is further contemplated that in one embodiment, capturing/sensing components 221 may further include any number and type of sensing devices or sensors (e.g., linear accelerometer) for sensing or detecting any number and type of contexts (e.g., estimating horizon, linear acceleration, etc., relating to a mobile computing device, etc.). For example, capturing/sensing components 221 may include any number and type of sensors, such as (without limitations): accelerometers (e.g., linear accelerometer to measure linear acceleration, etc.); inertial devices (e.g., inertial accelerometers, inertial gyroscopes, micro-electro-mechanical systems (MEMS) gyroscopes, inertial navigators, etc.); gravity gradiometers to study and measure variations in gravitation acceleration due to gravity, etc.

[0027] For example, capturing/sensing components 221 may further include (without limitations): audio/visual devices (e.g., cameras, microphones, speakers, etc.); context-aware sensors (e.g., temperature sensors, facial expression and feature measurement sensors working with one or more cameras of audio/visual devices, environment sensors (such as to sense background colors, lights, etc.), biometric sensors (such as to detect fingerprints, etc.), calendar maintenance and reading device), etc.; global positioning system (GPS) sensors; resource requestor; and trusted execution environment (TEE) logic. TEE logic may be employed separately or be part of resource requestor and/or an I/O subsystem, etc.

[0028] Computing device 100 may further include one or more output components 223 to remain in communication with one or more capturing/sensing components 221 and one or more components of visual mechanism 110 to facilitate displaying of images, playing or visualization of sounds, displaying visualization of fingerprints, presenting visualization of touch, smell, and/or other sense-related experiences, etc. For example and in one embodiment, output components 223 may include (without limitation) one or more of light sources, display devices or screens, audio speakers, bone conducting speakers, olfactory or smell visual and/or non/visual presentation devices, haptic or touch visual and/or non-visual presentation devices, animation display devices, biometric display devices, X-ray display devices, etc.

[0029] Computing device 100 may be in communication with one or more repositories or databases over one or more networks, where any amount and type of data (e.g., real-time data, historical contents, metadata, resources, policies, criteria, rules and regulations, upgrades, etc.) may be stored and maintained. Similarly, computing device 100 may be in communication with any number and type of other computing devices, such as HMDs, wearable devices, mobile computers (e.g., smartphone, a tablet computer, etc.), desktop computers, laptop computers, etc., over one or more networks (e.g., cloud network, the Internet, intranet, Internet of Things (“IoT”), proximity network, Bluetooth, etc.).

[0030] In the illustrated embodiment, computing device 100 is shown as hosting virtual mechanism 110; however, it is contemplated that embodiments are not limited as such and that in another embodiment, virtual mechanism 110 may be entirely or partially hosted by multiple or a combination of computing devices; however, throughout this document, for the sake of brevity, clarity, and ease of understanding, virtual mechanism 100 is shown as being hosted by computing device 100.

[0031] It is contemplated that computing device 100 may include one or more software applications (e.g., device applications, hardware components applications, business/social application, websites, etc.) in communication with virtual mechanism 110, where a software application may offer one or more user interfaces (e.g., web user interface (WUI), graphical user interface (GUI), touchscreen, etc.) to work with and/or facilitate one or more operations or functionalities of virtual mechanism 110.

[0032] In one embodiment, using virtual mechanism 110, a virtual wearable may be facilitated via computing device 100, such as a wearable device, to serve as an augmented display wraparound on an area of any shape or form, such as a user’s body part (e.g., hand, knee, arm, etc.). For example and in one embodiment, a virtual wearable may be a well-positioned wraparound over the user’s hand or other body parts, such as limbs, providing high-resolution displays (e.g., first and/or second displays) that may be allocated and designed according to one or more models.

[0033] In some embodiments, virtual wearable may be fully configurable, via communication/configuration logic 213, to allow for hardware designers and software programmers to use the virtual wearable as a platform to produce virtual wearable devices for augmented reality. Further, it is contemplated that virtual mechanism 110 may serve both the users (such as end-users using/wearing wearable devices, such as computing device 100) and software developers, programmers, hardware designers, etc., such as a developer may use virtual wearables to enable easy-to-sue media creation platforms for differentiating their product or match other products’ capabilities. Similarly, for example, a virtual wearable may provide a convenient interface, via output components 223, for the users to allow them to determine whether and which part of their personal data may be shared and which to remain private.

[0034] In one embodiment, virtual mechanism 100 facilitates virtual wearables to provide for an enhanced user experience (UX) for users that use various wearable devices (e.g., HMD), such as computing device 100, to enable the users to create and wear such virtual wearables that extend other devices (e.g., wearable devices) or stand on their own. Further, for example and in one embodiment, computing device 100 may include a wearable device (e.g., HMD) and its capturing/sensing components 221 may include, for example, a three-dimension (3D) camera that may then be used with one or more components, such as area-based model creation logic 207, adjustment/activation logic 209, etc., of virtual mechanism 110 that to facilitate a display augmented reality data in a realistic manner where, for example, the user of computing device 100 may see a 3D augmented world.

[0035] Similarly, for example and in one embodiment, the 3D camera may be further used for detection and capture of various objects in 3D (e.g., occlusion) as facilitated by detection/reception logic 201 as will be further described below. It is contemplated that occlusion support may be used to provide an enhanced and better illusion experience for the user when experiencing a virtual wearable, such as by using the depth data from the camera, the computing device 100 may capture the depth data of moving objects and occlude the virtual objects, as necessitated or desired. It is contemplated that embodiments are not limited to any particular component (such as 3D cameras) or technique (such as occlusion) and that any number and type of components and/or techniques may be applied or modified to achieve varying results and facilitate enhanced user experience with virtual wearables.

[0036] In some embodiments, a virtual wearable, as facilitated by virtual mechanism 110 and computing device 100 (e.g., wearable device, such as HMD), may be displayed at one or more areas (also referred to as “wearable areas” or “wearable body areas”) as chosen or preferred by the user of computing device 100. For example and in some embodiments, an display area for a virtual wearable may include various parts of human body, such as the user’s body, such that the virtual wearable may be virtually worn by the user and kept mobile and accessible while the user continues with other activities (e.g., running, eating, sitting, dancing, etc.). It is contemplated and to be noted that embodiments are not limited to merely body parts and that any number and type of areas (such as screens, walls, floors, canvass, holes, rocks, beach sand, non-human body parts, plants, trees, etc.) may be used to serve as wearable areas; however, for the sake of brevity, clarity, and ease of understanding, human body areas are used as examples and discussed throughout this document.

[0037] To find and use a body part (e.g., front or back of a hand, wrist, knee, knuckles, etc.) that is to serve as a wearable area for the user to wear a virtual wearable, in one embodiment, detection/reception logic 201 may be used to detect the body part and, in another embodiment, one or more wearable accessory or marks may be detected by detection/reception logic 201. For example, a detected accessory may a predefined worn accessory, such as a watch or bracelet, etc., that the user may choose to have extended via the virtual wearable. For example, the user may have smart accessory, such as a smartwatch, on the wrist and choose to have a virtual wearable displayed on a body area (e.g., wrist, arm, hand, etc.) next to the smartwatch such that the smartwatch may be extended into a larger device via the virtual wearable. In another example, an accessory may be a dumb accessory, such as a regular jewelry bracelet, a wrist band, a knee brace, etc.

[0038] In one embodiment, as will be further described below, once the initial detection of the body part and/or wearable accessory has been performed by detection/reception logic 201, a virtual wearable model may then be generated to be loaded and snapped onto the area are the user’s body part, where the virtual wearable model may be a 3D model that is specifically tailored for the wearable area of the user’s body, such as tailored around the curved surface of the body part and/or the wearable accessory which may be next to or aligned with the virtual wearable. Embodiments provide for 3D virtual wearables are properly aligned with the curves of human body areas and/or the edges of wearable accessories so that the virtual wearable abilities extended by these virtual wearables are experienced in a realist manner.

[0039] As further illustrated with respect to FIGS. 6D-6E, in one embodiment, a camera (e.g., 3D camera) of capturing/sensing components 221 may be used to capture an image of the wearable area (whether it be an independent body area or next to a wearable accessory, etc.), where the camera and/or one or more depth sensors of capturing/sensing components 221 may be used to scan and map the wearable area as facilitated by area scanning/tracking logic 205. For example, scanning/tracking logic 205 may facilitate the aforementioned camera and/or one or more depth sensors to scan the entire wearable are and track its nooks and corners, curves and edges, highs and lows, etc.

[0040] Once the wearable area has been successfully scanned and mapped via scanning/tracking logic 205, in one embodiment, area-based model creation logic 207 may be used to generate an area model of the wearable area where a highly-fitted virtual wearable may be projected via micro-projector 225 upon activation by adjustment/activation logic 209 and as communicated by communication/compatibility logic 213 of virtual mechanism 110.

[0041] In some embodiment, prior to activating the virtual wearable and displaying it on the wearable area, adjustment/activation logic 209 may be use do perform various adjustments, as necessitated or desired, to the virtual wearable such that it is appropriately aligned with and within the wearable area and/or along-side one or more wearable accessories, etc. Any adjustment is performed to the virtual wearable and/or the wearable area to achieve as perfect a fit between the virtual wearable and the wearable are as possible based on the available scanning, tracking, and 3D model information, etc.

[0042] As aforementioned, once any necessary or desired adjustment has been made, adjustment/activation logic 209 may activate the 3D model of the virtual wearable to be displayed at and/or within the wearable area, where the virtual wearable is then displayed via communication/compatibility logic 213 to then be used and accessed by the user. In one embodiment, the displaying of the virtual wearable may include projecting the 3D virtual wearable onto the wearable area and/or along-side one or more wearable accessories via micro-projector 225 of computing device 100.

[0043] Further, to make the access and use of the virtual wearable both secure and normal as using any another other computing device, interaction and recognition logic 209 may be employed and used to facilitate one or more techniques of touch interaction, gesture recognition, etc. It is contemplated that other such techniques may be employed and that embodiments are not merely limited to touch interaction and gesture recognition.

[0044] In one embodiment, using interaction and recognition logic 209, upon initial detection of the wearable area as facilitated by detection/reception logic 201, the target wearable area may be scanned and tracked as facilitated by are scanning/tracking logic 205, touch interaction may be employed. For example, it is contemplated that there may be various anomalies or jumps in the wearable area which may be detected using a histogram of the depth data of the wearable area using touch interaction as facilitated by interaction and recognition logic 209. As illustrated with reference to FIG. 6G, the y-axis represents the average depth value of the potential wearable area that is scanned from right to left.

[0045] In some embodiments, touch interaction may be used for user verification and authentication purposes, such as the user’s touch or fingerprints, etc., may be used as a password to allow or deny the user to access the virtual wearable, etc. For example, in one embodiment, after having projected the virtual wearable on the wearable area, touch interaction may be triggered by interaction and recognition logic 209 to detect and accept the user’s touch (e.g., fingerprints) to identify and verify the user’s credentials so that the user may be authenticated and accordingly, allowed or denied access to the virtual wearable. It is contemplated that touch interaction may be based on any number and type of touch interaction techniques.

[0046] In another embodiment, gesture recognition may be employed by interaction recognition logic 209 where the user may perform any number and type of gestures which may be detected by a camera and detected by one or more sensors of capturing/sensing components 221. In one embodiment, gesture recognition may allow the user to perform various gestures to interact with the wearable device, such as computing device 100. For example, the user may make various gestures, such as thumbs up, wave, snapping fingers, etc., which may be predetermined, to communicate with the user’s wearable device, such as computing device 100, to perform certain tasks that may or may not be directly related to the virtual wearable being projected on the wearable area. For example, the user may snap fingers to trigger a camera of capturing/sensing components 221 to take a picture, gives thumbs up to triggers computing device 100 to brighten the view of the virtual wearable, or wave to allow a home security application on computing device 100 to lock the doors of the user’s house.

[0047] Similarly, as mentioned above with reference to touch interaction, gesture recognition may be used for security or authentication purposes; for example, the user may perform a certain gesture, such as show the index finger, which may be used as a password to allow or deny the user to access the virtual wearable, etc. Like touch interaction, it is contemplated that gesture recognition may be based on any number and type of gesture recognition techniques (e.g., Intel.RTM. RealSense.TM. Technology, etc.).

[0048] In some embodiments, the user, such as a primary user, of the virtual wearable may choose to share access to the virtual wearable with one or more of other users, such as one or more target users, as further discussed with reference to FIGS. 6F and 3B. Embodiments provide for management of secured connections with one or more target users where the primary user may decide which target users may view and/or access the virtual wearable and which ones may not do so. This may be performed on an invitation from the primary user to a target user and/or in request to a request from the target user.

[0049] For example, a target user may place a request to view/access the virtual wearable, where this request may be received at detection/reception logic 201. The request along with the target user and/or the target user’s wearable device (e.g., HMD) may be authenticated and a permission to view/access the virtual wearable may be granted or denied via authentication/permission logic 203. If the permission is denied, the target user may not view or access the virtual wearable of the primary user. On the other hand, if the permission is grated, the target user may be allowed to view and/or access the primary user’s virtual wearable directly through the target user’s wearable device. It is contemplated that the target user’s wearable device may be a participating wearable device that satisfies the minimum compatibility and communication protocols and standards to be able to participate in the sharing of the virtual wearable.

[0050] In some embodiments, for sharing purposes, any number and type of identification and authentication techniques, such as face recognition techniques (e.g., Face.com.TM., etc.), pairing techniques (e.g., Bluetooth secure seamless paring, etc.) may be employed such that target users and their corresponding target wearable devices may be recognized and authenticated. Similarly, upon deciding on whether the target user be granted or denied permission to access the virtual wearable, one or more other techniques (e.g., user account control (UAC) technique, etc.) may be employed to show or block the view of the virtual wearable to the target wearable device associated with the target user.

[0051] Communication/compatibility logic 213 may be used to facilitate dynamic communication and compatibility between computing device 100 and any number and type of other computing devices (such as wearable computing devices, mobile computing devices, desktop computers, server computing devices, etc.), processing devices (e.g., central processing unit (CPU), graphics processing unit (GPU), etc.), capturing/sensing components 221 (e.g., non-visual data sensors/detectors, such as audio sensors, olfactory sensors, haptic sensors, signal sensors, vibration sensors, chemicals detectors, radio wave detectors, force sensors, weather/temperature sensors, body/biometric sensors, scanners, etc., and visual data sensors/detectors, such as cameras, etc.), user/context-awareness components and/or identification/verification sensors/devices (such as biometric sensors/detectors, scanners, etc.), memory or storage devices, databases and/or data sources (such as data storage devices, hard drives, solid-state drives, hard disks, memory cards or devices, memory circuits, etc.), networks (e.g., cloud network, the Internet, intranet, cellular network, proximity networks, such as Bluetooth, Bluetooth low energy (BLE), Bluetooth Smart, Wi-Fi proximity, Radio Frequency Identification (RFID), Near Field Communication (NFC), Body Area Network (BAN), etc.), wireless or wired communications and relevant protocols (e.g., Wi-Fi.RTM., WiMAX, Ethernet, etc.), connectivity and location management techniques, software applications/websites, (e.g., social and/or business networking websites, business applications, games and other entertainment applications, etc.), programming languages, etc., while ensuring compatibility with changing technologies, parameters, protocols, standards, etc.

[0052] Throughout this document, terms like “logic”, “component”, “module”, “framework”, “engine”, “tool”, and the like, may be referenced interchangeably and include, by way of example, software, hardware, and/or any combination of software and hardware, such as firmware. Further, any use of a particular brand, word, term, phrase, name, and/or acronym, such as “physical wearable”, “virtual wearable”, “wearable device”, “Head-Mounted Display” or “HDM”, “3D model”, “3D camera”, “augmented reality” or “AR”, etc., should not be read to limit embodiments to software or devices that carry that label in products or in literature external to this document.

[0053] It is contemplated that any number and type of components may be added to and/or removed from virtual mechanism 110 to facilitate various embodiments including adding, removing, and/or enhancing certain features. For brevity, clarity, and ease of understanding of virtual mechanism 110, many of the standard and/or known components, such as those of a computing device, are not shown or discussed here. It is contemplated that embodiments, as described herein, are not limited to any particular technology, topology, system, architecture, and/or standard and are dynamic enough to adopt and adapt to any future changes.

[0054] Referring now to FIG. 6A illustrates a virtual wearable 651 according to one embodiment. For brevity, many of the details discussed with reference to FIGS. 1 and 2 may not be discussed or repeated hereafter. In the illustrated, embodiment, virtual wearable 651 is shown to be displayed on a user’s arm such that virtual wearable 651 is projected by micro-projector 225 at and within wearable area 653 on the user’s arm. For illustration purposes, the user is shown to be wearing wearable accessory 655 (e.g., watch, bracelet, etc.) which may be smart or dumb. If, for example, wearable accessory 655 includes a dumb wearable accessory, it may be used as a tracking point for tracking and scanning of wearable area 653 as is further shown with reference to FIG. 6C. If, for example, wearable accessory 655 includes a smart wearable accessory (e.g., smart watch, smart bracelet, etc.), the smart wearable accessory may be made part of virtual wearable 651, such as virtual wearable 651 may be made and projected as an extension to the smart wearable accessory.

[0055] As further discussed with reference to FIG. 2, computing device 100 may include a wearable device, such as a head-mounted display, which hosts virtual mechanism 110 along with any number and type of other components, such as micro-projector 225. As further discussed with reference to FIG. 2, it is contemplated and to be noted that although in this and subsequent illustrations, virtual wearable 651 is shown to be projected on a human arm, embodiments are not so limited.

[0056] FIG. 6B illustrates a virtual wearable 651 according to one embodiment. In the illustrated embodiment, virtual wearable 651 is shown from a different angle where, in some embodiments, virtual wearable 651 may appear as a wraparound if the user’s arm is moved in a particular direction. In other embodiments, virtual wearable 651 may not be a wraparound.

[0057] FIG. 6C illustrates tracking points 657A-B associated with wearable areas according to one embodiment. As previously discussed with reference to FIG. 2, in one embodiment, various tracking points, such as tracking points 657A-657B, may be tracked, monitored, and noted as referenced points to then be used to determine the corresponding potential wearable areas. These tracking points 657A-B may have been caused any number and type of reasons, such as wearing of accessories, etc. In another embodiment, an object (e.g., wearable accessory 655) may be used to determine a tracking point, such as edges and boundaries of wearable accessory 655 (e.g., watch, bracelet, wristband, etc.) may be used to serve as reference points to determine the potential wearable area.

……
……
……

您可能还喜欢...