Magic Leap Patent | Systems And Methods For Augmented Reality

Patent: Systems And Methods For Augmented Reality

Publication Number: 20180239144

Publication Date: 20180823

Applicants: Magic Leap

Abstract

Methods and systems for triggering presentation of virtual content based on sensor information. The display system may be an augmented reality display system configured to provide virtual content on a plurality of depth planes using different wavefront divergences. The system may monitor information detected via the sensors, and based on the monitored information, trigger access to virtual content identified in the sensor information. Virtual content can be obtained, and presented as augmented reality content via the display system. The system may monitor information detected via the sensors to identify a QR code, or a presence of a wireless beacon. The QR code or wireless beacon can trigger the display system to obtain virtual content for presentation.

PRIORITY CLAIM

[0001] This application claims priority to U.S. Provisional Patent App. No. 62/459,802 titled “SYSTEMS AND METHODS FOR AUGMENTED REALITY,” which was filed on Feb. 16, 2017, and which is hereby incorporated by reference in its entirety for all purposes.

BACKGROUND

Field

[0002] The present disclosure relates to systems and methods to localize position and orientation of one or more objects in the context of augmented reality systems.

Description of the Related Art

[0003] Modern computing and display technologies have facilitated the development of systems for so called “virtual reality” or “augmented reality” experiences, wherein digitally reproduced images or portions thereof are presented to a user in a manner wherein they seem to be, or may be perceived as, real. A virtual reality, or “VR”, scenario typically involves presentation of digital or virtual image information without transparency to other actual real-world visual input; an augmented reality, or “AR”, scenario typically involves presentation of digital or virtual image information as an augmentation to visualization of the actual world around the user.

[0004] For example, referring to FIG. 1, an augmented reality scene (4) is depicted wherein a user of an AR technology sees a real-world park-like setting (6) featuring people, trees, buildings in the background, and a concrete platform (1120). In addition to these items, the user of the AR technology also perceives that he “sees” a robot statue (1110) standing upon the real-world platform (1120), and a cartoon-like avatar character (2) flying by which seems to be a personification of a bumble bee, even though these elements (2, 1110) do not exist in the real world. As it turns out, the human visual perception system is very complex, and producing a VR or AR technology that facilitates a comfortable, natural-feeling, rich presentation of virtual image elements amongst other virtual or real-world imagery elements is challenging.

[0005] For instance, head-worn AR displays (or helmet-mounted displays, or smart glasses) typically are at least loosely coupled to a user’s head, and thus move when the user’s head moves. If the user’s head motions are detected by the display system, the data being displayed can be updated to take the change in head pose into account.

[0006] As an example, if a user wearing a head-worn display views a virtual representation of a three-dimensional (3D) object on the display and walks around the area where the 3D object appears, that 3D object can be re-rendered for each viewpoint, giving the user the perception that he or she is walking around an object that occupies real space. If the head-worn display is used to present multiple objects within a virtual space (for instance, a rich virtual world), measurements of head pose (e.g., the location and orientation of the user’s head) can be used to re-render the scene to match the user’s dynamically changing head location and orientation and provide an increased sense of immersion in the virtual space.

[0007] In AR systems, detection or calculation of head pose can facilitate the display system to render virtual objects such that they appear to occupy a space in the real world in a manner that makes sense to the user. In addition, detection of the position and/or orientation of a real object, such as handheld device (which also may be referred to as a “totem”), haptic device, or other real physical object, in relation to the user’s head or AR system may also facilitate the display system in presenting display information to the user to enable the user to interact with certain aspects of the AR system efficiently. As the user’s head moves around in the real world, the virtual objects may be re-rendered as a function of head pose, such that the virtual objects appear to remain stable relative to the real world. At least for AR applications, placement of virtual objects in spatial relation to physical objects (e.g., presented to appear spatially proximate a physical object in two- or three-dimensions) may be a non-trivial problem. For example, head movement may significantly complicate placement of virtual objects in a view of an ambient environment. Such is true whether the view is captured as an image of the ambient environment and then projected or displayed to the end user, or whether the end user perceives the view of the ambient environment directly. For instance, head movement will likely cause a field of view of the end user to change, which will likely require an update to where various virtual objects are displayed in the field of the view of the end user. Additionally, head movements may occur within a large variety of ranges and speeds. Head movement speed may vary not only between different head movements, but within or across the range of a single head movement. For instance, head movement speed may initially increase (e.g., linearly or not) from a starting point, and may decrease as an ending point is reached, obtaining a maximum speed somewhere between the starting and ending points of the head movement. Rapid head movements may even exceed the ability of the particular display or projection technology to render images that appear uniform and/or as smooth motion to the end user.

[0008] Head tracking accuracy and latency (e.g., the elapsed time between when the user moves his or her head and the time when the image gets updated and displayed to the user) have been challenges for VR and AR systems. Especially for display systems that fill a substantial portion of the user’s visual field with virtual elements, it is critical that the accuracy of head-tracking is high and that the overall system latency is very low from the first detection of head motion to the updating of the light that is delivered by the display to the user’s visual system. If the latency is high, the system can create a mismatch between the user’s vestibular and visual sensory systems, and generate a user perception scenario that can lead to motion sickness or simulator sickness. If the system latency is high, the apparent location of virtual objects will appear unstable during rapid head motions.

[0009] In addition to head-worn display systems, other display systems can benefit from accurate and low latency head pose detection. These include head-tracked display systems in which the display is not worn on the user’s body, but is, e.g., mounted on a wall or other surface. The head-tracked display acts like a window onto a scene, and as a user moves his head relative to the “window” the scene is re-rendered to match the user’s changing viewpoint. Other systems include a head-worn projection system, in which a head-worn display projects light onto the real world.

[0010] Additionally, in order to provide a realistic augmented reality experience, AR systems may be designed to be interactive with the user. For example, multiple users may play a ball game with a virtual ball and/or other virtual objects. One user may “catch” the virtual ball, and throw the ball back to another user. In another embodiment, a first user may be provided with a totem (e.g., a real bat communicatively coupled to the AR system) to hit the virtual ball. In other embodiments, a virtual user interface may be presented to the AR user to allow the user to select one of many options. The user may use totems, haptic devices, wearable components, or simply touch the virtual screen to interact with the system.

[0011] Detecting head pose and orientation of the user, and detecting a physical location of real objects in space enable the AR system to display virtual content in an effective and enjoyable manner. However, although these capabilities are key to an AR system, but are difficult to achieve. In other words, the AR system must recognize a physical location of a real object (e.g., user’s head, totem, haptic device, wearable component, user’s hand, etc.) and correlate the physical coordinates of the real object to virtual coordinates corresponding to one or more virtual objects being displayed to the user. This requires highly accurate sensors and sensor recognition systems that track a position and orientation of one or more objects at rapid rates. Current approaches do not perform localization at satisfactory speed or precision standards.

[0012] There, thus, is a need for a better localization system in the context of AR and VR devices.

SUMMARY

[0013] In some embodiments, a display system is provided. The display system comprises a head-mounted augmented reality display device configured to be worn by a user, and configured to present virtual content to the user. The display system comprises one or more sensors. The display system comprises one or more processors, and computer storage media storing instructions that when executed by the display system, cause the display system to perform operations. The operations comprise monitoring information detected via the sensors of the system, and based on the monitored information, triggering access to virtual content identified in the sensor information. Virtual content to be presented via the system is obtained via the triggered access, with the virtual content being presented as augmented reality content visible with an ambient environment. The virtual content is presented via the augmented reality display device.

[0014] In some other embodiments, an augmented reality display device is provided. The augmented reality display device is configured to be worn by a user and present virtual content in an ambient environment of the user. The augmented reality display device comprises a plurality of stacked waveguides forming a display area and providing a view of the ambient environment through the display area, wherein at least some waveguides of the plurality of waveguides are configured to output light with different wavefront divergence than other waveguides, each waveguide being associated with a depth at which virtual content appears in focus. The augmented reality display device comprises one or more cameras configured to obtain images of the ambient environment. The augmented reality display device comprises one or more processors. The one or more processors are configured to obtain at least one image of the ambient environment, the at least one image being determined to include a QR code. The QR code is decoded, and an indication of a network location is obtained. A request to the network location is provided for virtual content. Virtual content received in response to the request is presented via the stacked waveguides.

[0015] In yet other embodiments, a method for sharing content is provided. The method comprises receiving an activation gateway indicating an interaction of virtual content viewed by a first user. The activation gateway is transmitted to at least one second user. An acceptance of the activation gateway is received by the at least one second user. The virtual content viewed by the first user is transmitted to the at least one second user. In some embodiments, a display system comprising: a head-mounted augmented reality display device configured to be worn by a user, and configured to present virtual content to the user; one or more processors; and computer storage media storing instructions that when executed by the display system, cause the display system to perform the method for sharing content of this paragraph.

[0016] In some embodiments, a method of sharing content is provided. The method comprises receiving an activation gateway indicating an interaction of virtual content viewed by a first user. The activation gateway is transmitted to at least one second user. An acceptance of the activation gateway is received by the at least one second user. A sharing credential of the at least one second user is transmitted to enable the first user to transmit directly to the at least one second user. In some embodiments, a display system comprising: a head-mounted augmented reality display device configured to be worn by a user, and configured to present virtual content to the user; one or more processors; and computer storage media storing instructions that when executed by the display system, cause the display system to perform the method for sharing content of this paragraph.

BRIEF DESCRIPTION OF THE DRAWINGS

[0017] FIG. 1 illustrates a user’s view of augmented reality (AR) through an AR device.

[0018] FIGS. 2A-2D illustrates an example of wearable display system.

[0019] FIG. 3 is a schematic illustrating coordination between example cloud computing assets and example local processing assets.

[0020] FIG. 4 illustrates an example system diagram of an electromagnetic tracking system.

[0021] FIG. 5 illustrates an example flowchart describing a functioning of an example electromagnetic tracking system.

[0022] FIG. 6 illustrates an electromagnetic tracking system that may be incorporated with an AR system.

[0023] FIG. 7 illustrates an example flow chart describing the electromagnetic tracking system in the context of AR devices.

[0024] FIG. 8 illustrates a system configuration featuring example sensing components.

[0025] FIGS. 9A-9F illustrate various aspects of an example control and quick release module.

[0026] FIG. 10 illustrates a minimized component/feature set.

[0027] FIG. 11A illustrates an electromagnetic sensing coil assembly coupled to a head mounted component.

[0028] FIG. 11B illustrates individual coils integrated into example structures of the head mounted component.

[0029] FIGS. 12A-12E illustrate various configurations for featuring a ferrite core coupled to an electromagnetic sensor to increase field sensitivity.

[0030] FIG. 13A illustrates a conventional local data processing configuration for a 3-coil electromagnetic receiver sensor.

[0031] FIG. 13B illustrates a transmitter configuration utilizing time division multiplexing.

[0032] FIG. 13C illustrates a receiver configuration utilizing time division multiplexing.

[0033] FIG. 14 illustrates a flowchart for tracking user head pose and handheld pose.

[0034] FIG. 15 illustrates another flowchart for tracking user head pose and handheld pose.

[0035] FIG. 16A illustrates a system configuration feature example sensing components, including example depth sensors.

[0036] FIG. 16B illustrates a partial orthogonal view of the configuration of FIG. 16A.

[0037] FIG. 17A illustrates an example resonant circuit used to create resonance.

[0038] FIG. 17B illustrates simulated data.

[0039] FIG. 17C illustrates example current plotted versus frequency.

[0040] FIG. 17D illustrates an embodiment of a dynamically tunable configuration.

[0041] FIG. 17E illustrates an example of a tunable circuit.

[0042] FIG. 17F illustrates simulated data.

[0043] FIG. 17G illustrates example current data.

[0044] FIG. 18A illustrates example noise in usable frequencies for electromagnetic tracking systems.

[0045] FIG. 18B illustrates a block diagram for a noise cancelling configuration for electromagnetic tracking interference.

[0046] FIG. 18C illustrates a plot of an example of how a signal can be inverted and added to cancel an interferer.

[0047] FIG. 19 illustrates a known pattern that may be utilized to assist in calibration of vision systems.

[0048] FIGS. 20A-20C illustrate a configuration with a summing amplifier to simplify circuitry between two subsystems or components of a wearable computing configuration.

[0049] FIG. 21 illustrates electromagnetic tracking update rates.

[0050] FIG. 22A illustrates a configuration with a single electromagnetic sensor device that may be coupled to a wearable component.

[0051] FIG. 22B illustrates another embodiment of the configuration.

[0052] FIG. 22C illustrates another embodiment of the configuration.

[0053] FIG. 23A illustrates coils on a transmitter being energized with a burst of sinewaves.

[0054] FIG. 23B illustrates a receiver being configured to receive EM waves using sensor coils.

[0055] FIG. 23C illustrates an example graph.

[0056] FIG. 24A illustrates an embodiment of an augmented reality system featuring a camera.

[0057] FIG. 24B illustrates an embodiment of an augmented reality system featuring a depth sensor.

[0058] FIGS. 24C-D illustrate determining position in space.

[0059] FIGS. 25A-B illustrate inherent ambiguities associated with electromagnetic tracking systems.

[0060] FIG. 26 illustrates a wearable computing device that comprises two outward-facing cameras.

[0061] FIG. 27 illustrates a flowchart of an example process for vision based pose calculations.

[0062] FIG. 28A illustrates use of an Extended Kalman Filter.

[0063] FIGS. 28B-F illustrate how data from one source at a higher update frequency may be combined with the data from another source at a lower update frequency.

[0064] FIG. 29 illustrates a deep learning network.

[0065] FIG. 30A illustrates a Helmholtz coil configuration.

[0066] FIG. 30B illustrates an example magnetic field.

[0067] FIG. 30C illustrates a three-axis Helmholtz coil configuration.

[0068] FIG. 30D illustrates a head mounted component being placed within a known magnetic field volume of a Helmholtz coil pair.

[0069] FIG. 30E illustrates optical fiducials.

[0070] FIG. 31A illustrates an example inner structure of a head mounted wearable component.

[0071] FIG. 31B illustrates an example inner structure of a head mounted wearable component.

[0072] FIG. 32A illustrates a layered configuration of a composite member.

[0073] FIG. 32B-C illustrate asymmetries of various types that may be engineered into composite constructs to transfer heat.

[0074] FIG. 33 illustrates a subject system that may be configured to assist a user in virtually experiencing a map of the solar system in an indoor environment.

[0075] FIGS. 34A-D illustrate virtually experiences for mapping, teaching, and “Street View”.RTM. type of functionalities.

[0076] FIGS. 35A-35B illustrate various features of example driving assistance configurations.

[0077] FIG. 36 illustrates a street use scenario with various users wearing head mounted system components.

[0078] FIG. 37 illustrates an embodiment featuring virtual highlighting of a selected route.

[0079] FIG. 38 illustrates an embodiment featuring virtual location assistance pertinent to identified friends of a user who may be in a crowd and otherwise difficult to visualize.

[0080] FIGS. 39A-39D illustrate various users wearing head mounted components in indoor environments.

[0081] FIG. 40 illustrates various sensors in various locations within a space to assist in locating and monitoring a person or other objects.

[0082] FIGS. 41A-41C illustrate one embodiment of an audio and/or video conferencing configuration.

[0083] FIGS. 42A-42D illustrate aspects of an “emojibomb” functionality.

[0084] FIGS. 43A-43D illustrate images or features presented in three dimensions to users.

[0085] FIGS. 44A-44D illustrate aspects of an “emojibomb” functionality.

[0086] FIGS. 45A-45D illustrate aspects of a “multiple emojibomb”.

[0087] FIGS. 46A-46D illustrate aspects of a music listening and engagement functionality.

[0088] FIGS. 47A-47B and 48A-48B illustrate being able to select a person and virtually associate or “stick” certain images or artwork to that person.

[0089] FIG. 49 illustrates a user wearing a head mounted component to interpret and translate sign language.

[0090] FIGS. 50A-50B illustrate translation being utilized to assist a user in understanding signage in the local environment.

[0091] FIG. 51 illustrates a pair of users wearing their head mounted components to experience a virtually-presented three-dimensional movie presentation.

[0092] FIGS. 52A-52E illustrate embodiments wherein users wearing head mounted components are able to experience highly augmented visual presentations.

[0093] FIG. 53 illustrates a theme park configuration.

[0094] FIG. 54 illustrates a couple wearing head mounted components that are able to enjoy a tabletop presentation of a theater show.

[0095] FIGS. 55A-55B illustrates a configuration wherein users with their head mounted components are able to step between room features.

[0096] FIGS. 56A-56H illustrate various gaming instantiations.

[0097] FIGS. 57A-57I illustrate an example game.

[0098] FIGS. 58A-58C illustrate a user scenario wherein a person utilizing a head mounted component may read and configure presentation of a book.

[0099] FIG. 59 illustrates an augmented birthday card configuration.

[0100] FIGS. 60A-60B and 67A-67B illustrate various aspects of a configuration wherein a user may customize presentation of images within a picture frame.

[0101] FIGS. 61A-62B illustrate various aspects of tabletop or desktop presentation of augmented reality.

[0102] FIGS. 63A-63F and 64A-64C illustrate various aspects of an example augmented reality document examination and/or gaming scenario.

[0103] FIGS. 65A-66C and 69A-69B illustrate various views of embodiments of the subject system wherein users wearing head mounted components are able to engage in activities.

[0104] FIGS. 68A-68B illustrate that various filters or overlays may be utilized to customize the presentation of virtual objects or images.

[0105] FIGS. 70A-71 illustrate participation in sports.

[0106] FIGS. 72A-75 illustrate users with head mounted components participating in artistic activities.

[0107] FIGS. 76A-76C and 78A-78C illustrate augmented reality or workers in various work environments.

[0108] FIG. 77 illustrates an augmented view of one embodiment of a particular user’s computing desktop.

[0109] FIGS. 79A-79B illustrate one embodiment of a file manipulation configuration.

[0110] FIG. 80 illustrates one embodiment of an augmented reality whiteboard configuration which may be shared.

[0111] FIG. 81 illustrates virtual objects augmenting an example tablet computer.
……
……
……

You may also like...