Meta Patent | Virtual personal interface for control and travel between virtual worlds

Patent: Virtual personal interface for control and travel between virtual worlds

Patent PDF: 20230419617

Publication Number: 20230419617

Publication Date: 2023-12-28

Assignee: Meta Platforms Technologies

Abstract

Methods and systems described herein are directed to a virtual personal interface (herein “personal interface”) for controlling an artificial reality (XR) environment, such as by providing user interfaces for interactions with a current XR application, providing detail views for selected items, navigating between multiple virtual worlds without having to transition in and out of a home lobby for those worlds, executing aspects of a second XR application while within a world controlled by a first XR application, and providing 3D content that is separate from the current world. While in at least one of those worlds, the personal interface can itself present content in a runtime separate from the current virtual world, corresponding to an item, action, or application for that world. XR applications can be defined for use with the personal interface to create both a 3D world portion and 2D interface portions that are displayed via the personal interface.

Claims

I/We claim:

1. A method of navigating multiple virtual worlds in artificial reality, the method comprising:providing a personal interface with 2D interfaces to multiple applications,wherein each application has a 3D world building portion that controls one of the virtual worlds executed in a first runtime, and a 2D interface portion that controls one of the 2D interfaces executed in a second runtime of the personal interface; andtransitioning to a subsequent virtual world by:receiving a selection of an application, of the multiple applications, corresponding to the subsequent virtual world;displaying, via the second runtime of the personal interface, the 2D interface portion of the selected application, wherein the 2D interface portion of the selected application comprises one or more travel cards, each travel card defining a link to a respective travel destination for the subsequent virtual world;receiving a selection of a travel destination via the displayed 2D interface portion of the selected application; andcausing, in response to the selection, the selected application to generate and display a 3D world, in the first runtime, corresponding to the selected travel destination.

2. The method of claim 1,wherein the receiving the selection of the application corresponding to the subsequent virtual world occurs concurrently with display for a current virtual world.

3. The method of claim 1,wherein the personal interface appears with consistent controls in both a current virtual world and the subsequent virtual world.

4. The method of claim 1,wherein the travel destinations correspond to one or more of (a) places within the subsequent virtual world, (b) events within the subsequent virtual world, (c) people within the subsequent virtual world, or (d) any combination thereof.

5. The method of claim 1 further comprising:receiving a selection of an item, in the subsequent virtual world, that corresponds to a controller;accessing content, from the controller, via a deeplink associated with the selected item; andpresenting the content, from the controller, on the personal interface in the second runtime of the personal interface.

6. The method of claim 1 further comprising:generating, in the second runtime of the personal interface exclusive of the first runtime, 3D content by the personal interface in response to receiving a 3D content trigger action for an item in the subsequent virtual world or on the personal interface; anddisplaying the generated 3D content by the personal interface according to a display option, wherein the display option specifies one of 3D content being displayed A) at a specified location relative to the personal interface, B) with the personal interface as a window into another 3D world which is the 3D content, or C) with the personal interface being a portal to a volume containing the 3D content.

7. A computing system for navigating multiple virtual worlds in artificial reality, the computing system comprising:one or more processors; andone or more memories storing instructions that, when executed by the one or more processors, cause the computing system to perform a process comprising:providing a personal interface with 2D interfaces to multiple applications,wherein each application has a 3D world building portion that controls one of the virtual worlds executed in a first runtime, and a 2D interface portion that controls one of the 2D interfaces executed in a second runtime of the personal interface; andtransitioning to a subsequent virtual world by:receiving a selection of an application, of the multiple applications, corresponding to the subsequent virtual world,displaying, via the second runtime of the personal interface, the 2D interface portion of the selected application,receiving a selection of a travel destination via the displayed 2D interface portion of the selected application, andcausing, in response to the selection, the selected application to generate and display a 3D world, in the first runtime, corresponding to the selected travel destination.

8. The computing system of claim 7,wherein the 2D interface portion of the selected application comprises one or more travel cards, each travel card defining a link to a respective travel destination for the subsequent virtual world.

9. The computing system of claim 8,wherein the travel destinations correspond to one or more of (a) places within the subsequent virtual world, (b) events within the subsequent virtual world, (c) people within the subsequent virtual world, or (d) any combination thereof.

10. The computing system of claim 7,wherein the receiving the selection of the application corresponding to the subsequent virtual world occurs concurrently with display for a current virtual world.

11. The computing system of claim 7,wherein the personal interface appears with consistent controls in both a current virtual world and the subsequent virtual world.

12. The computing system of claim 7, wherein the process further comprises:receiving a selection of an item, in the subsequent virtual world, that corresponds to a controller;accessing content, from the controller, via a deeplink associated with the selected item; andpresenting the content, from the controller, on the personal interface in the second runtime of the personal interface.

13. The computing system of claim 7,wherein the method further comprises generating, in the second runtime of the personal interface exclusive of the first runtime, 3D content by the personal interface in response to receiving a 3D content trigger action for an item in the subsequent virtual world or on the personal interface; anddisplaying the generated 3D content by the personal interface according to a display option, wherein the display option specifies one of 3D content being displayed A) at a specified location relative to the personal interface, B) with the personal interface as a window into another 3D world which is the 3D content, or C) with the personal interface being a portal to a volume containing the 3D content.

14. A machine-readable storage medium having machine-executable instructions stored thereon that, when executed by one or more processors, cause the one or more processors to perform a method for navigating multiple virtual worlds in artificial reality, the method comprising:providing a personal interface with 2D interfaces to multiple applications,wherein each application has a 3D world building portion that controls one of the virtual worlds executed in a first runtime, and a 2D interface portion that controls one of the 2D interfaces executed in a second runtime of the personal interface; andtransitioning to a subsequent virtual world by:receiving a selection of an application, of the multiple applications, corresponding to the subsequent virtual world,displaying, via the second runtime of the personal interface, the 2D interface portion of the selected application,receiving a selection of a travel destination via the displayed 2D interface portion of the selected application, andcausing, in response to the selection, the selected application to generate and display a 3D world, in the first runtime, corresponding to the selected travel destination.

15. The machine-readable storage medium of claim 14,wherein the 2D interface portion of the selected application comprises one or more travel cards, each travel card defining a link to a respective travel destination for the subsequent virtual world.

16. The machine-readable storage medium of claim 15,wherein the travel destinations correspond to one or more of (a) places within the subsequent virtual world, (b) events within the subsequent virtual world, (c) people within the subsequent virtual world, or (d) any combination thereof.

17. The machine-readable storage medium of claim 14,wherein the receiving the selection of the application corresponding to the subsequent virtual world occurs concurrently with display for a current virtual world.

18. The machine-readable storage medium of claim 14,wherein the personal interface appears with consistent controls in both a current virtual world and the subsequent virtual world.

19. The machine-readable storage medium of claim 14,wherein the method further comprises receiving a selection of an item, in the subsequent virtual world, that corresponds to a controller;accessing content, from the controller, via a deeplink associated with the selected item; andpresenting the content, from the controller, on the personal interface in the second runtime of the personal interface.

20. The machine-readable storage medium of claim 14,wherein the method further comprises generating, in the second runtime of the personal interface exclusive of the first runtime, 3D content by the personal interface in response to receiving a 3D content trigger action for an item in the subsequent virtual world or on the personal interface; anddisplaying the generated 3D content by the personal interface according to a display option wherein the display option specifies one of 3D content being displayed A) at a specified location relative to the personal interface, B) with the personal interface as a window into another 3D world which is the 3D content, or C) with the personal interface being a portal to a volume containing the 3D content.

Description

TECHNICAL FIELD

The present disclosure is directed to methods and systems for controlling and navigating between multiple virtual worlds in artificial reality without having to transition in and out of a home lobby.

BACKGROUND

Artificial reality systems offer a user a plethora of opportunities to experience what it might be like to visit desired places, participate in certain events, interact with particular individuals, etc. Often, these activities occur within the context of a virtual world delivered by an artificial reality application designed to simulate real-life encounters. For instance, such a virtual world can depict scenes for locations that can be controlled to immerse a user in the happenings within the world as if the user were actually there.

In some scenarios, a user may like to travel to a subsequent virtual world as a result of being intrigued by activities or things experienced in a current world. Otherwise, such a travel desire may be the result of a user having concluded a virtual world event or merely becoming more interested in a diversity of artificial reality offerings.

In an artificial reality environment, some objects a user sees and interacts with are “virtual objects,” i.e., computer generated object representations. Virtual objects can be presented, e.g., by a head-mounted display, mobile device, projection system, etc. Often, users can interact with virtual objects using controllers and/or hand gestures. In systems that include hand tracking, images of the user's hands can be interpreted to create 3D models of the user's hands or to otherwise estimate hand postures. In some systems, user hand gestures can perform “interactions” with virtual objects that can include selecting, moving, rotating, resizing, actuating controls, changing colors or skins, defining interactions between real or virtual objects, setting virtual forces to act on virtual objects, or any other action on or change to an object that a user can imagine.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 is a block diagram illustrating an overview of devices on which some implementations of the present technology can operate.

FIG. 2A is a wire diagram illustrating a virtual reality headset which can be used in some implementations of the present technology.

FIG. 2B is a wire diagram illustrating a mixed reality headset which can be used in some implementations of the present technology.

FIG. 2C is a wire diagram illustrating controllers which, in some implementations, a user can hold in one or both hands to interact with an artificial reality environment.

FIG. 3 is a block diagram illustrating an overview of an environment in which some implementations of the present technology can operate.

FIG. 4 is a block diagram illustrating components which, in some implementations, can be used in a system employing the disclosed technology.

FIG. 5 is a conceptual block diagram illustrating an exemplary artificial reality (XR) application which, in some implementations, can be used to separately generate and control personal interface and 3D world content.

FIG. 6 is a flow diagram illustrating a process used in some implementations of the present technology for using a personal interface of an artificial reality (XR) navigation system to directly navigate between multiple virtual worlds.

FIG. 7 is a flow diagram illustrating a process used in some implementations of the present technology for using the personal interface to present content from a controller corresponding to a selected item for a virtual world.

FIG. 8 is a flow diagram illustrating a process used in some implementations of the present technology for generating and displaying 3D content via the personal interface.

FIG. 9 is a flow diagram illustrating a process used in some implementations of the present technology for populating a dedicated space within a virtual world with content sourced by an application external to that virtual world.

FIG. 10 is a diagram illustrating an exemplary personal interface.

FIG. 11 is a diagram illustrating an exemplary personal interface depicting virtual world destinations which can be executable by a user upon selection of an XR application on the personal interface.

FIG. 12A is an exemplary diagram illustrating the personal interface providing 3D content separate from a world currently traveled by a user.

FIG. 12B is an exemplary diagram illustrating the personal interface providing a window into a virtual world other than a world currently traveled by a user.

FIG. 12C is an exemplary diagram illustrating the personal interface providing a portal to view and interact with 3D content that is separate from a world currently traveled by a user.

FIG. 13 is an exemplary diagram illustrating populating a dedicated space within a virtual world with content sourced by an application external to that virtual world.

The techniques introduced here may be better understood by referring to the following Detailed Description in conjunction with the accompanying drawings, in which like reference numerals indicate identical or functionally similar elements.

您可能还喜欢...