空 挡 广 告 位 | 空 挡 广 告 位

Facebook Patent | Prompting Creation Of A Networking System Communication With Augmented Reality Elements In A Camera Viewfinder Display

Patent: Prompting Creation Of A Networking System Communication With Augmented Reality Elements In A Camera Viewfinder Display

Publication Number: 20180300916

Publication Date: 20181018

Applicants: Facebook

Abstract

The present disclosure is directed toward systems and methods for utilizing augmented reality elements in connection with a camera viewfinder display of a mobile computing device. For example, systems and methods described herein detect characteristics of the mobile computing device and provide augmented reality elements that correspond to the detected characteristics directly in the camera viewfinder display. Thus, a user can interact with the provided augmented reality elements in the camera viewfinder display to compose a networking system post, view a friend’s location, order and pay for merchandise, and so forth.

BACKGROUND

[0001] Networking systems are increasingly reliant on visual media. For example, networking system users frequently include digital photographs and videos in networking system posts in order to make their posts more eye-catching and engaging. For instance, a networking system user may upload a networking system post including a picture of a dish from a new restaurant along with text detailing how the user enjoyed the dish. In another example, a networking system user may send pictures of his current location to his social networking “friends.” In another example, a third-party (e.g., a news outlet, a sports broadcaster, a business or vendor) may upload media related to an event to the networking system such that networking system users can read additional information, be directed to a website to order event merchandise, listen to event commentary, and so forth.

[0002] Relying on pictures and videos within networking system posts to convey information inevitably leads to a disconnect between the information that is accessible within the networking system and what a networking system user experiences in real life. For example, if a networking system user is at a baseball game, he has to access the networking system in order to read other networking system user’s posts related to the baseball game. Accordingly, the user must divide his attention between the baseball game and his computing device (e.g., a mobile phone, tablet, smart watch, etc.). In another example, when a group of friends are utilizing a social networking system to interact with each other while at a crowded club, they must continually view and send networking system messages, thus drawing their attention away from their current surroundings or companions.

[0003] Thus, there is a need for system that enables a networking system user to experience networking system information and features in a way that does not distract the user from real-life events.

SUMMARY

[0004] One or more embodiments described herein provide benefits and/or solve one or more of the foregoing or other problems in the art with systems and methods for providing networking system content within augmented reality elements displayed in the camera viewfinder display of a user’s mobile computing device. For example, systems and methods described herein generate augmented reality elements representing networking system content that is relevant to what a user is viewing through the camera viewfinder display of his mobile computing device. Thus, in one or more embodiments, the user can view networking system content in connection with a real-life scene through his camera viewfinder display.

[0005] Furthermore, one or more embodiments described herein provide benefits and/or solve one or more of the foregoing or other problems in the art with systems and methods for enabling a networking system user to create networking system augmented reality elements through the camera viewfinder display of the user’s mobile computing device. For example, instead of simply writing a networking system post related to a location, systems and methods described herein enable a networking system user to create an augmented reality element related to the location. Thus, systems and methods described herein can provide the user’s augmented reality element to other networking system users who are utilizing their mobile computing device camera at the same location.

[0006] Additional features and advantages of the present application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of such exemplary embodiments. The features and advantages of such embodiments may be realized and obtained by means of the instruments and combinations particularly pointed out in the appended claims. These and other features will become more fully apparent from the following description and appended claims, or may be learned by the practice of such exemplary embodiments as set forth hereinafter.

BRIEF DESCRIPTION OF THE DRAWINGS

[0007] The disclosure describes one or more embodiments with additional specificity and detail through the use of the accompanying drawings, as briefly described below.

[0008] FIG. 1 illustrates a schematic diagram of an augmented reality system in accordance with one or more embodiments.

[0009] FIG. 2 illustrates a detailed schematic diagram of the augmented reality system in accordance with one or more embodiments.

[0010] FIGS. 3A-3C illustrate a series of graphical user interfaces illustrating various features of one embodiment of the augmented reality system.

[0011] FIGS. 4A-4D illustrate a series of graphical user interfaces illustrating various features of one embodiment of the augmented reality system.

[0012] FIG. 5 illustrates a graphical user interface illustrating various features of one embodiment of the augmented reality system.

[0013] FIGS. 6A-6D illustrate a series of graphical user interfaces illustrating various features of one embodiment of the augmented reality system.

[0014] FIGS. 7A-7B illustrate a series of graphical user interfaces illustrating various features of one embodiment of the augmented reality system.

[0015] FIG. 8 illustrates a flowchart of a series of acts in a method of composing a networking system post utilizing augmented reality elements in accordance with one or more embodiments.

[0016] FIG. 9 illustrates a flowchart of a series of acts in a method of providing augmented reality elements representing networking system content in accordance with one or more embodiments.

[0017] FIG. 10 illustrates a flowchart of a series of acts in a method of displaying augmented reality elements representing networking system content in accordance with one or more embodiments.

[0018] FIG. 11 illustrates a block diagram of an exemplary computing device in accordance with one or more embodiments.

[0019] FIG. 12 is an example network environment of a social networking system in accordance with one or more embodiments.

[0020] FIG. 13 illustrates a social graph in accordance with one or more embodiments.

DETAILED DESCRIPTION

[0021] One or more embodiments described herein provide benefits and/or solve one or more of the foregoing or other problems in the art with systems and methods for utilizing augmented reality elements in connection with a camera viewfinder display of a mobile computing device to represent and/or create networking system content (e.g., social networking posts or messages). For example, by utilizing the augmented reality system, a networking system user can view and interact with augmented reality elements associated with the networking system directly through the camera viewfinder display of his or her mobile computing device. By utilizing these augmented reality elements within the camera viewfinder display, the user can generate a networking system post, interact with other networking system users, view networking system content, create additional augmented reality elements, and more.

[0022] As used herein, “augmented reality” refers to a system that creates a composite view for a user including computer-generated elements in association with the user’s real-life view. For example, in one or more embodiments, the augmented reality system overlays computer-generated elements on a display of a user’s real-life surroundings as captured by a camera of the user’s computing device (e.g., mobile device). Also as used herein, an “augmented reality element” refers to the computer-generated elements utilized by the augmented reality system described herein. In one or more embodiments, an augmented reality element may be a digital photograph, a digital video, a computer-generated image (e.g., in two or three dimensions), a sound recording, a text scroller, a speech bubble, an interactive element (e.g., a text input box), an animation, a sticker, and so forth. In at least one embodiment, the augmented reality system “anchors” or maps an augmented reality element to a point within a camera viewfinder display associated with a location, person, or object such that if the location, person, or object moves within the display, the augmented reality element moves as well.

[0023] In one example, the augmented reality system described herein detects various characteristics associated with a networking system user and the networking system user’s mobile computing device. In response to detecting these various characteristics, the augmented reality system identifies augmented reality elements and provides the identified augmented reality elements to the user’s mobile computing device as a camera viewfinder display overlay. As used herein, a “camera viewfinder display” refers to a display presented by a user’s mobile computing device that includes an image stream of image frames provided by the camera of the mobile computing device. For example, the camera viewfinder display illustrates in real-time what the mobile computing device camera is “looking at.”

[0024] In one or more embodiments, the augmented reality system may detect characteristics of the networking system user including the user’s gender, occupation, hobbies, networking system activity history, networking system profile information, etc. Further, the augmented reality system may detect characteristics of the user’s mobile computing device including the location of the mobile computing device (e.g., based on GPS data, Wi-Fi data, etc.), an orientation of the mobile computing device (e.g., based on the mobile computing device’s gyroscope or camera), etc. Additionally, if the camera of the mobile computing device is activated, the augmented reality system can also utilize computer vision techniques to analyze and determine characteristics of images captured by the camera (e.g., to detect objects, people, and so forth).

[0025] In response to detecting these user and mobile computing device characteristics, the augmented reality system can identify augmented reality elements that correspond to the detected characteristics. For example, in response to detecting that a networking system user is male, in his late twenties, a baseball enthusiast, and that his mobile computing device is located at a baseball stadium, the augmented reality system may identify augmented reality elements that prompt the user to compose a networking system post about the baseball game he is attending. The augmented reality system can then present the identified augmented reality elements within the camera viewfinder display of the user’s mobile computing device. Thus, the user can interact with the provided augmented reality elements to compose and submit a networking system post regarding the baseball game.

[0026] In addition to providing augmented reality elements to a networking system user, the augmented reality system can also enable the networking system user to create augmented reality elements. For example, the augmented reality system can provide a series of selectable elements through the user’s camera viewfinder display that assist the user in creating an augmented reality element that other networking system users can interact with and view. For instance, in an illustrative embodiment, the networking system user may wish to recommend a particular restaurant via the networking system. The augmented reality system can provide the user with interactive elements within the user’s camera viewfinder display that enable the user to create an augmented reality element that embodies the user’s recommendation for that restaurant. Later, when another networking system user (e.g., one of the user’s networking system “friends”) comes to the restaurant, the augmented reality system can provide the created augmented reality element to that networking system user.

[0027] In another example, the augmented reality system makes it possible for networking system users to easily find each other in a crowded location. For example, the augmented reality system can generate an augmented reality element that appears as a user avatar (e.g., a computer-generated representation of the user). In one or more embodiments, the augmented reality system can display the avatar in the camera viewfinder display of a networking system user such that the avatar appears where the associated user is located in a crowded space. Thus, when one of the user’s networking system friends pans his or her camera viewfinder display across the crowded space, the friend can easily see the avatar and locate the associated user.

[0028] In addition to providing networking system content via augmented reality elements overlaid on a user’s camera viewfinder display, the augmented reality system also provides partnered third-party content. For example, the augmented reality system can generate a camera viewfinder display overlay including augmented reality elements from a third party that applies to the user’s location. To illustrate, in response to determining that the networking system user is at a baseball game, the augmented reality system can identify third-party content from a sports broadcaster. The augmented reality system can then generate augmented reality elements including the third-party content and create a camera viewfinder display including the generated elements such that the augmented reality elements enhance the user’s view of the baseball game through the his camera viewfinder display.

[0029] In a further embodiment, the augmented reality system can automatically generate augmented reality elements in response to user actions. For example, in one or more embodiments, the augmented reality system can detect a gesture made by a networking system user captured through a camera of a mobile computing device. In response to the detected gesture, the augmented reality system can generate an augmented reality element and can then anchor the generated element to the user for a predetermined amount of time. Thus, anytime the user is displayed on a camera viewfinder display or in a photograph or video for the rest of the predetermined amount of time, the augmented reality system will add the generated augmented reality element to the display or captured media.

[0030] FIG. 1 illustrates an example block diagram of an environment for implementing the augmented reality system 100. As illustrated in FIG. 1, the augmented reality system 100 includes the mobile computing devices 102a, 102b, the server device(s) 106, and the third-party server 112, which are communicatively coupled through a network 110. As shown in FIG. 1, the mobile computing devices 102a, 102b include the networking system application 104a, 104b, respectively. Additionally shown in FIG. 1, the server device(s) 106 includes a networking system 108.

[0031] The mobile computing devices 102a, 102b, the server device(s) 106, and the third-party server 112 communicate via the network 110, which may include one or more networks and may use one or more communication platforms or technologies suitable for transmitting data and/or communication signals. In one or more embodiments, the network 110 includes the Internet or World Wide Web. The network 110, however, can include various other types of networks that use various communication technologies and protocols, such as a corporate intranet, a virtual private network (“VPN”), a local area network (“LAN”), a wireless local network (“WLAN”), a cellular network, a wide area network (“WAN”), a metropolitan area network (“MAN”), or a combination of two or more such networks. Although FIG. 1 illustrates a particular arrangement of the mobile computing devices 102a, 102b, the server device(s) 106, the third-party server 112, and the network 110, various additional arrangements are possible. For example, the mobile computing devices 102a, 102b may directed communicate with the networking system 108, bypassing the network 110. Additional details relating to the network 110 are explained below with reference to FIG. 12.

[0032] In one or more embodiments, the mobile computing devices 102a, 102b are one or more of various types of computing devices. For example, in one or more embodiments, the mobile computing devices 102a, 102b include a mobile device such as a mobile telephone, a smartphone, a PDA, a tablet, or a laptop. In alternative embodiments, the mobile computing devices 102a, 102b may include other computing devices such as a desktop computer, a server, or another type of computing device. Additional details with respect to the mobile computing device 102a, 102b are discussed below with respect to FIG. 11.

[0033] In at least one embodiment, the users of the mobile computing devices 102a, 102b are co-users via the networking system 108. For example, in at least one embodiment, the users of the mobile computing devices 102a, 102b are “friends” via the networking system 108 such that the networking system 108 adds posts submitted by the user of mobile computing device 102a to the newsfeed of the user of mobile computing device 102b, and vice versa. In one or more embodiments, the users of the mobile computing devices 102a, 102b interact with the networking system 108 via the networking system applications 104a, 104b installed on the mobile computing devices 102a, 102b respectively.

[0034] As discussed above, the systems and methods laid out with reference to FIG. 1 facilitate the use of augmented reality elements via the networking system 108. FIG. 2 illustrates a detailed schematic diagram illustrating an example embodiment of the augmented reality system 100. As shown in FIG. 2, the augmented reality system 100 includes, but is not limited to, the mobile computing devices 102a, 102b, the server device(s) 106, and the third-party server 112. In one or more embodiments, the mobile computing devices 102a, 102b include networking system applications 104a, 104b, respectively. As shown in FIG. 2, the networking system application 104a, 104b includes an augmented reality manager 202a, 202b, a display manager 204a, 204b, a user input detector 206a, 206b, and a data storage 208a, 208a including networking system data 210a, 210b.

[0035] Additionally, the server device(s) 106 hosts the networking system 108. In one or more embodiments, the networking system 108 includes a communication manager 212, an augmented reality element identifier 214, an augmented reality element generator 216, and a data storage 218 including augmented reality element data 220.

[0036] In at least one embodiment, the augmented reality system 100 accesses the networking system 108 in order to identify and analyze networking system user data. Accordingly, the networking system 108 includes a social graph 222 for representing a plurality of users, actions, and concepts. In one or more embodiments, the social graph 222 includes node information 224 and edge information 226. Node information 224 of the social graph 222 stores information including, for example, nodes for users and nodes for repositories. Edge information 226 of the social graph 222 stores information including relationships between nodes and/or actions occurring within the networking system 108. Further details regarding the networking system 108, the social graph 222, edges, and nodes are presented below with respect to FIGS. 12 and 13.

[0037] Each of the components 212-226 of the networking system 108, and the components 202a, 202b through 210a, 210b of the networking system applications 104a, 104b can be implemented using a computing device including at least one processor executing instructions that cause the augmented reality system 100 to perform the processes described herein. In some embodiments, the networking system components described herein can be implemented by the server device 106, or across multiple server devices. Additionally or alternatively a combination of one or more server devices and one or more mobile computing devices can implement the components of the networking system 108 and/or the networking system applications 104a, 104b. Additionally or alternatively, the components described herein can comprise a combination of computer-executable instructions and hardware.

[0038] In one or more embodiments, the networking system application 104a, 104b is a native application installed on the mobile computing device 102a, 102b. For example, the networking system application 104a, 104b can be a mobile application that installs and runs on a mobile device, such as a smart phone or a tablet computer. Alternatively, the networking system application 104a, 104b can be a desktop application, widget, or other form of a native computer program. Furthermore, the networking system application 104a, 104b may be a remote application accessed by the mobile computing device 102a, 102b, respectively. For example, the networking system application 104a, 104b may be a web application that is executed within a web browser of the mobile computing device 102a, 102b, respectively.

[0039] As mentioned above, and as shown in FIG. 2, the networking system application 104a, 104b includes an augmented reality manager 202a, 202b. In one or more embodiments, the augmented reality manager 202a, 202b interacts with the networking system 108 in order to provide augmented reality elements via a camera viewfinder display of the mobile computing device 102a, 102b. For example, in at least one embodiment and as will be described in greater detail below, the networking system 108 maintains and/or generates a repository of augmented reality elements. Accordingly, in response to receiving data related to characteristics of the mobile computing device 102a, 102b from the augmented reality manager 202a, 202b the networking system 108 provides a set of augmented reality elements to the augmented reality manager 202a, 202b. For a variety of reasons, the augmented reality manager 202a, 202b may not be able to display every augmented reality element provided by the networking system 108 (e.g., due to display restrictions, etc.). Thus, in at least one embodiment, the augmented reality manager 202a, 202b then performs an analysis to determine a subset of the provided set of augmented reality elements to present to the user via the camera viewfinder display of the mobile computing device 102a 102b.

[0040] Accordingly, in one or more embodiments, the augmented reality manager 202a, 202b collects characteristic data associated with the mobile computing device 102a, 102b respectively. For example, the augmented reality manager 202a, 202b, collects information detailing the location of the mobile computing device 102a, 102b. In at least one embodiment, the augmented reality manager 202a, 202b collects location information including GPS information and/or WiFi information.

[0041] Additionally, the augmented reality manager 202a, 202b collects characteristic data that is related to the user of the mobile computing device 102a, 102b. For example, in at least one embodiment, the user of the mobile computing device 102a, 102b is logged onto the networking system 108 via the networking system application 104a, 104b in order to utilize any of the features of the augmented reality system 100. Accordingly, in at least one embodiment, the augmented reality manager 202a, 202b identifies the user’s unique networking system user identifier. Additionally, the augmented reality manager 202a, 202b can collect additional user information including, but not limited to, application usage history, mobile computing device usage logs, contact information, and so forth. In at least one embodiment, the augmented reality manager 202a, 202b only collects user information in response to the user specifically opting into those features of the augmented reality system 100 so as to protect the user’s privacy.

[0042] Furthermore, the augmented reality manager 202a, 202b collects characteristic data associated with a camera of the mobile computing device 102a, 102b. For example, in one or more embodiments, the augmented reality manager 202a, 202b collects information regarding the orientation of the camera (e.g., portrait or landscape based on a gyroscope of the mobile computing device 102a, 102b). Additionally, the augmented reality manager 202a, 202b can regularly (e.g., at predetermined intervals) collect an image frame from the camera viewfinder image feed.

[0043] After collecting the characteristic information described above, the augmented reality manager 202a, 202b provides the collected characteristic information to the networking system 108. As will be described in greater detail below, the networking system 108 utilizes the provided characteristic information to identify a set of augmented reality elements to send back to the mobile computing device 102a, 102b. Accordingly, in one or more embodiments, the augmented reality manager 202a, 202b receives the set of augmented reality elements from the networking system 108. In at least one embodiment, and as will be described in greater detail below, the networking system 108 provides metadata along with each augmented reality element that includes, but is not limited to, demographic information for users who frequently interact with each augmented reality element, geographic information for where each augmented reality element is most commonly used, for each augmented reality element, networking system information for any networking system users who are “friends” of the user of the mobile computing device 102a, 102b, and mapping rules for each augmented reality element (i.e., rules dictating where an augmented reality should be displayed within a camera viewfinder display).

[0044] Due to various constraints of the mobile computing device 102a, 102b (e.g., the size and resolution of the camera viewfinder display, whether the camera viewfinder display is overly crowded, etc.), the augmented reality manager 202a, 202b may not be able to present all of the augmented reality elements provided by the networking system 108. Accordingly, in at least one embodiment, the augmented reality manager 202a, 202b determines a subset of the provided augmented reality elements to present via the camera viewfinder display of the mobile computing device 102a, 102b. In one or more embodiments, the augmented reality manager 202a, 202b determines the subset of the provided augmented reality elements based on an analysis of a variety of display factors.

[0045] For example, in one or more embodiments, the augmented reality manager 202a, 202b determines a subset of augmented reality elements based on an analysis of the size of each augmented reality element in the subset relative to the camera viewfinder display. For example, the augmented reality manager 202a, 202b may not select an augmented reality element that is too large or too small compared to the size of the camera viewfinder display. In at least one embodiment, the augmented reality manager 202a, 202b utilizes a heuristic that mandates that a single augmented reality element must be viewable but cannot take up more than a predetermined amount of viewable space in the camera viewfinder display.

[0046] Additionally, the augmented reality manager 202a, 202b determines the subset of augmented reality elements based on an analysis of one or more image frames taken from the image feed presented on the camera viewfinder display of the mobile computing device 102a, 102b. For example, in at least one embodiment, the augmented reality manager 202a, 202b analyzes the image frame to determine whether the image frame is “crowded” or “un-crowded.” For instance, an image frame may be crowded if it includes several people grouped together for a “group selfie” (i.e., the people are huddled close together leaving little space in the image frame that is not occupied by a face). Conversely, an image frame may be un-crowded if it includes a landscape picture of a grassy hill against a blue sky. In one or more embodiments, the augmented reality manager 202a, 202b utilizes a heuristic that mandates the number and/or size of augmented reality elements included in a camera viewfinder display is inversely proportional to the level of crowded-ness in an image frame taken from the image feed displayed on the camera viewfinder display (e.g., the less crowded an image frame, the more augmented reality elements can be included).

[0047] Furthermore, the augmented reality manager 202a, 202b determines the subset of augmented reality elements based on an analysis of networking system information associated with the user of the mobile computing device 102a, 102b. For example, as mentioned above, the networking system 108 provides metadata with each augmented reality element in the set of provided augmented reality elements. Accordingly, the augmented reality manager 202a, 202b can determine augmented reality elements in the set that are used by other networking system users who are demographically similar to the user of the mobile computing device 102a, 102b. Further, the augmented reality manager 202a, 202b can determine augmented reality elements from the set that are being used at or near the location of the mobile computing device 102a, 102b.

[0048] The augmented reality manager 202a, 202b can also determine augmented reality elements that are being or have been used by social networking friends of the user of the mobile computing device 102a, 102b. For example, the augmented reality manager 202a, 202b can identify augmented reality elements used by the user’s social networking friends with whom the user has a high relationship coefficient. In other words, in one or more embodiments, the augmented reality manager 202a, 202b operates under a heuristic that the user of the mobile computing device 102a, 102b is more likely to interact with augmented reality elements that have been used by social networking friends with whom the user is relatively close (e.g., the user is likely closer to a spouse than to an old high school friend).

[0049] Additionally, when determining augmented reality elements to provide to the user, the augmented reality manager 202a, 202b can also take into account the user’s past augmented reality element interactions. For example, if the user has previously interacted several times with a particular type of augmented reality element, the augmented reality manager 202a, 202b will likely provide that type of augmented reality element again instead of a different type of augmented reality element. Thus, in one or more embodiments, the augmented reality manager 202a, 202b operates under an overarching heuristic that the user of the mobile computing device 102a, 102b will likely want to be provided with augmented reality elements with which he is likely to interact.

[0050] In at least one embodiment, the augmented reality manager 202a, 202b determines which augmented reality elements to provide to the user by calculating a score for each augmented reality element provided by the networking system 108. For example, the augmented reality manager 202a, 202b may calculate the score by assigning a weighted value to each of the variety of display factors described above. Thus, certain display factors may carry a heavier weight than others. For instance, the size of a particular augmented reality element relative to the camera viewfinder display may carry a heavier weight than whether the user of the mobile computing device 102a, 102b has used the particular augmented reality element previously. Accordingly, in one or more embodiments, the augmented reality manager 202a, 202b determines the subset of augmented reality elements to provide via the camera viewfinder display by identifying a threshold amount of top scoring augmented reality elements.

[0051] In addition to determining a subset of augmented reality elements to provide via the camera viewfinder display of the mobile computing device 102a, 102b, the augmented reality manager 202a, 202b also maps each of the subset of augmented reality elements to a point or area within the camera viewfinder display. For example, in one or more embodiments, mapping rules may require that certain augmented reality elements are associated with a displayed person (e.g., a “tag this person” type of augmented reality element), an object (e.g., a “rate this dish” type of augmented reality element), or requires a certain type of background over which it must be overlaid (e.g., a “virtual scoreboard” requires a certain amount of solid-colored background over which it can be overlaid). As mentioned above, the networking system 108 may provide mapping rules for each augmented reality element as part of the metadata for each augmented reality element.

[0052] Accordingly, in order to map an augmented reality element to the correct point or area within the camera viewfinder display, the augmented reality manager 202a, 202b can analyze an image frame taken from the camera viewfinder display to find the optimal location for the augmented reality element. For example, if the mapping rules for an augmented reality element specify that the augmented reality element should be mapped to a blank (e.g., solid colored) space of a particular size, the augmented reality manager 202a, 202b can analyze the image frame to identify an area that corresponds to that requirement. The augmented reality manager 202a, 202b can then map that identified area within the image frame to the corresponding augmented reality element. Once the correct mapping for an augmented reality element is established, the augmented reality manager 202a, 202b anchors the augmented reality element to that location within the camera viewfinder display.

[0053] As mentioned above, and as shown in FIG. 2, the networking system application 104a, 104b includes a display manager 204a, 204b. The display manager 204a, 204b provides, manages, and/or controls a graphical user interface that allows the user of the mobile computing device 102a, 102b to interact with features of the augmented reality system 100. For example, in response to the augmented reality manager 202a, 202b anchoring an augmented reality element to a location within the camera viewfinder display of the mobile computing device 102a, 102b, the display manager 204a, 204b maintains the location of the augmented reality element relative to other objects displayed within the camera viewfinder display.

[0054] To illustrate, a feature of some embodiments of the augmented reality system 100 is that a displayed augmented reality element remains in a single location relative to a displayed object in the camera viewfinder display, even when the user of the mobile computing device 102a, 102b moves the camera. Thus, when the user pans the camera of the mobile computing device 102a, 102b across a scene, the augmented reality element appears anchored to a stationary object within the camera viewfinder display. In one or more embodiments, the display manager 204a, 204b utilizes simultaneous location and mapping (“SLAM”) techniques to construct and/or update a virtual map of the environment displayed in a camera viewfinder display while tracking the location of the mobile computing device 102a, 102b within that environment. In at least one embodiment, SLAM enables the display manager 204a, 204b to determine distance between objects, degrees of rotation, rate of movement, and so forth. Accordingly, in one example, the display manager 204a, 204b updates the camera viewfinder display of the mobile computing device 102a, 102b such that as the user points the camera at an object in real life, an augmented reality element anchored to that object remains in place relative to the object, even when the user pans the camera of the mobile computing device 102a, 102b.

[0055] In addition to enabling the display of one or more augmented reality elements within a camera viewfinder display, the display manager 204a, 204b also facilitates the display of graphical user interfaces that enable the user of the mobile computing device 102a, 102b to interact with the networking system 108. For example, the display manager 204a, 204b may compose a graphical user interface of a plurality of graphical components, objects, and/or elements that allow a user to engage in networking system activities. More particularly, the display manager 204a, 204b may direct the mobile computing device 102a, 102b to display a group of graphical components, objects, and/or elements that enable a user to interact with various features of the networking system 108.

[0056] In addition, the display manager 204a, 204b directs the mobile computing device 102a, 102b to display one or more graphical objects, controls, or elements that facilitate user input for interacting with various features of the networking system 108. To illustrate, the display manager 204a, 204b provides a graphical user interface that allows the user of the mobile computing device 102a, 102b to input one or more types of content into a networking system post or electronic message.

[0057] The display manager 204a, 204b also facilitates the input of text or other data for the purpose of interacting with one or more features of the networking system 108. For example, the display manager 204a, 204b provides a user interface that includes a touch display keyboard. A user can interact with the touch display keyboard using one or more touch gestures to input text to be included in a social networking system post or electronic message. For example, a user can use the touch display keyboard to compose a message. In addition to text, the graphical user interface including the touch display keyboard can facilitate the input of various other characters, symbols, icons, or other information. In at least one embodiment, the display manager 204a, 204b provides the touch display keyboard in connection with a camera viewfinder display of the mobile computing device 102a, 102b.

[0058] Furthermore, the display manager 204a, 204b is capable of transitioning between two or more graphical user interfaces. For example, in one embodiment, the user of the mobile computing device 102a, 102b may interact with one or more augmented reality elements within the camera viewfinder display. Then in response to a touch gesture from the user (e.g., a swipe left touch gesture), the display manager 204a, 204b can transition to a graphical user interface including the user’s newsfeed.

[0059] As further illustrated in FIG. 2, the networking system application 104a, 104b includes a user input detector 206a, 206b. In one or more embodiments, the user input detector 206a, 206b detects, receives, and/or facilitates user input in any suitable manner. In some examples, the user input detector 206a, 206b detects one or more user interactions with respect to the camera viewfinder display (e.g., a user interaction with an augmented reality element within the camera viewfinder display). As referred to herein, a “user interaction” means a single interaction, or combination of interactions, received from a user by way of one or more input devices.

[0060] For example, the user input detector 206a, 206b detects a user interaction from a keyboard, mouse, touch page, touch screen, and/or any other input device. In the event the mobile computing device 102a, 102b includes a touch screen, the user input detector 206a, 206b detects one or more touch gestures (e.g., swipe gestures, tap gestures, pinch gestures, reverse pinch gestures) from a user that form a user interaction. In some examples, a user can provide the touch gestures in relation to and/or directed at one or more graphical objects or graphical elements (e.g., augmented reality elements) of a user interface.

[0061] The user input detector 206a, 206b may additionally, or alternatively, receive data representative of a user interaction. For example, the user input detector 206a, 206b may receive one or more user configurable parameters from a user, one or more commands from the user, and/or any other suitable user input. The user input detector 206a, 206b may receive input data from one or more components of the networking system 108, or from one or more remote locations.

[0062] The networking system application 104a, 104b performs one or more functions in response to the user input detector 206a, 206b detecting user input and/or receiving other data. Generally, a user can control, navigate within, and otherwise use the networking system application 104a, 104b by providing one or more user inputs that the user input detector 206a, 206b can detect. For example, in response to the user input detector 206a, 206b detecting user input, one or more components of the networking system application 104a, 104b allow the user of the mobile computing device 102a, 102b to select an augmented reality element, scroll through a newsfeed, input text into a networking system post composer, and so forth.

[0063] As shown in FIG. 2, and as mentioned above, the networking system application 104a, 104b also includes the data storage 208a, 208b. The data storage 208a, 208b includes networking system data 210a, 210b. In one or more embodiments, the networking system data 210a, 210b is representative of networking system information (e.g., augmented reality element information, networking system activity information, etc.), such as described herein.

[0064] Also as shown in FIG. 2, and as mentioned above, the server device(s) 106 hosts the networking system 108. The networking system 108 provides augmented reality elements, networking system posts, electronic messages, and so forth to one or more users of the networking system 108 (e.g., by way of a camera viewfinder display, a newsfeed, a communication thread, a messaging inbox, a timeline, a “wall,” or any other type of graphical user interface). For example, one or more embodiments provide a user with a networking system newsfeed containing posts from one or more co-users associated with the user.

[0065] In one or more embodiments, a networking system user scrolls through the networking system newsfeed in order to view recent networking system posts submitted by the one or more co-users associated with the user via the networking system application 104a, 104b. In one embodiment, the networking system 108 organizes the networking system posts chronologically in a user’s networking system newsfeed. In alternative embodiments, the networking system 108 organizes the networking system posts geographically, by interest groups, according to a relationship coefficient between the user and the co-user, etc.

[0066] The networking system 108 also enables the user to engage in all other types of networking system activity. For example, the networking system 108 enables a networking system user to scroll through newsfeeds, click on posts and hyperlinks, compose and submit electronic messages and posts, and so forth. As used herein, a “structured object” is a displayed communication (e.g., an offer, a post, etc.) that includes structured data. In at least one embodiment, the networking system 108 treats augmented reality elements as structured objects.

[0067] Also as illustrated in FIG. 2, the networking system 108 includes a communication manager 212. In one or more embodiments, the communication manager 212 sends and receives communications to and from the networking system applications 104a, 104b, and the third-party server 112. For example, the communication manager 212 receives characteristic information from the networking system application 104a, 104b, and provides this characteristic information to the augmented reality element identifier 214 and/or the augmented reality element generator 216. In response to receiving a set of augmented reality elements, the communication manager 212 sends this set back to the networking system application 104a, 104b.

[0068] In addition to augmented reality element data, the communication manager 212 sends and receives information related to networking system activities. For example, the communication manager 212 receives information associated with networking system activities engaged in by one or more networking system users. To illustrate, the communication manager 212 receives information from the networking system application 104a, 104b detailing the clicks, scrolls, keyboard inputs, hovers, and so forth engaged in by the user of the mobile computing device 102a, 102b in association with features of the networking system 108 and/or the augmented reality system 100. In at least one embodiment, the networking system 108 utilizes this information to determine various characteristics of the user of the mobile computing device 102a, 102b.

[0069] Furthermore, the communication manager 212 also receives information associated with the user’s interactions with one or more augmented reality elements. For example, some augmented reality elements are interactive and allow the user to perform various networking system activities directly through a camera viewfinder display. Accordingly, when a user interacts with an augmented reality element, the networking system application 104a, 104b provides information related to the interaction to the communication manager 212.

[0070] Moreover, in some embodiments, the networking system 108 partners with one or more third parties in order to provide additional third-party augmented reality elements and functionality to networking system users. Accordingly, the communication manager 212 sends and receives information to and from the third-party server 112 in order to facilitate those interactions. For example, the augmented reality system 100 may determine that a user is at a baseball stadium where a hot dog vendor has partnered with the networking system 108 in order for the augmented reality system 100 to provide an augmented reality element that allows the user to have a custom hot dog delivered right to his seat. Thus, when the user interacts with the augmented reality element, the communication manager 212 receives information about the interaction, and relays that information on to the third-party server 112. Then, when the third-party server 112 responds with an order acknowledgement and a delivery status update, the communication manager 212 can send that information back to the networking system application 104a, 104b. In at least one embodiment, the communication manager 212 can also relay payment information to the third-party server 112 such that the user can pay for his hot dog through the augmented reality system 100.

[0071] As mentioned above, and as illustrated in FIG. 2, the networking system 108 includes an augmented reality element identifier 214. As mentioned above, the networking system application 104a, 104b collects characteristic information related to the mobile computing device 102a, 102b and the user of the mobile computing device 102a, 102b, and sends this characteristic information to the networking system 108. In response to receiving this characteristic information, the augmented reality element identifier 214 identifies a set of augmented reality elements and/or corresponding content based on the provided characteristic information.

[0072] In order to identify a set of augmented reality elements that correspond with provided characteristic information, the augmented reality element identifier 214 begins by analyzing the provided characteristic information. In one or more embodiments, the augmented reality element identifier 214 begins by analyzing the provided characteristic information to determine the location of the mobile computing device 102a, 102b. For example, the augmented reality element identifier 214 can analyze provided GPS information, WiFi information, networking system information, and Internet searches in order to determine where the mobile computing device 102a, 102b is located and what is currently occurring at the location of the mobile computing device 102a, 102b. For example, from provided GPS coordinates of the mobile computing device 102a, 102b, the augmented reality element identifier 214 can determine that the user of the mobile computing device 102a, 102b is currently attending a rock concert in Central Park. In another example, from the provided GPS coordinates of the mobile computing device 102a, 102b, the augmented reality element identifier 214 can determine that the user of the mobile computing device 102a, 102b is camping in the Smokey Mountains in the rain.

[0073] Additionally, the augmented reality element identifier 214 analyzes the provided characteristic information to determine user information. For example, the augmented reality element identifier 214 can determine the user’s demographic information, the user’s profile information, the user’s networking system activity history, the networking system activity history of the user’s networking system friends, the demographic information of the user’s networking system friends, and so forth. In at least one embodiment, in order to protect the user’s privacy, the augmented reality element identifier 214 requires that the user specifically opts in to this level of analysis.

[0074] Furthermore, augmented reality element identifier 214 can also analyze an image frame taken from the camera viewfinder display of the mobile computing device 102a, 102b in order to determine additional characteristics of the mobile computing device 102a, 102b. For example, the augmented reality element identifier 214 can utilize computer vision techniques to identify objects, backgrounds, text, and people within the image frame. Further, in response to identifying a person in the image frame, the augmented reality element identifier 214 can utilize facial recognition technology in combination with networking system information to identify networking system users within the image frame.

[0075] After analyzing the provided characteristic information to determine the exact location, conditions, and circumstances under which the mobile computing device 102a, 102b is currently situated, the augmented reality element identifier 214 can identify a set of augmented reality elements that correspond with the mobile computing device 102a, 102b. In at least one embodiment, the augmented reality element identifier 214 begins by identifying augmented reality elements that correspond with the location of the mobile computing device 102a, 102b. In some embodiments, this is the minimum level of analysis required by the augmented reality element identifier 214. Accordingly, the augmented reality element identifier 214 may simply provide the set of augmented reality elements that correspond to the location of the mobile computing device 102a, 102b.

[0076] In additional embodiments, the augmented reality element identifier 214 may broaden or narrow the set of augmented reality elements that correspond to the location of the mobile computing device 102a, 102b based on the additional characteristic information. For example, the augmented reality element identifier 214 can add or remove an augmented reality element from the collected set based on whether the augmented reality element corresponds with the user’s demographic information, whether the user has previously used that augmented reality element, whether the user’s friends have used that augmented reality element, and so forth. Additionally, the augmented reality element identifier 214 can add or remove an augmented reality element from the collected set based on the analysis of the image frame. For example, the augmented reality element identifier 214 can add or remove an augmented reality element based on whether the augmented reality element corresponds with the objects or person in the image frame, whether the augmented reality element corresponds with the lighting conditions displayed in the image frame, whether the augmented reality element corresponds with the circumstances depicted in the image frame, and so forth. In one or more embodiments, the augmented reality element identifier 214 utilizes machine learning in making the determinations described above in order to collect the resulting set of augmented reality elements.

[0077] In one or more embodiments, the augmented reality element identifier 214 may utilize a scoring scheme in order to identify augmented reality elements to include in the set. For example, the augmented reality element identifier 214 may utilize machine learning to calculate a score that reflects how strongly an augmented reality element corresponds with the characteristic information. In that case, the augmented reality element identifier 214 may include augmented reality elements that score above a threshold calculation. Additionally or alternatively, the augmented reality element identifier 214 may only include a threshold number of augmented reality elements to provide in the set, in order to keep from overwhelming the mobile computing device 102a, 102b.

[0078] In one or more embodiments, the augmented reality system 100 enables the generation of augmented reality elements. For example, the augmented reality system 100 can enable a user to generate a “leave-behind” augmented reality element that the augmented reality system 100 anchors to a particular location. Thus, when other networking system users later access the augmented reality at that particular location, they can discover the leave-behind augmented reality element. In another example, the augmented reality system 100 can generate customized augmented reality elements for a variety of purposes.

[0079] Accordingly, as shown in FIG. 2, the networking system 108 includes the augmented reality element generator 216. In one or more embodiments, the augmented reality element generator 216 receives information, either from the networking system application 104a, 104b or from the networking system 108, and generates an augmented reality element embodying the received information. To illustrate, the augmented reality element generator 216 may receive information from the networking system application 104a, 104b including a digital video of the user describing how much fun he is having at the theme park where he is currently located. In at least one embodiment, the augmented reality element generator 216 can generate an augmented reality element including the user’s digital video, and anchor the generated augmented reality element to the location of the theme park. Then when other networking system users later visit the theme park, the augmented reality element identifier 214 may identify and provide the generated augmented reality element to those networking system users.

[0080] In one or more embodiments, the augmented reality element generator 216 can associate various rules with a leave-behind augmented reality element. For example a user creating a leave-behind augmented reality element can specify that the augmented reality element may only be viewed by his or her networking system friends, by a group of his or her networking system friends, or by a single networking system friend. Alternatively, the creator of a leave-behind augmented reality element can specify that any user of the augmented reality system may view the augmented reality element. In at least one embodiment, the creator can also specify additional rules such as an expiration date and time for the leave-behind augmented reality element after which the element may no longer be viewed, a time of day during which the element may be viewed, a background requirement against which the element must be displayed, etc. Accordingly, the augmented reality element generator 216 can associate one or more of these rules with the generated augmented reality element as metadata.

[0081] In another example, the augmented reality element generator 216 may receive information from the networking system 108 including the networking system user identifiers for networking system users identified in an image frame provided (e.g., within a camera viewfinder) by networking system application 104a, 104b. In at least one embodiment, the augmented reality element generator 216 can generate customized augmented reality elements specific to each of the identified networking system users. For example, the customized augmented reality element can include the networking system user’s name, the networking system user’s profile picture, the networking system user’s avatar, and so forth.

[0082] As shown in FIG. 2, and as mentioned above, the networking system 108 also includes the data storage 218. The data storage 218 includes augmented reality element data 220. In one or more embodiments, the augmented reality element data 220 is representative of augmented reality element information (e.g., the display characteristics of the augmented reality elements, the metadata associated with each augmented reality element, etc.), such as described herein.

[0083] As will be described in more detail below, the components of the augmented reality system 100 can provide one or more graphical user interfaces (“GUIs”) and/or GUI elements. In particular, as described above, the augmented reality system 100 provides one or more augmented reality elements as an overlay within the camera viewfinder display of the mobile computing device 102a, 102b. FIGS. 3A-7B and the description that follows illustrate various example embodiments of the features of the augmented reality system 100 that are in accordance with general principles as described above.

[0084] As described above, the augmented reality system 100 provides augmented reality element within a camera viewfinder display of a mobile computing device. Accordingly, FIG. 3A illustrates a mobile computing device 300 where the camera viewfinder display 304 is active on the touch screen 302 of the mobile computing device 300. Additionally, as shown in FIG. 3A, the camera viewfinder display 304 includes a shutter button 306 (i.e., to capture a digital photograph or video), a digital photograph control 308, and a digital video control 310 (i.e., to select the type of multimedia to capture). Although the embodiments described herein include a smartphone mobile computing device, in additional embodiments, the mobile computing device 300 may be a tablet computer, a laptop computer, an augmented reality or virtual reality headset, or any other type of computing device suitable for interacting with the features of the augmented reality system 100.

[0085] In one or more embodiments, upon detecting the activation of the camera viewfinder display 304, the augmented reality system 100 collects characteristic data and identifies augmented reality elements through the methods and processes described herein. In response to receiving the identified augmented reality elements, the augmented reality system 100 provides the augmented reality elements via the camera viewfinder display 304. For example, as illustrated in FIG. 3B, in response to collecting and determining the characteristic information associated with the mobile computing device, the augmented reality system 100 provides the augmented reality elements 312a-312e.

[0086] In the embodiment illustrated in FIGS. 3A-3C, the user of the mobile computing device 300 is spending the day with two friends at Lake Tahoe. Accordingly, the augmented reality system 100 collects and analyzes characteristic information including the GPS location of the mobile computing device 300, an image frame taken from the camera viewfinder display 304, the networking system unique identifier associated with the user of the mobile computing device 300, and the other characteristic information described above. From this analysis, in one or more embodiments, the augmented reality system 100 identifies the augmented reality elements 312a-312e that correspond with the characteristic information.

[0087] For example, as shown in FIG. 3B, the augmented reality system 100 provides the augmented reality element 312a in response to determining that the mobile computing device 300 is located at a GPS location corresponding to a business (e.g., “Tahoe Adventures”). Accordingly, the augmented reality system 100 identifies a logo associated with the business (e.g., from a web search for the business, from a networking system page associated with the business, etc.), and generates the augmented reality element 312a including the identified logo. In one or more embodiments, in response to detecting a selection of the augmented reality element 312a, the augmented reality system 100 can open a browser window on the touch screen 302 of the mobile computing device 300 and direct the browser to a website associated with “Tahoe Adventures.” Additionally, in one or more embodiments, the augmented reality system 100 may add animation to the augmented reality element 312a (e.g., spinning, color changes, etc.). In yet further embodiments, user interaction with the augmented reality element 312a triggers creation of a “check-in” at the business or another post associated with the business. Accordingly, the user can create a networking system post by interacting with one or more of the augmented reality elements.

[0088] Furthermore, as shown in FIG. 3B, the augmented reality system 100 performs facial recognition in connection with an image frame taken from the camera viewfinder display 304 in order to identify a networking system user portrayed therein (i.e., “Dave S.”). Accordingly, in response to accessing the networking system account associated with the identified networking system user, the augmented reality system 100 can generate and/or provide the augmented reality element 312b representing the identified user. As shown in FIG. 3B, the augmented reality element 312b can include a screen name (e.g., “Dave S.”), and an avatar or profile picture associated with the identified networking system user. In one or more embodiments, the augmented reality system 100 identifies the components of the augmented reality element 312b from a networking system profile associated with the identified networking system user. In alternative embodiments, the augmented reality system 100 can identify the components of the augmented reality element 312b from Internet searches, or other data sources.

[0089] In one or more embodiments, in response to detecting a selection of the augmented reality element 312b, the augmented reality system 100 can redirect the touch screen 302 to display a graphical user interface provided by the networking system application installed on the mobile computing device 300. The networking system application can then provide a display of the networking system homepage associated with the networking system user associated with the augmented reality element 312b. Alternatively, in response to detecting a selection of the augmented reality element 312b, the augmented reality system 100 can redirect the touch screen 302 to display a message composer graphical user interface where in the user of the mobile computing device 300 can compose an networking system message to the networking system user associated with the augmented reality element 312a. In yet further embodiments, in response to a selection of the augmented reality element 312b, the augmented reality system can tag the identified user in a post being created by the user of the mobile computing device 300. As such, the user can interact with the augmented reality element 312b and one or more other augmented reality elements to create a post in which the identified user is tagged.

[0090] Additionally, as shown in FIG. 3B, the augmented reality system 100 analyzes networking system activity information (e.g., to determine that the user of the mobile computing device 300 frequently visits Lake Tahoe and enjoys time there, etc.) to identify an overall mood associated with the image frame taken from the camera viewfinder display 304. In response to identifying the likely overall mood, the augmented reality system 100 generates and provides the augmented reality element 312c. As shown in FIG. 3B, the augmented reality element 312c includes selectable emoticon elements that allow the user of the mobile computing device 300 to express an emotion. For example, in response to detecting a selection of the first emoticon element, the augmented reality system 100 can determine that the user of the mobile computing device 300 is feeling “chill.” In one or more embodiments, the augmented reality system 100 utilizes this selection in a resulting networking system post or message.

[0091] Also shown in FIG. 3B, in response to analyzing networking system activity information (e.g., to determine that the user of the mobile computing device 300 is celebrating a birthday), the augmented reality system 100 can generate and provide the augmented reality element 312d. In one or more embodiments, the augmented reality element 312d is associated with a filter that corresponds with the characteristics analyzed by the augmented reality system 100. For example, in response to detecting a selection of the augmented reality element 312d, the augmented reality system 100 can add a filter elements (e.g., animations, stickers, borders, color changes, etc.) to the camera viewfinder display 304. For example, a selection of the augmented reality element 312d causes streamers and party balloons to appear overlaid on the camera viewfinder display 304 and any resulting digital pictures or videos captured from the camera viewfinder display 304.

[0092] Additionally as shown in FIG. 3B, the augmented reality system 100 can provide standard augmented reality elements to the camera viewfinder display 304. For example, the augmented reality system 100 can provide the augmented reality element 312e as a matter of course to all networking system users who opt in to the features and functionality of the augmented reality system 100. In response to detecting a selection of the augmented reality element 312e, the augmented reality system 100 can overlay a touch screen keyboard on the camera viewfinder display 304 and can convert the augmented reality element 312e to a text box wherein the user of the mobile computing device 300 can input a message. In one or more embodiments, the augmented reality system 100 utilizes the message provided by the user of the mobile computing device 300 in a networking system post or message.

[0093] In one or more embodiments, the user of the mobile computing device 300 can remove any of the augmented reality elements 312a-312e from the camera viewfinder display 304. For example, if the augmented reality system 100 has incorrectly identified the location of the mobile computing device 300, the user of the mobile computing device 300 can remove the augmented reality element 312a from the camera viewfinder display 304 by pressing and swiping the augmented reality element 312a up off the camera viewfinder display 304, or by pressing and holding the augmented reality element 312a. In this way, the user has control over what is included in a resulting networking system post or message.

[0094] In one or more embodiments, as discussed above, the user of the mobile computing device 300 can compose a networking system post or message directly from the camera viewfinder display 304. For example, as shown in FIGS. 3B and 3C, in response to a single interaction from the user, the augmented reality system 100 can compose and send a post or message to the networking system 108 for distribution to one or more additional networking system users. In one embodiment, in response to detecting interactions with one or more of the augmented reality elements 312a-312e and a swipe touch gesture across the camera viewfinder display 304, the augmented reality system 100 can capture a digital picture from the camera viewfinder display 304 and compose a networking system post including the digital picture and elements/content that correspond to the one or more augmented reality elements 312a-312e with which the user interacted. Alternatively or additionally, the augmented reality system 100 can perform these same steps in response to detecting an interaction with the shutter button 306. In additional embodiments, the augmented reality system 100 can perform these steps in response to detecting other types of interactions with the mobile computing device 300 (e.g., a tilt, a shake, a verbal command, etc.).

[0095] For example, as shown in FIG. 3C, in response to detecting a swipe gesture in connection with the camera viewfinder display 304, the augmented reality system 100 can compose and send the post 318 to the networking system 108. FIG. 3C illustrates a networking system GUI 314 including a newsfeed 316 associated with a networking system user (e.g., the user of the mobile computing device 300, or another networking system user who is friends with the user of the mobile computing device 300 via the networking system 108). As shown, the post 318 includes a digital photograph 320 overlaid with the filter associated with the augmented reality element 312d. Additionally, the post 318 includes additional elements corresponding to other augmented reality elements provided to the user of the mobile computing device 300 (e.g., “Dave Smith” corresponding to the augmented reality element 312b, “Tahoe Adventures” corresponding to the augmented reality element 312a, “Birthday at the lake!” corresponding to input entered in connection with the augmented reality element 312e).

[0096] In an alternative embodiment, in response to detecting a swipe gesture in connection with the camera viewfinder display 304 in FIG. 3B, the augmented reality system 100 can provide a composer GUI wherein the user can view a composed post before the augmented reality system 100 sends the post to the networking system 108. For example, from the composer GUI, the user can edit tagged users, check-in locations, the digital picture or video that will be included with the post, and so forth. The user of the mobile computing device 300 can also specify privacy settings indicating one or more networking system friends who will receive the resulting post. If the user only selects one networking system friend, the networking system 108 will send the resulting post as an electronic message directly to that person. If the user selects a subgroup of his networking system friends, the networking system 108 may send the resulting post as a group electronic message.

[0097] Another embodiment of the augmented reality system 100 is illustrated in FIGS. 4A-4D. For example, as discussed above, the augmented reality system 100 enables a user to create “leave behind” augmented reality elements. To illustrate the process of creating a leave-behind augmented reality element, FIG. 4A shows the camera viewfinder display 304 on the touch screen 302 of the mobile computing device 300. As shown, the user of the mobile computing device 300 is directing the camera of the mobile computing device 300 at a restaurant menu. In one or more embodiments, in response to determining that an image frame taken from the camera viewfinder display 304 includes a menu, the augmented reality system 100 provides the augmented reality element 312f that enables the user to leave a recommendation related to the restaurant.

[0098] In response to detecting a selection of the augmented reality element 312f, the augmented reality system 100 can utilize optical character recognition and other computer vision techniques to generate the augmented reality elements 312g (e.g., selectable boxes around each of the menu items). In response to detecting a selection of one of the augmented reality elements 312g, the augmented reality system 100 can provide additional augmented reality elements that enable the user to leave a recommendation for the selected menu item that is embodied in a leave-behind augmented reality element. In one or more embodiments, after generating the leave-behind augmented reality element associated with the menu item, the augmented reality system 100 anchors the leave-behind augmented reality element to the location where the element was generated (e.g., the location of the restaurant).

[0099] By utilizing the various tools provided by the augmented reality system 100, the user of the mobile computing device 300 can create various types of leave-behind augmented reality elements. For example, after detecting a selection of “Pad Thai,” the augmented reality system 100 can provide a display of images of Pad Thai from which the user can select a particular image. In another example, the augmented reality system 100 can enable the user to take a video or photograph of his order of Pad Thai or of himself reacting to his order of Pad Thai. In yet another example, the augmented reality system 100 can utilize SLAM technology, described above, to create a 3D model based on a scan of the user’s order of Pad Thai.

[0100] Later, when another networking system user views his camera viewfinder display at the location where the augmented reality system 100 has anchored the leave-behind augmented reality element, the augmented reality system 100 can provide the augmented reality element on that camera viewfinder display. For example, as shown in FIG. 4C, the user of the mobile computing device 300’ is a networking system friend of the user of the mobile computing device 300. When the user of the mobile computing device 300’ opens the camera viewfinder display 304’, the augmented reality system 100 determines that the location of the mobile computing device 300’ is the same restaurant where the user of the mobile computing device 300 created the leave-behind augmented reality element, discussed above. Accordingly, the augmented reality system 100 provides the leave-behind augmented reality element on the camera viewfinder display 304’. In one or more embodiments, the augmented reality system 100 provides the leave-behind augmented reality element in response to an analysis of characteristics associated with the leave-behind augmented reality element (e.g., whether the creator of the leave-behind augmented reality element specified that it should be generally available to networking system users, or only available to networking system friends, etc.), and an analysis of the characteristics associated with mobile computing device 300’ and its user (e.g., whether there is a threshold relationship coefficient between the user of the mobile computing device 300’ and the user of the mobile computing device 300, etc.).

[0101] As shown in FIG. 4C, the augmented reality system 100 can provide multiple leave-behind augmented reality elements (e.g., the augmented reality elements 312h and 312i) on the camera viewfinder display 304’. For example, the user of the mobile computing device 300’ may have several networking system friends who have visited the same restaurant and have left behind augmented reality elements. Accordingly, when the user of the mobile computing device 300’ opens the camera viewfinder display 304’ and directs it at the same restaurant menu, the augmented reality system 100 provides the augmented reality elements 312h and 312i. In one or more embodiments, the augmented reality system 100 may provide the augmented reality elements 312h and 312i after determining that the relationship coefficient between the user of the mobile computing device 300’ and the networking system users associated with the augmented reality elements 312h and 312i is above a threshold number. Furthermore, in one or more embodiments, there may be additionally provided augmented reality elements that are not displayed by the augmented reality system 100 due to display limitations, insufficient relationship coefficients, and so forth.

[0102] Also shown in FIG. 4C and as discussed above, the augmented reality system 100 can enable a networking system user to create various types of leave-behind augmented reality elements. For example, the augmented reality element 312h is a 3D model of a bowl of noodles with chopsticks. Additionally, the augmented reality element 312h includes a 5-star rating and text detailing “Adam recommends the PAD THAI!” In one or more embodiments, the augmented reality system 100 generates the augmented reality element 312h in response to the user of the mobile computing device 300 selecting the augmented reality element 312g associated with the “Pad Thai” menu item, and providing a description and rating, as discussed above. Furthermore, the augmented reality element 312h is anchored to a portion of the menu corresponding to the augmented reality element 312h. In particular, upon detecting the anchored portion of the menu within the image being displayed, the mobile computing device 300’ displays the augmented reality element 312h at a location corresponding to the anchored portion of the menu and/or with visual elements indicating the connection of the augmented reality element 312h to the anchored portion of the menu.

[0103] Additionally, as shown in FIG. 4C, the augmented reality element 312i includes a digital video window playing a previously recorded digital video, along with a networking system user avatar and text (e.g., “Loving this Massaman!”). In one or more embodiments, when a user (e.g., “Tom N.”) selected an augmented reality element associated with the “Massaman Curry” menu item and provided the augmented reality system 100 with a digital video of himself along with a description. Accordingly, in at least one embodiment, the digital video window included in the augmented reality element 312i can auto-play the digital video. Furthermore, as shown in FIG. 4C, in one or more embodiments, the augmented reality system 100 provides directional lines connecting the augmented reality elements 312h and 312i to their associated menu items.

[0104] In one or more embodiments, the augmented reality system 100 can provide a combination of personal leave-behind augmented reality elements, non-personal leave-behind augmented reality elements, general augmented reality elements, and third-party augmented reality elements. For example, as shown in FIG. 4D, the user of the mobile computing device 300 has accessed the camera viewfinder display 304 after walking into a bar. Accordingly, the augmented reality system 100 has determined the location of the mobile computing device 300 as well as other characteristic information associated with the mobile computing device 300 and the user of the mobile computing device 300, and has identified and provided the augmented reality elements 312j, 312k, 312l, 312m, and 312n.

[0105] In one or more embodiments, the augmented reality element 312j is a personal leave-behind augmented reality element that has been generated specifically for the user of the mobile computing device 300 by another networking system user who is meeting the user of the mobile computing device 300 at the bar. As shown in FIG. 4D, the augmented reality element 312j informs the user of the mobile computing device 300 where his group is located. In at least one embodiment, the networking system user who created the augmented reality element 312j specified the appearance and content of the augmented reality element 312j, the mapping position of the augmented reality element 312j (e.g., over the door), and the identity of the networking system user(s) to whom the augmented reality system 100 should provide the augmented reality element 312j. Thus, in at least one embodiment, the augmented reality system 100 only provides the augmented reality element 312j to the user of the mobile computing device 300 and to no one else.

[0106] In one or more embodiments, the augmented reality element 312l is a non-personal leave-behind augmented reality element that has been generated for any networking system user who is friends with the creator of the augmented reality element 312l. For example, as shown in FIG. 4D, the augmented reality element 312l is a digital photograph of two people that was taken by a friend of the user of the mobile computing device 300 at the bar where the mobile computing device 300 is currently located. Accordingly, in response to determining that the relationship coefficient between the user of the mobile computing device 300 and the networking system user who created the augmented reality element 312l is sufficiently high, the augmented reality system 100 provides the augmented reality element 312l on the camera viewfinder display 304. In additional embodiments, the augmented reality system 100 may have provided the augmented reality element 312l in response to also determining that there was sufficient display space within the image shown in camera viewfinder display 304.

[0107] In one or more embodiments, the augmented reality element 312k is a general augmented reality element, generated by the augmented reality system 100 and anchored to the location of the bar. For example, in one embodiments, the augmented reality system 100 generates the augmented reality element 312k in response to determining that the bar is located near the university campus where lots of people watch university sports. Accordingly, while a game is ongoing, the augmented reality system 100 may generate and continually update the augmented reality element 312k to reflect the score of the university team’s game. Furthermore, in at least one embodiment, the augmented reality system 100 may only provide the augmented reality element 312k to networking system users who have networking system activity history that reflects an interest in university sports. Thus, as shown in FIG. 4D, the augmented reality system 100 may have provided the augmented reality element 312k because the user of the mobile computing device 300 has frequently posted to the networking system 108 regarding university sports.

[0108] In one or more embodiments, the augmented reality elements 312m and 312n are third-party augmented reality elements. For example, in order to gather more data and engage more fully with users, the third party associated with the augmented reality element 312m may have partnered with the augmented reality system 100 to provide the augmented reality element 312m to networking system users who visit various locations. Accordingly, when the augmented reality system 100 detects that the mobile computing device 300 is located in one of the locations of interest to that third party, the augmented reality system 100 provides the augmented reality element 312m. As shown in FIG. 4D, the augmented reality element 312m is interactive and allows the user of the mobile computing device 300 to see the average review (e.g., provided by the third-party) for the bar, and to leave his personal review of the bar.

[0109] Furthermore, in one or more embodiments, the augmented reality system 100 can provide third-party content absent a partnership with the third party. For example, the augmented reality element 312n includes a weather warning and is provided by the augmented reality system 100 in response to determining that the mobile computing device 300 is not located at “home” (e.g., the location where the mobile computing device 300 spends the night). In other words, in one or more embodiments, the augmented reality system 100 monitors various third-party information services (e.g., the National Weather Service, various news sources, the Amber Alert System, etc.) in order to alert augmented reality system users of events and occurrences that might impact them.

[0110] In another embodiment, the augmented reality system 100 can enable a user to generate an augmented reality element that is anchored to the location of the user’s mobile computing device rather than being anchored to a stationary location. For example, as shown in FIG. 5, the user of the mobile computing device 300 may be attempting to meet up with a couple friends at a crowded baseball stadium. Accordingly, the augmented reality system 100 can enable a user to create an augmented reality element that includes a personal avatar. For example, as shown in FIG. 5, the augmented reality elements 312o and 312p include personal avatars as well as the user’s names. In one or more embodiments, the augmented reality system 100 anchors each augmented reality element 312o, 312p to the location of the mobile computing device associated with each respective user. Thus, if the user associated with the augmented reality element 312o leaves the grand stands to pick up food from the concession stand, the augmented reality system 100 will cause her augmented reality element 312o to move with her (e.g., assuming she takes her mobile computing device along). Accordingly, when the user of the mobile computing device 300 accesses the camera viewfinder display 304, the augmented reality system 100 provides the augmented reality elements 312o, 312p, which assist the user of the mobile computing device 300 to quickly and easily locate his friends. In one or more embodiments, the users associated with the augmented reality elements 312o, 312p can specify who may see their avatars and locations, a duration of time during which their avatars may be presented, and so forth, thus preserving the users’ privacy and/or providing the users various levels of privacy settings.

……
……
……

您可能还喜欢...