Sony Patent | Information Processing System, Information Processing Method, And Program

Patent: Information Processing System, Information Processing Method, And Program

Publication Number: 20200097749

Publication Date: 20200326

Applicants: Sony

Abstract

There is provided an information processing system capable of appropriately assisting in creation of a marker, an information processing method, and a program. The information processing system includes an acquisition part configured to acquire a recognition result of a marker that a user is making, and a display control part configured to cause a display face to display assistance information for assisting in creation of the marker depending on the recognition result in association with the marker in process of creation.

TECHNICAL FIELD

[0001] The present disclosure relates to an information processing system, an information processing method, and a program.

BACKGROUND ART

[0002] Various augmented reality (AR) technologies have been conventionally developed. AR enables additional information associated with an object in an environment where a user is present to be presented to the user.

[0003] For example, Patent Document 1 describes a technology for analyzing a captured image thereby to detect a marker, and calling a function associated with the detected marker.

CITATION LIST

Patent Document

[0004] Patent Document 1: Japanese Patent Application Laid-Open No. 2015-90524

SUMMARY OF THE INVENTION

Problems to be Solved by the Invention

[0005] However, the technology described in Patent Document 1 does not consider assisting in creation of a marker when a user creates the marker.

[0006] Thus, the present disclosure proposes a novel and improved information processing system capable of appropriately assisting in creation of a marker, an information processing method, and a program.

Solutions to Problems

[0007] According to the present disclosure, there is provided an information processing system including an acquisition part configured to acquire a recognition result of a marker that a user is making, and a display control part configured to cause a display face to display assistance information for assisting in creation of the marker depending on the recognition result in association with the marker in process of creation.

[0008] Further, according to the present disclosure, there is provided an information processing method including acquiring a recognition result of a marker that a user is making, and causing, by a processor, a display face to display assistance information for assisting in creation of the marker depending on the recognition result in association with the marker in process of creation.

[0009] Further, according to the present disclosure, there is provided a program for causing a computer to function as an acquisition part configured to acquire a recognition result of a marker that a user is making, and a display control part configured to cause a display face to display assistance information for assisting in creation of the marker depending on the recognition result in association with the marker in process of creation.

Effects of the Invention

[0010] As described above, according to the present disclosure, it is possible to appropriately assist in creation of a marker. Additionally, the effect described herein is not restrictive, and may be any effect described in the present disclosure.

BRIEF DESCRIPTION OF DRAWINGS

[0011] FIG. 1 is an explanatory diagram illustrating an exemplary configuration of an information processing system 10 common in the respective embodiments of the present disclosure.

[0012] FIG. 2 is an explanatory diagram illustrating another exemplary configuration of the information processing system 10 common in the respective embodiments.

[0013] FIG. 3 is an explanatory diagram illustrating another exemplary configuration of the information processing system 10 common in the respective embodiments.

[0014] FIG. 4 is an explanatory diagram illustrating another exemplary configuration of the information processing system 10 common in the respective embodiments.

[0015] FIG. 5 is a functional block diagram illustrating an exemplary functional configuration of the information processing system 10 according to a first embodiment of the present disclosure.

[0016] FIG. 6 is a diagram illustrating that a score calculation part 104 according to the first embodiment calculates a score of a marker in process of creation by way of example.

[0017] FIG. 7 is a diagram illustrating an exemplary configuration of an assistance information DB 128 according to the first embodiment.

[0018] FIG. 8A is a diagram illustrating how a user creates a marker.

[0019] FIG. 8B is a diagram illustrating exemplary display of assistance information depending on recognition of the marker in process of creation illustrated in FIG. 8A.

[0020] FIG. 8C is a diagram illustrating how the user additionally draws the marker in process of creation illustrated in FIG. 8A.

[0021] FIG. 8D is a diagram illustrating exemplary display of assistance information depending on recognition of the marker in process of creation illustrated in FIG. 8C.

[0022] FIG. 8E is a diagram illustrating how the user additionally draws the marker in process of creation illustrated in FIG. 8C.

[0023] FIG. 8F is a diagram illustrating exemplary display of assistance information depending on recognition of the marker in process of creation illustrated in FIG. 8E.

[0024] FIG. 9A is a diagram illustrating another exemplary display of assistance information depending on recognition of the marker in process of creation illustrated in FIG. 8A.

[0025] FIG. 9B is a diagram illustrating another exemplary display of assistance information depending on recognition of the marker in process of creation illustrated in FIG. 8C.

[0026] FIG. 9C is a diagram illustrating another exemplary display of assistance information depending on recognition of the marker in process of creation illustrated in FIG. 8E.

[0027] FIG. 10A is a diagram illustrating another exemplary display of assistance information depending on recognition of the marker in process of creation illustrated in FIG. 8A.

[0028] FIG. 10B is a diagram illustrating exemplary display of part or background candidates addable to the marker in process of creation illustrated in FIG. 8A.

[0029] FIG. 10C is a diagram illustrating exemplary display of assistance information depending on recognition of the marker when a candidate selected on the display screen illustrated in FIG. 10B is added to the marker in process of creation illustrated in FIG. 8A.

[0030] FIG. 11 is a flowchart illustrating an overall flow of processing according to the first embodiment.

[0031] FIG. 12 is a flowchart illustrating a flow of “assistance information display processing” according to the first embodiment.

[0032] FIG. 13 is a diagram illustrating an exemplary configuration of a marker information DB 130 according to a second embodiment of the present disclosure.

[0033] FIG. 14 is a diagram illustrating an exemplary configuration of the assistance information DB 128 according to the second embodiment.

[0034] FIG. 15A is a diagram illustrating how the user creates a marker.

[0035] FIG. 15B is a diagram illustrating exemplary display of assistance information depending on recognition of the marker in process of creation illustrated in FIG. 15A.

[0036] FIG. 15C is a diagram illustrating exemplary display of assistance information when a details button illustrated in FIG. 15B is selected.

[0037] FIG. 15D is a diagram illustrating exemplary display of assistance information depending on recognition of user-redrawn marker.

[0038] FIG. 16A is a diagram illustrating exemplary display of assistance information depending on recognition of a user-creating marker.

[0039] FIG. 16B is a diagram illustrating exemplary display of assistance information when the details button illustrated in FIG. 16A is selected.

[0040] FIG. 16C is a diagram illustrating exemplary display of assistance information depending on recognition of the marker when the user redraws the marker illustrated in FIG. 16A.

[0041] FIG. 17A is a diagram illustrating how the user is creating a marker.

[0042] FIG. 17B is a diagram illustrating exemplary display of assistance information depending on recognition of the marker in process of creation illustrated in FIG. 17A.

[0043] FIG. 17C is a diagram illustrating exemplary display of assistance information when a correction candidate button illustrated in FIG. 17B is selected.

[0044] FIG. 17D is a diagram illustrating exemplary display of assistance information depending on recognition of a marker corresponding to a candidate selected on the display screen illustrated in FIG. 17C.

[0045] FIG. 18A is a diagram illustrating how the user draws a picture on an object by use of an IR paint pen.

[0046] FIG. 18B is a diagram illustrating how the user draws a picture on the object by use of the IR paint pen.

[0047] FIG. 19A is a diagram illustrating how the user rubs an IR transfer seal on the object.

[0048] FIG. 19B is a diagram illustrating how the user rubs the IR transfer seal on the object.

[0049] FIG. 20 is a diagram illustrating how the user applies part of an IR paint sheet on the object.

[0050] FIG. 21 is a diagram illustrating an exemplary fastest fingers first game for answering a country corresponding to a card arranged on a screen 20.

[0051] FIG. 22A is a diagram illustrating an example in which a word associated with a marker card is displayed on the screen 20 when the marker card is placed on the screen 20.

[0052] FIG. 22B is a diagram illustrating exemplary display of a video when another marker card is additionally placed on the screen 20 in the situation illustrated in FIG. 22A.

[0053] FIG. 23A is a diagram illustrating how the user arranges a 3D marker 30 on the screen 20 according to a sixth embodiment.

[0054] FIG. 23B is a diagram illustrating exemplary projection on the 3D marker 30 according to the sixth embodiment.

[0055] FIG. 24A is a diagram illustrating an exemplary configuration of the marker information DB 130 according to the sixth embodiment.

[0056] FIG. 24B is a diagram illustrating an exemplary configuration of the assistance information DB 128 according to the sixth embodiment.

[0057] FIG. 24C is a diagram illustrating exemplary display of alarm information in a case where a calculated score is less than a predetermined threshold according to the sixth embodiment.

[0058] FIG. 24D is a diagram illustrating exemplary display of information indicating that symmetry is appropriate in a case where a calculated score is equal to or higher than the predetermined threshold according to the sixth embodiment.

[0059] FIG. 25A is a diagram illustrating an exemplary configuration of the assistance information DB 128 according to a seventh embodiment.

[0060] FIG. 25B is a diagram illustrating exemplary display of alarm information in a case where other 3D marker matches with a group of characteristic points at a predetermined threshold or more.

[0061] FIG. 26 is an explanatory diagram illustrating an exemplary hardware configuration of the information processing system 10 common in the respective embodiments.

MODE FOR CARRYING OUT THE INVENTION

[0062] Preferred embodiments of the present disclosure will be described below in detail with reference to the accompanying drawings. Additionally, the components having substantially the same functional configuration are denoted with the same reference numeral and a repeated description thereof will be omitted in the present specification and the drawings.

[0063] Further, a plurality of components having substantially the same functional configuration may be discriminated by different alphabets after the same reference numeral in the present specification and the drawings. For example, a plurality of components having substantially the same functional configuration such as marker 30a and marker 30b is discriminated as needed. However, in a case where each of a plurality of components having substantially the same functional configuration does not need to be particularly discriminated, they are denoted with only the same reference numeral. For example, in a case where the marker 30a and the marker 30b do not need to be particularly discriminated, they are simply denoted as marker 30.

[0064] Further, “MODE FOR CARRYING OUT THE INVENTION” will be described in the following item order.

[0065] 1.* Configuration of information processing system*

[0066] 2.* First Embodiment*

[0067] 3.* Second Embodiment*

[0068] 4.* Third Embodiment*

[0069] 5.* Fourth Embodiment*

[0070] 6.* Fifth Embodiment*

[0071] 7.* Sixth Embodiment*

[0072] 8.* Seventh Embodiment*

[0073] 9.* Hardware configuration*

[0074] 10.* Variants*

1.* CONFIGURATION OF INFORMATION PROCESSING SYSTEM*

[0075] An exemplary configuration of an information processing system 10 common in the respective embodiments of the present disclosure will be first described. FIG. 1 is an explanatory diagram illustrating an exemplary configuration of the information processing system 10.

[0076] Additionally, a system can mean a configuration for performing predetermined processing in the present specification. A system may be configured of one apparatus or may be configured of a plurality of apparatuses. Further, the information processing system 10 according to the present embodiments may be also configured to be able to perform predetermined processing as the entire information processing system 10, and any component in the information processing system 10 may be regarded as one apparatus.

[0077] With reference to FIG. 1, an information processing system 10a common in the respective embodiments of the present disclosure includes an input part 120a and a display part 124a.

<1-1. Display Part 124>

[0078] The display part 124a displays various items of information on a table 90a. The display part 124a can be a projection part (projector). For example, the display part 124a can be arranged above the table 90a apart from the table 90a by a predetermined distance or more to be suspended from the ceiling as illustrated in FIG. 1. In this case, the display part 124a projects information on the top of the table 90a. For example, the display part 124a may be a pendant light or a desk light. A system for displaying information on the top of the table 90a from above in this way is also called “projection type”. Further, the top of the table 90 may be denoted as screen 20 below. The screen 20 includes a face (display face) to be projected by the display part 124.

[0079] For example, the display part 124a displays a virtual display object under control of a display processing part 122 described below. The display object is a window, a UI object, or the like, for example. The UI object is a predetermined image (still image or moving image) for receiving various user operations (such as selecting or inputting). For example, the UI object is an image including a graphical user interface (GUI) part (such as button, slider, checkbox, textbox, or software keyboard). Further, the UI object can be arranged within the window.

<1-2. Input Part 120>

[0080] The input part 120a includes a camera for capturing the table 90a by one lens, for example. Alternatively, the input part 120a can include a stereo camera capable of recording depth information by capturing the table 90a by two lenses. The stereo camera can employ a visible-ray camera, an invisible-ray camera capable of detecting an invisible ray such as infrared ray, or the like, for example. Further, the input part 120a may further include a voice input apparatus such as microphone for collecting user’s voice or environment sounds of a surrounding environment.

[0081] In a case where the input part 120a employs a camera for capturing the table 90a by one lens, the information processing system 10a analyzes an image captured by the camera (shot image) thereby to detect the position of an object (such as user’s hand) positioned on the screen 20. Further, in a case where the input part 120a employs a stereo camera, the information processing system 10a analyzes an image captured by the stereo camera thereby to acquire depth information of an object in addition to the position information of the object positioned on the screen 20. The information processing system 10a can detect contact or approach of user’s hand onto the screen 20 or release thereof from the screen 20 on the basis of the depth information. Additionally, the input part 120a may have a depth sensor (such as sensor in time of flight system or sensor in structured light system) instead of the stereo camera. In this case, the depth sensor can obtain the depth information of an object positioned on the screen 20.

[0082] In each embodiment, the position of an operator (such as user’s hand, or various operation members such as stylus pen) on the screen 20 is detected on the basis of an image captured by the input part 120a, and various items of information can be input on the basis of the detected position of the operator. That is, the user can input various operations by moving the operator on the screen 20. For example, when contact of user’s hand on the window or UI object is detected, the operation on the window or the UI object is input.

[0083] Further, a camera included in the input part 120a may capture not only the top of the table 90a but also the user present around the table 90a. In this case, the information processing system 10a can detect the user position around the table 90a on the basis of the image captured by the input part 120a. Further, the information processing system 10a extracts physical characteristics (such as size of the face or body) capable of specifying an individual user on the basis of the captured image, thereby making user’s personal recognition.

[0084] Not limited to the above example, a user operation may be input in other method. For example, the input part 120a may be installed as a touch panel on the top (screen 20a) of the table 90a, and user operation input may be detected by contact of user’s finger or the like on the touch panel. Further, user operation input may be detected by a gesture for a camera included in the input part 120a.

<1-3. Variants>

[0085] The configuration of the information processing system 10a common in the respective embodiments has been described above. Additionally, the configuration of the information processing system common in the respective embodiments is not limited to the example illustrated in FIG. 1, and may be ones illustrated in FIG. 2 to FIG. 4, for example.

更多阅读推荐......