Sony Patent | Information Processing Device, Information Processing Method, And Computer Program

Patent: Information Processing Device, Information Processing Method, And Computer Program

Publication Number: 20200143774

Publication Date: 20200507

Applicants: Sony

Abstract

[Problem] To provide an information processing device, an information processing method, and a computer program. [Solution] To provide an information processing device including a display control unit that controls display so that, of a first real object and a second real object present in a real space that are recognized as candidates for an object to be operated, in a case in which the first real object is selected by a user as the object to be operated, a first virtual object corresponding to the first real object is displayed at a first position in the real space corresponding to a position of the first real object based on the selection made by the user, and in a case in which the second real object is selected by the user as the object to be operated, a second virtual object corresponding to the second real object is displayed at a second position in the real space corresponding to a position of the second real object based on the selection made by the user.

FIELD

[0001] The present disclosure relates to an information processing device, an information processing method, and a computer program.

BACKGROUND

[0002] In recent years, a technique of superimposing a virtual object on a real space to be presented to a user, which is called Augmented Reality (AR), has been attracting attention. For example, by using a projector or a Head Mounted Display (hereinafter, also referred to as an “HMD”) including a display that is positioned in front of the eyes of the user when being worn on a head part of the user, a virtual object is enabled to be displayed while being superimposed on a real space.

[0003] In such an AR technique, the virtual object is displayed based on information of a real object present in the real space. For example, a virtual object corresponding to the information of the real object that is recognized based on an image taken by a camera is displayed to be superimposed on the recognized real object. The following Patent Literature 1 discloses a technique of determining a display region of a virtual object to be displayed on a display surface in accordance with information of a real object present on the display surface.

CITATION LIST

Patent Literature

[0004] Patent Literature 1: WO 2014/171200

SUMMARY

Technical Problem

[0005] In a case of displaying the virtual object based on the information of the real object as described above, a virtual object desirable for a user is not necessarily displayed.

[0006] Thus, the present disclosure provides new and improved information processing device, information processing method, and computer program that enable a virtual object more desirable for a user to be displayed by controlling display based on a selection made by the user.

Solution to Problem

[0007] According to the present disclosure, an information processing device is provided that includes: a display control unit configured to control display so that, of a first real object and a second real object that are present in a real space and recognized as candidates for an object to be operated, in a case in which the first real object is selected by a user as the object to be operated, a first virtual object corresponding to the first real object is caused to be displayed at a first position in the real space corresponding to a position of the first real object based on the selection made by the user, and in a case in which the second real object is selected by the user as the object to be operated, a second virtual object corresponding to the second real object is caused to be displayed at a second position in the real space corresponding to a position of the second real object based on the selection made by the user.

[0008] Moreover, according to the present disclosure, an information processing method is provided that includes: controlling display by a processor so that, of a first real object and a second real object present in a real space that are recognized as candidates for an object to be operated, in a case in which the first real object is selected by a user as the object to be operated, a first virtual object corresponding to the first real object is displayed at a first position in the real space corresponding to a position of the first real object based on the selection made by the user, and in a case in which the second real object is selected by the user as the object to be operated, a second virtual object corresponding to the second real object is displayed at a second position in the real space corresponding to a position of the second real object based on the selection made by the user.

[0009] Moreover, according to the present disclosure, a computer program is provided that causes a computer to implement a function of controlling display so that, of a first real object and a second real object present in a real space that are recognized as candidates for an object to be operated, in a case in which the first real object is selected by a user as the object to be operated, a first virtual object corresponding to the first real object is displayed at a first position in the real space corresponding to a position of the first real object based on the selection made by the user, and in a case in which the second real object is selected by the user as the object to be operated, a second virtual object corresponding to the second real object is displayed at a second position in the real space corresponding to a position of the second real object based on the selection made by the user.

Advantageous Effects of Invention

[0010] As described above, according to the present disclosure, a virtual object more desirable for a user is enabled to be displayed by controlling display based on a selection made by the user.

[0011] The effects described above are not limitations, and any of the effects disclosed herein or another effect that may be grasped from the present description may be exhibited in addition to the effects described above, or in place of the effects described above.

BRIEF DESCRIPTION OF DRAWINGS

[0012] FIG. 1 is a diagram for explaining an outline of an information processing device 1 according to an embodiment of the present disclosure.

[0013] FIG. 2 is a block diagram illustrating a configuration example of the information processing device 1 according to the embodiment.

[0014] FIG. 3 is a flowchart illustrating a processing procedure performed by the information processing device 1 according to the embodiment.

[0015] FIG. 4 is an explanatory diagram for explaining an example of a specific operation of the information processing device 1 according to the embodiment.

[0016] FIG. 5 is an explanatory diagram illustrating a hardware configuration example.

DESCRIPTION OF EMBODIMENTS

Description of Embodiments

[0017] The following describes a preferred embodiment of the present disclosure in detail with reference to the attached drawings. In the present description and the drawings, constituent elements having substantially the same functional configuration are denoted by the same reference numeral, and redundant description will not be repeated.

[0018] In the present description and drawings, a plurality of constituent elements having substantially the same functional configuration are distinguished from each other by adding different alphabets to the same reference numeral in some cases. However, in a case in which the constituent elements having substantially the same functional configuration are not required to be distinguished from each other, only the same reference numeral is given thereto.

[0019] The description will be made in the following order.

[0020] 1.* Outline*

[0021] 2.* Configuration*

[0022] 3. Operation [0023] 3-1. Processing procedure [0024] 3-2.* Specific example*

[0025] 4. Modification [0026] 4-1. First modification [0027] 4-2. Second modification [0028] 4-3.* Third modification*

[0029] 5.* Hardware configuration example*

[0030] 6.* Conclusion*

1.* OUTLINE*

[0031] First, the following describes an outline of an information processing device according to an embodiment of the present disclosure. FIG. 1 is a diagram for explaining an outline of an information processing device 1 according to the embodiment. As illustrated in FIG. 1, the information processing device 1 according to the embodiment is implemented by a spectacle-type Head Mounted Display (HMD) worn on a head part of a user U, for example. Display units 13 corresponding to spectacle lens portions that are positioned in front of the eyes of the user U when being worn may be a transmissive type or a non-transmissive type. The information processing device 1 can present a virtual object ahead of a line of sight of the user U by displaying the virtual object on the display units 13. The HMD as an example of the information processing device 1 does not necessarily present an image to both eyes, and may present the image to only one eye. For example, the HMD may be a monocular type including the display unit 13 that presents an image to one eye disposed therein.

[0032] The information processing device 1 includes an outward camera 110 disposed therein that images a direction of the line of sight of the user U, that is, an outward direction when being worn. Additionally, although not illustrated in FIG. 1, the information processing device 1 also includes various sensors disposed therein such as an inward camera that images the eye of the user U when being worn and a microphone (hereinafter, referred to as a “mic”). A plurality of outward cameras 110 and inward cameras may be disposed. In a case in which a plurality of outward cameras 110 are disposed, a depth image (distance image) can be obtained based on parallax information, and a surrounding environment can be three-dimensionally sensed. Even in a case in which one outward camera 110 is used, depth information (distance information) can be estimated based on a plurality of images.

[0033] The shape of the information processing device 1 is not limited to the example illustrated in FIG. 1. For example, the information processing device 1 may be a headband-type (a type of being worn with a band wound around the entire circumference of the head part. In some cases, there may be disposed a band passing through not only a temporal region but also a head top part) HMD, or a helmet-type (a visor portion of the helmet corresponds to the display) HMD. The information processing device 1 may also be implemented by a wearable device of a wristband type (for example, a smart watch including a display or no display), a headphone type (without a display), a neckphone type (a neck-hanging type including a display or no display), or the like.

[0034] An operation input for a wearable device that may be worn by the user like the information processing device 1 according to the embodiment may be performed based on a movement and voice of the user sensed by a sensor such as the camera described above, for example. For example, it can be considered to receive an operation input using a virtual object such as a gesture of touching the virtual object displayed on the display unit 13. However, the virtual object is unreal, so that it has been difficult for the user to intuitively make such an operation input using the virtual object as compared with an operation input performed by using a real controller, for example.

[0035] Thus, the information processing device 1 according to the embodiment receives an operation input using a real object present in the real space. For example, the information processing device 1 according to the embodiment may receive, as the operation input, movement of the real object, rotation of the real object, or touching of the real object performed by the user. With this configuration, an operation input more intuitive for the user than the operation input using the virtual object may be implemented. In the following description, the real object used for the operation input in the embodiment may be referred to as an object to be operated in some cases.

[0036] In the embodiment, the object to be operated is not limited to a dedicated controller prepared in advance or a specific real object determined in advance, and may be various real objects present in the real space. For example, the object to be operated according to the embodiment may be any real object such as a writing tool, a can, a book, a clock, and an eating utensil present around the periphery. With this configuration, convenience for the user is improved.

[0037] As described above, the object to be operated is not limited to a dedicated controller prepared in advance or a specific real object determined in advance, so that it is desirable to notify the user of whether which of real objects present around the periphery is the object to be operated. Thus, the information processing device 1 according to the embodiment may display the virtual object indicating that the real object is the object to be operated that can receive the operation input from the user (an example of information about the operation input using the object to be operated). The virtual object is displayed at a position corresponding to the position of the object to be operated, for example, may be displayed to be superimposed on the object to be operated or displayed in the vicinity of the object to be operated.

[0038] In this case, among the real objects present around the periphery, a real object that does not meet user’s preference (for example, the operation input is difficult to be performed) may be assigned as the object to be operated when the real object is automatically assigned as the object to be operated. If all real objects that are present around the periphery and can be utilized as the object to be operated are assumed to be objects to be operated, and virtual objects corresponding to the objects to be operated are displayed, a virtual object not desirable for the user is displayed in some cases. Specifically, if a virtual object corresponding to a real object other than the object to be operated that is actually used for the operation input by the user is kept being displayed, the operation input performed by the user may be obstructed.

[0039] Thus, the information processing device 1 according to the embodiment performs assignment of the object to be operated and display of the virtual object based on the selection made by the user to implement assignment of the object to be operated more preferred by the user and display of the virtual object desired by the user. Specifically, the information processing device 1 specifies the object to be operated from among the real objects that are present in the real space and recognized as candidates for the object to be operated based on the selection made by the user, and causes the virtual object corresponding to the specified object to be operated to be displayed.

2.* CONFIGURATION*

[0040] The outline of the information processing device 1 according to the embodiment has been described above. Subsequently, the following describes a configuration of the information processing device 1 according to the embodiment with reference to FIG. 2. FIG. 2 is a block diagram illustrating a configuration example of the information processing device 1 according to the embodiment. As illustrated in FIG. 2, the information processing device 1 includes a sensor unit 11, a control unit 12, a display unit 13, a speaker 14, a communication unit 15, an operation input unit 16, and a storage unit 17.

[0041] Sensor Unit 11

[0042] The sensor unit 11 has a function of acquiring various kinds of information about the user or a peripheral environment. For example, the sensor unit 11 includes the outward camera 110, an inward camera 111, a mic 112, a gyro sensor 113, an acceleration sensor 114, an azimuth sensor 115, a position measuring unit 116, and a biosensor 117. A specific example of the sensor unit 11 described herein is merely an example, and the embodiment is not limited thereto. Additionally, a plurality of sensors may be disposed.

[0043] Each of the outward camera 110 and the inward camera 111 includes a lens system constituted of an imaging lens, a diaphragm, a zoom lens, a focus lens, and the like, a driving system that causes the lens system to perform a focus operation or a zoom operation, a solid-state imaging element array that photoelectrically converts imaging light obtained by the lens system to generate an imaging signal, and the like. The solid-state imaging element array may be implemented by a Charge Coupled Device (CCD) sensor array, or a Complementary Metal Oxide Semiconductor (CMOS) sensor array, for example.

[0044] The mic 112 collects voice of the user and environmental sound of the surroundings to be output to the control unit 12 as voice data.

[0045] The gyro sensor 113 is implemented by a triaxial gyro sensor, for example, and detects an angular speed (rotational speed).

You may also like...