空 挡 广 告 位 | 空 挡 广 告 位

Sony Patent | Information Processing Apparatus And Method, Display Control Apparatus And Method, Reproducing Apparatus And Method, And Information Processing System

Patent: Information Processing Apparatus And Method, Display Control Apparatus And Method, Reproducing Apparatus And Method, And Information Processing System

Publication Number: 20200120386

Publication Date: 20200416

Applicants: Sony

Abstract

The present technology relates to an information processing apparatus and method, a display control apparatus and method, a reproducing apparatus and method, a program, and an information processing system that transmit a response of viewers acquired in a more natural way to a place where content is captured, enabling presentation in an easier-to-see way. The information processing apparatus of one aspect of the present technology receives motion information indicating motions of users watching video content and information indicating attributes of the users, and generates an excitement image by arranging information visually indicating a degree of excitement of each user determined on the basis of the motion information transmitted from a plurality of reproducing apparatuses at a position according to an attribute of each user. The present technology is applicable to a server that processes information transmitted from the reproducing apparatuses of content distributed in real time via a network.

CROSS REFERENCE TO RELATED APPLICATIONS

[0001] The present application is a continuation application of U.S. patent application Ser. No. 15/323,289, filed Dec. 30, 2016, which is a National Stage of PCT/JP2015/069382, filed Jul. 6, 2015, and claims the benefit of priority from prior Japanese Patent Application JP 2014-147597, filed Jul. 18, 2014,* the entire content of which is hereby incorporated by reference*

TECHNICAL FIELD

[0002] The present technology relates to an information processing apparatus and method, a display control apparatus and method, a reproducing apparatus and method, a program, and an information processing system, and in particular to an information processing apparatus and method, a display control apparatus and method, a reproducing apparatus and method, a program, and an information processing system that transmit a response of viewers acquired in a more natural way to a place where capturing of content is performed, enabling presentation in an easier-to-see way.

BACKGROUND ART

[0003] In recent years, various services to distribute moving pictures of concerts or sport games in real time have been provided. Viewers can watch content of the moving pictures distributed from a server by using their terminals such as PCs and smartphones.

[0004] There is also a technology to transmit a reaction of viewers who are watching content via the Internet to a venue and to present the reaction to spectators or the like who are in the venue.

CITATION LIST

Patent Document

[0005] Patent Document 1: Japanese Patent Application Laid-Open No. 2005-339479

[0006] Patent Document 2: Japanese Patent Application Laid-Open No. 2011-182109

SUMMARY OF THE INVENTION

Problems to be Solved by the Invention

[0007] In the above-described technology, the viewers of the content need to take, so to speak, active operations such as pushing buttons displayed on terminals used for watching the content in order to transmit their reaction to the venue. Therefore, sending the reaction causes an obstruction to being absorbed in watching the content.

[0008] The present technology has been made in view of such a situation and aims to transmit a response of the viewers acquired in a more natural way to a place where capturing of the content is performed, and to enable presentation in an easier-to-see way.

Solutions to Problems

[0009] An information processing apparatus according to a first aspect of the present technology includes: a receiving unit configured to receive motion information indicating a motion of a user who is watching video content and information indicating an attribute of the user transmitted from a reproducing apparatus that receives and reproduces the real time video content with a display range switched following the motion of the user who is a viewer within a range of a captured entire video; a generation unit configured to generate an excitement image by arranging information that visually indicates a degree of excitement of each of the users determined on the basis of the motion information transmitted from the plurality of reproducing apparatuses at a position according to the attribute of each of the users; and a transmitting unit configured to transmit data of the excitement image to a display control apparatus that causes a display apparatus installed in space where capturing of the video of the video content is performed to display the excitement image.

[0010] A display control apparatus according to a second aspect of the present technology includes: a receiving unit configured to receive motion information indicating a motion of a user who is watching video content and information indicating an attribute of the user transmitted from a reproducing apparatus that receives and reproduces the real time video content with a display range switched following the motion of the user who is a viewer within a range of a captured entire video, and to receive data of an excitement image transmitted from an information processing apparatus that generates the excitement image by arranging information that visually indicates a degree of excitement of each of the users determined on the basis of the motion information transmitted from the plurality of reproducing apparatuses at a position according to the attribute of each of the users; and a display control unit configured to cause a display apparatus installed in space where capturing of the video of the video content is performed to display the excitement image on the basis of the data of the excitement image.

[0011] A reproducing apparatus according to a third aspect of the present technology includes: a receiving unit configured to receive real time video content with a display range switched following a motion of a user who is a viewer within a range of a captured entire video; a reproducing unit configured to reproduce the video content; a detection unit configured to detect the motion of the user; a display unit configured to display the video of the display range according to the motion of the user; and a transmitting unit configured to transmit motion information indicating the motion of the user who is watching the video content to an information processing apparatus that aggregates the motion information together with information indicating an attribute of the user.

Effects of the Invention

[0012] The present technology enables transmission of the response of the viewers acquired in a more natural way to the place where capturing of the content is performed, and enables presentation in an easier-to-see way.

[0013] It is to be noted that effects described here may not necessarily be limited, and may be any effect described in the present disclosure.

BRIEF DESCRIPTION OF DRAWINGS

[0014] FIG. 1 is a diagram illustrating a configuration example of an information processing system according to one embodiment of the present technology.

[0015] FIG. 2 is a plan view illustrating an example of an event venue.

[0016] FIG. 3 is a diagram illustrating an example of an angle of view of a video.

[0017] FIG. 4 is a diagram illustrating another example of the angle of view of the video.

[0018] FIGS. 5A, 5B, and 5C are diagrams illustrating an example of the video.

[0019] FIG. 6 is a diagram illustrating an example of a visual field.

[0020] FIG. 7 is a diagram illustrating an example of a heat map image.

[0021] FIG. 8 is a block diagram illustrating a configuration example of an HMD.

[0022] FIG. 9 is a block diagram illustrating a functional configuration example of a control unit.

[0023] FIG. 10 is a diagram illustrating an example of information included in user attribute information.

[0024] FIG. 11 is a block diagram illustrating a configuration example of an aggregation server.

[0025] FIG. 12 is a block diagram illustrating a functional configuration example of the aggregation server.

[0026] FIG. 13 is a block diagram illustrating a functional configuration example of a distribution server.

[0027] FIG. 14 is a block diagram illustrating a configuration example of a display control apparatus.

[0028] FIG. 15 is a block diagram illustrating a functional configuration example of the display control apparatus.

[0029] FIG. 16 is a flowchart illustrating content distribution processing of the distribution server.

[0030] FIG. 17 is a flowchart illustrating reproduction processing of a client terminal.

[0031] FIG. 18 is a flowchart illustrating excitement aggregation processing of the aggregation server.

[0032] FIG. 19 is a flowchart illustrating heat map image display processing of the display control apparatus.

[0033] FIG. 20 is a flowchart illustrating the heat map image display processing of the client terminal.

[0034] FIG. 21 is a diagram illustrating a display example of a display.

[0035] FIG. 22 is a diagram illustrating another example of the heat map image.

[0036] FIG. 23 is a diagram illustrating an example of arrangement of information indicating a degree of excitement.

[0037] FIG. 24 is a diagram illustrating an example of switching of display of the heat map image.

[0038] FIGS. 25A, 25B, and 25C are diagrams illustrating an example of a form of the HMD.

MODE FOR CARRYING OUT THE INVENTION

[0039] Hereinafter, a form for implementing the present technology will be described. The description is made in the following order.

  1. Configuration of information processing system 2. Configuration of each device 3. Operation of each device 4.* Example of heat map image*

5.* Variations*

[0040] FIG. 1 is a diagram illustrating a configuration example of an information processing system according to one embodiment of the present technology.

[0041] The information processing system of FIG. 1 is a system that distributes moving pictures obtained by capturing concerts, sport games, and the like in real time.

[0042] The information processing system of FIG. 1 includes apparatuses on an event venue side where a concert or the like is held, a distribution server 11, an aggregation server 12, and head mounted displays (HMDs) 21 to 24 used by viewers of live content.

[0043] While FIG. 1 illustrates four HMDs as client terminals, more client terminals may be provided. It is also possible that the distribution server 11 and the aggregation server 12 are installed in the event venue. In addition, it is also possible to implement functions of the distribution server 11 and the aggregation server 12 with one server. Respective apparatuses are connected via a network such as the Internet.

[0044] A capturing control apparatus 1, a display control apparatus 2, and a large display 3 are provided in the event venue. In the example of FIG. 1, a concert is held in the event venue, and three persons as performers and spectators of the concert are in the event venue. The large display 3 is installed at a position both the performers and the spectators can see.

[0045] The capturing control apparatus 1 controls cameras installed in the event venue and controls capturing of the concert. The capturing control apparatus 1 transmits a captured moving picture to the distribution server 11 over the network.

[0046] The distribution server 11 generates live content for distribution on the basis of the moving picture transmitted from the capturing control apparatus 1, and then transmits the live content to the HMDs 21 to 24. The live content transmitted from the distribution server 11 is, for example, content including videos captured at a plurality of positions with an angle of view in at least one of a horizontal direction and a vertical direction being 360 degrees.

[0047] For example, a user of the HMD 21 who receives the live content transmitted from the distribution server 11 can select a visual point and watch the video from the selected visual point while changing a visual field. The live content transmitted from the distribution server 11 is so-called free visual point content with a changeable visual point.

[0048] Each of the HMDs 21 to 24 is equipped with a sensor for head tracking, such as an acceleration sensor and an angular velocity sensor. Each of the HMDs 21 to 24 detects a posture of a head of the user wearing each of the HMDs 21 to 24, and then switches a display range of the video according to a direction of a line of sight guessed from the posture of the head. Of the entire angle of view of 360 degrees, the user will see a certain range of the video that is in a direction in which the user faces.

[0049] Here, “the visual point” is a standpoint of the user who sees an object. Also, “the visual field” is a range the user sees, and corresponds to the range of video displayed on a display (display range). “The line of sight” is a direction of the user’s visual field, and corresponds to a direction of the display range of the video on the basis of a predetermined direction in capturing space.

[0050] FIG. 2 is a plan view illustrating an example of the event venue.

[0051] A stage #1 is provided in the event venue, and a spectator floor #2 is provided ahead of the stage #1 (in a lower part of FIG. 2). Three singers of persons H1, H2, and H3 are on the stage #1. In addition, a lot of spectators are in the spectator floor #2, behind which is installed the large display 3. A subject is a scene of the entire event venue.

[0052] Positions P1 to P3 are capturing positions of videos. At each of the positions P1 to P3 is installed a camera capable of capturing a video with an angle of view in at least one of the horizontal direction and vertical direction being 360 degrees. By installation of a plurality of cameras with different capturing ranges at respective capturing positions and composition of the videos captured by the cameras, a video with the capturing positions of the positions P1 to P3 and the angle of view of 360 degrees may be generated.

[0053] Hollow arrows A1 to A3 indicate reference directions at the positions of the positions P1 to P3, respectively. In the example of FIG. 2, the direction of the stage #1 is the reference direction.

[0054] FIG. 3 is a diagram illustrating an example of the angle of view of the videos to be captured at the positions P1 to P3.

[0055] For example, when a wide-angle lens is pointed at right above at each of the positions P1 to P3 and capturing is performed, as illustrated in FIG. 3, the video of a half celestial sphere range is captured in which an optical axis L1 illustrated in alternate long and short dash line crosses a zenith. The angle of view of FIG. 3 is an angle of view of 360 degrees in the horizontal direction and 180 degrees in the vertical direction.

[0056] As illustrated in FIG. 4, the video of an entire celestial sphere range with the angle of view in both horizontal direction and vertical direction of 360 degrees may be captured at each of the positions P1 to P3. Here, when represented in latitude/longitude using equidistant cylindrical projection, the video of the entire celestial sphere range may be represented as 360 degrees in the horizontal direction and 180 degrees in the vertical direction; here, in order to distinguish from the half celestial sphere range of FIG. 3, the angle of view in both horizontal direction and vertical direction are 360 degrees.

[0057] For convenience of description, the following describes a case where the video captured at each of the positions P1 to P3 is the video of the half celestial sphere range illustrated in FIG. 3.

[0058] FIGS. 5A, 5B, and 5C are diagrams illustrating examples of the videos captured at the positions P1 to P3, respectively.

[0059] The half celestial spheres of FIGS. 5A, 5B, and 5C indicate the entire videos of one frame of the moving pictures captured at the positions P1 to P3, respectively.

您可能还喜欢...