雨果巴拉:行业北极星Vision Pro过度设计不适合市场

Sony Patent | Information processing device, drawing control method, and recording medium recording program of the same

Patent: Information processing device, drawing control method, and recording medium recording program of the same

Drawings: Click to check drawins

Publication Number: 20210327101

Publication Date: 20211021

Applicant: Sony

Assignee: Sony Corporation

Abstract

Deterioration in usability due to a delay is improved. An information processing device includes a control unit (10) that controls drawing of a picture displayed on a real object according to delay information based on a result of displaying of a picture on the real object.

Claims

  1. An information processing device comprising a control unit that controls drawing of a picture displayed on a real object according to delay information based on a result of displaying of a picture on the real object.

  2. The information processing device according to claim 1, wherein the control unit measures the delay information on the basis of a captured image in which the real object and the picture are imaged.

  3. The information processing device according to claim 1, wherein the control unit measures the delay information on the basis of a first captured image in which the real object and a first picture are imaged, and a second captured image in which the real object and a second picture different from the first picture are imaged.

  4. The information processing device according to claim 3, wherein the control unit identifies a first distance between a first figure of the real object in the first captured image and a second figure of the first picture in the first captured image, identifies a second distance, for which the real object moves while the picture displayed on the real object is switched from the first picture to the second picture, on the basis of the first figure and a second figure of the real object in the second captured image, and calculates the delay information on the basis of the first distance and the second distance.

  5. The information processing device according to claim 2, further comprising a delay measurement camera that acquires the captured image in which the real object and the picture are imaged.

  6. The information processing device according to claim 1, wherein the control unit measures the delay information on the basis of a time difference between a start or stop of movement of the real object and a display of the picture on the real object.

  7. The information processing device according to claim 6, further comprising: a first sensor that is provided in the real object and that detects the start or stop of the movement of the real object; a second sensor that detects that the picture is displayed on the real object; and a measurement unit that measures elapsed time from when the first sensor detects the start or stop of the movement of the real object until the second sensor detects that the picture is displayed on the real object, wherein the control unit sets, as the delay information, the elapsed time measured by the measurement unit.

  8. The information processing device according to claim 7, wherein the first sensor is an inertial measurement unit (IMU) sensor, and the second sensor is an optical sensor.

  9. The information processing device according to claim 1, further comprising a detection unit that detects a position of the real object, wherein the control unit controls a position of drawing a picture, which is superimposed on the real object in and after a next frame, on the basis of the position of the real object which position is detected by the detection unit, and the delay information.

  10. The information processing device according to claim 9, wherein the control unit predicts a position, in which a picture superimposed on the real object in and after a next frame is drawn, on the basis of the position of the real object which position is detected by the detection unit and the delay information, and draws the picture in and after the next frame in the predicted position.

  11. The information processing device according to claim 10, wherein the detection unit includes an imaging unit that images the real object, and the control unit detects a position of the real object on the basis of an image acquired by the imaging unit.

  12. The information processing device according to claim 11, wherein the detection unit further includes a reflection marker that is provided on the real object and that reflects light of a specific wavelength, and a light source that projects the light of the specific wavelength onto the real object, and the imaging unit detects the light of the specific wavelength which light is reflected by the reflection marker.

  13. The information processing device according to claim 12, wherein the light of the specific wavelength is infrared light.

  14. The information processing device according to claim 1, further comprising an output unit that outputs a picture drawn by the control unit.

  15. The information processing device according to claim 5, further comprising an output unit that outputs a picture drawn by the control unit, wherein a first frame rate at which the delay measurement camera acquires a captured image is equivalent to or higher than a second frame rate at which the output unit outputs the picture.

  16. The information processing device according to claim 15, wherein the first frame rate is a multiple of the second frame rate.

  17. The information processing device according to claim 14, wherein the output unit is a projector or a display.

  18. The information processing device according to claim 14, further comprising a plurality of the output units, wherein the control unit controls drawing of a picture, which is output from each of the plurality of output units, on the basis of a display position of a picture displayed on the real object by each of the plurality of output units and a position of the real object.

  19. A drawing control method comprising controlling drawing of a picture displayed on a real object according to delay information based on a result of displaying of a picture on the real object.

  20. A recording medium recording a program for causing a computer to execute a step of controlling drawing of a picture displayed on a real object according to delay information based on a result of displaying of a picture on the real object.

Description

FIELD

[0001] The present disclosure relates to an information processing device, a drawing control method, and a recording medium on which a program thereof is recorded.

BACKGROUND

[0002] Recently, a touch panel on which an input can be performed with a finger or a pen, and an interactive projector on which an input can be performed with a pen-type device have been productized. Also, productization and research of a glasses-type augmented reality (AR) device that can superimpose a virtual object on a real world are being actively conducted.

CITATION LIST

Patent Literature

[0003] Patent Literature 1: Japanese Laid-open Patent Publication No. 2016-151612** A**

SUMMARY

Technical Problem

[0004] In such a device that superimposes a picture on a real object, processing time from detection of an object or an input by a user until superimposition of the picture appears as a delay. When the delay is too large, a deviation of a superimposition position becomes noticeable. As a result, an experience value is impaired and usability is deteriorated.

[0005] Thus, the present disclosure proposes an information processing device and a drawing control method that can improve deterioration in usability due to a delay, and a recording medium on which a program thereof is recorded.

Solution to Problem

[0006] To solve the above-described problem, an information processing device according to one aspect of the present disclosure comprises a control unit that controls drawing of a picture displayed on a real object according to delay information based on a result of displaying of a picture on the real object.

[0007] (Action) According to an information processing device of one form according to the present disclosure, drawing of a picture projected in a next frame is controlled on the basis of delay information indicating an amount of a delay actually generated in one frame. Accordingly, even in a case where the delay information changes due to a change in a system configuration or a change in processing time of an application, it becomes possible to dynamically change a prediction amount and compensate for the delay information. As a result, since a positional deviation between a picture projected in a next frame and a real object is decreased, it is possible to improve deterioration in usability due to a delay.

Advantageous Effects of Invention

[0008] According to the present disclosure, it becomes possible to improve deterioration in usability due to a delay. Note that an effect described here is not necessarily limited, and may be any effects described in the present disclosure.

BRIEF DESCRIPTION OF DRAWINGS

[0009] FIG. 1 is a timing chart illustrating a flow of a series of processing to superimpose a picture on a real object.

[0010] FIG. 2 is a view illustrating an example of a case where delay time changes due to an increase in time required for imaging.

[0011] FIG. 3 is a view illustrating an example of a case where delay time changes due to an increase in time required for recognition.

[0012] FIG. 4 is a view illustrating an example of a case where delay time changes due to an increase in time required for drawing.

[0013] FIG. 5 is a schematic diagram illustrating a schematic configuration example of a projection system according to a first embodiment.

[0014] FIG. 6 is a functional block diagram illustrating the schematic configuration example of the projection system according to the first embodiment.

[0015] FIG. 7 is a flowchart illustrating a schematic example of a projection operation executed by the projection system according to the first embodiment.

[0016] FIG. 8 is a flowchart illustrating a schematic example of a total delay time measurement operation executed by the projection system according to the first embodiment.

[0017] FIG. 9 is a view illustrating an example of a processing flow of when “detection” of a position of an object is performed, the flow being according to the first embodiment.

[0018] FIG. 10 is a view illustrating an example of a processing flow of when a prediction point is calculated, the flow being according to the first embodiment.

[0019] FIG. 11 is a view illustrating a flow of measuring total delay time, the flow being according to the first embodiment.

[0020] FIG. 12 is a view for describing an example of timing, at which the total delay time is reflected, according to the first embodiment.

[0021] FIG. 13 is a view for describing another example of timing, at which total delay time is reflected, according to the first embodiment.

[0022] FIG. 14 is a view illustrating a schematic configuration example of a projection system according to a second embodiment.

[0023] FIG. 15 is a functional block diagram illustrating the schematic configuration example of the projection system according to the second embodiment.

[0024] FIG. 16 is a flowchart illustrating a schematic example of a total delay time measurement operation according to the second embodiment.

[0025] FIG. 17 is a view for describing a flow of measuring total delay time, the flow being according to the second embodiment.

[0026] FIG. 18 is a sequence diagram illustrating an operation example of a modification example of an output device according to the first or second embodiment.

[0027] FIG. 19 is a schematic diagram illustrating an example of a case where the projection system according to the first embodiment is applied to AR glasses.

[0028] FIG. 20 is a schematic diagram illustrating an example of a case where the projection system according to the second embodiment is applied to a head-mounted display-type VR device.

[0029] FIG. 21 is a schematic diagram illustrating an example of a case where the first or second embodiment is applied to a configuration in which a picture is displayed on an object placed on a display.

[0030] FIG. 22 is a schematic diagram illustrating an example of a case where the first or second embodiment is applied to an interactive projector.

[0031] FIG. 23 is a block diagram illustrating an example of a hardware configuration of an information processing device according to the first or second embodiment.

DESCRIPTION OF EMBODIMENTS

[0032] Hereinafter, an embodiment of the present disclosure will be described in detail on the basis of the drawings. Note that in the following embodiment, overlapped description is omitted by assignment of the same reference sign to identical parts.

[0033] Also, the present disclosure will be described in the following order of items.

[0034] 1. Introduction

[0035] 2. First embodiment

[0036] 2.1 Schematic configuration example of projection system

[0037] 2.2 Operation example of projection system

[0038] 2.2.1 Projection operation

[0039] 2.2.2 Total delay time measurement operation

[0040] 2.2.2.1 Object position detection processing

[0041] 2.2.2.2 Prediction point calculation processing

[0042] 2.2.2.3 Measurement of total delay time

[0043] 2.2.2.4 Calculation of prediction amount

[0044] 2.3 Reflection timing of prediction amount setting value

[0045] 2.4 Action/effect

[0046] 3. Second embodiment

[0047] 3.1 Schematic configuration example of projection system

[0048] 3.2 Operation example of projection system

[0049] 3.2.1 Total delay time measurement operation

[0050] 3.2.1.1 Measurement of total delay time

[0051] 3.3 Action/effect

[0052] 4. Modification example

[0053] 4.1 Modification example related to prediction amount calculation

[0054] 4.2 Modification example related to object position detection

[0055] 4.3 Modification example related to output device

[0056] 5. Application example

[0057] 5.1 AR glasses

[0058] 5.2 Virtual reality (VR) device/video see-through device

[0059] 5.3 Display

[0060] 5.4 Interactive projector

[0061] 5.5 Other application examples

[0062] 6. Hardware configuration

1.* INTRODUCTION*

[0063] FIG. 1 is a timing chart illustrating a flow of a series of processing to superimpose an image or a moving image (hereinafter, referred to as picture) on a real object. As illustrated in FIG. 1, in the series for superimposing a picture on a real object, a series of processing such as “imaging” S1 of the real object, “recognition” S2 of the real object by an analysis of a captured image, “drawing” S3 of a superimposed picture, and “output” S4 of the drawn picture needs to be performed in one frame. However, variations are generated in delay information indicating an amount of a delay such as delay time generated between the “imaging” S1 and the “output” S4 when an imaging device, a projection device, or a display is changed, or a processing load during execution of an application is changed. FIG. 2 to FIG. 4 are views illustrating examples in which delay time from “imaging” to “output” changes. Note that a case where time required for the imaging S1 is increased (imaging S1’) is illustrated in FIG. 2, a case where time required for the recognition S2 is increased (recognition S2’) is illustrated in FIG. 3, and a case where time required for the drawing S3 is increased (drawing S3’) is illustrated in FIG. 4.

[0064] In such a manner, factors that change the delay information are (1) a change in imaging time (see FIG. 2), (2) a change in recognition processing time (see FIG. 3), (3) a change in drawing processing time (see FIG. 4), (4) a change in output time, and the like. Then, the following items can be illustrated as factors that cause each change.

[0065] (1) Factor to Change Imaging Time [0066] Change of an imaging device [0067] Change in a camera frame rate [0068] Change in exposure time or shutter speed

[0069] (2) Factor to Change Recognition Processing Time [0070] Increase in a processing cost required for recognition due to an increase in the number of times of detection/tracking of objects (hereinafter, referred to as recognition cost)

[0071] (3) Factor to Change Drawing Processing Time [0072] Increase in a processing cost required for drawing due to an increase in the number of computer graphics (CG) objects to be drawn (hereinafter, referred to as drawing cost)

[0073] (4) Factor to Change Output Time [0074] Change of a projector to a projector with a different delay due to internal processing (for example, case where a low image-quality or low-resolution projector is changed to a high image-quality or super-resolution projector) [0075] Change to a projector or display with a different refresh rate (for example, case where a projector with a frame rate being 60 Hertz (Hz) (display interval=16.7 ms) is changed to a projector with a frame rate being 120 Hz (display interval=8.3 ms))

[0076] As described above, a factor to change the delay information is conceivable in each of the elements that are the “imaging” S1, “recognition” S2, “drawing” S3, and “output” S4. In such a situation, in a case where the “drawing” S3 is executed with the delay information being a fixed numerical value, there is a case where a prediction amount (such as prediction time) for eliminating a deviation of a superimposition position (hereinafter, simply referred to as positional deviation) is insufficient and a picture is displayed behind a position of a real object, or a prediction amount is too large and a picture is displayed ahead of a position of a real object.

[0077] Thus, in the following embodiment, a mechanism for measuring a delay generated between the “imaging” S1 and the “output” S4 (hereinafter, referred to as total delay) is introduced into a system, and a prediction amount is dynamically changed from delay information such as total delay time. This makes it possible to improve deterioration in usability due to a delay even in a case where delay information changes.

2.* FIRST EMBODIMENT*

[0078] First, the first embodiment will be described in detail with reference to the drawings.

[0079] 2.1 Schematic Configuration Example of Projection System

[0080] FIG. 5 is a schematic diagram illustrating a schematic configuration example of a projection system according to the present embodiment. FIG. 6 is a functional block diagram illustrating the schematic configuration example of the projection system according to the present embodiment.

[0081] As illustrated in FIG. 5 and FIG. 6, a projection system 1 includes an information processing device 10, a sensor 20, and an output device (output unit) 30. In addition, as a part of a configuration to “recognize” a real object on which a picture is projected (hereinafter, referred to as object), the projection system 1 also includes a retroreflective marker 42 provided on an object 40. That is, an infrared projector (light source) 33, the retroreflective marker 42, and an infrared camera (imaging unit) 22 function as detection units to detect a position of the object 40.

[0082] The sensor 20 includes the infrared camera 22 to recognize the object 40 and detect a position thereof, and a delay measurement camera 21. In the present embodiment, as the infrared camera 22, a camera in which visible light is cut and only infrared light can be observed is used. However, in a case where the object 40 is recognized from a color or a feature in a captured image, a color camera or grayscale camera may be used.

[0083] The delay measurement camera 21 is a camera to measure total delay time as delay information from a positional deviation between the object 40 and a picture, and may be a visible light camera that acquires an image in a visible light region, for example. Also, a frame rate of the delay measurement camera 21 may be, for example, a frame rate equivalent to or higher than a frame rate of the infrared camera 22 or a projector 31 (described later). At that time, when the frame rate of the delay measurement camera 21 is set to a multiple (including 1) of the frame rate of the projector 31, a time difference from timing at which the projector 31 starts or completes an output of a picture until the delay measurement camera 21 starts or completes imaging of the output picture can be made constant. Thus, it becomes possible to improve measurement accuracy of total delay time (described later). Note that in the present description, a case where the total delay time is used as delay information indicating an amount of a delay generated between the “imaging” S1 and the “output” S4 is illustrated. However, delay information is not limited to time information, and various kinds of information that express a delay as processable information such as a numerical value and that are, for example, distance information and a count value can be used.

[0084] The output device 30 includes the projector 31 to project a picture, and an infrared projector 33 to project light of a specific wavelength (for example, infrared light in the present embodiment) onto the object 40. Also, the output device 30 may include a speaker 32 or the like to output sound effect or the like.

[0085] There is not necessarily one projector 31, and there may be a plurality thereof. In the present embodiment, a general speaker is assumed as the speaker 32, but an ultrasonic speaker having high directivity, or the like may be used. Also, a fixed projection-type projector is assumed in the present embodiment. However, a projector 31 may be configured to be able to project a picture in an arbitrary direction or place by provision of drive or a movement mechanism in an output device 30.

[0086] Moreover, in the present embodiment, a display device such as a display may be used instead of the projector 31 or together with the projector 31. In a case where a display is used, a case where an object 40 placed on the display is detected and visual expression or effect is displayed around a position thereof is considered. That is, the “superimposition of a picture” in the present description includes not only projection (also referred to as projection or projection) of a picture onto the object 40 but also displaying of a picture in or around a position corresponding to the object 40.

[0087] In the present embodiment, the object 40 is, for example, a real object that can slide on a table 50. Specific examples of the object 40 include a pack in air hockey of a game machine. However, an object 40 is not limited to this, and any real object that can move on a plane or in space can be used. Also, in a case where the projector 31 is movable, a fixed object can be an object 40. That is, various real objects positional relationships of which with a device that projects a picture can be changed can be set as the object 40.

[0088] The retroreflective marker 42 for detection of a position of the object 40 is fixed thereto. The retroreflective marker 42 reflects light of a specific wavelength (infrared light in the present example) projected from the infrared projector 33. Note that in the present embodiment, the infrared light projected from the infrared projector 33 and reflected by the retroreflective marker 42 is detected in order to detect a position of the object 40. However, a light emitting unit that emits light of a specific wavelength (such as infrared light emitting diode (LED)) may be mounted on the object 40.

[0089] Alternatively, in a case where a color camera is used instead of the infrared camera 22, a position of the object 40 can be detected by extraction of a color marker provided on the object 40 with the color camera, or by extraction of a feature of the object 40 which feature is acquired from a captured image.

[0090] For example, the information processing device 10 may be an information processing device that includes an information processing unit such as a central processing unit (CPU) as a control unit and that is, for example, a personal computer (PC). However, this is not a limitation, and various electronic devices that can perform information processing and that are, for example, a server (including a cloud server) and the like can be used.

[0091] On the basis of information input from the sensor 20, the information processing device 10 generates picture and sound data to be projected on the object 40 and outputs the generated picture data and sound data to the output device 30.

[0092] Thus, as illustrated in FIG. 6, the information processing device 10 includes a total delay time measurement unit 11, a prediction amount determination unit 12, a prediction amount storage unit 13, an object position detection unit 14, and an object position prediction unit 15, a picture data generation unit 16, a sound data generation unit 17, and an interface (I/F) unit 18. At least a part of the total delay time measurement unit 11, the prediction amount determination unit 12, the prediction amount storage unit 13, the object position detection unit 14, the object position prediction unit 15, the picture data generation unit 16, and the sound data generation unit 17 may be realized when the CPU (control unit) in the information processing device 10 reads a predetermined program from a recording unit (not illustrated) and performs execution thereof.

[0093] The total delay time measurement unit 11 fires a measurement starting event and a measurement ending event from an image acquired from the delay measurement camera 21, and measures the total delay time from the image at that time. Note that the measurement starting event and the measurement ending event will be described later.

[0094] When the total delay time measurement unit 11 succeeds in measuring the total delay time, the prediction amount determination unit 12 calculates a prediction amount from the total delay time. Then, the prediction amount determination unit 12 updates a prediction amount setting value stored in the prediction amount storage unit 13 with the calculated prediction amount.

[0095] Note that the prediction amount and the prediction amount setting value represent future prediction time (msec), are time information for reducing a positional deviation between the object 40 and a projected image and are values on which the measured total delay time is reflected. For example, in a case where the prediction amount setting value is set to zero regardless of existence/non-existence of a positional deviation, a position of the object 40 in a next frame is predicted in the next frame on the basis of the position of the object 40 which position is detected up to the current frame. Thus, in a case where a picture to be superimposed on the object 40 is “drawn” and “output” with respect to this predicted position in the next frame, a positional deviation corresponding to the total delay time generated between the “imaging” and “output” in one frame is generated between the object 40 and the picture. Note that the one frame indicates a period or a time from the “imaging” to “output”. Thus, as the prediction amount setting value, time information to reduce a positional deviation between the object 40 and the projected picture is determined on the basis of the total delay time when a position to draw a picture to be superimposed on the object 40 in “drawing” in a next frame (that is, prediction position of the object 40) is determined. For example, in a case where a picture is projected on a position delayed from the object 40 in the current frame, positive time information corresponding to an amount of the positional deviation is set as the prediction amount setting value at that time in order to project the picture on a position that is further ahead. On the one hand, in a case where a picture is projected on a position advanced from the object 40 in the current frame, negative time information corresponding to an amount of the positional deviation is set as the prediction amount setting value at that time in order to delay the projection position of the picture.

[0096] The object position detection unit 14 detects, from an image captured by the infrared camera 22, a position of the retroreflective marker 42 on the object 40 as coordinates. Then, the object position detection unit 14 converts the detected coordinates from a coordinate system of the infrared camera 22 (hereinafter, referred to as camera coordinate system) to a coordinate system of the projector 31 (hereinafter, referred to as projector display coordinate system) by using a projection matrix.

[0097] For example, the prediction amount storage unit 13 stores, as a prediction amount setting value, the latest prediction amount calculated by the prediction amount determination unit 12. Also, the prediction amount storage unit 13 stores a history of coordinates of the object 40 which coordinates are detected by the object position detection unit 14 in the past (hereinafter, referred to as detection history).

[0098] The object position prediction unit 15 predicts a future position of the object 40 (hereinafter, referred to as prediction point) by using a position (coordinate) of the object 40 which position is detected this time and a history of the position (coordinate) of the object 40 which position is detected in the past (detection history). At that time, the object position prediction unit 15 calculates a prediction point with reference to the latest prediction amount setting value stored in the prediction amount storage unit 13. Note that the prediction point may be, for example, a position on which a picture to be superimposed on the object 40 is “drawn” in and after the next frame.

[0099] After the prediction point is calculated, picture data and sound data of the picture to be projected are respectively generated by the picture data generation unit 16 and the sound data generation unit 17. The generated picture data and sound data are transmitted and output to the projector 31 and the speaker 32 via the I/F unit 18.

[0100] 2.2 Operation Example of Projection System

[0101] Next, an operation of the projection system 1 according to the present embodiment will be described in detail with reference to the drawings.

[0102] 2.2.1 Projection Operation

[0103] FIG. 7 is a flowchart illustrating a schematic example of a projection operation executed by the projection system according to the present embodiment. As illustrated in FIG. 7, in this operation, first, infrared light is projected by the infrared projector 33 and “imaging” is executed by the infrared camera 22 in order to detect a position of an object 40 (Step S110). An image acquired in this manner is transmitted to the information processing device 10 and input to the object position detection unit 14 via the I/F unit 18.

[0104] Next, Step S121 to S123 corresponding to “recognition” of the object 40 are executed. In Step S121, the object position detection unit 14 executes object position detection processing to detect a position of the object 40 included in the image by analyzing the input image. More specifically, the object position detection unit 14 detects a figure of the retroreflective marker 42 in the image.

[0105] In a case where the object position detection unit 14 fails to detect the position of the object 40 (NO in Step S122), this operation returns to Step S110. On the one hand, in a case where the object position detection unit 14 succeeds in detecting the position of the object 40 (YES in Step S122), the object position prediction unit 15 then executes prediction point calculation processing to calculate a future prediction point (such as position of the object 40 at next projection timing) by using a result of detection by the object position detection unit 14 at this time, a history of a position (coordinate) of the object 40 which position is detected in the past (detection history), and the latest prediction amount setting value stored in the prediction amount storage unit 13 (Step S123). Note that a result of the object position detection processing executed in the past may be, for example, results of the object position detection processing for the immediately preceding predetermined number of times (for example, three times).

[0106] After the future prediction point is calculated in such a manner, the picture data generation unit 16 then performs “drawing” of data of a picture to be projected from the projector 31 onto the object 40 (picture data) (Step S130). At that time, when necessary, the sound data generation unit 17 may generate data of sound to be output from the speaker 32 (sound data).

[0107] Next, the picture data generated by the picture data generation unit 16 is transmitted to the projector 31 and the projector 31 reproduces and “outputs” the picture data, whereby a picture is projected onto the object 40 (Step S140).

[0108] Subsequently, it is determined whether to end this operation (Step S150), and this operation is ended in a case of being determined to be ended (YES in Step S150). On the one hand, in a case of being determined not to be ended (NO in Step S150), this operation returns to Step S110.

[0109] 2.2.2 Total Delay Time Measurement Operation

[0110] Next, the total delay time measurement operation according to the present embodiment will be described in detail with reference to the drawings. FIG. 8 is a flowchart illustrating a schematic example of the total delay time measurement operation executed by the projection system according to the present embodiment. As illustrated in FIG. 8, in this operation, the total delay time measurement unit 11 first waits until an event for starting measurement of the total delay time (hereinafter, referred to as measurement starting event) fires (NO in Step S201), and starts the measurement of the total delay time (Step S202) in a case where the measurement starting event fires (YES in Step S201). At that time, for example, by using a measurement unit or the like (not illustrated) that includes software or hardware and that is, for example, a counter to count clocks, the total delay time measurement unit 11 measures elapsed time from the start of the measurement of the total delay time. Note that the measurement starting event may be, for example, a case where a projection position of a picture projected from the projector 31, which position is detected by an analysis of an image acquired by the delay measurement camera 21, changes from a projection position detected by a previous image analysis, the measurement of the total delay time not being started being a condition.

[0111] Next, the total delay time measurement unit 11 determines whether certain time (such as 50 milliseconds (ms)) or more elapses from the start of the measurement of the total delay time (Step S203), and determines that the measurement of the total delay time is failed, resets the measurement time measured by the measurement unit (not illustrated) (Step S204), and proceeds to Step S209 in a case where the certain time or more elapses (YES in Step S203).

[0112] On the one hand, in a case where the certain time does not elapse yet (NO in Step S203), the total delay time measurement unit 11 determines whether an event for ending the measurement of the total delay time (hereinafter, referred to as measurement ending event) fires (Step S205). In a case where the measurement ending event does fire (NO in Step S205), the total delay time measurement unit 11 returns to Step S203 and executes operations in and after that. Note that the measurement ending event may be, for example, a case where a projection position of a picture projected from the projector 31, which position is detected by an analysis of an image acquired by the delay measurement camera 21, changes from a projection position detected by a previous image analysis, the measurement of the total delay time being started being a condition.

[0113] In a case where the measurement ending event fires (YES in Step S205), the total delay time measurement unit 11 ends the measurement of the total delay time (Step S206). Subsequently, the total delay time measurement unit 11 calculates the total delay time from an image of when it is determined that the measurement starting event fires and an image of when it is determined that the measurement ending event fires (Step S207). Note that the measurement of the total delay time in Step S202 to S207 will be described later in detail.

[0114] When the total delay time is measured in such a manner, the prediction amount determination unit 12 then calculates a prediction amount from the measured total delay time (Step S208). Subsequently, the prediction amount determination unit 12 updates the prediction amount setting value in the prediction amount storage unit 13 with the calculated prediction amount (Step S209). As a result, the prediction amount setting value in the prediction amount storage unit 13 is updated to the latest value.

[0115] Subsequently, it is determined in Step S210 whether to end this operation, and this operation is ended in a case of being determined to be ended (YES in Step S210). On the one hand, in a case of being determined not to be ended (NO in Step S210), this operation returns to Step S201 and operations in and after that are executed.

[0116] 2.2.2.1 Object Position Detection Processing

[0117] Here, the object position detection processing illustrated in Step S121 in FIG. 7 will be described. Detection of a position of the object 40, for example, includes “detection” to identify a position of the object 40 by “recognizing” an image in a current frame, and “tracking” to identify a position of the object 40 by “recognizing” an image in a current frame on the basis of a position of the object 40 which position is identified in the previous frame.

[0118] More specifically, in the “detection”, for example, a position of a bright spot corresponding to a figure of the retroreflective marker 42 that is in an image photographed by the infrared camera 22 is identified, and coordinates thereof are set as coordinates of the object 40.

[0119] FIG. 9 is a view illustrating an example of a processing flow of when “detection” of a position of the object 40 is performed. As illustrated in FIG. 9, in the “detection” of the position of the object 40, first, an image G1 acquired by cutting of visible light and observing of a figure of infrared light is input from the infrared camera 22 (Step S161). In this image G1, for example, a bright spot K1 corresponding to a figure of the retroreflective marker 42 is expressed on a substantially gray scale. The object position detection unit 14 performs binarization processing on the input image G1 by using a previously-set threshold (Step S162). As a result, for example, a binary image G2 in which each pixel in a region K2 corresponding to the retroreflective marker 42 has a bit value 1 and pixels in the other regions have a bit value 0 is acquired. Subsequently, the object position detection unit 14 extracts a contour K3 of the region K2 from the binary image G2 (Step S163), and then calculates barycentric coordinates K4 of the region corresponding to the retroreflective marker 42 from the extracted contour K3 and sets the calculated barycentric coordinates K4 as the position of the object 40 (Step S164).

[0120] Note that a case where the grayscale image G1 is used is illustrated in the above, but this is not a limitation. As described above, for example, in a case where a color camera is used instead of the infrared camera 22, various modifications may be made. For example, a color marker provided on an object 40 is detected with a color camera, or a position of an object 40 is detected by capturing of a feature thereof from an edge or a feature amount in an image captured by a color camera.

[0121] On the one hand, in the “tracking”, when the contour K3 is extracted in Step S163 in FIG. 9, a vicinity region of barycentric coordinates K4 in a binary image G2 in a current frame is searched for on the basis of the barycentric coordinates K4 of the object 40 which coordinates are detected in the image in the previous frame, whereby a bright spot K1 corresponding to the figure of the retroreflective marker 42 is detected. Alternatively, a motion vector is calculated from a history of the barycentric coordinates K4 for a several times and a binary image G2 in the current frame is searched for in a vector direction thereof, whereby a bright spot K1 corresponding to the figure of the retroreflective marker 42 is detected. Then, similarly to FIG. 9, a contour K3 of a region K2 corresponding to the retroreflective marker 42 is extracted from the detected bright spot K1 (Step S163), K4 calculation of barycentric coordinates of the region K2 corresponding to the retroreflective marker 42 is then performed from the extracted contour K3, and the calculated barycentric coordinates K4 are set as a position of the object 40 (Step S164).

[0122] 2.2.2.2 Prediction Point Calculation Processing

[0123] Next, the prediction point calculation processing illustrated in Step S123 in FIG. 9 will be described. As described above, in the prediction point calculation processing, a prediction point is calculated from a position (coordinate) of the object 40 which position is detected this time, a history of a position (coordinate) of the object 40 which position is detected in the past (detection history), and the latest prediction amount setting value stored in the prediction amount storage unit 13. An example of a processing flow of when a prediction point is calculated is illustrated in FIG. 10. Note that a case of calculating, as a prediction point, a position of the object 40 that is n (n is natural number) frames ahead (position Q at timing t.sub.+1 in FIG. 10) by reflecting a change in acceleration a (acceleration a.sub.1 to a.sub.3 in FIG. 10) on the prediction point from a history of coordinates of the object 40 which coordinates are detected by the object position detection unit 14 (detection history) is illustrated in FIG. 10. In this case, n prediction points can be acquired. Note that in FIG. 10, .DELTA.T indicates time for one frame, and .DELTA.V.sub.0 to .DELTA.V.sub.2 indicate a moving speed (vector or scalar) of the object 40 in each frame.

[0124] As described above, a prediction amount setting value p represent future prediction time (msec). Thus, in the calculation of a prediction point, it is necessary to determine the number of points to be predicted. Here, when it is assumed that an imaging frame rate of the infrared camera 22 is F (frame per second (fps)), prediction time p’ (msec) of when prediction is performed n’ points ahead is calculated by the following equation (1).

p ’ = 1 .times. 0 .times. 0 .times. 0 F .times. n ’ ( 1 ) ##EQU00001##

[0125] Thus, in the present embodiment, n’ is incremented by 1 from 1 and a value of when p’ exceeds the prediction amount setting value p for the first time is set as a prediction point Qn of a frame that is n frames ahead.

[0126] 2.2.2.3 Measurement of Total Delay Time

……
……
……

您可能还喜欢...