雨果巴拉:行业北极星Vision Pro过度设计不适合市场

Sony Patent | Calibration apparatus, calibration method, and program

Patent: Calibration apparatus, calibration method, and program

Drawings: Click to check drawins

Publication Number: 20210033712

Publication Date: 20210204

Applicant: Sony

Abstract

Information acquisition units 11-1 and 11-2 (11-2a) acquire peripheral object information, and information processing units 12-1 and 12-2 (12-2a) generate point cloud data relating to a feature point of a peripheral object on the basis of the peripheral object information. A weight setting unit 13 sets a weight according to a situation between the peripheral object and the information acquisition units when the peripheral object information was acquired. A calibration processing unit 15 uses the point cloud data, the weight, and external parameters stored in a parameter storage unit 14 to calculate new external parameters that minimize an error of the external parameters, on the basis of a cost indicating the error. A parameter update unit 16 updates the external parameters stored in the parameter storage unit 14 using the calculated new external parameters. Since highly accurate external parameters are stored in the parameter storage unit 14, the calibration can be performed stably.

Claims

  1. A calibration apparatus comprising a calibration processing unit that calculates parameters relating to positions and attitudes of a plurality of information acquisition units, using point cloud data relating to a feature point of a peripheral object generated on a basis of peripheral object information acquired by the plurality of information acquisition units, and a weight according to a situation between the peripheral object and the information acquisition units when the peripheral object information was acquired.

  2. The calibration apparatus according to claim 1, wherein the calibration processing unit calculates a cost indicating an error of the parameters, using the point cloud data acquired for the feature point by the plurality of information acquisition units, the weight, and the parameters stored in advance, and calculates parameters that minimize the error, on a basis of the calculated cost.

  3. The calibration apparatus according to claim 2, wherein the peripheral object information is acquired a plurality of times within a predetermined period.

  4. The calibration apparatus according to claim 3, wherein the calibration processing unit sets the weight according to a moving speed of a moving body provided with the plurality of information acquisition units for each acquisition of the peripheral object information, and reduces the weight as the moving speed increases.

  5. The calibration apparatus according to claim 3, wherein the calibration processing unit sets the weight according to a motion vector of the feature point, and reduces the weight as the motion vector increases.

  6. The calibration apparatus according to claim 3, wherein the predetermined period is a preset period from a start of movement of a moving body provided with the plurality of information acquisition units or a preset period until an end of movement of the moving body.

  7. The calibration apparatus according to claim 2, wherein the calibration processing unit sets the weight according to a distance from the plurality of information acquisition units to the feature point, and reduces the weight as the distance increases.

  8. The calibration apparatus according to claim 2, further comprising a parameter update unit that updates the stored parameters using parameters calculated by the calibration processing unit.

  9. The calibration apparatus according to claim 8, wherein the parameter update unit updates the parameters from when movement of a moving body provided with the plurality of information acquisition units is stopped or when movement of the moving body ends until when movement of the moving body starts next time.

  10. The calibration apparatus according to claim 1, wherein the plurality of information acquisition units each acquires at least a captured image of the peripheral object as the peripheral object information.

  11. The calibration apparatus according to claim 10, comprising, as the plurality of information acquisition units, an information acquisition unit that acquires a captured image of the peripheral object as the peripheral object information, and an information acquisition unit that quantifies a distance to each position of the peripheral object using a ranging sensor to treat a quantification result as the peripheral object information.

  12. The calibration apparatus according to claim 11, further comprising an information processing unit that performs a registration process on a quantification result for a distance to each position of the peripheral object acquired by the information acquisition unit, and generates point cloud data for each position of the peripheral object as point cloud data for each feature point.

  13. The calibration apparatus according to claim 10, comprising, as the plurality of information acquisition units, information acquisition units that each acquires a captured image of the peripheral object as the peripheral object information.

  14. The calibration apparatus according to claim 10, further comprising an information processing unit that performs feature point detection using a captured image of the peripheral object acquired by the information acquisition unit, and generates point cloud data for each feature point by a registration process for the detected feature point of the peripheral object.

  15. A calibration method comprising calculating, by a calibration processing unit, parameters relating to positions and attitudes of a plurality of information acquisition units, using point cloud data relating to a feature point of a peripheral object generated on a basis of peripheral object information acquired by the plurality of information acquisition units, and a weight according to a situation between the peripheral object and the information acquisition units when the peripheral object information was acquired.

  16. A program for performing calibration on a computer, the program causing the computer to execute: a procedure of acquiring point cloud data relating to a feature point of a peripheral object generated on a basis of peripheral object information acquired by a plurality of information acquisition units; and a procedure of calculating parameters relating to positions and attitudes of the plurality of information acquisition units, using a weight according to a situation between the peripheral object and the information acquisition units when the peripheral object information was acquired.

Description

TECHNICAL FIELD

[0001] This technology relates to a calibration apparatus, a calibration method, and a program, and allows calibration to be performed stably.

BACKGROUND ART

[0002] Conventionally, an object in a peripheral area is recognized using a ranging apparatus. For example, in Patent Document 1, a moving body is provided with a distance measuring sensor that measures a distance to a structure and a sensor position measuring apparatus that measures a three-dimensional position of the distance measuring sensor, and a three-dimensional position of the structure is calculated using a measurement result of the distance measuring sensor and a measurement result of the sensor position measuring apparatus. Furthermore, calibration is performed for the mounting position and mounting attitude of the distance measuring sensor.

CITATION LIST

Patent Document

[0003] Patent Document 1: Japanese Patent Application Laid-Open No. 2011-027598

SUMMARY OF THE INVENTION

Problems to be Solved by the Invention

[0004] Incidentally, a sensor used for recognizing an object in a peripheral area is not restricted to the distance measuring sensor indicated in Patent Document 1. For example, three-dimensional measurement or the like is performed using an imaging apparatus on the basis of a captured image acquired by the imaging apparatus. In three-dimensional measurement based on the captured image, for example, three-dimensional measurement is performed by utilizing the principle of triangulation in line with captured images acquired by two imaging apparatuses whose relative positions and attitudes are known. Furthermore, in order to enhance the reliability of three-dimensional measurement, not only the imaging apparatus but also a ranging apparatus is used. As described above, in order to perform three-dimensional measurement using a plurality of imaging apparatuses or an imaging apparatus and a ranging apparatus, the relative positions and attitudes between the imaging apparatuses or between the imaging apparatus and the ranging apparatus need to be calibrated beforehand. However, in a case where the calibration is performed using point cloud data acquired by the ranging apparatus and point cloud data based on a feature point detected from the captured image, there is a possibility that the image of a foreground object is blurred when a distant object is focused, or a possibility that the ranging accuracy of the ranging apparatus deteriorates as the object becomes distant. Therefore, the calibration cannot be performed stably. In addition, if the imaging apparatus and the ranging apparatus are not synchronized, a difference between the positions of observation points sometimes increases in a case where the moving speed is higher, and the calibration cannot be performed stably.

[0005] Thus, an object of this technology is to provide a calibration apparatus, a calibration method, and a program capable of performing the calibration stably.

Solutions to Problems

[0006] A first aspect of this technology is

[0007] a calibration apparatus including

[0008] a calibration processing unit that calculates parameters relating to positions and attitudes of a plurality of information acquisition units, using point cloud data relating to a feature point of a peripheral object generated on the basis of peripheral object information acquired by the plurality of information acquisition units, and a weight according to a situation between the peripheral object and the information acquisition units when the peripheral object information was acquired.

[0009] In this technology, the plurality of information acquisition units acquires the peripheral object information a plurality of times in a predetermined period, for example, in a preset period from a start of movement of a moving body provided with the plurality of information acquisition units or a preset period until an end of movement of the moving body. Furthermore, the plurality of information acquisition units is configured to each acquire at least a captured image of the peripheral object as the peripheral object information. For example, the information acquisition units are constituted by a plurality of information acquisition units that each acquires a captured image of the peripheral object, or an information acquisition unit that acquires a captured image of the peripheral object and an information acquisition unit that quantifies a distance to each position of the peripheral object using a ranging sensor to treat a quantification result as the peripheral object information. An information processing unit performs a registration process on the quantification result for a distance to each position of the peripheral object acquired by the information acquisition unit, and generates point cloud data for each position of the peripheral object as point cloud data for each feature point. In addition, the information processing unit performs feature point detection using a captured image of the peripheral object acquired by the information acquisition unit, and generates point cloud data for each feature point by a registration process for the detected feature point of the peripheral object.

[0010] The calibration processing unit calculates new external parameters using the point cloud data relating to the feature point of the peripheral object, the weight relating to a situation between the peripheral object and the information acquisition units when the peripheral object information was acquired, and the parameters (external parameters) relating to positions and attitudes of the plurality of information acquisition units stored in advance. As the weight relating to a situation between the peripheral object and the information acquisition units, relative speed and distance between the peripheral object and the information acquisition units, and a motion vector of the feature point are used. The calibration processing unit sets the weight according to a moving speed of a moving body provided with the plurality of information acquisition units for each acquisition of the peripheral object information, and reduces the weight as the moving speed increases. Furthermore, the calibration processing unit sets the weight according to a distance between the peripheral object and each of the information acquisition units, and reduces the weight as the distance increases. Moreover, in setting the weight, the weight is set according to the motion vector of the feature point, and the weight is reduced as the motion vector increases. The calibration processing unit calculates a cost indicating an error of the parameters for each acquisition of the peripheral object information, using the weight, the point cloud data, and the parameters stored in advance, and calculates new parameters that minimize the error, on the basis of an accumulated value of the cost for each acquisition. Additionally, a parameter update unit updates the stored parameters to the parameters calculated by the calibration processing unit from when movement of a moving body provided with the plurality of information acquisition units is stopped or when movement of the moving body ends until when movement of the moving body starts next time.

[0011] A second aspect of this technology is

[0012] a calibration method including

[0013] calculating, by a calibration processing unit, parameters relating to positions and attitudes of a plurality of information acquisition units, using point cloud data relating to a feature point of a peripheral object generated on the basis of peripheral object information acquired by the plurality of information acquisition units, and a weight according to a situation between the peripheral object and the information acquisition units when the peripheral object information was acquired.

[0014] A third aspect of this technology is

[0015] a program for performing calibration on a computer,

[0016] the program causing the computer to execute:

[0017] a procedure of acquiring point cloud data relating to a feature point of a peripheral object generated on the basis of peripheral object information acquired by a plurality of information acquisition units; and

[0018] a procedure of calculating parameters relating to positions and attitudes of the plurality of information acquisition units, using a weight according to a situation between the peripheral object and the information acquisition units when the peripheral object information was acquired.

[0019] Note that the program according to the present technology is a program that can be provided, for example, to a general-purpose computer capable of executing a variety of program codes by a storage medium or a communication medium that provides a program in a computer-readable format, for example, a storage medium such as an optical disc, a magnetic disk, and a semiconductor memory or a communication medium such as a network. By providing such a program in a computer-readable format, a process according to the program is implemented on the computer.

Effects of the Invention

[0020] According to this technology, external parameters between a plurality of information acquisition units are calculated using point cloud data relating to a feature point of a peripheral object generated on the basis of peripheral object information acquired by the plurality of information acquisition units, and a weight according to a situation between the peripheral object and the information acquisition units when the peripheral object information was acquired. Consequently, the calibration is allowed to be performed stably. Note that the effects described in the present description merely serve as examples and not construed to be limited. There may be an additional effect as well.

BRIEF DESCRIPTION OF DRAWINGS

[0021] FIG. 1 is a diagram exemplifying a configuration of a calibration apparatus.

[0022] FIG. 2 is a diagram exemplifying a configuration of a first embodiment.

[0023] FIG. 3 is a diagram exemplifying a relationship between a speed and a weight.

[0024] FIG. 4 is a diagram exemplifying feature points.

[0025] FIG. 5 is a flowchart exemplifying working of the first embodiment.

[0026] FIG. 6 is a diagram illustrating a working example of the first embodiment.

[0027] FIG. 7 is a diagram exemplifying a configuration of a second embodiment.

[0028] FIG. 8 is a diagram exemplifying a relationship between a distance and a weight.

[0029] FIG. 9 is a flowchart exemplifying working of the second embodiment.

[0030] FIG. 10 is a diagram illustrating a working example of the second embodiment.

[0031] FIG. 11 is a diagram exemplifying a configuration of a third embodiment.

[0032] FIG. 12 is a diagram exemplifying a relationship between a magnitude of a motion vector and a weight.

[0033] FIG. 13 is a flowchart exemplifying working of the third embodiment.

[0034] FIG. 14 is a diagram exemplifying a configuration of a fourth embodiment.

[0035] FIG. 15 is a flowchart exemplifying working of the fourth embodiment.

[0036] FIG. 16 is a diagram exemplifying a configuration of a fifth embodiment.

[0037] FIG. 17 is a flowchart exemplifying working of the fifth embodiment.

[0038] FIG. 18 is a diagram exemplifying a configuration of a sixth embodiment.

[0039] FIG. 19 is a flowchart exemplifying working of the sixth embodiment.

[0040] FIG. 20 is a block diagram illustrating an example of a schematic configuration of a vehicle control system.

[0041] FIG. 21 is an explanatory diagram illustrating an example of installation positions of vehicle exterior information detecting parts and imaging units.

MODE FOR CARRYING OUT THE INVENTION

[0042] Hereinafter, modes for carrying out the present technology will be described. Note that the description will be given in the following order.

[0043] 1. Configuration of Calibration Apparatus

[0044] 2. First Embodiment

[0045] 3. Second Embodiment

[0046] 4. Third Embodiment

[0047] 5. Fourth Embodiment

[0048] 6. Fifth Embodiment

[0049] 7. Sixth Embodiment

[0050] 8. Other Embodiments

[0051] 9. Application Examples

  1. Configuration of Calibration Apparatus

[0052] FIG. 1 exemplifies a configuration of a calibration apparatus according to the present technology. The calibration apparatus 10 is configured using a plurality of information acquisition units 11-1 and 11-2 (2a) and information processing units 12-1 and 12-2 (2a), a weight setting unit 13, a parameter storage unit 14, a calibration processing unit 15, and a parameter update unit 16. Note that the calibration apparatus 10 is not restricted to a case where the blocks illustrated in FIG. 1 are provided as a unified body, but may have a configuration in which some blocks are provided separately.

[0053] The information acquisition units 11-1 and 11-2 (2a) acquire peripheral object information. The peripheral object information is information that enables the acquisition of information regarding a feature point of a peripheral object, and is, for example, a captured image in which a peripheral object is imaged, ranging data to each position of a peripheral object, or the like. The information processing unit 12-1 generates point cloud data of feature points in the peripheral object on the basis of the peripheral object information acquired by the information acquisition unit 11-1, and outputs the generated point cloud data to the calibration processing unit 15. Similarly, the information processing unit 12-2 (2a) generates point cloud data of feature points in the peripheral object on the basis of the peripheral object information acquired by the information acquisition unit 11-2 (2a), and outputs the generated point cloud data to the calibration processing unit 15.

[0054] The weight setting unit 13 sets a weight according to a situation between the peripheral object and the information acquisition units, which affects the accuracy of the calibration. The weight setting unit 13 outputs the set weight to the calibration processing unit 15.

[0055] The parameter storage unit 14 holds parameters (hereinafter, referred to as “external parameters”) relating to the positions and attitudes of the plurality of information acquisition units. The parameter storage unit 14 outputs the held external parameters to the calibration processing unit 15. Furthermore, in a case where external parameters are supplied from the parameter update unit 16, the parameter storage unit 14 updates the held external parameters to the external parameters supplied from the parameter update unit 16.

[0056] The calibration processing unit 15 calculates a cost according to an error of the external parameters on the basis of a cost function, using the point cloud data for a predetermined period supplied from the information processing units 12-1 and 12-2 (2a), the weight set by the weight setting unit 13, and the external parameters acquired from the parameter storage unit 14. Furthermore, the calibration processing unit 15 calculates new external parameters that minimize the accumulated value of the cost for the predetermined period, and outputs the calculated new external parameters to the parameter update unit 16.

[0057] The parameter update unit 16 outputs the new external parameters calculated by the calibration processing unit 15 to the parameter storage unit 14, such that the parameter storage unit 14 holds the external parameters that allow the calibration to be performed stably.

  1. First Embodiment

[0058] Next, a first embodiment will be described. FIG. 2 exemplifies a configuration of the first embodiment. In the first embodiment, two information acquisition units 11-1 and 11-2 are used. The information acquisition unit 11-1 is configured using an imaging apparatus so as to acquire a captured image. The information acquisition unit 11-2 is configured using a ranging apparatus, for example, a time-of-flight (TOF) camera, light detection and ranging or laser imaging detection and ranging (LIDAR), or the like, and acquires point cloud data indicating a ranging value. Furthermore, a weight setting unit 13 sets a weight according to a situation between a peripheral object and the information acquisition units. The weight setting unit 13 uses a moving speed as a situation between the peripheral object and the information acquisition units. Here, the moving speed is, for example, assumed as a moving speed of the information acquisition units 11-1 and 11-2 with respect to the peripheral object.

……
……
……

您可能还喜欢...