Sony Patent | Varied density grid camera calibration chart

Patent: Varied density grid camera calibration chart

Publication Number: 20260065509

Publication Date: 2026-03-05

Assignee: Sony Interactive Entertainment Inc

Abstract

A chart for camera calibration is tilted away from the camera. The part of the chart that is further from the camera has a relatively sparse grid to make finding grid corners easier, whereas the part of the chart closer to the camera has a relatively dense grid to provide more data for better calibration.

Claims

What is claimed is:

1. A method, comprising:providing a substrate;tilting the substrate at an oblique angle with respect to a line of sight of a camera; andcalibrating the camera using one and only one image of the substrate, wherein the substrate comprises a relatively sparse grid to make finding grid corners easier in a region of the substrate that is further from the camera when the substrate is tiled relative to the camera and a relatively dense grid in a region of the substrate that is closer to the camera when the substrate is tiled relative to the camera to provide more data for calibration.

2. The method of claim 1, comprising providing at least one quick respond (QR) code on the substrate configured for automated pose estimation to establish a search area for grid corners.

3. The method of claim 2, wherein the substrate comprises one and only one QR code.

4. The method of claim 2, wherein the substrate comprises plural QR codes.

5. The method of claim 1, wherein the sparse grid and dense grid comprise respective squares.

6. The method of claim 5, wherein the squares of the sparse grid are four times larger than at least some squares in a first region of the dense grid.

7. The method of claim 6, wherein the squares of the sparse grid are sixteen times larger than at least some squares in a second region of the dense grid.

8. The method of claim 7, comprising at least one quick response (QR) code in at least one square of the first region of the dense grid and no QR codes in the second region of the dense grid or in the sparse grid.

9. An assembly, comprising:at least one substrate comprising at least first and second regions of subdivisions, the subdivisions of the first region being of larger size than the subdivisions of the second region; andat least one camera positioned to generate at least one image of the substrate to calibrate at least one parameter of the camera.

10. The assembly of claim 9, comprising one and only one camera, the camera being positioned with its optical axis defining an oblique angle to a surface of the substrate with the first region of the substrate being further from the camera than the second region of the substrate.

11. The assembly of claim 10, comprising at least one quick respond (QR) code on the substrate configured for automated pose estimation to establish a search area for grid corners on the substrate.

12. The assembly of claim 10, wherein the substrate comprises one and only one QR code.

13. The assembly of claim 10, wherein the substrate comprises plural QR codes.

14. The assembly of claim 9, wherein the subdivisions comprise respective squares.

15. The assembly of claim 14, wherein the squares of the first region are four times larger than at least some squares in a first sub-region of the second region.

16. The assembly of claim 15, wherein the squares of the first region are sixteen times larger than at least some squares in a second sub-region of the second region.

17. The assembly of claim 16, comprising at least one quick response (QR) code in at least one square of the first sub-region of the second region and no QR codes in the second sub-region or in the first region, the first sub-region being between the first region and the second sub-region.

18. An assembly, comprising:at least one camera; andat least one substrate positionable at an oblique angle relative to an optical axis of the camera and comprising a far region relative to the camera and having subdivisions of a first size and a near region relative to the camera and having subdivisions of a second size smaller than the first size, the camera being configured to calibrate at least one parameter of the camera based on at least one image of the substrate.

19. The assembly of claim 18, wherein the camera is configured to calibrate at least one parameter of the camera based on one and only one image of the substrate.

20. The assembly of claim 18, wherein the subdivisions comprise squares and squares in the far region are twice as large in a linear dimension as squares in the near region.

Description

FIELD

The present application relates to varied density grid camera calibration charts.

BACKGROUND

Charts with grids may be used to calibrate cameras. Camera calibration may be particularly important for virtual reality (VR) applications, computer vision applications, and other applications.

Generally, multiple calibration images must be taken to provide high diversity, in part because a chart directly facing the camera exhibits redundant information, e.g., a point above the chart midline at a first distance gives redundant calibration information to that given by a point directly below the first point and below the midline at the first distance. As understood herein, multiple calibration images complicate automated calibration

SUMMARY

Accordingly, a method includes providing a substrate and tilting the substrate at an oblique angle with respect to a line of sight of a camera. The method includes calibrating the camera using one and only one image of the substrate. The substrate includes a relatively sparse grid to make finding grid corners easier in a region of the substrate that is further from the camera when the substrate is tiled relative to the camera and a relatively dense grid in a region of the substrate that is closer to the camera when the substrate is tiled relative to the camera to provide more data for calibration.

In some examples the method may includes providing at least one quick respond (QR) code on the substrate configured for automated pose estimation to establish a search area for grid corners. In some examples the substrate can have one and only one QR code whereas in other embodiments the substrate has plural QR codes.

In non-limiting implementations the sparse grid and dense grid include respective squares. The squares of the sparse grid can be four times larger than at least some squares in a first region of the dense grid. The squares of the sparse grid may be sixteen times larger than at least some squares in a second region of the dense grid. At least one QR code may be in at least one square of the first region of the dense grid and no QR codes may appear in the second region of the dense grid or in the sparse grid.

In another aspect, an assembly includes at least one substrate with at least first and second regions of subdivisions. The subdivisions of the first region are of larger size than the subdivisions of the second region. At least one camera is positioned to generate at least one image of the substrate to calibrate at least one parameter of the camera.

In another aspect, an assembly includes at least one camera and at least one substrate positionable at an oblique angle relative to an optical axis of the camera. The substate includes a far region relative to the camera that has subdivisions of a first size and a near region relative to the camera that has subdivisions of a second size smaller than the first size. The camera is configured to calibrate at least one parameter of the camera based on at least one image of the substrate.

The details of the present application, both as to its structure and operation, can best be understood in reference to the accompanying drawings, in which like reference numerals refer to like parts, and in which:

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 illustrates a schematic view of a first system for calibrating a single 2D or 3D camera;

FIG. 2 illustrates a schematic view of a second system for calibrating plural 2D or 3D cameras;

FIG. 3 illustrates a block diagram of an example camera;

FIG. 4 illustrates a calibration substrate not tilted relative to a camera (rendered as an image of an eye in the figure);

FIG. 5 illustrates a calibration substrate that is tilted relative to a camera;

FIG. 6 illustrates example logic in example flow chart format consistent with present principles for QR code processing;

FIG. 7 illustrates an example calibration substrate consistent with present principles;

FIG. 8 illustrates example logic in example flow chart format consistent with present principles for calibration; and

FIG. 9 illustrates another example calibration substrate.

DETAILED DESCRIPTION

This disclosure relates generally to computer ecosystems including aspects of consumer electronics (CE) devices, including cameras and CE device networks such as but not limited to computer game networks. A system herein may include server and client components which may be connected over a network such that data may be exchanged between the client and server components. The client components may include one or more cameras and/or one or more computing devices including game consoles such as Sony PlayStation® or a game console made by Microsoft or Nintendo or other manufacturer, extended reality (XR) headsets such as virtual reality (VR) headsets, augmented reality (AR) headsets, portable televisions (e.g., smart TVs, Internet-enabled TVs), portable computers such as laptops and tablet computers, and other mobile devices including smart phones and additional examples discussed below. These client devices may operate with a variety of operating environments. For example, some of the client computers may employ, as examples, Linux operating systems, operating systems from Microsoft, or a Unix operating system, or operating systems produced by Apple, Inc., or Google, or a Berkeley Software Distribution or Berkeley Standard Distribution (BSD) OS including descendants of BSD. These operating environments may be used to execute one or more browsing programs, such as a browser made by Microsoft or Google or Mozilla or other browser program that can access websites hosted by the Internet servers discussed below. Also, an operating environment according to present principles may be used to execute one or more computer game programs.

Servers and/or gateways may be used that may include one or more processors executing instructions that configure the servers to receive and transmit data over a network such as the Internet. Or a client and server can be connected over a local intranet or a virtual private network. A server or controller may be instantiated by a game console such as a Sony PlayStation®, a personal computer, etc.

Information may be exchanged over a network between the clients and servers. To this end and for security, servers and/or clients can include firewalls, load balancers, temporary storages, and proxies, and other network infrastructure for reliability and security. One or more servers may form an apparatus that implement methods of providing a secure community such as an online social website or gamer network to network members.

A processor may be a single-or multi-chip processor that can execute logic by means of various lines such as address lines, data lines, and control lines and registers and shift registers. A processor including a digital signal processor (DSP) may be an embodiment of circuitry. A processor system may include one or more processors.

Components included in one embodiment can be used in other embodiments in any appropriate combination. For example, any of the various components described herein and/or depicted in the Figures may be combined, interchanged, or excluded from other embodiments.

“A system having at least one of A, B, and C” (likewise “a system having at least one of A, B, or C” and “a system having at least one of A, B, C”) includes systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together.

FIG. 1 illustrates that a single two-dimensional (2D) or three-dimensional (3D) camera 10 may be calibrated using a substrate, configured as a calibration chart 12 that is tilted at an oblique angle 14 relative to the vertical 16 (because the camera is assumed to be aimed along the horizontal). In general, the substrate is tilted at an oblique angle relative to the line of sight or optical axis of the camera. Details of an example substrate are given further herein. Camera intrinsics including camera field of view and/or focal length and/or depth of field may be calibrated using the chart. Distortion calibration also may be effected using the chart. These calibrations may be implemented on cameras with variable focus lenses and variable zoom lenses, for example, and may involve 3D depth sensing.

FIG. 2 illustrates that a substrate implemented in one example by the chart 12, tilted as described above, may be used to calibrate plural cameras 200, 202 for stereoscopic calibration to track objects in 3D space using two simultaneous views.

The cameras herein, in non-limiting embodiments, may be implemented by Sony Alpha 1 or Alpha 6600 cameras (which may be trademarked by Sony). FIG. 3 illustrates a camera 300 consistent with present principles that includes a lens 302 such as a telephoto lens with, for example, a field of view of about 2.5°. The camera may include one or more processors 304 accessing one or more storages 306 and communicating with one or more imagers 308 that receives light from the lens 302 through, if desired, a shutter 310. The focal length of the lens 302 may be altered by a focal length controller 312 responsive to control signals from the processor 304. In some embodiments the lens 302 may be movable by a position controller 314 responsive to control signals from the processor 304. The position controller 314 may be implemented by, e.g., a motion stabilization device or other device coupled to the lens 302 (including coupled to the hood of the lens.)

Refer now to FIGS. 4 and 5, which use an image of an eye 400 to represent a camera to be calibrated. For spherical camera lenses, data points “A” and “B” on a substrate 402 in FIG. 4 oriented perpendicular to the line of sight of the eye 400 are similar and in fact provide redundant information because once “A” is known, “B” provides no additional information. On the other hand, as shown in FIG. 5, a substrate 402 oriented at an oblique angle to the line of sight and having data points “C” and “D” which have different distances to the camera provides diversity in the Z-axis data when the 3-dimensional coordinates of the data points are provided to the calibration algorithm.

FIG. 6 illustrates preliminary logic for using a QR code on the substrate that is imaged at state 600 to look up its known 3D coordinates on the substrate at state 602. Moving to state 604, the four corners around the QR code can be detected and used at state 606 to estimate the orientation and scale of the substrate. This is called pose estimation. With an estimated pose, the position of all other points can be estimated at state 608 using reprojection (place 3D positions into 2D positions in a camera view).

Once the reprojection has been done, the estimated corners and estimates for the other grid point locations are used at state 610 to establish a search region for the actual corners in the grid. The actual corners are identified at state 612 using, e.g., machine vision and then used at state 614 to calibrate the camera.

As understood herein, a single QR may be used even though plural QR codes are shown in ensuing figures since one code provides four corners from which all positions of other points can be reprojected.

As understood herein, more data points on the substrate means more frame coverage and more accurate calibration, but accuracy of the estimates is poor in regions of the substrate that are closer to the camera (and the grid in those closer regions thus tighter and more dense in the image of the substrate). Accuracy in more distanced regions of the substrate is better but the estimated points in those regions appear to be very close together and cannot be used for precise and effective calibration. However, the estimated positions can help establish a search region in which the actual QR code corner can be located with high accuracy, which then can be used for calibration.

For example, if a search is conducted around the estimate for a corner, the search region needs to be large, but if grid subdivisions overlap, there will be potential for error. Use of a smaller search area avoids overlap, but can result in the search never finding the corner point.

Accordingly, as recognized herein, by lowering the density of the grid in the region farther away from the camera, the search area can be increased with high confidence of finding the correct corner. In the region close to the camera, the corners are already far apart enough to avoid the search area problem, and if the grid size were denser in this region, then more data points are available to the calibration algorithm, resulting in a more accurate calibration.

Note that a single QR code can be used to locate four starting corners to use for pose estimation, which is required to estimate other reprojected corner points that don't have QR codes beside them. Although more QR codes means better pose estimation, they do not contribute to calibration quality and thus may be omitted.

Now refer to FIG. 7, illustrating a substrate 700 consistent with principles above. As shown, the substrate 700 includes at least first and second regions 702, 704 having respective grids of subdivisions of varying size. More specifically, in the region 702 furthest away from the camera when the substrate is tilted, a relatively sparse grid 706 is on the substrate 700 to make finding grid corners easier. On the other hand, the region 704 is closer to the camera than the region 702 when the substrate is tilted and so it includes a relatively dense grid 708 of subdivisions.

Furthermore, in the example shown a third region 710 may be disposed between the first and second regions 702, 704 and may include a grid having subdivisions of sizes in between the sizes of the first and second regions 702, 704. In the example shown, a respective QR code 712 is disposed in every subdivision of the third region 710 although as stated above some or even all of the QR codes shown in FIG. 7 but one may be omitted.

In the example shown, the grid subdivisions are square for simplicity of calculation. Other shapes may be used.

In the example shown, the squares of the sparse grid 706 are four times larger than at least some squares in the third region, i.e., are a factor of two larger in a linear dimension for simplicity of calculation, it being understood that other multiples of size differentials may be used.

In the example shown, the squares of the sparse grid 706 are sixteen times larger than at least some squares in the dense grid 708 of the second region 704, i.e., are a factor of four larger in a linear dimension for simplicity of calculation, it being understood that other multiples of size differentials may be used.

Accordingly, it may now be appreciated that during calibration, the substrate 700 may be tilted obliquely relative to the camera optical axis as shown in FIGS. 1, 2, and 5 for more data diversity. More data in the form of diversity from tilt and use of smaller and larger grid subdivisions enables the use of one and only one image of the substrate for calibration instead of multiple images. The one or more QR code markers provide automated pose estimation to reproject estimated points and establish a search area for more grid corners. The grid is relatively less dense grid in the region of the substrate more distanced from the camera when the substrate is tilted to make it easier to search for grid corners, whereas the grid a relatively more dense in regions of the substrate closer to the camera when the substrate is tilted to provide more data for better calibration. QR codes are omitted where they are not needed, because an unneeded QR code might confuse grid corner detection.

FIG. 8 illustrates general calibration logic, expanding on state 614 of FIG. 6. State 800 indicates that the actual corner positions as imaged from FIG. 6 are compared to ground truth real world coordinates of the grid corners as known beforehand. The difference between ground truth and imaged grid corners establishes a reprojection error at state 802, which is used at state 804 to estimate the errors or deviations in the intrinsic parameters of the camera that led to the reprojection error. The errors or deviations in the intrinsic parameters of the camera may be used, for example, to establish coefficients that can be applied to subsequent images at state 806 to correct for the errors or deviations in the intrinsic parameters of the camera.

FIG. 9 illustrates a substrate 900 having relatively large alternating black and white blocks 902 in two rows with no QR codes, followed by seven rows of smaller blocks 904 (sides half the length size of lengths of the blocks 902) every other one of which contains a QR code, followed in turn by seventeen rows of still-smaller alternating black and white blocks 906 (sides half the length of the sides of the blocks 904) with no QR codes.

While particular techniques are herein shown and described in detail, it is to be understood that the subject matter which is encompassed by the present invention is limited only by the claims.

您可能还喜欢...