空 挡 广 告 位 | 空 挡 广 告 位

Sony Patent | Information processing device, information processing method, and program

Patent: Information processing device, information processing method, and program

Drawings: Click to check drawins

Publication Number: 20210092284

Publication Date: 20210325

Applicant: Sony

Abstract

An information processing device according to the present technology includes a controller. The controller is configured to acquire first position-and-pose information related to a position and a pose of hardware that is used to calibrate a line of sight, and second position-and-pose information related to a position and a pose of the hardware, the second position-and-pose information being different from the first position-and-pose information; to control a display section to display a plurality of presented gaze points including a first presented gaze point and a second presented gaze point, the first presented gaze point being presented at a first position, the second presented gaze point being presented at a second position different from the first position; to control an image-capturing section to acquire a first image of an eyeball of a user gazing at the first presented gaze point, and a first image of the eyeball of the user gazing at the second presented gaze point; to generate a plurality of pieces of calibration data using the first position-and-pose information, the second position-and-pose information, and the first images; and to perform a correction related to an optical-axis vector and a visual-axis vector with respect to the eyeball of the user, using the plurality of pieces of calibration data.

Claims

  1. An information processing device comprising a controller configured to acquire first position-and-pose information related to a position and a pose of hardware that is used to calibrate a line of sight, and second position-and-pose information related to a position and a pose of the hardware, the second position-and-pose information being different from the first position-and-pose information, control a display section to display a plurality of presented gaze points including a first presented gaze point and a second presented gaze point, the first presented gaze point being presented at a first position, the second presented gaze point being presented at a second position different from the first position, control an image-capturing section to acquire a first image of an eyeball of a user gazing at the first presented gaze point, and a first image of the eyeball of the user gazing at the second presented gaze point, generate a plurality of pieces of calibration data using the first position-and-pose information, the second position-and-pose information, and the first images, and perform a correction related to an optical-axis vector and a visual-axis vector with respect to the eyeball of the user, using the plurality of pieces of calibration data.

  2. The information processing device according to claim 1, wherein the hardware is at least one of the image-capturing section, the display section, or a light source that irradiates the eyeball of the user with light.

  3. The information processing device according to claim 1, wherein the plurality of pieces of calibration data includes a first piece of calibration data based on the first position-and-pose information, and a second piece of calibration data based on the second position-and-pose information, the first piece of calibration data including information related to the optical-axis vector and the visual-axis vector, the second piece of calibration data including information related to the optical-axis vector and the visual-axis vector, and the controller is configured to perform the correction using a piece of calibration data having a smaller difference between a certain difference between an optical-axis vector and a visual-axis vector, and another difference between an optical-axis vector and a visual-axis vector, the piece of calibration data being the first piece of calibration data or the second piece of calibration data, the certain difference corresponding to the first presented gaze point, the other difference corresponding to the second presented gaze point.

  4. The information processing device according to claim 3, wherein the controller is configured to perform the correction using a piece of calibration data having a smaller variance of differences between an optical-axis vector and a visual-axis vector, the piece of calibration data being the first piece of calibration data or the second piece of calibration data, the differences corresponding to respective presented gaze points of the plurality of presented gaze points.

  5. The information processing device according to claim 1, wherein the first presented gaze point is a presented gaze point different from the second presented gaze point.

  6. The information processing device according to claim 1, wherein the second presented gaze point is the first presented gaze point displaced to the second position from the first position.

  7. The information processing device according to claim 1, wherein the first presented gaze point and the second presented gaze point are a virtual object situated at a fixed position in a space using pieces of position-and-pose information related to a position and a pose, the first presented gaze point and the second presented gaze point being associated with the respective pieces of position-and-pose information by the information processing device.

  8. The information processing device according to claim 3, wherein the controller is configured to control the image-capturing section to acquire a second image of the eyeball of the user, estimate the optical-axis vector using the second image, estimate the visual-axis vector by performing correction with respect to the estimated optical-axis vector according to a correction amount, and calculate the correction amount using the piece of calibration data having a smaller difference between the certain difference and the other difference, the piece of calibration data being the first piece of calibration data or the second piece of calibration data.

  9. The information processing device according to claim 8, wherein the controller is configured to calculate the correction amount using a similarity between the estimated optical-axis vector and the optical-axis vector included in the piece of calibration data having a smaller difference between the certain difference and the other difference, the piece of calibration data being the first piece of calibration data or the second piece of calibration data.

  10. The information processing device according to claim 9, wherein the first piece of calibration data and the second piece of calibration data each further include at least one of information regarding a three-dimensional position of a center of a corneal curvature of the eyeball of the user, or information regarding a three-dimensional position of a center of a pupil or a center of an eyeball rotation of the eyeball of the user, and the controller is configured to estimate, using the second image, at least one of the three-dimensional position of the center of a corneal curvature, or the three-dimensional position of the center of a pupil or the center of an eyeball rotation, and calculate the correction amount using at least one of a similarity between the estimated three-dimensional position of the center of a corneal curvature and another three-dimensional position of the center of a corneal curvature, or a similarity between the estimated three-dimensional position of the center of a pupil or the center of an eyeball rotation, and another three-dimensional position of the center of a pupil or the center of an eyeball rotation, the other three-dimensional position of the center of a corneal curvature and the other three-dimensional position of the center of a pupil or the center of an eyeball rotation being included in the piece of calibration data having a smaller difference between the certain difference and the other difference, the piece of calibration data being the first piece of calibration data or the second piece of calibration data.

  11. The information processing device according to claim 1, wherein the controller is configured to acquire the first position-and-pose information and the second position-and-pose information depending on a movement of the hardware in a real space.

  12. The information processing device according to claim 11, wherein the controller is configured to control the display section to display guidance indicating a method for moving the hardware.

  13. The information processing device according to claim 12, wherein the controller is configured to acquire the first position-and-pose information and the second position-and-pose information after the guidance is displayed.

  14. The information processing device according to claim 11, further comprising an adjustment section configured to adjust at least one of a position or a pose of the hardware, wherein the controller is configured to acquire the first position-and-pose information and the second position-and-pose information depending on the adjustment.

  15. The information processing device according to claim 14, wherein the adjustment section is configured to automatically perform the adjustment using a drive section, and the controller drives the adjustment section according to a result of the correction.

  16. The information processing device according to claim 14, wherein the adjustment section is configured to adjust an interpupillary distance of a head-mounted display or a head-up display, and the controller is configured to acquire the first position-and-pose information and the second position-and-pose information depending on the adjustment of the interpupillary distance.

  17. The information processing device according to claim 14, wherein the adjustment section is a mechanism for a head-mounted display or a head-up display that is capable of manually performing the adjustment.

  18. The information processing device according to claim 1, wherein the information processing device is a head-mounted display that further includes the display section and the image-capturing section.

  19. An information processing method comprising: acquiring first position-and-pose information related to a position and a pose of hardware that is used to calibrate a line of sight, and second position-and-pose information related to a position and a pose of the hardware, the second position-and-pose information being different from the first position-and-pose information; controlling a display section to display a plurality of presented gaze points including a first presented gaze point and a second presented gaze point, the first presented gaze point being presented at a first position, the second presented gaze point being presented at a second position different from the first position; controlling an image-capturing section to acquire a first image of an eyeball of a user gazing at the first presented gaze point, and a first image of the eyeball of the user gazing at the second presented gaze point; generating a plurality of pieces of calibration data using the first position-and-pose information, the second position-and-pose information, and the first images; and performing a correction related to an optical-axis vector and a visual-axis vector with respect to the eyeball of the user, using the plurality of pieces of calibration data.

  20. A program that causes a computer to perform a process comprising: acquiring first position-and-pose information related to a position and a pose of hardware that is used to calibrate a line of sight, and second position-and-pose information related to a position and a pose of the hardware, the second position-and-pose information being different from the first position-and-pose information; controlling a display section to display a plurality of presented gaze points including a first presented gaze point and a second presented gaze point, the first presented gaze point being presented at a first position, the second presented gaze point being presented at a second position different from the first position; controlling an image-capturing section to acquire a first image of an eyeball of a user gazing at the first presented gaze point, and a first image of the eyeball of the user gazing at the second presented gaze point; generating a plurality of pieces of calibration data using the first position-and-pose information, the second position-and-pose information, and the first images; and performing a correction related to an optical-axis vector and a visual-axis vector with respect to the eyeball of the user, using the plurality of pieces of calibration data.

Description

TECHNICAL FIELD

[0001] The present technology relates to an information processing device that includes a line-of-sight detection device, an information processing method, and a program.

BACKGROUND ART

[0002] In recent years, a technology has been developed that calibrates a line of sight of a user before detecting the line of sight (for example, refer to Patent Literature 1). The calibration includes instructing a user to direct his or her line of sight in a specified direction, and acquiring a state of an eye of the user, the eye being an eye for which the line of sight is adjusted according to the instruction. Accordingly, it becomes possible to perform correction with respect to a result of detecting a line of sight of a user according to a state of his or her eye that is acquired upon performing calibration.

CITATION LIST

Patent Literature

[0003] Patent Literature 1: Japanese Patent Application Laid-open No. 2009-183473

DISCLOSURE OF INVENTION

Technical Problem

[0004] An object of the present technology is to provide an information processing device, an information processing method, and a program that make it possible to improve a performance in a detection of a line of sight.

Solution to Problem

[0005] An information processing device according to an aspect of the present technology includes a controller. The controller is configured to acquire first position-and-pose information related to a position and a pose of hardware that is used to calibrate a line of sight, and second position-and-pose information related to a position and a pose of the hardware, the second position-and-pose information being different from the first position-and-pose information; to control a display section to display a plurality of presented gaze points including a first presented gaze point and a second presented gaze point, the first presented gaze point being presented at a first position, the second presented gaze point being presented at a second position different from the first position; to control an image-capturing section to acquire a first image of an eyeball of a user gazing at the first presented gaze point, and a first image of the eyeball of the user gazing at the second presented gaze point; to generate a plurality of pieces of calibration data using the first position-and-pose information, the second position-and-pose information, and the first images; and to perform a correction related to an optical-axis vector and a visual-axis vector with respect to the eyeball of the user, using the plurality of pieces of calibration data.

Advantageous Effects of Invention

[0006] As described above, the present technology makes it possible to improve a performance in a detection of a line of sight.

BRIEF DESCRIPTION OF DRAWINGS

[0007] FIG. 1 illustrates a configuration of an information processing device 100 according to a first embodiment of the present technology.

[0008] FIG. 2 conceptually illustrates a configuration of a line-of-sight detection device 10 included in the information processing device 100 illustrated in FIG. 1.

[0009] FIG. 3 is a block diagram of a hardware configuration of the line-of-sight detection device 10 illustrated in FIG. 2.

[0010] FIG. 4A illustrates an actual relationship between an optical-axis vector V.sub.opt.fwdarw. and a visual-axis vector V.sub.vis.fwdarw. when an image-capturing section 12 in a typical line-of-sight detection device is arranged offset in an optical-axis direction y of the image-capturing section 12.

[0011] FIG. 4B illustrates a calculated relationship between the optical-axis vector V.sub.opt.fwdarw. and the visual-axis vector V.sub.vis.fwdarw. replacing the relationship illustrated in FIG. 4A, the calculated relationship being obtained by performing a line-of-sight calibration.

[0012] FIG. 5A illustrates an actual relationship between an optical-axis vector V.sub.opt.fwdarw. and a visual-axis vector V.sub.vis.fwdarw. when the position of a line-of-sight-estimation-target eyeball 1 of a user moves in a direction orthogonal to an optical-axis direction y of the image-capturing section 12 with respect to a position upon performing a line-of-sight calibration, in which there is an assembling error in the image-capturing section 12 with respect to the optical-axis direction y.

[0013] FIG. 5B illustrates a calculated relationship between the optical-axis vector V.sub.opt.fwdarw. and the visual-axis vector V.sub.vis.fwdarw. replacing the relationship illustrated in FIG. 5A, the calculated relationship being obtained by performing the line-of-sight calibration.

[0014] FIG. 6A illustrates an actual relationship between an optical-axis vector V.sub.opt.fwdarw. and a visual-axis vector V.sub.vis.fwdarw. when the position of the line-of-sight-estimation-target eyeball 1 of the user moves in a direction other than the direction orthogonal to the optical-axis direction y of the image-capturing section 12 with respect to the position upon performing a line-of-sight calibration.

[0015] FIG. 6B illustrates a calculated relationship between the optical-axis vector V.sub.opt.fwdarw. and the visual-axis vector V.sub.vis.fwdarw. replacing the relationship illustrated in FIG. 6A, the calculated relationship being obtained by performing the line-of-sight calibration.

[0016] FIG. 7A illustrates an actual relationship between an optical-axis vector V.sub.opt.fwdarw. and a visual-axis vector V.sub.vis.fwdarw. when there is an assembling error with respect to a pose of the image-capturing section 12 in the typical line-of-sight detection device.

[0017] FIG. 7B illustrates a calculated relationship between the optical-axis vector V.sub.opt.fwdarw. and the visual-axis vector V.sub.vis.fwdarw. replacing the relationship illustrated in FIG. 7A, the calculated relationship being obtained by performing a line-of-sight calibration.

[0018] FIG. 8A illustrates a relationship between an optical-axis vector V.sub.opt.fwdarw. and a visual-axis vector V.sub.vis.fwdarw. when the position of the line-of-sight-estimation-target eyeball 1 of the user moves in a direction of a radius of a position of the image-capturing section 12 with respect to the position upon performing a line-of-sight calibration.

[0019] FIG. 8B illustrates a calculated relationship between the optical-axis vector V.sub.opt.fwdarw. and the visual-axis vector V.sub.vis.fwdarw. replacing the relationship illustrated in FIG. 8A, the calculated relationship being obtained by performing the line-of-sight calibration.

[0020] FIG. 9A illustrates a relationship between an optical-axis vector V.sub.opt.fwdarw. and a visual-axis vector V.sub.vis.fwdarw. when the position of the line-of-sight-estimation-target eyeball 1 of the user moves in a direction other than the direction of the radius of the position of the image-capturing section 12 with respect to the position upon performing a line-of-sight calibration.

[0021] FIG. 9B illustrates a calculated relationship between the optical-axis vector V.sub.opt.fwdarw. and the visual-axis vector V.sub.vis.fwdarw. replacing the relationship illustrated in FIG. 9A, the calculated relationship being obtained by performing the line-of-sight calibration.

[0022] FIG. 10A illustrates actual relationships between an optical-axis vector V.sub.opt.sup.n.fwdarw. and a visual-axis vector V.sub.vis.sup.n.fwdarw. with respect to N respective presented gaze points.

[0023] FIG. 10B illustrates calculated relationships between the optical-axis vector V.sub.opt.sup.n.fwdarw. and the visual-axis vector V.sub.vis.sup.n.fwdarw. replacing the relationships illustrated in FIG. 10A, the calculated relationships being obtained by performing a line-of-sight calibration.

[0024] FIG. 11 is a flowchart illustrating a procedure of a method for calculating an amount of correction between an optical axis and a visual axis according to the first embodiment of the present technology, in which an assembling error in hardware is taken into consideration.

[0025] FIG. 12 is a diagram explaining a movement of the image-capturing section 12 when the method, of FIG. 11, for calculating an amount of correction between an optical axis and a visual axis in which an assembling error in hardware is taken into consideration.

[0026] FIG. 13 is a flowchart illustrating a procedure of a method for calculating an amount of correction between an optical axis and a visual axis in which an assembling error in a display section 13 is taken into consideration.

[0027] FIG. 14 is a flowchart illustrating a procedure of a method for calculating an amount of correction between an optical axis and a visual axis in which an assembling error in an active light source 11 is taken into consideration.

[0028] FIG. 15 illustrates an outline of a method for estimating a three-dimensional position of a center of a corneal curvature in a sensor-origin coordinate system.

[0029] FIG. 16 illustrates an outline of a method for estimating a three-dimensional position of a center of a pupil in the sensor-origin coordinate system.

[0030] FIG. 17 is a block diagram of hardware configuration of a line-of-sight detection device that does not include the active light source.

MODE(S)* FOR CARRYING OUT THE INVENTION*

[0031] Embodiments according to the present technology will now be described below with reference to the drawings.

[Hardware Configuration of Information Processing Device]

[0032] FIG. 1 illustrates a configuration of an information processing device 100 according to a first embodiment of the present technology.

[0033] The information processing device 100 is a device wearable on the head of a user U to cover the eyes of the user U, and is a so-called head-mounted display. The information processing device 100 includes a display section. A video such as a virtual reality (VR) video or an augmented reality (AR) video is displayed on a screen of the display section.

[0034] The information processing device 100 is provided with a line-of-sight detection device that detects where in a screen space of the display section the line of sight of the user U exists. In the information processing device 100, for example, a control of a display video is performed using data regarding a line of sight of the user U that is detected by the line-of-sight detection device.

[0035] [Line-of-Sight Detection Device]

[0036] Next, a configuration of the line-of-sight detection device is described using FIGS. 2 and 3.

[0037] FIG. 2 conceptually illustrates a configuration of a line-of-sight detection device 10.

[0038] FIG. 3 is a block diagram of a hardware configuration of the line-of-sight detection device 10. The line-of-sight detection device 10 includes an active light source 11, an image-capturing section 12, a display section 13, a storage 15, and a controller 16.

[0039] The active light source 11 is a light source that irradiates light for a line-of-sight calibration and a line-of-sight estimation onto an eyeball 1 of a user who is looking at an arbitrary position P.sub.gt.sup.n on a screen of the display section 13. For example, an infrared light emitting diode (LED) is used as the active light source 11.

[0040] The image-capturing section 12 is, for example, a camera, and is a device that captures an image of light reflected from the eyeball 1 of the user, the eyeball 1 being irradiated with light from the active light source 11. The image-capturing section 12 includes, for example, a lens, an imaging element, and a signal processing circuit. Examples of the lens include an imaging lens that forms an optical image of a subject on an image-capturing surface of the imaging element. The imaging element includes, for example, an image sensor such as a complementary-metal-oxide-semiconductor (CMOS) image sensor and a charge-coupled-device (CCD) image sensor. The signal processing circuit includes, for example, an automatic-gain-control (AGC) circuit and an analog-to-digital converter (ADC), and convers an analog signal generated by the imaging element into a digital signal (image data). Further, the signal processing circuit performs various processes related to, for example, ROW image-development. Moreover, the signal processing circuit may perform various signal processes such as a white balance process, a color-tone correction process, a gamma process, a YCbCr conversion process, and an edge enhancement process.

[0041] The display section 13 is, for example, a liquid crystal display or an organic electro-luminescence (EL) display (also referred to as an organic light emitting diode (OLED) display). When a line-of-sight calibration is performed that estimates a difference between an optical-axis vector and a visual-axis vector of a user, a presented gaze point is displayed on the screen of the display section 13 by a control being performed by the controller 16, the presented gaze point being a point to be gazed at by the user.

[0042] The storage 15 includes a volatile memory and a nonvolatile memory. The volatile memory is, for example, a random access memory (RAM), and stores therein, for example, control data, such as a program and an operational parameter, that is used by the controller 16, and data of an image captured by the image-capturing section 12.

[0043] The nonvolatile memory stores therein, for example, various measurement data related to the eyeball 1 for each user and data used to perform measurement. For example, a flash memory is used as the nonvolatile memory. The nonvolatile memory may be removable from the body of the information processing device 100.

[0044] The controller 16 includes, for example, a central processing unit (CPU). The controller 16 performs arithmetic processing to control the information processing device 100 including the line-of-sight detection device 10, using the control data stored in the volatile memory and the various data stored in the nonvolatile memory.

[0045] [Line-of-Sight Estimation Method]

[0046] Next, a line-of-sight estimation method performed by the line-of-sight detection device 10 is described.

[0047] As the line-of-sight estimation method, there exists, for example, a method for estimating a line of sight using a three-dimensional model of an eyeball. Typically, first, the controller 16 estimates an optical-axis vector of an eyeball using an image (second image) of the eyeball that is captured by the image-capturing section 12. As a method for estimating an optical-axis vector, there exists, for example, a corneal reflection method that uses a corneal light reflex obtained by light from the active light source 11 being reflected off a cornea. In this method, first, a pupil is detected from a captured image. Next, the controller 16 sets, according to a position of the pupil that is specified from the captured image, a range in which a corneal light reflex exists, and estimates a three-dimensional position of a center of a corneal curvature and a three-dimensional position of a center of a pupil, using a position of observing the corneal light reflex and the pupil in the image. The controller 16 obtains, as a result of estimating an optical-axis vector, a vector that connects the estimated three-dimensional position of the center of a corneal curvature with the estimated three-dimensional position of the center of a pupil, and is oriented outward from the eyeball.

[0048] On the other hand, an orientation vector indicating a direction in which a man subjectively feels that he or she is looking, is called a visual-axis vector. The visual-axis vector is a vector that connects a fovea of an eyeball with a center of a cornea of the eyeball, and thus there exists a deviation (difference) of a visual-axis vector from an optical-axis vector. Thus, a visual-axis vector is estimated by correcting an optical-axis vector estimated as described above using a difference (an amount of correction) between an optical-axis vector and a visual-axis vector that is obtained by performing a line-of-sight calibration described later. The difference used to estimate a visual-axis vector from an optical-axis vector may be hereinafter referred to as an “amount of correction between an optical axis and a visual axis (a correction amount)” as appropriate.

[0049] [Line-of-Sight Calibration]

[0050] An outline of a typical line-of-sight calibration is described using FIG. 2.

[0051] In a typical line-of-sight calibration, a presented gaze point P.sub.gt.sup.n is presented on the screen of the display section 13 to be gazed at by a user. Here, the controller 16 calculates a difference between an optical-axis vector V.sub.opt.fwdarw. and a visual-axis vector V.sub.vis.fwdarw., the optical-axis vector V.sub.opt.fwdarw. connecting a three-dimensional position of a center P.sub.e.sup.n of a corneal curvature with a three-dimensional position of a center of a pupil, the three-dimensional position of the center P.sub.e.sup.n of a corneal curvature and the three-dimensional position of the center of a pupil being estimated using an image captured by the image-capturing section 12, the visual-axis vector V.sub.vis.fwdarw. connecting the three-dimensional position P.sub.e.sup.n of the center of a corneal curvature with a three-dimensional position P.sub.gt.sup.n of the presented gaze point, the three-dimensional position P.sub.e.sup.n of the center of a corneal curvature being estimated using the image captured by the image-capturing section 12.

[0052] However, due to an error (an assembling error) in an assembling state such as a position and a pose of a device such as the image-capturing section 12, the display section 13, and the active light source 11, there may occur a decrease in the accuracy in estimating, using a line-of-sight calibration, a difference between an optical-axis vector V.sub.opt.fwdarw. and a visual-axis vector V.sub.vis.fwdarw., that is, an amount of correction between an optical axis and a visual axis.

[0053] [Effects That Assembling Error Has on Line-of-Sight Calibration]

[0054] Effects that an assembling error of a device, and particularly, an assembling error of the image-capturing section 12, has on a line-of-sight calibration, are discussed below.

[0055] FIGS. 4A and 4B each illustrate a relationship between an optical-axis vector V.sub.opt.fwdarw. and a visual-axis vector V.sub.vis.fwdarw. with respect to a presented gaze point when the image-capturing section 12 is arranged offset in an optical-axis direction y of the image-capturing section 12. Here, FIG. 4A illustrates an actual relationship between an optical-axis vector V.sub.opt.fwdarw. and a visual-axis vector V.sub.vis.fwdarw.. FIG. 4B illustrates a calculated relationship between the optical-axis vector V.sub.opt.fwdarw. and the visual-axis vector V.sub.vis.fwdarw. replacing the relationship illustrated in FIG. 4A, the calculated relationship being obtained by performing a line-of-sight calibration (in the case of there being no assembling error).

[0056] In the presence of such an assembling error, when a line-of-sight calibration is performed while a user is gazing at a presented gaze point displayed at an intersection of the screen of the display section 13 and an optical axis y of the image-capturing section 12, a sum of an actual difference .theta.r between an optical-axis vector V.sub.opt.fwdarw. and a visual-axis vector V.sub.vis.fwdarw., and a difference .theta.a due to the assembling error with respect to the y-axis direction of the image-capturing section 12, is calculated as a calculated difference obtained by performing the line-of-sight calibration, as illustrated in FIG. 4B.

[0057] Regarding this difference (.theta.r+.theta.a), as illustrated in FIGS. 5A and 5B, when the position of a line-of-sight-estimation-target eyeball 1 of a user moves in a direction orthogonal to the optical-axis direction y of the image-capturing section 12 with respect to a position upon performing a line-of-sight calibration (indicated by a dotted line), this results in obtaining a correct result of estimating a line of sight, in which there is an assembling error in the image-capturing section 12 with respect to the optical-axis direction y. Thus, there occurs no problem. However, as illustrated in FIGS. 6A and 6B, when the position of the line-of-sight-estimation-target eyeball 1 of the user moves in a direction (for example, the optical-axis direction y) other than the direction orthogonal to the optical-axis direction y of the image-capturing section 12 with respect to the position upon performing a line-of-sight calibration, this results in a mismatch between a corrected gaze point and a presented gaze point, that is, this results in not obtaining a correct result of estimating a line of sight.

[0058] Next, the case of there being an assembling error with respect to a pose of the image-capturing section 12, is discussed.

[0059] FIGS. 7A and FIG. 7B each illustrate a relationship between an optical-axis vector V.sub.opt.fwdarw. and a visual-axis vector V.sub.vis.fwdarw. with respect to a presented gaze point when there is an assembling error with respect to a pose of the image-capturing section 12. Here, FIG. 7A illustrates an actual relationship between an optical-axis vector V.sub.opt.fwdarw. and a visual-axis vector V.sub.vis.fwdarw.. FIG. 7B illustrates a calculated relationship between the optical-axis vector V.sub.opt.fwdarw. and the visual-axis vector V.sub.vis.fwdarw. replacing the relationship illustrated in FIG. 7A, the calculated relationship being obtained by performing a line-of-sight calibration (in the case of there being no assembling error).

[0060] When a line-of-sight calibration is performed in the presence of an assembling error with respect to the pose of the image-capturing section 12, as described above, a sum of an actual difference .theta.r between an optical-axis vector V.sub.opt.fwdarw. and a visual-axis vector V.sub.vis.fwdarw., and a difference .theta.b due to an assembling error with respect to the pose of the image-capturing section 12, is calculated as a calculated difference obtained by performing the line-of-sight calibration, as illustrated in FIG. 7B.

[0061] Regarding this difference (.theta.r+.nu.b), as illustrated in FIGS. 8A and 8B, when the position of an eyeball 1 moves in a direction of a radius of a position of the image-capturing section 12 with respect to a position upon performing a line-of-sight calibration (indicated by a dotted line), this results in obtaining a correct result of estimating a line of sight. Thus, there occurs no problem. However, as illustrated in FIGS. 9A and 9B, when the position of the eyeball 1 moves in a direction other than the direction of the radius of the position of the image-capturing section 12 with respect to the position upon performing a line-of-sight calibration, this results in a mismatch between a corrected gaze point and a presented gaze point, that is, this results in not obtaining a correct result of estimating a line of sight.

[0062] As described above, when there is an assembling error in the image-capturing section 12, a line of sight is not correctly estimated except for a specific state. Likewise, when there is an assembling error in other hardware such as the display section 13 or the active light source 11, a line of sight is not correctly estimated except for a specific state.

[0063] [Line-of-Sight Calibration with Respect to N Presented Gaze Points]

[0064] Next, a typical method for estimating an amount of correction between an optical axis and a visual axis by performing a line-of-sight calibration with respect to N (a plurality of) presented gaze points, is described.

[0065] FIGS. 10A and FIG. 10B each illustrate relationships between an optical-axis vector V.sub.opt.sup.n.fwdarw. and a visual-axis vector V.sub.vis.sup.n.fwdarw. with respect to N respective presented gaze points. FIG. 10A illustrates actual relationships between an optical-axis vector V.sub.opt.sup.n.fwdarw. and a visual-axis vector V.sub.vis.sup.n.fwdarw. with respect to N (three in this example) respective presented gaze points. FIG. 10B illustrates calculated relationships between the optical-axis vector V.sub.opt.sup.n.fwdarw. and the visual-axis vector V.sub.vis.sup.n.fwdarw. with respect to the N respective presented gaze points, the calculated relationships replacing the relationships illustrated in FIG. 10A, the calculated relationships being obtained by performing a line-of-sight calibration (in the case of there being no assembling error).

[0066] Anatomically, a geometric relationship between an optical axis and a visual axis of a user remains unchanged regardless of the direction of his or her line of sight. However, for example, when there is an assembling error in hardware such as the image-capturing section 12, a variation in an amount of correction between an optical axis and a visual axis occurs depending on the direction of a line of sight. In other words, an amount of correction between an optical axis and a visual axis with respect to a presented gaze point 1 is a sum of an actual difference .theta.r between an optical-axis vector V.sub.opt.sup.1.fwdarw. and a visual-axis vector V.sub.vis.sup.1.fwdarw., and a difference .theta.c1 due to an assembling error in the image-capturing section 12. An amount of correction between an optical axis and a visual axis with respect to a presented gaze point 2 is a sum of an actual difference .theta.r between an optical-axis vector V.sub.opt.sup.2.fwdarw. and a visual-axis vector V.sub.vis.sup.2.fwdarw., and a difference .theta.c2 due to the assembling error in the image-capturing section 12. An amount of correction between an optical axis and a visual axis with respect to a presented gaze point 3 is a sum of an actual difference .theta.r between an optical-axis vector V.sub.opt.sup.3.fwdarw. and a visual-axis vector V.sub.vis.sup.3.fwdarw., and a difference .theta.c3 due to the assembling error in the image-capturing section 12. Since the three differences described above .theta.c1, .theta.c2, and .theta.c3 due to the assembling error in the image-capturing section 12, are different from one another depending on the direction of a line of sight, a variation in an amount of correction between an optical axis and a visual axis occurs depending on the direction of a line of sight.

[0067] As described above, regarding amounts of correction between an optical axis and a visual axis that are respectively obtained by performing a line-of-sight calibration with respect to N respective presented gaze points, a variation occurs depending on the direction of a line of sight due to an assembling error in hardware. Thus, typically, for example, a method has been discussed that includes estimating, using an image, an optical-axis start point and an optical-axis vector of a line-of-sight-estimation-target eyeball of a user; and calculating an amount of correction between an optical axis and a visual axis using a similarity between the estimated optical-axis start point and an optical-axis start point saved as line-of-sight-calibration data, and a similarity between the estimated optical-axis vector and an optical-axis vector saved as the line-of-sight-calibration data, and using a correction amount for each presented gaze point included in the line-of-sight-calibration data. However, this is not a method in which the precision in hardware assembly is not explicitly considered, so it is still difficult to efficiently suppress the occurrence of an error in an amount of correction between an optical axis and a visual axis due to an assembling error in hardware.

[0068] Thus, in the information processing device 100 according to the first embodiment of the present technology, the controller 16 changes the position of a presented gaze point to display the presented gaze point at a plurality of different positions on the display section 13, according to control data such as a program stored in a RAM 14. Then, using the image-capturing section 12, the controller 16 captures, for each displayed presented gaze point, an image of an eyeball of a user who is gazing at the displayed presented gaze point, and generates a plurality of pieces of line-of-sight-calibration data corresponding to each of the presented gaze points. Then, using the plurality of pieces of line-of-sight-calibration data, the controller 16 estimates a position and a pose of an arbitrary piece of hardware included in the information processing device 100 when a variance of (or a difference between) differences between an optical-axis vector and a visual-axis vector for each of the presented gaze points, is minimum in the line-of-sight-calibration data. The estimated position and pose of the arbitrary piece of hardware can be considered a position and a pose in which an assembling error in the piece of hardware has been reflected. In other words, an effect of an assembling error in an arbitrary piece of hardware is reduced by calculating, using an estimated position and an estimated pose of the arbitrary piece of hardware, an amount of correction between an optical axis and a visual axis in which the assembling error in the arbitrary piece of hardware has been reflected. Note that, for example, one of the image-capturing section 12, the display section 13, and the light source 11, or a combination of two or more thereof may be selected as an arbitrary piece of hardware. This is described in detail below.

[0069] Note that, with respect to a camera, a display, and a light source that irradiates an eyeball of a user with light, values of design information acquired in advance are used for a position and a pose of each of the pieces of hardware, but a difference from the design information is estimated by the controller 16 to obtain a position and a pose after an assembling error occurs, since the difference from the design information is actually caused due to an effect of the assembling error. An original position in this case is stated in the design information.

[0070] [Calculation of Amount of Correction Between Optical Axis and Visual Axis in Which Assembling Error in Hardware is Taken into Consideration]

[0071] FIG. 11 is a flowchart illustrating a procedure of a method for calculating an amount of correction between an optical axis and a visual axis according to the first embodiment, in which an assembling error in hardware is taken into consideration.

[0072] Note that it is assumed that there is an assembling error in the image-capturing section 12 that is the assembling error in hardware.

[0073] 1. First, the controller 16 controls the display section 13 such that a firstly presented gaze point (a first presented gaze point) from among N presented gaze points is displayed at a specified position (a first position) on the screen of the display section 13 (Step S101).

[0074] Here, the presented gaze point is a point (a virtual object) at which a user has been instructed to gaze, and is, for example, a round shape displayed on the display section 13.

[0075] 2. Next, using the image-capturing section 12, the controller 16 repeatedly captures, L times, an image of the eyeball 1 of a user who is gazing at the firstly presented gaze position, while changing at least one of a position or a pose of the image-capturing section 12. Accordingly, the controller acquires L images (first images) of the eyeball that correspond to the firstly presented gaze position (Step S102). The value of L is limited due to a speed of the information processing device 1, but is favorably the largest possible number for a reason of accuracy.

[0076] Note that, as a method for changing at least one of a position or a pose of the image-capturing section 12, there exist a method for virtually changing at least one of the position or the pose of the image-capturing section 12 in a virtual space, and a method for actually changing at least one of the position or the pose of the image-capturing section 12 in a real space. When virtually changing at least one of the position or the pose of the image-capturing section 12, the controller 12 changes a calculated position or a calculated pose of the image-capturing section 12 in the virtual space. On the other hand, when actually changing at least one of the position or the pose of the image-capturing section 12, an adjustment section is added to the head-mounted display.

[0077] The adjustment section is capable of adjusting at least one of a position or a pose of the image-capturing section 12. Note that the adjustment section 12 may be capable of adjusting at least one of a position or a pose of the display section 13 and adjusting at least one of a position or a pose of the active light source 11, in addition to adjusting at least one of the position or the pose of the image-capturing section (the cases of adjusting the position and the pose of the display section 13 and adjusting the position and the pose the active light source 11, which will be described later).

[0078] The adjustment section may be capable of automatically performing the adjustment described above using a drive section such as an actuator. Alternatively, the adjustment section may be capable of manually performing the adjustment described above (for example, a dialing mechanism).

[0079] When the adjustment described above is manually performed, the controller 12 controls the display section to display, to a user, guidance indicating how to move the controller 12 and a speed for moving the controller 12. For example, guidance, such as “Move more slowly”, “Move more broadly”, “Move similarly to the first movement”, and “Move again”, is displayed to be reported to a user. Note that audio guidance may be reported to a user.

[0080] The controller 12 may determine which of the various pieces of guidance described above is to be presented to a user, according to a condition such as whether a specified number (L) of pieces of line-of-sight-calibration data have been acquired with respect to the movement of the camera 12 in a certain range. Alternatively, the controller 12 may determine which of the various pieces of guidance described above is to be presented to a user, according to a condition such as whether the camera 12 has been moved on the same route for each presented gaze point.

[0081] Using the image-capturing section 12, the controller 16 captures an image of the eyeball 1 of the user (after guidance is displayed) while changing at least one of a position or a pose of the image-capturing section 12, and acquires position-and-pose information related to a position and a pose of the image-capturing section 12 in a virtual space or in a real space. The position-and-pose information may be considered information depending on a change in at least one of the position or the pose of the image-capturing section 13, the information including first position-and-pose information and second position-and-pose information that are different from each other. Then, the controller 16 associates the position-and-pose information regarding a position and a pose of the image-capturing section 12 with an image (first image) of the eyeball that is captured in a state in which the image-capturing section 12 is in the pose in the position, and saves them in the storage 14.

……
……
……

您可能还喜欢...