Sony Patent | Information processing apparatus, information processing method, and program
Patent: Information processing apparatus, information processing method, and program
Drawings: Click to check drawins
Publication Number: 20210181836
Publication Date: 20210617
Applicant: Sony
Abstract
[Problem] To realize line-of-sight estimation with high accuracy in accordance with an individual characteristic. [Solution] Provided is an information processing apparatus that includes an arithmetic processing unit that performs arithmetic processing related to estimation of a line of sight of a user by using an eyeball model. The arithmetic processing unit dynamically estimates an individual parameter related to the eyeball model for each of users, and the individual parameter includes information on a relative position of a structure constituting an eyeball in a three-dimensional space. Further provided is an information processing method for performing, by a processor, arithmetic processing related to estimation of a line of sight of a user by using an eyeball model. The performing of the arithmetic processing includes dynamically estimating an individual parameter related to the eyeball model for each of users, and the individual parameter includes information on a relative position of a structure constituting an eyeball in a three-dimensional space.
Claims
-
An information processing apparatus comprising: an arithmetic processing unit that performs arithmetic processing related to estimation of a line of sight of a user by using an eyeball model, wherein the arithmetic processing unit dynamically estimates an individual parameter related to the eyeball model for each of users, and the individual parameter includes information on a relative position of a structure constituting an eyeball in a three-dimensional space.
-
The information processing apparatus according to claim 1, wherein the structure includes two spherical structures and a pupil.
-
The information processing apparatus according to claim 1, wherein the individual parameter includes a cornea-pupil distance that is a distance between a pupil center and a corneal curvature center, and the arithmetic processing unit estimates the line of sight of the user by using the cornea-pupil distance.
-
The information processing apparatus according to claim 3, wherein the arithmetic processing unit estimates a position of the pupil center in a three-dimensional space by using the cornea-pupil distance.
-
The information processing apparatus according to claim 3, wherein the arithmetic processing unit calculates the cornea-pupil distance with which an error between a target point that is gazed at by the user and at least one of a visual axis and an optical axis is minimized.
-
The information processing apparatus according to claim 5, wherein the error includes one of a distance and an angle between a vector that extends from a corneal curvature center to the target point and at least one of a line-of-sight vector and an optical axis vector.
-
The information processing apparatus according to claim 5, wherein the arithmetic processing unit calculates the cornea-pupil distance with which the error is minimized, on the basis of a vector that extends from the corneal curvature center to the target point and a vector that extends from the corneal curvature center to the pupil center.
-
The information processing apparatus according to claim 5, wherein the arithmetic processing unit calculates the cornea-pupil distance on the basis of input information on the single target point.
-
The information processing apparatus according to claim 5, wherein the arithmetic processing unit calculates the cornea-pupil distance on the basis of a single eyeball image that is obtained when the user gazes at the target point.
-
The information processing apparatus according to claim 5, wherein the arithmetic processing unit calculates the cornea-pupil distance by using a closed form.
-
The information processing apparatus according to claim 5, wherein the arithmetic processing unit calculates the cornea-pupil distance by using an evaluation function that minimizes the error.
-
The information processing apparatus according to claim 11, wherein the arithmetic processing unit calculates the cornea-pupil distance by differential operation on the evaluation function.
-
The information processing apparatus according to claim 12, wherein the arithmetic processing unit calculates the cornea-pupil distance with which a derivative function of the evaluation function reaches zero.
-
The information processing apparatus according to claim 1, wherein the individual parameter includes a corneal curvature radius, and the arithmetic processing unit estimates the line of sight of the user by using the estimated corneal curvature radius.
-
The information processing apparatus according to claim 1, wherein the arithmetic processing unit estimates the line of sight of the user by a corneal reflection method.
-
The information processing apparatus according to claim 1, further comprising: an image acquiring unit that acquires an image including a bright point on a cornea of the user.
-
The information processing apparatus according to claim 5, further comprising: a display unit that displays the target point.
-
The information processing apparatus according to claim 1, wherein the information processing apparatus is a device that is worn on a head of the user.
-
An information processing method comprising: performing, by a processor, arithmetic processing related to estimation of a line of sight of a user by using an eyeball model, wherein performing the arithmetic processing includes dynamically estimating an individual parameter related to the eyeball model for each of users, and the individual parameter includes information on a relative position of a structure constituting an eyeball in a three-dimensional space.
-
A program that causes a computer to function as an information processing apparatus that includes: an arithmetic processing unit that performs arithmetic processing related to estimation of a line of sight of a user by using an eyeball model, wherein the arithmetic processing unit dynamically estimates an individual parameter related to the eyeball model for each of users, and the individual parameter includes information on a relative position of a structure constituting an eyeball in a three-dimensional space.
Description
FIELD
[0001] The present disclosure relates to an information processing apparatus, an information processing method, and a program.
BACKGROUND
[0002] In recent years, a technology for estimating a user’s line of sight and using the estimated line of sight for various kinds of operation has been in widespread use. Further, a number of technologies for improving accuracy of line-of-sight detection have been developed. For example, Patent Literature 1 discloses a technology for improving detection accuracy of a corneal reflection image on a cornea in line-of-sight estimation using a corneal reflection method.
CITATION LIST
Patent Literature
[0003] Patent Literature 1: Japanese Laid-open Patent Publication No. 2016-106668
SUMMARY
Technical Problem
[0004] Meanwhile, if line-of-sight estimation using an eyeball model, such as the corneal reflection method, is performed, it is important to use an eyeball model that closely resembles an eyeball structure unique to a user in order to obtain a line-of-sight estimation result with high accuracy. However, in the technology described in Patent Literature 1, the line-of-sight estimation is performed by using an eyeball model that is determined in advance, so that it may be difficult to deal with a new user in some cases.
[0005] Therefore, in the present disclosure, an information processing apparatus, an information processing method, and a program that are novel, modified, and capable of realizing line-of-sight estimation with high accuracy in accordance with an individual characteristic.
Solution to Problem
[0006] According to the present disclosure, an information processing apparatus is provided that includes: an arithmetic processing unit that performs arithmetic processing related to estimation of a line of sight of a user by using an eyeball model, wherein the arithmetic processing unit dynamically estimates an individual parameter related to the eyeball model for each of users, and the individual parameter includes information on a relative position of a structure constituting an eyeball in a three-dimensional space.
[0007] Moreover, according to the present disclosure, an information processing method is provided that includes: performing, by a processor, arithmetic processing related to estimation of a line of sight of a user by using an eyeball model, wherein performing the arithmetic processing includes dynamically estimating an individual parameter related to the eyeball model for each of users, and the individual parameter includes information on a relative position of a structure constituting an eyeball in a three-dimensional space.
[0008] Moreover, according to the present disclosure, a program is provided that causes a computer to function as an information processing apparatus that includes: an arithmetic processing unit that performs arithmetic processing related to estimation of a line of sight of a user by using an eyeball model, wherein the arithmetic processing unit dynamically estimates an individual parameter related to the eyeball model for each of users, and the individual parameter includes information on a relative position of a structure constituting an eyeball in a three-dimensional space.
Advantageous Effects of Invention
[0009] As described above, according to the present disclosure, it is possible to realize line-of-sight estimation with high accuracy in accordance with an individual characteristic.
[0010] Meanwhile, the effects described above are not limitative, and, with or in the place of the above effects, any of the effects described in this specification or other effects that can be recognized from this specification may be achieved.
BRIEF DESCRIPTION OF DRAWINGS
[0011] FIG. 1 is a diagram for explaining a flow of line-of-sight estimation using a corneal reflection method.
[0012] FIG. 2 is a diagram for explaining factors to reduce line-of-sight estimation accuracy in the corneal reflection method.
[0013] FIG. 3 is a diagram illustrating an arrangement example of hardware in a case where an information processing apparatus according to one embodiment of the present disclosure is a wearable device.
[0014] FIG. 4 is a schematic side view illustrating a positional relationship between an eyeball of a user and the information processing apparatus in a case where the information processing apparatus according to the embodiment is worn on a head of the user.
[0015] FIG. 5 is a block diagram illustrating a functional configuration example of the information processing apparatus according to the embodiment.
[0016] FIG. 6 is a diagram illustrating an example of reduction of the line-of-sight estimation accuracy due to an individual difference of an eyeball structure.
[0017] FIG. 7 is a diagram for explaining minimization of error variance in a target coordinate system according to one embodiment of the present disclosure.
[0018] FIG. 8 is a diagram for explaining an effect of the line-of-sight estimation using an individual parameter that is estimated by a full search method according to the embodiment.
[0019] FIG. 9 is a diagram for explaining a calculation of a cornea-pupil distance d by a closed-form solution according to the embodiment.
[0020] FIG. 10 is a diagram illustrating a degree of influence of the factor to reduce the line-of-sight estimation accuracy when individual parameter estimation according to the embodiment and calibration are performed.
[0021] FIG. 11 is a flowchart illustrating a flow of a line-of-sight estimation process according to the embodiment.
[0022] FIG. 12 is a flowchart illustrating a flow of individual parameter estimation using a plurality of target points according to the embodiment.
[0023] FIG. 13 is a flowchart illustrating a flow of individual parameter estimation using the closed-form solution according to the embodiment.
[0024] FIG. 14 is a diagram illustrating a hardware configuration example according to one embodiment of the present disclosure.
DESCRIPTION OF EMBODIMENTS
[0025] Preferred embodiments of the present disclosure will be described in detail below with reference to the accompanying drawings. In this specification and the drawings, structural elements that have substantially the same functions and configurations will be denoted by the same reference symbols, and repeated explanation of the structural elements will be omitted.
[0026] In addition, hereinafter, explanation will be given in the following order.
[0027] 1. Embodiment [0028] 1.1. Overview [0029] 1.2. Arrangement example of hardware related to information processing apparatus 10 [0030] 1.3. Functional configuration example of information processing apparatus 10 [0031] 1.4. Details of line-of-sight estimation [0032] 1.5. Flow of processes
[0033] 2. Hardware configuration example
[0034] 3. Conclusion
-
Embodiment
[0035] <<1.1. Overview>>
[0036] One example of a flow of line-of-sight estimation using a corneal reflection method will be described below. The corneal reflection method (also referred to as a pupil-corneal reflection method) is a method of applying light from a light source to an eyeball of a user and detecting reflected light and a position of a pupil on a corneal surface, to thereby estimate a user’s line-of-sight direction.
[0037] FIG. 1 is a diagram for explaining the flow of the line-of-sight estimation using the corneal reflection method. As illustrated in a lower left part in FIG. 1, an information processing apparatus that performs the line-of-sight estimation using the corneal reflection method first applies light from a light source 103 to an eyeball E of a user and causes an imaging unit 104 to capture an image including a corneal reflection image (also referred to as a Purkinje image or a bright point) on a corneal surface.
[0038] In an upper left part in FIG. 1, an eyeball image I that is obtained through the procedure as described above is illustrated. Subsequently, the information processing apparatus detects a pupil PU and a bright point s from the eyeball image I through image processing. In this case, the information processing apparatus may detect the pupil PU and the bright point s by using a statistical method, such as machine learning.
[0039] Then, the information processing apparatus calculates a line-of-sight vector of the user by using the detected pupil PU, the detected bright point s, and a three-dimensional eyeball model (hereinafter, also simply referred to as an eyeball model). An overview of a line-of-sight vector calculation using the eyeball model is illustrated on a right side in FIG. 1.
[0040] The information processing apparatus estimates a three-dimensional position of a corneal curvature center c that corresponds to a center of a cornea when the cornea is assumed as a spherical structure, on the basis of a position of the detected bright point and a position of the light source 103, for example. In this case, the information processing apparatus may obtain the three-dimensional position of the corneal curvature center c by using a corneal curvature radius r that is one of parameters (hereinafter, also referred to as eyeball parameters) related to the eyeball model.
[0041] Subsequently, the information processing apparatus estimates a three-dimensional position of a pupil center p, on the basis of the three-dimensional position of the corneal curvature center c and a cornea-pupil distance d that is one of the eyeball parameters. Meanwhile, the cornea-pupil distance d is an eyeball parameter that indicates a distance between the pupil center p and the corneal curvature center c.
[0042] Then, the information processing apparatus estimates an optical axis from the corneal curvature center c and the pupil center p that are estimated through the procedures as described above. For example, the information processing apparatus estimates a straight line connecting the corneal curvature center c and the pupil center p as the optical axis, and estimates a vector that extends from the corneal curvature center c through the pupil center p as an optical axis vector OA. In the corneal reflection method, the optical axis vector OA is detected as a line-of-sight direction of the user.
[0043] However, the optical axis vector OA estimated as described above and an actual line-of-sight direction of the user (a line-of-sight vector VA) deviate from each other A point of gaze (target point M) that is actually gazed at by the user is located on a visual axis connecting a central fovea f and the corneal curvature center c, and, in general, a difference (deviation) of about 4.degree. to 8.degree. occurs between the optical axis vector OA and the line-of-sight vector VA. Therefore, in the line-of-sight estimation using the corneal reflection method, it is common to perform calibration to correct deviation between the optical axis vector OA and the line-of-sight vector VA in order to improve accuracy of the line-of-sight estimation.
[0044] In the above explanation using FIG. 1, the example has been described in which the information processing apparatus uses the vector connecting the pupil center p and the corneal curvature center c as an estimation result, but the information processing apparatus may use a vector connecting the corneal curvature center c and an eyeball center O (rotation center) as the estimation result, for example.
[0045] Further, in the line-of-sight estimation using the corneal reflection method, it is known that various factors lead to reduction of line-of-sight estimation accuracy. FIG. 2 is a diagram for explaining factors to reduce the line-of-sight estimation accuracy in the corneal reflection method. In FIG. 2, a horizontal axis represents the factors to reduce the line-of-sight estimation accuracy, and a vertical axis represents a magnitude of an angular error that occurs due to each of the factors.
[0046] With reference to FIG. 2, the factors for reduction are largely classified into three kinds. In other words, the factors include a detection error due to image processing including pupil detection and bright point detection, an error of the eyeball parameter including the cornea-pupil distance and the corneal curvature radius, and an error due to a hardware mounting position including an LED position, a camera position, and camera posture.
[0047] Further, with reference to FIG. 2, among the above-described two kinds of factors for reduction, it is indicated that the error of the eyeball parameter including the cornea-pupil distance has the highest degree of influence. The error due to the eyeball parameter as described above is caused by a difference between an eyeball structure of the user and an eyeball model used in the line-of-sight estimation. It is general that eyeball structures of humans vary among different individuals and the cornea-pupil distance and the corneal curvature radius vary for each of users. Therefore, for example, if the line-of-sight estimation is performed using an average eyeball model, a difference between an actual eyeball structure of a certain user may be largely different from the average eyeball model and the line-of-sight estimation accuracy may eventually be reduced.
[0048] A technical idea according to one embodiment of the present disclosure is conceived in view of the foregoing point and makes it possible to realize the line-of-sight estimation with high accuracy in accordance with an individual characteristic. Therefore, one of features of the information processing apparatus, the information processing method, and the program according to one embodiment of the present disclosure is to dynamically estimate an individual parameter related to an eyeball model for each of users. Further, the individual parameter as described above is an eyeball parameter that is unique to a user and that is related to the eyeball model, and may include information on a relative position of a structure of an eyeball in a three-dimensional space.
[0049] With the feature of the information processing apparatus, the information processing method, and the program according to one embodiment of the present disclosure as described above, it is possible to perform the line-of-sight estimation using a highly-accurate eyeball model for each of users, so that it is possible to improve accuracy of the line-of-sight estimation.
[0050] The above-described feature of the information processing apparatus, the information processing method, and the program according to the embodiment and effects achieved by the feature will be described in detail below.
[0051] <<1.2. Arrangement Example of Hardware Related to Information Processing Apparatus 10>>
[0052] One example of arrangement of hardware of an information processing apparatus 10 according to the embodiment of the present disclosure will be described below. The information processing apparatus 10 according to the embodiment may be, for example, a head-mounted display to be worn on a head of a user or a glasses-type wearable device. FIG. 3 is a diagram illustrating a hardware arrangement example in a case where the information processing apparatus 10 according to the embodiment is a wearable device. Further, FIG. 4 is a schematic side view illustrating a positional relationship between an eyeball E of a user and the information processing apparatus 10 in a case where the information processing apparatus 10 is worn on a head of the user.
[0053] FIG. 3 illustrates a configuration of the information processing apparatus 10 when viewed from a side that faces eyes of the user. With reference to FIG. 3, the information processing apparatus 10 according to the embodiment includes displays 102R and 102L at positions corresponding to a right eye and a left eye of the user. As illustrated in FIG. 3, the displays 102R and 102L according to the embodiment may be formed in approximately rectangular shapes. Further, in a body 101, a recessed portion 101a at which a nose of the user is located may be formed between the displays 102R and 102L.
[0054] The displays 102R and 102L according to the embodiment may be, for example, liquid crystal displays, organic electroluminescence (EL) displays, or lenses on which information is displayed by a projection device.
[0055] Four light sources 103Ra to 103Rd are arranged at approximate centers of four sides around the display 102R. Similarly, for light sources 103La to 103Ld are arranged at approximate centers of four sides around the display 102L. The light sources 103Ra to 103Rd and 103La to 103d according to the embodiment may be infrared light emitting diodes (IR LEDs) that emit infrared light, for example. The light sources 103Ra to 103Rd and 103La to 103d emit infrared light to the opposing right eye and the opposing left eye of the user.
[0056] Meanwhile, the light sources 103Ra to 103Rd and 103La to 103d according to the embodiment need not always be the IR LEDs, but may be light sources that emit light with an appropriate wavelength capable of detecting the bright point.
[0057] Further, imaging units 104R and 104L that capture images of the eyeballs E of the user are arranged in the peripheries of the displays 102R and 102L. The imaging units 104R and 104L are arranged on lower sides of the displays 102R and 102L (below the light sources 103Rc and 103Lc) as illustrated in FIG. 3, for example.
[0058] Furthermore, as illustrated in FIG. 4, the imaging units 104R and 104L are arranged such that at least pupils PU of the eyeballs E to be captured are included in imaging ranges. For example, the imaging units 104R and 104L may be arranged at a predetermined elevation angle .theta.. The elevation angle .theta. may be, for example, about 30.degree..
[0059] Meanwhile, the information processing apparatus 10 is configured such that the displays 102R and 102L are separated from the eyeballs E of the user by a predetermined distance when the information processing apparatus is worn on the user. With this configuration, the user who has worn the information processing apparatus 10 is able to view display regions of the displays 102R and 102L in the user’s visual field without feeling discomfort. In this case, it may be possible to determine the distance between each of the displays 102R and 102L and the eyeballs E of the user such that even when the user wears a glasses G, it is possible to wear the information processing apparatus 10 on the glasses G in an overlapping manner. The imaging units 104R and 104L are arranged such that the pupils PU of the eyeballs E of the user are included in the imaging ranges in the above-described state.
[0060] Thus, the arrangement example of the hardware of the information processing apparatus 10 according to the embodiment has been described above. While the example has been described above in which the information processing apparatus 10 according to the embodiment is implemented as a wearable device to be worn on the head of the user, the information processing apparatus 10 according to the embodiment is not limited to this example. The information processing apparatus 10 according to the embodiment may be a server, a general-purpose computer, a smartphone, a tablet, or the like that performs arithmetic processing based on a captured image. The information processing apparatus 10 according to the embodiment may be various apparatuses that perform arithmetic processing related to the line-of-sight estimation.
[0061] <<1.3. Functional Configuration Example of Information Processing Apparatus 10>>
[0062] Next, a functional configuration example of the information processing apparatus 10 according to the embodiment will be described. FIG. 5 is a block diagram illustrating the functional configuration example of the information processing apparatus 10 according to the embodiment. With reference to FIG. 5, the information processing apparatus 10 according to the embodiment includes an illumination unit 110, an image acquiring unit 120, an arithmetic processing unit 130, a display unit 140, and a storage unit 150.
[0063] (Illumination Unit 110)
[0064] The illumination unit 110 according to the embodiment has a function to emit light to the eyeballs E of the user who has worn the information processing apparatus 10. To cope with this, the illumination unit 110 according to the embodiment includes the light source 103 that has been described above with reference to FIG. 3. The illumination unit 110 may emit light on the basis of control performed by the arithmetic processing unit 130.
[0065] (Image Acquiring Unit 120)
[0066] The image acquiring unit 120 according to the embodiment captures images of the eyeballs E of the user who has worn the information processing apparatus 10. More specifically, the image acquiring unit 120 acquires images of the eyeballs E including the bright points on the corneas of the user. To cope with this, the image acquiring unit 120 according to the embodiment includes the imaging unit 104 that has been described above with reference to FIG. 3. The image acquiring unit 120 may capture the images of the eyeballs E on the basis of control performed by the arithmetic processing unit 130.
[0067] (Arithmetic Processing Unit 130)
[0068] The arithmetic processing unit 130 according to the embodiment has a function to perform arithmetic processing related to the line-of-sight estimation for the user by using the three-dimensional eyeball model. Further, the arithmetic processing unit 130 may function as a control unit that controls each of the components included in the information processing apparatus 10. With the arithmetic processing unit 130 according to the embodiment, it is possible to realize the line-of-sight estimation with high accuracy by estimating an individual parameter related to the eyeball model for each of users. Meanwhile, the individual parameter according to the embodiment indicates an eyeball parameter that is unique to the user and that depends on a characteristic of an eyeball structure. The functions of the arithmetic processing unit 130 according to the embodiment will be described in detail later.
[0069] (Display Unit 140)
[0070] The display unit 140 according to the embodiment has a function to display visual information. The display unit 140 may display, for example, visual information corresponding to a user’s line of sight that is estimated by the arithmetic processing unit 130. Further, the display unit 140 according to the embodiment displays a target point that is a target to be gazed at by the user, on the basis of control performed by the arithmetic processing unit 130. The display unit 140 according to the embodiment includes the display 102 that has been described above with reference to FIG. 3.
[0071] (Storage Unit 150)
[0072] The storage unit 150 according to the embodiment stores therein various kinds of information that are used for the line-of-sight estimation by the arithmetic processing unit 130. The storage unit 150 stores therein, for example, the eyeball parameters (individual parameters), such as the cornea-pupil distance and the corneal curvature radius, that are estimated by the arithmetic processing unit 130, various programs, calculation results, and the like.
[0073] Thus, the functional configuration of the information processing apparatus 10 according to the embodiment has been described above. The configuration as described above with reference to FIG. 5 is one example, and the functional configuration of the information processing apparatus 10 according to the embodiment is not limited to this example. For example, the information processing apparatus 10 according to the embodiment need not include the illumination unit 110, the image acquiring unit 120, the display unit 140, or the like. As described above, the information processing apparatus 10 according to the embodiment may be a server that performs arithmetic processing related to the line-of-sight estimation on the basis of an image that is captured by a different device, such as a wearable device. The functional configuration of the information processing apparatus 10 according to the embodiment may be flexibly deformed depending on specifications or operation.
[0074] <<1.4. Details of Line-of-Sight Estimation>>
[0075] Details of the line-of-sight estimation performed by the arithmetic processing unit 130 according to the embodiment will be described below. As has been described above with reference to FIG. 2, in the line-of-sight estimation using the corneal reflection method, an individual difference related to the eyeball parameters, such as the cornea-pupil distance and the corneal curvature radius, may become large factors to reduce the line-of-sight estimation accuracy.
[0076] An example of reduction of the line-of-sight estimation accuracy due to the individual difference of the eyeball structure will be described below. FIG. 6 is a diagram illustrating an example of the reduction of the line-of-sight estimation accuracy due to the individual difference of the eyeball structure. In an upper part in FIG. 6, a positional relationship between a target point M and an estimated viewpoint position ep in a case where the eyeball structure of the user coincides with a general eyeball model when the line-of-sight estimation is performed using the general eyeball model. Further, in a lower part in FIG. 6, a positional relationship between the target point M and the estimated viewpoint position ep in a case where the eyeball structure of the user does not coincide with the general eyeball model when the line-of-sight estimation is performed using the general eyeball model. Meanwhile, in each of the upper part and the lower part in FIG. 6, a positional relationship obtained when the user keeps his/her gaze in a horizontal direction, a positional relationship obtained when the user keeps his/her gaze in an upward direction, a positional relationship obtained when the user keeps his/her gaze in a downward direction, and an error obtained by performing normalization using a target coordinate are illustrated from the left to the right.
[0077] Here, it is assumed that a corneal curvature radius r is 7.7 mm and a cornea-pupil distance d is 4.5 mm in the general eyeball model as described above. Further, the target point M is visual information that is displayed on the display unit 140 as a point to be gazed at by the user at the time of calibration. Meanwhile, for simplicity of explanation, FIG. 6 schematically illustrates a case in which it is assumed that there is no difference (deviation) between the optical axis and a visual axis.
[0078] When the upper part in FIG. 6 is focused on, in the case of the user who has the corneal curvature radius r and the cornea-pupil distance d that coincide with those of the general eyeball model, even if the user gazes at the target point M displayed in any direction, the estimated viewpoint position ep coincides with the target point M. In this manner, if the eyeball model coincides with the eyeball structure of the user, it is possible to realize the line-of-sight estimation with high accuracy.
[0079] In contrast, when the lower part in FIG. 6 is focused on, in the case of the user who has the cornea-pupil distance d (4.0 mm) different from that of the eyeball model, it is indicated that a large error occurs in the estimated viewpoint position ep when the user gazes at the target point M, such as an upper part or a lower part, for which an angular difference exists with respect to a front direction. In this manner, if the eyeball parameter, such as the cornea-pupil distance d or the corneal curvature radius r, related to the eyeball model is different from the eyeball structure of the user, an error between the target point M and the estimated viewpoint position ep increases and the line-of-sight estimation accuracy is extremely reduced.
[0080] To cope with this, the arithmetic processing unit 130 according to the embodiment realizes the line-of-sight estimation with high accuracy by dynamically estimating the eyeball parameter (individual parameter) unique to the user, for each user. In other words, the arithmetic processing unit 130 according to the embodiment is able to eliminate the factors for reduction, which has the largest influence on reduction of the line-of-sight estimation accuracy, by performing the line-of-sight estimation using a unique eyeball model that matches a characteristic of the eyeball structure of each of users.
[0081] Meanwhile, as one of features of the individual parameter estimated by the arithmetic processing unit 130 according to the embodiment, the individual parameter includes information on a relative position of a structure constituting the eyeball in a three-dimensional space. Here, the above-described structure includes two spherical structures and a pupil. Further, the two spherical structures include a cornea and an eyeball body including a corpus vitreum.
[0082] For example, the arithmetic processing unit 130 according to the embodiment may estimate, as the individual parameter, the cornea-pupil distance d that is a difference between the pupil center p and the corneal curvature center c that is a center of a cornea when the cornea is assumed as a spherical structure, and estimate a user’s line of sight by using the cornea-pupil distance d.
[0083] Further, for example, if a vector connecting the corneal curvature center c and an eyeball center O that is a center of the eyeball body is used as an estimation result, the arithmetic processing unit 130 according to the embodiment may estimate, as the individual parameter, a distance between the corneal curvature center c and the eyeball center O.
[0084] Meanwhile, in the following, the explanation will be continued using the example in which the arithmetic processing unit 130 according to the embodiment estimates the cornea-pupil distance d as the individual parameter. The arithmetic processing unit 130 according to the embodiment is able to calculate an optical axis vector by estimating a position of the pupil center in a three-dimensional space by using the estimated cornea-pupil distance d.
[0085] In this case, the arithmetic processing unit 130 according to the embodiment may calculate the cornea-pupil distance d or the corneal curvature radius that minimizes an error between the target point M that is gazed at by the user and the visual axis or the optical axis. Here, the above-described error may be a distance or an angle between a vector that extends from the corneal curvature center c to the target point and the line-of-sight vector or the optical axis vector.
……
……
……