Meta Patent | Eye tracking via dense point cloud scanning using a light beam source and a reflective and/or diffractive surface

Patent: Eye tracking via dense point cloud scanning using a light beam source and a reflective and/or diffractive surface

Publication Number: 20250291411

Publication Date: 2025-09-18

Assignee: Meta Platforms Technologies

Abstract

Augmented and/or virtual reality (AR/VR), near-eye display devices that implement eye tracking via dense point cloud scanning are disclosed. In examples, an eye tracking system for an augmented reality/virtual reality (AR/VR) display device comprises a light beam emission and sensor element located in a first location of the display device to emit a light beam. The eye tracking system may comprises a redirection element located in a second location of the display device to redirect the light beam to illuminate one of line-of-sight and a field-of-view (FOV) of the display device.

Claims

1. An eye tracking system for an augmented reality (AR)/virtual reality (VR) display device, the eye tracking system, comprising:a light beam emission and sensor element located in a first location of the display device to emit a light beam; anda redirection element located in a second location of the display device to redirect the light beam to illuminate one of line-of-sight and a field-of-view (FOV) of the display device.

2. The eye tracking system of claim 1, wherein the light beam emission and sensor component includes one or more of a light beam source, a micro-electromechanical system (MEMS), and a photodetector (PD).

3. The eye tracking system of claim 1, wherein the light beam emission and sensor component further includes a lens to broaden a scope of space over which the light beam is to scan.

4. The eye tracking system of claim 1, wherein the first location is a temple arm of the display device.

5. The eye tracking system of claim 1, wherein the second location is a display lens of the display device.

6. The eye tracking system of claim 5, wherein the redirection element includes a diffractive surface.

7. The eye tracking system of claim 5, wherein the redirection element includes a reflective surface.

8. The eye tracking system of claim 1, wherein to acquire data associated with at least one of intensity, depth, and velocity for eye tracking.

9. The eye tracking system of claim 1, wherein the redirection element is further to collimate the light beam.

10. The eye tracking system of claim 1, wherein the redirection element includes a thin, transparent substrate having a pattern of rectangular grooves arranged to diffract display light.

11. The eye tracking system of claim 1, wherein the redirection element includes an reflective film, wherein the reflective film includes one or more multi-layer dielectric films tuned to reflect a particular wavelength band and a particular polarization.

12. The eye tracking system of claim 1, wherein the redirection element includes an anti-reflection coating for visible light.

13. The eye tracking system of claim 1, wherein the redirection element includes a holographic optical element (HOE) film.

14. A method for implementing eye tracking via a dense point cloud scanning, the method comprising:positioning a light beam emission and sensor element at a first location of a display device;positioning a redirection element at a second location of the display device to redirect light to illuminate one of line-of-sight and a field-of-view (FOV) of the display device;emitting, via a first light beam emission and sensor element, a first light beam towards a user's eye;emitting, via a second light beam emission and sensor element, a second light beam towards a redirection element; andredirecting the second light beam towards the user's eye.

15. The method of claim 14, wherein the redirection element includes a diffractive surface.

16. The method of claim 14, wherein the redirection element includes a reflective surface.

17. The method of claim 14, wherein the redirection element includes a thin, transparent substrate having a pattern of rectangular grooves arranged to diffract display light.

18. A non-transitory computer readable medium configured to store program code instructions, when executed by a processor, cause the processor to perform steps comprising:emit, via a first light beam emission and sensor element, a first light beam towards a user's eye;emit, via a second light beam emission and sensor element, a second light beam towards a redirection element; andredirect the second light beam towards the user's eye.

19. The non-transitory computer readable medium of claim 18, wherein the instructions, when executed by the processor, cause the processor to collimate, via the redirection element, at least one of the first light beam and the second light beam.

20. The non-transitory computer readable medium of claim 18, wherein the instructions, when executed by the processor, cause the processor to a lens to adjust a lens of the first light beam emission and sensor element to broaden a scope of space over which the first light beam is to scan.

Description

PRIORITY

The present application claims priority to U.S. provisional patent application Ser. No. 63/566,740, filed on Mar. 18, 2024, which is incorporated by reference in its entirety.

TECHNICAL FIELD

This provisional patent application relates generally to augmented reality (AR) and/or virtual reality (VR) near-eye display devices, and in particular, to implementing eye tracking via a dense point cloud scanning using a light beam source and a reflective and/or diffractive surface.

BACKGROUND

Many augmented reality (AR) and virtual reality (VR) devices implement eye tracking. For example, in some instances, eye tracking technologies may monitor (or “sense”) a user's “gaze” during use of an augmented reality (AR) and virtual reality (VR) device.

In some instances, self-mixing interferometry (SMI) may be implemented as a sensing technique. In some examples, self-mixing interferometry (SMI) may utilize a vertical-cavity surface-emitting laser (VCSEL) and a photodetector (PD).

In some instances, these self-mixing interferometry (SMI) elements may be included in an optical “stack” (or “pancake stack”) of a display device. Moreover, in some instances, the optical stack may be located in a field-of-view (FOV) of a display device, such as being embedded in a display lens of the display device. In some instances, this may lead to issues in visibility and design aesthetic. Moreover, in some instances, issues with integration of these elements (e.g., trace placement, component bonding, etc.) may be complex, and may lead to increased costs.

BRIEF DESCRIPTION OF DRAWINGS

Features of the present disclosure are illustrated by way of example, and are not limited in the following figures, in which like numerals indicate like elements. One skilled in the art will readily recognize from the following that alternative examples of the structures and methods illustrated in the figures can be employed without departing from the principles described herein.

FIG. 1 illustrates a block diagram of an artificial reality (AR) system environment including a near-eye display, according to an example.

FIGS. 2A-2C illustrate various views of a near-eye display device in the form of a head-mounted display (HMD) device, according to examples.

FIG. 3 illustrates a perspective view of a near-eye display in the form of a pair of glasses, according to an example.

FIG. 4 illustrates an optical system for eye tracking via a dense point cloud scanning using a light beam source and a reflective and/or diffractive surface, according to an example.

FIG. 5 illustrates an optical system for eye tracking via a dense point cloud scanning using a light beam source and a reflective and/or diffractive surface, according to an example.

FIG. 6 illustrates diffractive elements of a diffractive surface on a display lens, according to an example.

FIG. 7 illustrates optical system for eye tracking via a dense point cloud scanning using a light beam source and a reflective and/or diffractive surface, according to an example.

FIG. 8 illustrates a flow diagram for a method for implementing eye tracking via a dense point cloud scanning using a light beam source and a reflective and/or diffractive surface, according to some examples.

您可能还喜欢...