空 挡 广 告 位 | 空 挡 广 告 位

Facebook Patent | Polarization capture device, system, and method

Patent: Polarization capture device, system, and method

Drawings: Click to check drawins

Publication Number: 20210084284

Publication Date: 20210318

Applicant: Facebook

Abstract

A device includes a first lens. The device also includes a first polarized image sensor coupled with the first lens and configured to capture, from a first perspective, a first set of image data in a plurality of polarization orientations. The device also includes a second lens disposed apart from the first lens. The device further includes a second polarized image sensor coupled with the second lens and configured to capture, from a second perspective different from the first perspective, a second set of image data in the plurality of polarization orientations.

Claims

  1. A device, comprising: a first lens; a first polarized image sensor coupled with the first lens and configured to capture, from a first perspective, a first set of image data in a plurality of polarization orientations; a second lens disposed apart from the first lens; and a second polarized image sensor coupled with the second lens and configured to capture, from a second perspective different from the first perspective, a second set of image data in the plurality of polarization orientations.

  2. The device of claim 1, wherein at least one of the first polarized image sensor or the second polarized sensor comprises: a microlens array; a pixel array coupled with the microlens array; and a polarizer array disposed between the microlens array and the pixel array.

  3. The device of claim 2, wherein the polarizer array comprises a plurality of polarizers having different transmission axis orientations arranged in a predetermined pattern.

  4. The device of claim 3, wherein: the plurality of polarizers comprise one or more arrays, each array comprising: a linear horizontal polarizer configured to allow a light having a horizontal polarization orientation to transmit through; a linear vertical polarizer configured to allow a light having a vertical polarization orientation to transmit through; a linear 45-degree polarizer configured to allow a light having a 45.degree. polarization orientation to transmit through; and a linear 135-degree polarizer configured to allow a light having a 135.degree. polarization orientation to transmit through.

  5. The device of claim 2, wherein at least one of the first polarized image sensor or the second polarized sensor further comprises: a color filter array disposed between the microlens array and the pixel array and including a plurality of color filters arranged in a predetermined color filter pattern.

  6. The device of claim 5, wherein: the plurality of color filters comprise at least one of a red color filter, a green color filter, or a blue color filter, and the predetermined color filter pattern is a Bayer filter pattern.

  7. The device of claim 1, further comprising: a processor coupled to the first polarized image sensor and the second polarized image sensor, and configured to: construct a polarization color image corresponding to a polarization orientation of the plurality of polarization orientations based on at least one of the first set of image data or the second set of image data obtained in the plurality of polarization orientations; and determine one or more polarization parameters based on the polarization color image, the one or more polarization parameters including one or more of a Stokes parameter, a degree of linear polarization (“DOLP”), and an angle of linear polarization (“AOLP”).

  8. The device of claim 7, wherein the processor is further configured to: obtain polarization image data corresponding to the polarization orientation of the plurality of polarization orientations based on at least one of the first set of image data or the second set of image data through a polarization interpolation; and construct the polarization color image corresponding to the polarization orientation of the plurality of polarization orientations based on the corresponding polarization image data through a color interpolation.

  9. The device of claim 7, wherein the processor is further configured to: determine one or more Stokes parameters based on one or more optical powers a light corresponding to the plurality of polarization orientations.

  10. The device of claim 9, wherein the processor is further configured to: determine a DOLP value for each pixel based on the one or more Stokes parameters; and determine an AOLP value for each pixel based on the one or more Stokes parameters.

  11. The device of claim 7, wherein the processor is further configured to: determine depth information of an object based on the first set of image data and the second set of image data.

  12. A system, comprising: a first polarization camera configured to capture a first set of image data from a first perspective in a plurality of polarization orientations; and a second polarization camera configured to capture a second set of image data from a second perspective different from the first perspective in the plurality of polarization orientations.

  13. The system of claim 12, wherein at least one of the first polarization camera or the second polarization camera comprises: a lens; and a polarized image sensor optically coupled to the lens.

  14. The system of claim 13, wherein the polarized image sensor comprises: a microlens array; a pixel array coupled with the microlens array; a color filter array disposed between the microlens array and the pixel array and including a plurality of color filters arranged in a predetermined color filter pattern; and a polarizer array disposed between the microlens array and the color filter array, and including a plurality of polarizers associated with different polarization orientations arranged in a predetermined pattern.

  15. The system of claim 12, wherein at least one of the first polarization camera or the second polarization camera further comprises: a processor coupled to the polarized image sensor and configured to: construct a polarization color image corresponding to a polarization orientation of the plurality of polarization orientations based on at least one of the first set of image data or the second set of image data obtained in the plurality of polarization orientations; and determine one or more polarization parameters based on the polarization color image, the one or more polarization parameters including one or more of a Stokes parameter, a degree of linear polarization (“DOLP”), and an angle of linear polarization (“AOLP”).

  16. The system of claim 15, wherein when the processor is also configured to: determine depth information of an object based on a disparity of the object in the first set of image data and the second set of image data.

  17. A method, comprising: obtaining a first set of image data in a plurality of polarization orientations through a first polarization camera in a first perspective; obtaining a second set of image data in the plurality of polarization orientations through a second polarization camera in a second perspective different from the first perspective; and determining, through a processor, multi-modal data based on the first set of image data and the second set of image data, the multi-modal data comprising data for a plurality of polarization color images, one or more polarization parameters, and depth information.

  18. The method of claim 17, wherein obtaining the multi-modal data comprises: constructing, through the processor, a polarization color image of the plurality of polarization color images based on at least one of the first set of image data or the second set of image data through a polarization interpolation and a color interpolation; and determining, through the processor, the one or more polarization parameters based on the polarization color image, the one or more polarization parameters including one or more of a Stokes parameter, a degree of linear polarization (“DOLP”), and an angle of linear polarization (“AOLP”).

  19. The method of claim 18, wherein obtaining the multi-modal data comprises: determining a DOLP value for each pixel based on one or more Stokes parameters; and determining an AOLP value for each pixel based on the one or more Stokes parameters.

  20. The method of claim 18, wherein obtaining the multi-modal data further comprises: determining, through the processor, depth information of an object based on a disparity of the object in the first set of image data and the second set of image data.

Description

CROSS-REFERENCE TO RELATED APPLICATION

[0001] This application claims the benefit of priority to U.S. Provisional Patent Application No. 62/901,452, filed on Sep. 17, 2019, the entire content of which is incorporated by reference.

TECHNICAL FIELD

[0002] The present disclosure generally relates to imaging technologies and, more specifically, to a polarization capture device, system, and method.

BACKGROUND

[0003] Cameras have been widely used in a large variety of devices, such as mobile phones, augmented reality (“AR”) devices, virtual reality (“VR”) devices, vehicles, drones, detecting systems for various application in atmospheric science, remote sensing, facial recognition, eye-tracking, machine vision, and the like. An object may produce polarized features that are related to the nature of the object when reflecting, diffracting, transmitting, refracting, and/or scattering an incoming light. Therefore, polarization information may be used to determine various properties of the object. Polarization cameras have been used to capture images of objects including the polarization information.

SUMMARY

[0004] One aspect of the present disclosure provides a device that includes a first lens. The device also includes a first polarized image sensor coupled with the first lens and configured to capture, from a first perspective, a first set of image data in a plurality of polarization orientations. The device also includes a second lens disposed apart from the first lens. The device further includes a second polarized image sensor coupled with the second lens and configured to capture, from a second perspective different from the first perspective, a second set of image data in the plurality of polarization orientations.

[0005] Another aspect of the present disclosure provides a system. The system includes a first polarization camera configured to capture a first set of image data from a first perspective in a plurality of polarization orientations. The system also includes a second polarization camera configured to capture a second set of image data from a second perspective different from the first perspective in the plurality of polarization orientations.

[0006] Another aspect of the present disclosure provides a method. The method includes obtaining a first set of image data in a plurality of polarization orientations through a first polarization camera in a first perspective. The method also includes obtaining a second set of image data in the plurality of polarization orientations through a second polarization camera in a second perspective different from the first perspective. The method further includes determining, through a processor, multi-modal data based on the first set of image data and the second set of image data. The multi-modal data include data for a plurality of polarization color images, one or more polarization parameters, and depth information.

[0007] Other aspects of the present disclosure can be understood by those skilled in the art in light of the description, the claims, and the drawings of the present disclosure. The foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the claims.

BRIEF DESCRIPTION OF THE DRAWINGS

[0008] The following drawings are provided for illustrative purposes according to various disclosed embodiments and are not intended to limit the scope of the present disclosure. In the drawings:

[0009] FIGS. 1A and 1B schematically illustrate a polarization capture device, according to an embodiment of the present disclosure;

[0010] FIG. 2 schematically illustrates a structure of a polarized image sensor, according to an embodiment of the present disclosure;

[0011] FIG. 3 schematically illustrates an example pattern of a polarizer array, according to an embodiment of the present disclosure;

[0012] FIG. 4 schematically illustrates an image processing unit, according to an embodiment of the present disclosure;

[0013] FIGS. 5A and 5B illustrate steps of a method of constructing a plurality of polarization color images, according to an embodiment of the present disclosure;

[0014] FIGS. 6A, 6B, and 6C illustrate an RGB (red, green, and blue) image of an eye, a degree of polarization image of the eye, and an angle of polarization image of the eye, respectively;

[0015] FIGS. 7A, 7B, and 7C illustrate an RGB (red, green, and blue) image of a house with windows, a degree of polarization image of the house with windows, and an angle of polarization image of the house with windows, respectively;

[0016] FIG. 8 schematically illustrates a method of calculating a depth of an object, according to an embodiment of the present disclosure;

[0017] FIG. 9 schematically illustrates a polarization capture system, according to an embodiment of the present disclosure;

[0018] FIG. 10 is a flowchart illustrating a method for obtaining multi-modal data, according to an embodiment of the present disclosure;

[0019] FIG. 11A illustrates a schematic diagram of a near-eye display (“NED”) including a polarization capture device, according to an embodiment of the present disclosure;

[0020] FIG. 11B illustrates a schematic diagram of a cross section view of a half of the NED shown in FIG. 11A, according to an embodiment of the present disclosure; and

[0021] FIG. 12 is a flowchart illustrating a method according to an embodiment of the present disclosure.

DETAILED DESCRIPTION

[0022] Embodiments consistent with the present disclosure will be described with reference to the accompanying drawings, which are merely examples for illustrative purposes and are not intended to limit the scope of the present disclosure. Wherever possible, the same reference numbers are used throughout the drawings to refer to the same or similar parts, and a detailed description thereof may be omitted.

[0023] Further, in the present disclosure, the disclosed embodiments and the features of the disclosed embodiments may be combined. The described embodiments are some but not all of the embodiments of the present disclosure. Based on the disclosed embodiments, persons of ordinary skill in the art may derive other embodiments consistent with the present disclosure. For example, modifications, adaptations, substitutions, additions, or other variations may be made based on the disclosed embodiments. Such variations of the disclosed embodiments are still within the scope of the present disclosure. Accordingly, the present disclosure is not limited to the disclosed embodiments. Instead, the scope of the present disclosure is defined by the appended claims.

[0024] As used herein, the terms “couple,” “coupled,” “coupling,” or the like may encompass an optical coupling, a mechanical coupling, an electrical coupling, an electromagnetic coupling, or any combination thereof. An “optical coupling” between two optical elements refers to a configuration in which the two optical elements are arranged in an optical series, and a light beam output from one optical element may be directly or indirectly received by the other optical element. An optical series refers to optical positioning of a plurality of optical elements in a light beam path, such that a light beam output from one optical element may be transmitted, reflected, diffracted, converted, modified, or otherwise processed or manipulated by one or more of other optical elements. In some embodiments, the sequence in which the plurality of optical elements are arranged may or may not affect an overall output of the plurality of optical elements. A coupling may be a direct coupling or an indirect coupling (e.g., coupling through an intermediate element).

[0025] The phrase “at least one of A or B” may encompass all combinations of A and B, such as A only, B only, or A and B. Likewise, the phrase “at least one of A, B, or C” may encompass all combinations of A, B, and C, such as A only, B only, C only, A and B, A and C, B and C, or A and B and C. The phrase “A and/or B” may be interpreted in a manner similar to that of the phrase “at least one of A or B.” For example, the phrase “A and/or B” may encompass all combinations of A and B, such as A only, B only, or A and B. Likewise, the phrase “A, B, and/or C” has a meaning similar to that of the phrase “at least one of A, B, or C.” For example, the phrase “A, B, and/or C” may encompass all combinations of A, B, and C, such as A only, B only, C only, A and B, A and C, B and C, or A and B and C.

[0026] When a first element is described as “attached,” “provided,” “formed,” “affixed,” “mounted,” “secured,” “connected,” “bonded,” “recorded,” or “disposed,” to, on, at, or at least partially in a second element, the first element may be “attached,” “provided,” “formed,” “affixed,” “mounted,” “secured,” “connected,” “bonded,” “recorded,” or “disposed,” to, on, at, or at least partially in the second element using any suitable mechanical or non-mechanical manner, such as depositing, coating, etching, bonding, gluing, screwing, press-fitting, snap-fitting, clamping, etc. In addition, the first element may be in direct contact with the second element, or there may be an intermediate element between the first element and the second element. The first element may be disposed at any suitable side of the second element, such as left, right, front, back, top, or bottom.

[0027] When the first element is shown or described as being disposed or arranged “on” the second element, term “on” is merely used to indicate an example relative orientation between the first element and the second element. The description may be based on a reference coordinate system shown in a figure, or may be based on a current view or example configuration shown in a figure. For example, when a view shown in a figure is described, the first element may be described as being disposed “on” the second element. It is understood that the term “on” may not necessarily imply that the first element is over the second element in the vertical, gravitational direction. For example, when the assembly of the first element and the second element is turned 180 degrees, the first element may be “under” the second element (or the second element may be “on” the first element). Thus, it is understood that when a figure shows that the first element is “on” the second element, the configuration is merely an illustrative example. The first element may be disposed or arranged at any suitable orientation relative to the second element (e.g., over or above the second element, below or under the second element, left to the second element, right to the second element, behind the second element, in front of the second element, etc.).

[0028] When the first element is described as being disposed “on” the second element, the first element may be directly or indirectly disposed on the second element. The first element being directly disposed on the second element indicates that no additional element is disposed between the first element and the second element. The first element being indirectly disposed on the second element indicates that one or more additional elements are disposed between the first element and the second element.

[0029] The term “processor” used herein may encompass any suitable processor, such as a central processing unit (“CPU”), a graphics processing unit (“GPU”), an application-specific integrated circuit (“ASIC”), a programmable logic device (“PLD”), or any combination thereof. Other processors not listed above may also be used. A processor may be implemented as software, hardware, firmware, or any combination thereof.

[0030] The term “controller” may encompass any suitable electrical circuit, software, or processor configured to generate a control signal for controlling a device, a circuit, an optical element, etc. A “controller” may be implemented as software, hardware, firmware, or any combination thereof. For example, a controller may include a processor, or may be included as a part of a processor.

[0031] The term “non-transitory computer-readable medium” may encompass any suitable medium for storing, transferring, communicating, broadcasting, or transmitting data, signal, or information. For example, the non-transitory computer-readable medium may include a memory, a hard disk, a magnetic disk, an optical disk, a tape, etc. The memory may include a read-only memory (“ROM”), a random-access memory (“RAM”), a flash memory, etc.

[0032] The term “communicatively coupled” or “communicatively connected” indicates that related items are coupled or connected through an electrical and/or electromagnetic coupling or connection, such as a wired or wireless communication connection, channel, or network.

[0033] The wavelength ranges, spectra, or bands mentioned in the present disclosure are for illustrative purposes. The disclosed optical device, system, element, assembly, and method may be applied to a visible wavelength range, as well as other wavelength ranges, such as an ultraviolet (“UV”) wavelength range, an infrared (“IR”) wavelength range, or a combination thereof.

[0034] The present disclosure provides a polarization capture device or system configured to capture color information, polarization information, and depth information of an object under a nature light. The polarization capture device or system may be a stereo polarization capture device or system. The stereo polarization capture system may be implemented in a large variety of devices, such as mobile phones, augmented reality (“AR”) devices, virtual reality (“VR”) devices, mixed reality (“MR”) devices, vehicles, drones, detecting systems for various application in atmospheric science, remote sensing, facial recognition, eye-tracking, machine vision, and the like. The color information, polarization information, and depth information extracted from images of the object may be useful to realize other functions, obtain other physical properties of the object, and/or determine an operation state of the object.

[0035] FIGS. 1A and 1B schematically illustrate a polarization capture device 100, according to an embodiment of the present disclosure. The polarization capture device 100 may be a stereo polarization capture system, which may include two or more lenses (e.g., camera lenses) and two or more polarized image sensors. For example, the stereo polarization capture device 100 may include at least two (e.g., at least a pair of) camera lenses 110 and 111 (e.g., first camera 110 and second camera 111), at least two (e.g., at least a pair of) polarized image sensors 120 and 121 (e.g., first polarized image sensor 120 and second polarized image sensor 121). In some embodiments, the first camera lens 110 and the first polarized image sensor 120 may be disposed apart, e.g., horizontally apart, from the second camera lens 111 and the second polarized image sensor 121. That is, the first camera lens 110 may be disposed side by side with the second camera lens 111. The first polarized image sensor 120 may be disposed side by side with the second polarized image sensor 121. The first polarized camera sensor 120 may be coupled with the first camera lens 110 in an optical series. The second polarized camera sensor 121 may be coupled with the second camera lens 111 in an optical series. The first camera lens 110 may be configured to guide lights onto the first polarized image sensor 120, and the second camera lens 111 may be configured to guide lights onto the second polarized image sensor 121. The first camera lens 110 and the first polarized image sensor 120, and the second camera lens 111 and the second polarized image sensor 121 may be configured to capture two sets of image data representing two images (e.g., a first set of image data and a second set of image data) of an object or a scene including an object from two different perspectives (e.g., a first perspective and a second perspective different from, and non-parallel with the first perspective). The polarization capture device 100 may also include a processor 130 coupled to the first and second polarized image sensors 120 and 121. The processor 130 may be configured to process the two sets of image data captured by the first and second polarized image sensors 120 and 121. The polarization capture device 100 may include a memory 140 coupled to the processor 130. The memory 140 may be configured to store computer-executable codes or instructions.

[0036] The first camera lens 110 and the second camera lens 111 may include one or more lenses. In some embodiments, the first polarized image sensor 120 may be arranged at a focal plane of the first camera lens 110. The second polarized image sensor 121 may be arranged at a focal plane of the second camera lens 111. The first and second camera lenses 110 and 111 may be fixedly attached or removably mounted to a housing of the polarization capture device 100.

[0037] In some embodiments, the first and second camera lenses 110 and 111 and the first and second polarized image sensors 120 and 121 may be integrated in a single camera. In some embodiments, the first camera lens 110 and the first polarized image sensor 120 may be arranged in a first camera, the second camera lens 111 and the second polarized image sensor 121 may be arranged in a second camera.

[0038] As shown in FIGS. 1A and 1B, the object may be illuminated by a nature light source, e.g., the sun. The first and second polarized image sensors 120 and 121 may receive incoming lights reflected, scattered, diffracted, transmitted, and/or refracted from an object from different perspectives. The reflection, scattering, diffraction, transmission, and/or refraction of lights by the object may be collectively referred to as deflection for discussion convenience. The polarization capture device 100 may be configured to process the light from the object 185. In some embodiments, as shown in FIG. 1A, optical axes of the first and second camera lenses 110 and 111 may be parallel to each other. In some embodiments, as shown in FIG. 1B, the optical axes of the first and second camera lenses 110 and 111 may be tilted with respect to each other, i.e., the optical axes may cross with each other.

[0039] As shown in FIGS. 1A and 1B, the first and second polarized image sensors 120 and 121 may be disposed apart from each other (e.g., in a horizontal direction). A distance between the first and second polarized image sensors 120 and 121 may be referred to as a baseline distance. In some embodiments, the baseline distance can be fixed (or constant). In some embodiments, the baseline distance may be adjustable to satisfy requirements of different applications.

[0040] FIG. 2 schematically illustrates a structure of a polarized image sensor, which may be an embodiment of the first and second polarized image sensors 120 and 121, according to an embodiment of the present disclosure. As shown in FIG. 2, the polarized image sensor 120 (or 121) may include a microlens array 1201 arranged at a top layer of the polarized image sensor 120 (or 121), a pixel array 1202 arranged at a bottom layer of the polarized image sensor 120 (or 121), and a polarizer array 1203 disposed between the microlens array 1201 and the pixel array 1202. The microlens array 1201 may be located closer to the camera lens 110 (or 111) than the pixel array 1202 and the polarizer array 1203. The microlens array 1201 may include a plurality (or array) of micro lenses 1211. The pixel array 1202 may include a plurality (or array) of photosensors 1212, such as photodiodes. In some embodiments, the polarized image sensor 120 (or 121) may further include a cover (e.g., a glass cover) arranged on the microlens array 1201 to protect the microlens array 1201 from dust or being scratched. The microlens array 1201, the polarizer array 1203, and the pixel array 1202 may be optically coupled (e.g., arranged in a stacked configuration in an optical series). In some embodiments, an incoming light may sequentially propagate through the microlens array 1201, the polarizer array 1203, and the pixel array 1202.

[0041] The polarizer array 1203 may include a plurality (or array) of polarizers arranged in a repeating pattern. The plurality of polarizers may have different transmission axis (or polarization axis) orientations. The plurality of polarizers may be configured to filter an incoming light based on a polarization orientation of the incoming light and the transmission axis orientations of the polarizers. A polarization orientation of the incoming light refers to an orientation of oscillations of an electrical field of the incoming light perpendicular to a propagating direction of the incoming light. Each of the plurality of polarizers may be configured to allow an incoming light of a predetermined polarization orientation to transmit through. The plurality of polarizers may include, but are not limited to, dichromic polarizers, crystalline polarizers, wire grid polarizers, or the like. In some embodiments, the polarizer array 1203 may include nano wire-grid polarizers coated with an anti-reflection material that suppresses flaring and ghosting. The incoming light may be filtered by the polarizer array 1203 based on the polarization orientation before being received by the pixel array 1202. As such, the polarized image sensors 120 and 121 may output two sets of image data, each set of image data associated with a plurality of polarization orientations. The two sets of image data output from the polarized image sensors 120 and 121 may represent two images.

[0042] FIG. 3 illustrates an example repeating pattern 300 of the polarizer array 1203, according to an embodiment of the present disclosure. The polarizer array 1203 may include any suitable number of the repeating patterns 300. The repeating pattern 300 may include four different polarizers associated with four different polarization axis orientations (e.g., 90.degree., 45.degree., 135.degree. (or -45.degree.), and 0.degree.) corresponding to four pixels in the pixel array 1202, respectively. In the embodiment shown in FIG. 3, the repeating pattern 300 may include a linear horizontal polarizer (i.e., 0-degree polarizer) 1213-a, a linear vertical polarizer (i.e., 90-degree polarizer) 1213-b, a linear 45-degree polarizer 1213-c, and a linear 135-degree polarizer (or a -45-degree polarizer) 1213-d arranged side by side in a 2.times.2 array. The linear horizontal polarizer (i.e., 0-degree polarizer) 1213-a may be configured to allow an incoming light having a horizontal polarization orientation (i.e., 0.degree. polarization orientation) to transmit through. The linear vertical polarizer (i.e., 90-degree polarizer) 1213-b may be configured to allow an incoming light having a vertical polarization orientation (i.e., 90.degree. polarization orientation) to transmit through. The linear 45-degree polarizer 1213-c may be configured to allow an incoming light having a 45.degree. polarization orientation to transmit through. The linear 135-degree polarizer (or a -45-degree polarizer) 1213-d may be configured to allow an incoming light having a 135.degree. polarization orientation (i.e., -45.degree. polarization orientation) to transmit through. That is, the repeating pattern 300 in the 2.times.2 array format may correspond to a four-pixel area (e.g., a 2.times.2 pixel array) in the pixel array 1202.

[0043] FIG. 3 merely shows one example layout of the repeating pattern 300 in the 2.times.2 array format, in which the linear vertical polarizer 1213-b is arranged at an upper-left pixel, the linear horizontal polarizer 1213-a is arranged at a lower-right pixel, the 135.degree. (or -45.degree.) linear polarizer 1213-d is arranged at an upper-right pixel, and the 45.degree. linear polarizer 1213-c is arranged at a lower-left pixel. In other embodiments, the polarizers 1213-a, 1213-b, 1213-c, and 1213-d may be arranged in any other suitable layout.

[0044] In some embodiments, as shown in FIG. 2, the polarized image sensor 120 (or 121) may further include a color filter array 1204 configured to obtain a plurality of polarization color images. In some embodiments, the color filter array 1204 may be disposed between the microlens array 1201 and the polarizer array 1203. In some embodiments, the color filter array 1204 may be disposed between the polarizer array 1203 and the pixel array 1202. The color filter array 1204 may include a plurality of color filters arranged in a predetermined color filter pattern. The color filters may include, for example, a red color (“R”) filter, a green color (“G”) filter, and a blue color (“B”) filter. The predetermined color filter pattern may be a Bayer filter pattern (e.g., BGGR, RGBG, GRGB, or RGGB). For example, the Bayer filter pattern may have a 2.times.2 array format. The microlens array 1201, the pixel array 1202, the polarizer array 1203, and the color filter array 1204 may have the same number of 2.times.2 arrays.

[0045] FIG. 4 illustrates an image processing unit 400, according to an embodiment of the present disclosure. The image processing unit 400 may be included in the first and second polarized image sensors 120 and 121. The image processing unit 400 may include a layer of a polarizer array (e.g., polarizer array 1203) and a layer of color filter array (e.g., color filter array 1204). FIG. 4 shows an schematic illustration of the two layers combined together. The combination of the layer of the polarizer array and the layer of the color filter array may be stacked as shown in FIG. 2 to form a pixel filter array. That is, the polarizer array 1203 and the color filter array 1204 may form a pixel filter array. The polarizer array 1203 may include a plurality of repeating patterns 300 shown in FIG. 3. The color filter array may include a suitable number of Bayer filter patterns. As shown in FIG. 4, an RGGB Bayer filter may be combined with a number of repeating 2.times.2 polarizer array to form a 16-pixel image processing unit 400. In FIG. 4, “R+H” represents a pixel filter formed by a stacked combination of a red color (“R”) filter and a horizontal polarizer, “R+V” represents a pixel filter formed by a stacked combination of an R color filter and a vertical polarizer, “R+45” represents a pixel filter formed by a stacked combination of an R color filter and a 45.degree. polarizer, “R+135” represents a pixel filter formed by a stacked combination of an R color filter and a 135.degree. polarizer, “G+H” represents a pixel filter formed by a stacked combination of a green color (“G”) filter and a horizontal polarizer, “G+V” represents a pixel filter formed by a stacked combination of a G color filter and a vertical polarizer, “G+45” represents a pixel filter formed by a stacked combination of a G color filter and a 45.degree. polarizer, “G+135” represents a pixel filter formed by a stacked combination of a G color filter and a 135.degree. polarizer, “B+H” represents a pixel filter formed by a stacked combination of a blue color (“B”) filter and a horizontal polarizer, “B+V” represents a pixel filter formed by a stacked combination of a B color filter and a vertical polarizer, “B+45” represents a pixel filter formed by a stacked combination of a B color filter and a 45.degree. polarizer, and “B+135” represents a pixel filter formed by a stacked combination of a B color filter and a 135.degree. polarizer.

[0046] For the image processing unit 400, the image data including an R value from the horizontal polarizer, an R value from the vertical polarizer, an R value from the 45.degree. polarizer, an R value from the 135.degree. polarizer, two G values from the horizontal polarizer, two G values from the vertical polarizer, two G values from the 45.degree. polarizer, two G values from the 135.degree. polarizer, a B value from the horizontal polarizer, a B value from the vertical polarizer, a B value from the 45.degree. polarizer, and a B value from the 135.degree. polarizer.

[0047] Referring back to FIGS. 1A and 1B, the processor 130 may be further configured to construct a plurality of polarization color images based on the image data from the plurality of pixel filters (combinations of the polarizer array and the color filter array). In some embodiments, the processor 130 may be configured to construct two sets of polarization color images from the two sets of image data captured by the polarization image sensors 120 and 121. That is, a first set of polarization color images may be constructed based on the first set of image data captured by a first image processing unit (an embodiment of the image processing unit 400) included in the first polarization image sensor 120 and a second set of polarization color images may be constructed based on the second set of image data captured by a second image processing unit (an embodiment of the image processing unit 400) included in the second polarization image sensor 121. In some embodiments, the processor 130 may be configured to construct one of the first set or the second set of polarization color images based on the image data captured by one of the first polarization image sensor 120 or the second polarization image sensor 121.

[0048] FIG. 5A illustrates a method of converting a set of original image data (e.g., raw image data) 500 to polarization image data with polarization interpolation, according to an embodiment of the present disclosure. The image data 500 may be obtained by any of the polarization image sensors 120 and 121. As shown in FIG. 5A, each pixel may capture a single color value (either R, G, or B value) through a single one of the polarizers associated with a predetermined polarization orientation, as shown in FIG. 3. The processor 130 may be configured to obtain polarization image data corresponding to each polarization orientation based on and from the raw RGB image data via polarization interpolation. One of the polarization interpolation methods is a nearest neighbor method. According to the nearest neighbor method, for a pixel with missing image data, image data of a nearest neighboring pixel may be used as the missing image data. As such, polarization image data corresponding to each polarization orientation may be obtained. As shown in FIG. 5A, raw RGB image data (in the form of an array) 500 may be processed with the polarization interpolation to obtain polarization image data for each polarization orientation, e.g., polarization image data 505 for the vertical polarization orientation (represented by a “V array”), polarization image data 506 for the 135.degree. polarization orientation (represented by a “135 array”), polarization image data 507 for the 45.degree. polarization orientation (represented by a “45 array”), and polarization image data 508 for the horizontal polarization orientation (represented by an “H array”).

[0049] In addition to the nearest neighbor method, the interpolation algorithm may be other methods, such as, for example, bilinear interpolation, bicubic interpolation, bicubic spline interpolation, gradient-based interpolation, residual interpolation, Newton’s polynomial interpolation algorithms, etc. The interpolation method or algorithm may be selected according to a desirable accuracy, and/or implementational and computational complexity. For example, a nearest neighbor algorithm may be used to reduce the computational burden on the processor 130 of the polarization capture device 100. In some embodiments, a residual interpolation algorithm or a Newton’s polynomial interpolation algorithm may be used to achieve a high accuracy.

……
……
……

您可能还喜欢...