雨果巴拉:行业北极星Vision Pro过度设计不适合市场

Sony Patent | Observation device, observation method, and observation system

Patent: Observation device, observation method, and observation system

Drawings: Click to check drawins

Publication Number: 20210271067

Publication Date: 20210902

Applicant: Sony

Abstract

To obtain a more accurate image by improving a utilization efficiency of light energy while at the same time suppressing with a simpler method distortion that may occur in an inline hologram when a plurality of lights having different wavelengths are used, an observation device (1) according to the present disclosure includes a light source part (11) in which a plurality of light emitting diodes (101) having different light emission wavelengths with a length of each light emission point being smaller than 100.lamda. (.lamda.: light emission wavelength) are arranged such that a separation distance between the adjacent light emitting diodes is equal to or smaller than 100.lamda. (.lamda.: light emission wavelength); and an image sensor (13) installed so as to be opposed to the light source part with respect to an observation target object.

Claims

  1. An observation device comprising: a light source part in which a plurality of light emitting diodes having different light emission wavelengths with a length of each light emission point being smaller than 100.lamda. (.lamda.: light emission wavelength) are arranged such that a separation distance between the adjacent light emitting diodes is equal to or smaller than 100.lamda. (.lamda.: light emission wavelength); and an image sensor installed so as to be opposed to the light source part with respect to an observation target object.

  2. The observation device according to claim 1, wherein a length of the separation distance is equal to or smaller than five times the length of the light emission point.

  3. The observation device according to claim 1, wherein a bandpass filter setting a transmission wavelength band to a peak wavelength of each of the plurality of light emitting diodes is installed between the observation target object and the light source part.

  4. The observation device according to claim 1, further comprising a calculation processing part for executing calculation processing for obtaining an image of the observation target object by using a photographed image for each light emission wavelength, the photographed image being generated by the image sensor, wherein the calculation processing part comprises: a preprocessing part for executing, for the photographed image for each light emission wavelength, preprocessing including at least shift correction of the image that depends on a positional relationship among the plurality of light emitting diodes; and a reconstruction processing part for reconstructing the image of the observation target object by using the preprocessed photographed image.

  5. The observation device according to claim 4, wherein the preprocessing part is configured to execute the shift correction so as to cancel a positional deviation between the photographed images due to positions at which the respective light emitting diodes are installed.

  6. The observation device according to claim 4, wherein the preprocessing part is configured to: select one light emitting diode serving as a reference from among the plurality of light emitting diodes; and shift spatial coordinates of the photographed images which are photographed by using the remaining light emitting diodes other than the light emitting diode serving as the reference in a direction of the photographed image which is photographed by using the light emitting diode serving as the reference among the plurality of light emitting diodes.

  7. The observation device according to claim 4, wherein the light source part includes the three light emitting diodes having different light emission wavelengths arranged in one row, and the preprocessing part is configured to shift spatial coordinates of the photographed images which are photographed by using the light emitting diodes positioned at both ends in a direction of the photographed image which is photographed by using the light emitting diode positioned at a center by a correction amount .delta. calculated by the following expression (1): [ Math . .times. 1 ] .delta. = p .times. Z L - Z expression .times. .times. ( 1 ) ##EQU00007## where, in the expression (1), .delta. represents a correction amount, L represents a distance between the light source part and the image sensor, Z represents a distance between the observation target object and the image sensor, and p represents a distance between the light emitting diodes.

  8. The observation device according to claim 4, wherein the light source part includes the three light emitting diodes having different light emission wavelengths arranged in a triangle, and the preprocessing part is configured to shift spatial coordinates of the photographed images which are photographed by using any two of the light emitting diodes in a direction of the photographed image which is photographed by using the one remaining light emitting diode.

  9. The observation device according to claim 1, wherein the observation target object is a biomaterial.

  10. An observation method comprising: applying light to an observation target object for each light emission wavelength by a light source part in which a plurality of light emitting diodes having different light emission wavelengths with a length of each light emission point being smaller than 100.lamda. (.lamda.: light emission wavelength) are arranged such that a separation distance between the adjacent light emitting diodes is equal to or smaller than 100.lamda. (.lamda.: light emission wavelength); and photographing an image of the observation target object for each light emission wavelength by an image sensor installed so as to be opposed to the light source part with respect to the observation target object.

  11. An observation system comprising: a light source part in which a plurality of light emitting diodes having different light emission wavelengths with a length of each light emission point being smaller than 100.lamda. (.lamda.: light emission wavelength) are arranged such that a separation distance between the adjacent light emitting diodes is equal to or smaller than 100.lamda. (.lamda.: light emission wavelength); an image sensor installed so as to be opposed to the light source part with respect to an observation target object; and a calculation processing part for executing calculation processing of obtaining an image of the observation target object by using a photographed image for each light emission wavelength which is generated by the image sensor.

Description

TECHNICAL FIELD

[0001] The present disclosure relates to an observation device, an observation method, and an observation system.

BACKGROUND ART

[0002] Hitherto, there has been proposed, as a small and low-cost microscope, a lensless microscope (also called “lensfree microscope”) that does not use an optical lens. Such a lensless microscope includes an image sensor and a coherent light source. In the lensless microscope, the coherent light source emits light, and a plurality of inline holograms, which are obtained by light diffracted by an observation target object such as a biomaterial and the light directly emitted by the coherent light source, are photographed by changing a condition such as a distance or a wavelength. After that, an amplified image and a phase image of the observation target object are reconstructed by light propagation calculation, and those images are provided to a user.

[0003] In such a lensless microscope, hitherto, a combination of a light emitting diode (LED) and a space aperture (e.g., pinhole or single-core optical fiber) has been used as the coherent light source. For example, NPL 1 described below discloses a lensless microscope using a coherent light source that is a combination of a light emitting diode and a pinhole.

CITATION LIST

Non Patent Literature

[0004] [NPL 1] [0005] O. Mudanyali et al., Lab Chip, 2010, 10, pp. 1417-1428.

SUMMARY

Technical Problem

[0006] However, in the combination of an LED and a space aperture as disclosed in NPL 1 described above, a large proportion of light emitted by the LED cannot pass through the space aperture, leading to low energy utilization efficiency. As a result, the cost of a power source part or the like increases, and an original advantage of the lensless microscope cannot be obtained sufficiently.

[0007] Further, when an inline hologram is obtained by changing a separation distance between an image sensor and an observation target object, the lensless microscope as disclosed in NPL 1 described above performs control of changing a position of a stage at which the observation target object is placed, for example. However, when the accuracy of determining the position of the stage is low, a deviation in position of the stage causes an error, resulting in decrease in accuracy of the obtained image.

[0008] Further, when a plurality of lights having different wavelengths are used, a difference in angle of a ray becomes larger as a distance from a light emission point becomes larger, leading to such a concern that distortion occurs in the recorded inline hologram and reconstruction of the image has an error. In order to prevent distortion due to the difference in angle of a ray, first, it is conceivable to adopt a solution such as introduction of a plurality of lights by the same optical fiber and combination of the plurality of lights by using a dichroic mirror. However, when such a solution is used, the entire microscope becomes larger and the cost increases, which contradicts such an advantage that the lensless microscope is small and low in cost.

[0009] In view of the above-mentioned circumstances, the present disclosure proposes an observation device, an observation method, and an observation system, which are capable of obtaining a more accurate image by improving the utilization efficiency of light energy while at the same time suppressing with a simpler method distortion that may occur in an inline hologram when a plurality of lights having different wavelengths are used.

Solution to Problem

[0010] According to the present disclosure, there is provided an observation device including a light source part in which a plurality of light emitting diodes having different light emission wavelengths with a length of each light emission point being smaller than 100.lamda. (.lamda.: light emission wavelength) are arranged such that a separation distance between the adjacent light emitting diodes is equal to or smaller than 100.lamda. (.lamda.: light emission wavelength); and an image sensor installed so as to be opposed to the light source part with respect to an observation target object.

[0011] Further, according to the present disclosure, there is provided an observation method including: applying light to an observation target object for each light emission wavelength by a light source part in which a plurality of light emitting diodes having different light emission wavelengths with a length of each light emission point being smaller than 100.lamda. (.lamda.: light emission wavelength) are arranged such that a separation distance between the adjacent light emitting diodes is equal to or smaller than 100.lamda. (.lamda.: light emission wavelength); and photographing an image of the observation target object for each light emission wavelength by an image sensor installed so as to be opposed to the light source part with respect to the observation target object.

[0012] Further, according to the present disclosure, there is provided an observation system including: a light source part in which a plurality of light emitting diodes having different light emission wavelengths with a length of each light emission point being smaller than 100.lamda. (.lamda.: light emission wavelength) are arranged such that a separation distance between the adjacent light emitting diodes is equal to or smaller than 100.lamda. (.lamda.: light emission wavelength); an image sensor installed so as to be opposed to the light source part with respect to an observation target object; and a calculation processing part for executing calculation processing of obtaining an image of the observation target object by using a photographed image for each light emission wavelength, which is generated by the image sensor.

[0013] According to the present disclosure, a light source part including a plurality of light emitting diodes installed so as to satisfy a predetermined condition applies light to an observation target object, and an inline hologram that is caused by the applied light is photographed by an image sensor installed so as to be opposed to the light source part with respect to the observation target object.

Advantageous Effects of Invention

[0014] As described above, according to the present disclosure, it is possible to obtain a more accurate image by improving a utilization efficiency of light energy while at the same time suppressing with a simpler method distortion that may occur in an inline hologram when a plurality of lights having different wavelengths are used.

[0015] The above-mentioned effect is not necessarily given in a limited manner, and in addition to or instead of the above-mentioned effect, any effect shown in this specification or other effects that may be grasped based on this specification may be exhibited.

BRIEF DESCRIPTION OF DRAWINGS

[0016] FIG. 1A is an explanatory diagram schematically illustrating an example of a configuration of an observation device according to an embodiment of the present disclosure.

[0017] FIG. 1B is an explanatory diagram schematically illustrating another example of the configuration of the observation device according to the embodiment.

[0018] FIG. 2A is an explanatory diagram schematically illustrating an example of a configuration of a light source part included in the observation device according to the embodiment.

[0019] FIG. 2B is an explanatory diagram schematically illustrating another example of the configuration of the light source part included in the observation device according to the embodiment.

[0020] FIG. 3 is a block diagram illustrating an example of a configuration of a calculation processing part included in the observation device according to the embodiment.

[0021] FIG. 4 is a block diagram illustrating an example of a configuration of an image calculation part included in the calculation processing part according to the embodiment.

[0022] FIG. 5 is a block diagram illustrating an example of a configuration of a preprocessing part included in the image calculation part according to the embodiment.

[0023] FIG. 6 is an explanatory diagram for describing reconstruction processing to be executed by a reconstruction processing part included in the image calculation part according to the embodiment.

[0024] FIG. 7 is a flow chart illustrating an example of a flow of the reconstruction processing according to the embodiment.

[0025] FIG. 8 is an explanatory diagram for describing the reconstruction processing to be executed by a reconstruction processing part included in the image calculation part according to the embodiment.

[0026] FIG. 9 is an explanatory diagram illustrating an example of a reconstructed image obtained by the observation device according to the embodiment.

[0027] FIG. 10 is a block diagram illustrating an example of a hardware configuration of the calculation processing part according to the embodiment.

[0028] FIG. 11 is a flow chart illustrating an example of a flow of an observation method according to the embodiment.

[0029] FIG. 12 is an explanatory diagram for describing an embodiment example.

[0030] FIG. 13 is an explanatory diagram for describing the embodiment example.

[0031] FIG. 14 is an explanatory diagram for describing the embodiment example.

DESCRIPTION OF EMBODIMENTS

[0032] In the following, description is given in detail of a preferred embodiment of the present disclosure with reference to the attached drawings. In this specification and the drawings, components having substantially the same functional configuration are assigned with the same reference numeral, and redundant description thereof is omitted.

[0033] Description is given in the following order.

  1. Embodiment

1.1 Observation Device

1.1.1 Overall Configuration of Observation Device and Hologram Acquisition Part

1.1.2 Calculation Processing Part

1.2 Observation Method

  1. Embodiment Example

Embodiment

[0034] In the following, description is given in detail of an observation device according to an embodiment of the present disclosure with reference to FIG. 1A to FIG. 10.

[Overall Configuration of Observation Device and Hologram Acquisition Part]

[0035] First, description is given in detail of an overall configuration of an observation device according to this embodiment and a hologram acquisition part included in the observation device according to this embodiment with reference to FIG. 1A to FIG. 2B.

[0036] FIG. 1A is an explanatory diagram schematically illustrating an example of a configuration of the observation device according to this embodiment, and FIG. 1B is an explanatory diagram schematically illustrating another example of the configuration of the observation device according to this embodiment. FIG. 2A is an explanatory diagram schematically illustrating an example of a configuration of a light source part included in the observation device according to this embodiment, and FIG. 2B is an explanatory diagram schematically illustrating another example of the configuration of the light source part included in the observation device according to this embodiment.

[0037] Overall Configuration of Observation Device

[0038] An observation device 1 according to this embodiment is a device to be used for observing a predetermined observation target object, and is a device for reconstructing an image of the observation target object by using a hologram (more specifically, inline hologram) image that occurs due to interference between light that has passed through the observation target object and light diffracted by the observation target object.

[0039] Regarding the observation target object focused on by the observation device 1 according to this embodiment, any object can be set as the observation target object as long as the object transmits light used for observation to some extent and enables interference between light that has passed through the observation target object and light diffracted by the observation target object. Such an observation target object may include, for example, a phase object for which light having a predetermined wavelength used for observation can be considered to be transparent to some extent, and such a phase object may include, for example, various kinds of biomaterials such as a cell of a living thing, biological tissue, a sperm cell, an egg cell, a fertilized egg, or a microbe.

[0040] In the following, description is given based on an exemplary case in which a biomaterial such as a cell, which is an example of the observation target object, exists in a predetermined sample holder.

[0041] As illustrated in FIG. 1A and FIG. 1B, the observation device 1 according to this embodiment for observing the above-mentioned observation target object includes a hologram acquisition part 10 for observing the observation target object and acquiring a hologram image of the observation target object, and a calculation processing part 20 for executing a series of calculation processing of reconstructing an image of the focused observation target object based on the obtained hologram image.

[0042] The hologram acquisition part 10 according to this embodiment acquires a hologram image of an observation target object C existing in a sample holder H placed at a predetermined position of an observation stage St under control by the calculation processing part 20 described later. The hologram image of the observation target object C acquired by the hologram acquisition part 10 is output to the calculation processing part 20 described later. A detailed configuration of the hologram acquisition part 10 having such a function is described later again.

[0043] The calculation processing part 20 integrally controls the processing of acquiring a hologram image by the hologram acquisition part 10. Further, the calculation processing part 20 executes a series of processing of reconstructing an image of the focused observation target object C by using the hologram image acquired by the hologram acquisition part 10. The image acquired by such a series of processing is presented to the user of the observation device 1 as an image that has photographed the focused observation target object C. A detailed configuration of the calculation processing part 20 having such a function is described later again.

[0044] In the above, the overall configuration of the observation device 1 according to this embodiment has been briefly described.

[0045] The observation device 1 according to this embodiment can also be realized as an observation system constructed by a hologram acquisition unit including the hologram acquisition part 10 having a configuration as described later in detail and a calculation processing unit including the calculation processing part 20 having a configuration as described later in detail.

[0046] Hologram Acquisition Part

[0047] Next, description is given in detail of the hologram acquisition part 10 in the observation device 1 according to this embodiment with reference to FIG. 1A to FIG. 2B. In the following, for the sake of convenience, a positional relationship among members constructing the hologram acquisition part 10 is described by using a coordinate system illustrated in FIG. 1A to FIG. 2B.

[0048] As illustrated in FIG. 1A, the hologram acquisition part 10 according to this embodiment includes a light source part 11 for applying illumination light to be used for acquiring a hologram image of the observation target object C, and an image sensor 13 for photographing a generated hologram image of the observation target object C. The operations of such the light source part 11 and the image sensor 13 are controlled by the calculation processing part 20. Further, the calculation processing part 20 may control a z-direction position of the observation stage St provided in the hologram acquisition part 10.

[0049] Illumination light from the light source part 11 is applied to the observation target object C supported in the sample holder H placed on the observation stage St. As schematically illustrated in FIG. 1A, the sample holder H includes a support surface S1 for supporting the observation target object C. The sample holder H is not particularly limited, and for example, is a prepared slide including a glass slide and a glass cover, which has a light transmission property.

[0050] Further, the observation stage St has a region having a light transmission property of transmitting illumination light of the light source part 11, and the sample holder H is placed on such a region. The region having a light transmission property provided in the observation stage St may be formed by a glass or the like, for example, or may be formed by an opening that passes through the upper surface and bottom surface of the observation stage St along the z-axis direction.

[0051] When the illumination light is applied to the observation target object C, such illumination light is divided into transmitted light H1 passing though the observation target object C and diffracted light H2 diffracted by the observation target object C. Such transmitted light H1 and diffracted light H2 interfere with each other, so that a hologram (inline hologram) image of the observation target object C is generated on a sensor surface S2 of the image sensor 13 installed so as to be opposed to the light source part 11 with respect to the observation target object C. In this description, in the observation device 1 according to this embodiment, Z represents the length of a separation distance between the support surface S1 and the sensor surface S2, and L represents the length of a separation distance between the light source part 11 (more specifically, emission port of illumination light) and the image sensor 13 (sensor surface S2). In this embodiment, the transmitted light H1 functions as reference light for generating a hologram of the observation target object C. The hologram image (hereinafter also referred to as “hologram”) of the observation target object C generated in this manner is output to the calculation processing part 20.

[0052] As illustrated in FIG. 1B, the hologram acquisition part 10 according to this embodiment is preferred to additionally include a bandpass filter 15 on an optical path between the light source part 11 and the observation target object C. Such a bandpass filter 15 is designed to include the wavelength of illumination light applied by the light source part 11 in a transmission wavelength band. It is possible to obtain a hologram with a higher contrast and quality by additionally providing such a bandpass filter 15 and enabling improvement in spatial coherence and temporal coherence of illumination light applied by the light source part 11.

[0053] In this manner, the hologram acquisition part 10 according to this embodiment does not use a space aperture unlike a conventional lensless microscope, and thus can use energy of illumination light applied by the light source part 11 more efficiently.

[0054] In the observation device 1 according to this embodiment, the light source part 11 applies a plurality of illumination lights having different wavelengths. Such a light source part 11 includes a plurality of light emitting diodes (LED) having different light emission wavelengths and enabling application of partially coherent light in order to apply illumination lights having different wavelengths. Thus, the above-mentioned bandpass filter 15 functions as a multi-bandpass filter designed to have one or a plurality of transmission wavelength bands so as to handle the light emission wavelength of each LED.

[0055] The light emission wavelength of each LED constructing the light source part 11 is not particularly limited as long as the LEDs have different light emission wavelengths, and it is possible to use light having any light emission peak wavelength belonging to any wavelength band. The light emission wavelength (light emission peak wavelength) of each LED may belong to an ultraviolet light band, a visible light band, or a near-infrared band, for example. Further, each LED constructing the light source part 11 to be used may be any publicly known LED as long as the LED satisfies a condition on two types of lengths described later in detail.

[0056] In the light source part 11 according to this embodiment, the number of LEDs is not particularly limited as long as the number is equal to or larger than two. The size of the light source part 11 becomes larger as the number of LEDs becomes larger, and thus the light source part 11 is preferred to include three LEDs having different light emission wavelengths in consideration of reduction in size of the observation device 1. In the following, description is given based on an exemplary case in which the light source part 11 includes three LEDs having different light emission wavelengths.

[0057] In the light source part 11 according to this embodiment, the length of a light emission point of each LED constructing the light source part 11 is smaller than 100.lamda. (.lamda.: light emission wavelength). Further, each LED constructing the light source part 11 is arranged such that a separation distance between adjacent LEDs is equal to or smaller than 100.lamda. (.lamda.: light emission wavelength). At this time, as the light emission wavelength a serving as a reference for the length of a light emission point and the separation distance between LEDs, a shortest peak wavelength is used among peak wavelengths of light emitted by each LED included in the light source part 11.

[0058] The LEDs are adjacent to one another such that the length of each light emission point is smaller than 100.lamda. and the separation distance between adjacent LEDs is equal to or smaller than 100.lamda., which enables the observation device 1 according to this embodiment to obtain a more accurate image by enabling cancellation of distortion between holograms due to deviation in light emission point of the LED through use of simple shift correction described later in detail. A group of LEDs satisfying the above-mentioned two conditions are hereinafter also referred to as “micro LED”. When the length of each light emission point is equal to or larger than 100.lamda., or the separation distance between adjacent LEDs is larger than 100.lamda., deviation in light emission point between LEDs becomes significant, and even when shift correction as described later in detail is performed, distortion between holograms cannot be cancelled. The length of each light emission point is preferably smaller than 80.lamda., and more preferably smaller than 40.lamda.. Further, the separation distance between adjacent LEDs is preferably equal to or smaller than 80.lamda., and more preferably equal to or smaller than 60.lamda.. The length of each light emission point and the separation distance between adjacent LEDs are desired to be smaller without limitation, and a lower limit value is not particularly limited.

[0059] Further, the length of the above-mentioned separation distance is more preferably equal to or smaller than five times the length of the above-mentioned light emission point. The length of the light emission point and the length of the separation distance have the above-mentioned relationship, which enables distortion between holograms to be cancelled more reliably and an image with a further higher quality to be obtained. The length of the above-mentioned separation distance is more preferably equal to or smaller than one and a half times the length of the above-mentioned light emission point.

[0060] For example, as illustrated in FIG. 2A, three LEDs 101A, 101B, and 101C (in the following, a plurality of LEDs may be collectively referred to as “light emitting diode 101” or “LED 101”) having three different light emission wavelengths may be arranged in one row in the light source part 11 according to this embodiment. In the example illustrated in FIG. 2A, the three LEDs 101A, 101B, and 101C are arranged in one row along an x-axis direction. Further, in FIG. 2A, the length indicated by d corresponds to the length of the light emission point of the LED 101, and the length of a distance between centers indicated by p corresponds to the length of the separation distance (in other words, pitch between adjacent LEDs 101) between the adjacent LEDs 101.

[0061] Further, for example, as illustrated in FIG. 2B, the three LEDs 101A, 101B, and 101C having different light emission wavelengths may be arranged in a triangle in the light source part 11 according to this embodiment. In the example illustrated in FIG. BA, a mode in a case where the three LEDs 101A, 101B, and 101C of the light source part 11 are viewed from the above along the z-axis is schematically illustrated, and the three LEDs 101A, 101B, and 101C are arranged such that a contour of a set of the three LEDs 101A, 101B, and 101C forms a triangle on an xy-plane. Also in the example illustrated in FIG. 2B, the length indicated by d corresponds to the length of the light emission point of the LED 101, and the length of the distance between centers indicated by p corresponds to the length of the separation distance between the adjacent LEDs 101.

[0062] In the light source part 11 illustrated in FIG. 2A and FIG. 2B, the light emission peak wavelength of each LED can be selected from a combination of 460 nm, 520 nm, and 630 nm, for example. Such a combination of light emission peak wavelengths is only one example, and any combination of light emission peak wavelengths can be adopted.

[0063] The light source part 11 having the above-mentioned configuration sequentially turns on each LED 101 and causes a hologram at each light emission wavelength under control by the calculation processing part 20.

[0064] Referring back to FIG. 1A and FIG. 1B, description is given of the image sensor 13 in the hologram acquisition part 10 according to this embodiment.

[0065] The image sensor 13 according to this embodiment records a hologram (inline hologram) of the observation target object C, which has occurred on the sensor surface S2 illustrated in FIG. 1A and FIG. 1B, in synchronization with the lighting state of each LED under control by the calculation processing part 20. As a result, the image sensor 13 generates the same number of pieces of image data (namely, hologram image data) on the hologram as the number of light emission wavelengths of the LED in the light source part 11. Such an image sensor 13 is not particularly limited as long as the image sensor 13 has sensitivity to the wavelength band of illumination light emitted by various kinds of LEDs used as the light source part 11, and various kinds of publicly known image sensors can be used as the image sensor 13. Such an image sensor may include, for example, a charged-coupled device (CCD) sensor or a complementary metal-oxide-semiconductor (CMOS) sensor. Those image sensors may be a monochrome sensor or a color sensor. Further, pixel sizes of those image sensors may be selected appropriately depending on the length of the light emission point of the LED 101 used as the light source part 11, for example, and are not particularly limited. For example, the pixel sizes are preferred to be about 100 .mu.m.

[0066] The hologram acquisition part 10 according to this embodiment records only the light intensity distribution (square value of amplitude) of a hologram on the sensor surface S2, and does not record the distribution of phases. However, the calculation processing part 20 executes a series of image reconstruction processing as described later in detail to reproduce the distribution of phases of the hologram.

[0067] Further, the bandpass filter 15 according to this embodiment as illustrated in FIG. 1B is installed on the optical path between the light source part 11 and the observation target object C, and transmits only the illumination light applied by the light source part 11 toward the observation target object C. Such a bandpass filter 15 is provided to enable further improvement in spatial coherence and temporal coherence of illumination light, and achieve more efficient partially coherent illumination. Such a bandpass filter 15 is not particularly limited as long as the bandpass filter 15 is designed such that the transmission wavelength band corresponds to the light emission peak wavelength of the LED provided in the light source part 11, and various kinds of publicly known bandpass filters can be used appropriately.

[0068] As described above, the hologram acquisition part 10 according to this embodiment can acquire a more accurate hologram image of the observation target object with an extremely small number of parts by including an image sensor and an LED for which the length and pitch of the light emission point satisfy a specific condition, and further including a bandpass filter as necessary.

[0069] In the above, the configuration of the hologram acquisition part 10 in the observation device 1 according to this embodiment has been described in detail with reference to FIG. 1A to FIG. 2B.

[Calculation Processing Part]

[0070] Next, description is given in detail of the calculation processing part included in the observation device 1 according to this embodiment with reference to FIG. 3 to FIG. 10.

[0071] The calculation processing part 20 according to this embodiment integrally controls the activation state of the hologram acquisition part 10 included in the observation device 1 according to this embodiment. Further, the calculation processing part 20 uses a hologram image of the observation target object C acquired by the hologram acquisition part 10 to execute a series of processing of reconstructing an image of the observation target object C based on such a hologram image.

[0072] Overall Configuration of Calculation Processing Part

[0073] As schematically illustrated in FIG. 3, such a calculation processing part 20 includes a hologram acquisition control part 201, a data acquisition part 203, an image calculation part 205, an output control part 207, a display control part 209, and a storage part 211.

[0074] The hologram acquisition control part 201 is realized by, for example, a central processing unit (CPU), a read only memory (ROM), a random access memory (RAM), an input device, and a communication device. The hologram acquisition control part 201 integrally controls the activation state of the hologram acquisition part 10 based on observation condition information on various kinds of observation conditions of the hologram acquisition part 10 input through a user operation. Specifically, the hologram acquisition control part 201 controls the plurality of LEDs 101 provided in the light source part 11 of the hologram acquisition part 10, and controls the lighting state of each LED 101. Further, the hologram acquisition control part 201 controls the activation state of the image sensor 13 to generate a hologram (inline hologram) image of the observation target object C for each light emission wavelength on the sensor surface S2 of the image sensor 13 while at the same time synchronizing the activation state with the lighting state of each LED 101.

[0075] Further, the hologram acquisition control part 201 can also control the position of the observation stage St provided in the hologram acquisition part 10 along the z-axis direction. The hologram acquisition control part 201 may output the observation condition information and various kinds of information on the activation state of the hologram acquisition part 10 to the data acquisition part 203 and the image calculation part 205, and cause the data acquisition part 203 and the image calculation part 205 to use those pieces of information for various kinds of processing.

[0076] The data acquisition part 203 is realized by, for example, a CPU, a ROM, a RAM, and a communication device. The data acquisition part 203 acquires, from the hologram acquisition part 10, image data on the hologram image of the observation target object C for each light emission wavelength, which has been acquired by the hologram acquisition part 10 under control by the hologram acquisition control part 201. When the data acquisition part 203 has acquired image data from the hologram acquisition part 10, the data acquisition part 203 outputs the acquired image data on the hologram image to the image calculation part 205 described later. Further, the data acquisition part 203 may record the acquired image data on the hologram image into the storage part 211 described later as history information in association with time information on, for example, a date and time at which such image data has been acquired.

[0077] The image calculation part 205 is realized by, for example, a CPU, a ROM, and a RAM. The image calculation part 205 uses the image data on the hologram image of the observation target object C for each light emission wavelength, which is output from the data acquisition part 203, to execute a series of image calculation processing of reconstructing an image of the observation target object C. A detailed configuration of such an image calculation part 205 and details of the image calculation processing executed by the image calculation part 205 are described later again.

[0078] The output control part 207 is realized by, for example, a CPU, a ROM, a RAM, an output device, and a communication device. The output control part 207 controls output of image data on the image of the observation target object C calculated by the image calculation part 205. For example, the output control part 207 may cause the output device such as a printer to output the image data on the observation target object C calculated by the image calculation part 205 for provision to the user as a paper medium, or may cause various kinds of recording media to output the image data. Further, the output control part 207 may cause various kinds of information processing devices such as an externally provided computer, server, and process computer to output the image data on the observation target object C calculated by the image calculation part 205 so as to share the image data. Further, the output control part 207 may cause a display device such as various kinds of displays included in the observation device 1 or a display device such as various kinds of displays provided outside of the observation device 1 to output the image data on the observation target object C calculated by the image calculation part 205 in cooperation with the display control part 209 described later.

[0079] The display control part 209 is realized by, for example, a CPU, a ROM, a RAM, an output device, and a communication device. The display control part 209 performs display control when the image of the observation target object C calculated by the image calculation part 205 or various kinds of information associated with the image are displayed on an output device such as a display included in the calculation processing part 20 or an output device provided outside of the calculation processing part 20, for example. In this manner, the user of the observation device 1 can grasp various kinds of information on the focused observation target object on the spot.

[0080] The storage part 211 is realized by, for example, a RAM or a storage device included in the calculation processing part 20. The storage part 211 stores, for example, various kinds of databases or software programs to be used when the hologram acquisition control part 201 or the image calculation part 205 executes various kinds of processing. Further, the storage part 211 appropriately records, for example, various kinds of settings information on, for example, the processing of controlling the hologram acquisition part 10 executed by the hologram acquisition control part 201 or various kinds of image processing executed by the image calculation part 205, or progresses of the processing or various kinds of parameters that are required to be stored when the calculation processing part 20 according to this embodiment executes some processing. The hologram acquisition control part 201, the data acquisition part 203, the image calculation part 205, the output control part 207, the display control part 209, or the like can freely execute processing of reading/writing data from/to the storage part 211.

[0081] In the above, the overall configuration of the calculation processing part 20 included in the observation device 1 according to this embodiment has been described with reference to FIG. 3.

[0082] Configuration of Image Calculation Part

[0083] The image calculation part 205 uses image data on the hologram image of the observation target object C for each light emission wavelength to execute a series of image calculation processing of reconstructing an image of the observation target object C. As schematically illustrated in FIG. 4, such an image calculation part 205 includes a propagation distance calculation part 221, a preprocessing part 223, and a reconstruction processing part 225 including a reconstruction calculation part 225A and an amplitude replacement part 225B. In the following description, for the sake of convenience, it is assumed that z=0 represents the position of the support surface S1 and z=Z represents the position of the sensor surface S2 as the z-axis coordinates illustrated in FIG. 1A and FIG. 1B. Further, it is assumed that the light source part 11 applies illumination lights having light emission peak wavelengths .lamda..sub.1, .lamda..sub.2, .lamda..sub.3, and the image sensor 13 acquires hologram images g.sub..lamda.1, g.sub..lamda.2, g.sub..lamda.3 (more specifically, image relating to amplitude strength of hologram).

[0084] The propagation distance calculation part 221 is realized by, for example, a CPU, a ROM, and a RAM. The propagation distance calculation part 221 uses a digital focus technology (digital focusing) utilizing Rayleigh-Sommerfeld diffraction integral to calculate a specific value of a separation distance Z (separation distance between support surface S1 and sensor surface S2) illustrated in FIG. 1A and FIG. 1B as a propagation distance Z. Digital focusing herein refers to a technique of determining the focus position of each hologram image g.sub..lamda.1, g.sub..lamda.2, g.sub..lamda.3 by adjusting the propagation distance Z (separation distance Z illustrated in FIG. 1A and FIG. 1B) between the support surface S1 and the sensor surface S2.

[0085] In this case, the hologram acquisition control part 201 acquires, in advance, a focus image a(x, y, z) at each light emission wavelength while at the same time controlling the hologram acquisition part 10 to change the z-coordinate position of the observation stage St. In this case, a(x, y, 0) corresponds to a hologram image g.sub..lamda.n generated on the sensor surface S2.

[0086] The propagation distance calculation part 221 first uses a plurality of focus images having different z-coordinate positions to calculate a difference value f(z+.DELTA.z/Z) of luminance between focus images represented by the following expression (101). As can be understood from the following expression (101), a total sum of luminance differences at respective points forming image data is calculated for the entire image. Such a total sum can be used to obtain an output curve representing how the luminance value has changed along the z-axis direction (optical-path direction).

[ Math . .times. 1 ] f .times. ( z + .times. .DELTA. .times. .times. z 2 ) = x .times. y .times. { a .function. ( x , y , z + .DELTA. .times. .times. z ) - a .function. ( x , y , z ) } expression .times. .times. ( 101 ) ##EQU00001##

[0087] Next, the propagation distance calculation part 221 calculates a differential value f(z) of f(z+.DELTA.z/Z) calculated based on the expression (101) with respect to a variable z. Then, a z-position that gives the peak of the obtained differential value f(z) is a focus position of the focused hologram image g. Such a focus position is set as a specific value of the separation distance Z illustrated in FIG. 1A and FIG. 1B, namely, the propagation distance.

[0088] The propagation distance calculation part 221 outputs information on the propagation distance Z obtained in this manner to the preprocessing part 223 and the reconstruction processing part 225 at a subsequent stage.

[0089] In the above, the case of the propagation distance calculation part 221 calculating the separation distance Z by using the digital focus technology utilizing Rayleigh-Sommerfeld diffraction integral has been described. However, the propagation distance calculation part 221 may calculate the propagation distance Z based on the mechanical accuracy (accuracy of positioning observation stage St) of the hologram acquisition part 10.

[0090] The preprocessing part 223 is realized by, for example, a CPU, a ROM, and a RAM. The preprocessing part 223 executes, for the photographed image (namely, hologram image gin) for each light emission wavelength, preprocessing including at least shift correction of the image that depends on a positional relationship among the plurality of light emitting diodes. As illustrated in FIG. 5, this preprocessing part 223 includes a gradation correction part 231, an upsampling part 233, an image shift part 235, an image end processing part 237, and an initial complex amplitude generation part 239.

[0091] The gradation correction part 231 is realized by, for example, a CPU, a ROM, and a RAM. The gradation correction part 231 performs gradation correction (e.g., dark level correction and inverse gamma correction) of the image sensor 13, and executes processing of returning an image signal based on the hologram images g.sub..lamda.1, g.sub..lamda.2, g.sub..lamda.3 output from the data acquisition part 203 to a linear state. Specific details of the processing of gradation correction to be executed are not particularly limited, and various kinds of publicly known details of processing can be appropriately used. The gradation correction part 231 outputs the hologram images g.sub..lamda.1, g.sub..lamda.2, g.sub..lamda.3 after gradation correction to the upsampling part 233 at a subsequent stage.

[0092] The upsampling part 233 is realized by, for example, a CPU, a ROM, and a RAM. The upsampling part 233 upsamples image signals of the hologram images g.sub..lamda.1, g.sub..lamda.2, g.sub..lamda.3 after gradation correction. The hologram acquisition part 10 according to this embodiment is constructed as a so-called lensless microscope, and thus the resolution may exceed a Nyquist frequency of the image sensor 13. Thus, in order to exhibit the maximum performance, the hologram images g.sub..lamda.1, g.sub..lamda.2, g.sub..lamda.3 after gradation correction are subjected to upsampling processing. The upsampling processing to be executed specifically is not particularly limited, and various kinds of publicly known upsampling processing can be used appropriately.

[0093] The image shift part 235 is realized by, for example, a CPU, a ROM, and a RAM. The image shift part 235 executes, for the hologram image (more specifically, hologram image subjected to the above-mentioned gradation correction processing and upsampling processing) for each light emission wavelength, which has been acquired by the hologram acquisition part 10, shift correction of the image that depends on the positional relationship among the plurality of light emitting diodes.

[0094] More specifically, the image shift part 235 executes shift correction so as to cancel a deviation in position of the hologram image due to the position at which each LED 101 is provided. Such shift correction is performed by shifting spatial coordinates (x, y, z) defining the pixel position of the hologram image in a predetermined direction.

……
……
……

您可能还喜欢...