|
1.IntroductionAs a beam of light passes through a sample, the cumulation of phase delay reflects much intrinsic information on the sample, such as thickness, refractive index, and composition.1,2 However, the current detection device (CCD or CMOS) is incapable of acquiring the phase data, which led to the appearance of phase retrieval techniques. Technically, the lost phase could be reconstructed by an interferometric system3 and computational imaging.4,5 Without the reference beam, the computational imaging method has been developed by researchers with great interest and has been successfully applied in super resolution,6–8 three-dimensional imaging,9,10 and quantitative phase imaging.11,12 At present, this technique is classified into two types, namely transport of intensity equation (TIE)13–15 and iterative phase retrieval (IPR) method.16,17 TIE is a direct numerical phase solver so that it does not require phase unwrapping, and thus computationally efficient. But TIE merely retrieves the phase-only or amplitude-only object, which is not suitable for imaging a complex-valued sample. On the contrary, IPR method could reconstruct complex-valued images of different samples.18–20 As the origin, the Gerchberg–Saxton algorithm16 reconstructed the object’s phase via computationally propagating back-and-forth between the real and reciprocal space and imposing the constraints from a pair of amplitude distributions. The hybrid input output algorithm17 replaced the requirement of the known amplitude distribution in real space with a loose support and introduced the feedback to escape the stagnation. But these two algorithms are both sensitive to the initial guess and have to get a rough guess of the object for a better reconstruction. Alternatively, multi-image phase retrieval is capable of achieving high-accuracy image reconstruction by means of measurement diversity. Without prior knowledge, the ptychographic iterative engine (PIE) algorithm21–23 retrieves a complex-valued object from a series of diffraction patterns obtained by an overlapped pinhole-scanning across the sample. Apart from the lateral scanning strategy, there emerged plenty of other different forms to introduce degrees of freedom in the imaging system, including multidistance,24–27 multiwavelength,28,29 multibeam illumination,30 and spatial light modulation.31 Unlike the PIE method, multidistance phase retrieval (MDPR)25 transversely records diffraction patterns at different positions and iteratively computes the complex amplitude of the object, whose stability and robustness have been demonstrated.26,27 Multiwavelength phase retrieval (MWPR)28 has a similar performance but utilizes the multiwavelength illumination. These two methods without the need of lateral shift along - and -directions actually reduce the complexity of experiment. However, the imaging quality of these methods is severely hampered by speckle noise from the coherent light source. To alleviate the speckle noise effect, partially coherent illumination32,33 has been adopted, which effectively reduces the speckle noise but holds the coherence assumption according to the van Cittert–Zernike theorem. Zheng et al.6 mounted a programmable light-emitting diode (LED) array in conventional wide-field microscopy for multiangle illumination and retrieved quantitative complex field distribution of the sample. Similarly, Tian et al.,34 Chen et al.,35 and Lee et al.36 utilized patterned LED illumination for bright-field, dark-field, and phase-contrast imaging. Until now, partially coherent illumination has become a popular and feasible strategy to realize the high-resolution imaging with low-cost hardware. In our previous work,37 the weighted feedback was proposed to accelerate the convergence of MDPR under coherent illumination of a fiber laser. But the imaging contrast is heavily obstructed by speckle noise. Also, the reconstruction of the translucent sample is incompetent in that case. In this work, we will show that IPR based on weighted feedback acceleration modality can easily realize high-contrast and fast-converging reconstruction for different samples under partially coherent and speckle illumination, which are demonstrated by simulation and experiment in both lensless and lens-based systems. For the lensless system, a programmable LED array is used for partially coherent illumination, and the imaging quality of MDPR is demonstrated to be better compared with the fiber laser illumination. By imbedding weighted feedback, the imaging contrast of a biological specimen is enhanced for both MDPR and MWPR methods. To further exhibit its performance, we apply MDPR in the application of noninvasive imaging through the scattering layer, in which the convergence speed is significantly improved. For the lens-based system, MDPR is utilized to image the phase of a translucent sample in a conventional microscopy and the weighted feedback also takes effect. The rest of this article is arranged as follows. The theory of IPR and its weighted feedback modality are described in Sec. 2. The corresponding simulation and experiment results are given in Secs. 3 and 4, respectively. Conclusions are presented in Sec. 5. 2.TheoryIn MDPR, a set of diffraction patterns recorded downstream of the object plane repeatedly constrains the object estimate until the full complex field of the object is obtained. Here, the amplitude-phase retrieval algorithm25 is assigned to achieve MDPR, and its schematic diagram is shown in Fig. 1. As shown in Fig. 1(a), diffraction patterns are measured in the downstream of the sample. The transverse distance is composed of two components: initial distance and equivalent interval . In Fig. 1(b), the complex amplitude of the sample is initialized with zero matrix. The algorithmic flowchart of MDPR should follow: (1) ’th estimation of the object’s complex field propagates forward to the recording plane and thus generates computed patterns by different transverse distances ; (2) replacing the amplitude of computed patterns with the modulus of recorded diffraction patterns and retaining the computed phase; (3) these synthesized patterns propagate backward to the object plane and guesses of the object are produced; (4) ’th estimation of object is obtained by the average of guessed data , especially, this average operation is separately executed for amplitude and phase; and (5) running iteratively from steps (1) to (4) until the reconstructed accuracy meets the required threshold. Weighted feedback operation is imposed between step (3) and (4) as where denotes the ’th modulated object guess. The symbols and are the feedback coefficients, which are parameterized as 0.7 and 0.5 in Ref. 37, respectively. Within this definition, the estimation is calculated by the average of modulated object guesses (). If iterative number or 2, let , which means that weighted feedback starts when . Here, MDPR based on weighted feedback is termed as the MDPRF algorithm in the following context.The method in Ref. 28 is utilized as the MWPR algorithm for test. Its weighted version is named as the MWPRF algorithm for short. Here, the MWPR algorithm implements multiple patterns recorded under different wavelengths for image reconstruction, which is defined in Fig. 2(a). Its algorithmic details about the MWPRF algorithm are shown in Fig. 2(b) as follows: (1) initializing the complex field of the object with zeros matrix; (2) plugging and propagating ’th estimation of object forward to recording plane with a distance of ; (3) replacing the computed amplitude of recording plane with the modulus of recorded diffraction pattern and retaining the computed phase; (4) inversely propagating this synthesized complex amplitude backward to the object plane; (5) using next wavelength until all wavelengths are scanned and thus the ’th estimation of object is obtained; and (6) iteratively running steps (2) to (5) so that the reconstructed accuracy meets the given requirement. The corresponding weighted feedback operation is imbedded between steps (5) and (6), which is expressed as where the coefficients and are parameterized as 0.7 and 0.5, respectively. For MWPRF, the first () and second estimations () are similar to MWPR’ ones. Only if , weighted feedback operation starts working effectively.3.Simulation3.1.Partially Coherent IlluminationIn this section, numerical simulation is presented to prove the capability of the proposed idea. To quantitatively show the reconstructed accuracy, we utilize the normalized correlation coefficient (NCC) between the reconstructed image and the ground truth as the metric function, which is defined as where is the covariance matrix of the reconstructed image and the ground truth, which is an indicator of how much two images match each other. Here, denotes the total pixel numbers of the object image. The value of NCC ranges from [0, 1]. With the increase of NCC, the information of two images becomes closer to each other.The kernel of IPR is the propagation computation. In this paper, the diffraction computation is defined in the Fresnel regime. Thus, all propagations are computed by the angular spectrum formula as where is the illumination wavelength. Here, denotes the coordinate in frequency domain. In this case, any diffraction patterns placed in any transverse distances are able to be obtained by , where and represent the Fourier transform and its inverse version, respectively. To simulate the partial coherence of diffraction recording, the object plane around its original position is vibrated by 100 random plates to generate 100 object planes. These 100 object planes propagate a transverse distance of so that 100 diffraction patterns are produced. The desired modulus of partially coherent diffraction pattern in is calculated by averaging the modulus of 100 computed patterns. Repeatedly running this procedure, it is workable to get a set of multidistance or multiwavelength intensity images under partially coherent illumination. This partial coherence calculation is explicitly described in Ref. 38.Here, the image “cameraman” is chosen as the ground truth image and simulated parameters are listed as follows: (1) the imaging size is (); (2) ; (3) , ; and (4) the recording number is 3, 8, 11, and 13. One of 13 recorded diffraction patterns is shown in Fig. 3(a). The corresponding convergence curve is shown in Fig. 3(b). It is easy to observe the improvement of convergence speed by MDPRF. With the recording number , there is no predominant impact of increasing recording number on the convergence. So in this case, the convergence merely depends on the iterations. To visually exhibit the comparison of MDPR and MDPRF, the reconstructed images using 11 diffraction patterns after 10, 30, and 50 iterations are shown in Figs. 3(c)–3(e) for MDPR and Figs. 3(f)–3(h) for MDPRF. Obviously, the degradation of slow-rate convergence happens on the edge of the man at lower iterations, which actually wraps all around “the man” with a vague halo in MDPR results. But this degradation is visibly ironed out by MDPRF. Also, the iterative number reaching to convergence is 50 for the MDPRF algorithm, which is superior to MDPR’s results. Within the computation of partial coherency, the reconstructed images of MWPR and MWPRF algorithms are shown in Figs. 4(a)–4(d). The convergence curve is shown in Fig. 4(e). The simulated parameters are listed as follows: (1) the imaging size is (); (2) , , , and (3) . The convergence performance is explicitly improved by weighted feedback in Fig. 4. The NCC curves in Figs. 3(b) and 4(e) prove that the problems of the degradation and stagnation are actually worked out by weighted feedback under partially coherent illumination. 3.2.Speckle IlluminationMeasuring intensity patterns in the volume speckle field could lead to a unique and accurate image reconstruction of the object.39–41 To test the feasibility of our method in speckle illumination, we place the object in the speckle field and retrieve it using multiple intensity patterns. Here, the object in Fig. 5(a) is selected as the ground truth image, and its incident pattern is shown in Fig. 5(b). The speckle pattern to illuminate object results from a phase mask located in the front plane of the object. The simulated parameters are listed as: (1) the image size is (); (2) ; (3) the receiving plane is placed in the back plane of the object (, ); (4) the recording number is set as 5, 8, 11, and 13; and (5) the phase mask with a range of 0 to is located upstream 30 mm from the object. The convergence curve is shown in Fig. 5(c), and the reconstructed images with 11 intensity patterns are given in Figs. 5(d)–5(g) for MDPR and Figs. 5(h)–5(k) for MDPRF. To our surprise, even in the case of speckle field, weighted feedback still functions well in the acceleration of the convergence. Similar to partially coherent illumination, the recording number is a maximum limit for acceleration. As shown in Figs. 5(d)–5(k), MDPRF retrieves a full object merely in 50 iterations, in which the convergence speed is enhanced by twofolds of magnitude. 4.Experiment4.1.Lensless Multidistance ImagingTo prove the capability of our method, we set up the experiment in both lensless and lens-based systems. For lensless imaging, a programmable LED matrix (Adafruit 607) is used to realize partially coherent illumination. Unlike the regional illumination schemes in Refs. 3435.–36, only one LED is switched on in experiment so as to prevent the diffraction patterns from alias. A beam of spherical wave from LED is incident on a condense lens to generate plane wave illumination. This parallel light shaped by aperture illuminates the sample so that the CCD camera (, Point Gray) receives a diffraction pattern in the downstream of the sample. For MDPR, the CCD camera is mounted on the precision linear stage (M-403, Physik Instrumente Inc.). As the stage moves, a set of diffraction patterns is recorded with different transverse diffractive distances (initial distance and equally spaced interval ). Choosing a proper transverse distance, MWRR is doable by changing the wavelength of LED. The details of the experimental setup are shown in Fig. 6. Here, this lensless implementation is able to accomplish the experiment of partially coherent illumination for MDPR and MWPR, simultaneously. To verify the performance of partially coherent illumination, we perform MDPR with a sample of Negative Resolution Chart (R2L2S1N, Thorlabs) by LED and a fiber laser. The corresponding results are shown in Fig. 7. The experimental parameters are listed as follows: (1) the imaging size is ; (2) , , ; and (3) the wavelength of the fiber laser is 532 nm and LED is 623 nm. After 100 iterations for the fiber laser and 50 iterations for LED, the reconstructed images are shown in Figs. 7(a)–7(c), which indicates that partial coherency of incident light doubtlessly eliminates the affection of speckle noise. To quantitatively show this improvement, the plotlines along blue dash lines in Figs. 7(a)–7(c) are drawn in Figs. 7(d) and 7(e). Note that the vertical fringe pattern is clearly resolved by weighted feedback with a high imaging contrast. Similarly, we image “orchid root” (NSS Ltd.) with 623-nm LED illumination and its retrieved complex amplitudes are displayed in Fig. 8. MDPR is run by 50 and 500 iterations to generate reconstructed amplitudes [Figs. 8(a) and 8(b)] and phases [Figs. 8(d) and 8(e)]. By plugging in weighted feedback and running 50 iterations, the amplitude and phase from MDPRF are obtained in Figs. 8(c) and 8(f), respectively. Comparing Figs. 8(a) and 8(c), it is noted that the vague shape of the specimen is removed by MDPRF, which accords with simulation analysis. The profiles along blue, red, and black arrows in Figs. 8(a)–8(c) are plotted in Fig. 8(g). The plotline of MDPRF by 50 iterations is close to the line of MDPR by 500 iterations, which implies that weighted feedback actually speeds up the convergence. For phase reconstruction, it is worth noting that the contrast of the reconstructed phase is enhanced for MDPRF, which enables the biological tissue more distinct with respect to background noise. 4.2.Lensless Multiwavelength ImagingThe advantage of LED matrix lies in that it could exchange wavelength by programmable operation without any extra mechanical devices. MWPR’s experimental implementation is the same as done in Fig. 6. Choosing a proper transverse distance , three appreciable diffraction patterns are measured by sequentially switching red, green, and blue channels of LED (623, 532, and 467 nm). When , the reconstructed results of orchid root (NSS Ltd.) by MWPR and MWPRF are presented in Fig. 9. Technically, the red, green, and blue LED channels are fabricated side by side in Adafruit 607, which leads to a relative shift for multiwavelength diffraction patterns. The relative position distribution is derived from cross-correlation operation.27 The detailed process should follow: (1) choosing the peak position of self-correlation of the diffraction pattern related to blue LED as a start position and (2) orderly use cross correlation of start image and others to calculate the relative shifts. Thus, the relative position distribution of three patterns is shown in Fig. 9(a). To accomplish MWPR, we align these patterns to the start position and cut out irrelevant parts. After this preparation, the reconstructed images of MWPR and MWPRF are shown in Figs. 9(b)–9(e). The retrieved phases in Figs. 9(b) and 9(d) encounter the problem of phase wrapping. Here, the DCT least-squares algorithm42,43 is applied for phase unwrapping, and its corresponding unwrapped phases are presented in Figs. 9(c) and 9(e). Note that the contrast of retrieved phase is strengthened by weighted feedback. However, the imaging quality of multiwavelength strategy is not as good as Fig. 8. This discrepancy is mainly attributed to the uncertainty of central wavelength. We propose two solutions to solve this problem. The direct solution is that introducing different narrow-bandwidth filters in the back plane of the condensed lens to cut out uncertain parts. But the plug-in of the filter could lead to the decrease of outgoing radiance onto the CCD camera, which will heavily impair the quality of the reconstructed image. It is accordingly imperative to combine this weighted feedback modality with the noise compression method. For another solution, replacing the present LED with a high-power one, it is easy to get rid of the obstruction of low signal-to-noise ratio. But the incoherent property of this light source will lose the effectiveness of distance-based angular spectrum propagation. Hence, it is workable to place the recording plane in the far-field regime for IPR. We believe that this challenge will be overcome in the future. 4.3.Lensless Imaging Through Scatter LayerOptical imaging through a scattering medium has great promise for biomedical engineering, since biological tissue could diffuse any incident beams into a speckle pattern so that the resolution and penetration are limited.41 IPR, as a useful tool, could recover a target hidden behind the scatter medium. Here, we apply our weighted feedback acceleration in this situation and compare MDPRF with MDPR. The experimental diagram is shown in Fig. 10(a). A fiber laser with the wavelenght of 532 nm takes the task of illumiantion. A ground glass diffuser (GGD, Thorlabs, 120 grit) is chosen as the scattering medium and placed in the middle of the CCD camera and sample. The sample is a number “5” of Negative 1951 USAF Target (R3L3S1N, Thorlabs). The experimental parameters are listed as follows: (1) the imaging size is ; (2) the distance from the sample to GGD is 40 mm; and (3) , , and . The corresponding retrieved images are shown in Figs. 10(b)–10(e) for MDPR and Figs. 10(f)–10(i) for MDPRF after 10, 25, 50, and 100 iteartions. It is noted that weighted feedback still works well in the speckle field. In only 25 iterations, MDPRF is capable of retrieving the structure of Only target, while MDPR needs 100 iterations or more. This result is identical to simulation analysis in Sec. 3. 4.4.Lens-Based ImagingAt present, the resolution of lensless imaging is limited by finite pixel size of the imaging sensor. To observe the performance of our method on tiny matters, we apply weighted feedback into microscopy (magnification , ) and utilize translucent “human cheek cells” as the sample. Here, the MDPR algorithm is performed to synthesize a set of defocused intensity images for the phase reconstruction of an in-focused image. The corresponding datasets are from Laura Waller’s team.44 The wavelength of incident light is filtered by white light as 650 nm (10-nm bandwidth). The number of recorded images is 129 (one focused image and 128 defocused images, ). The defocus range belongs to [, ] and the interval is . The focused intensity image is assigned as an initialization for MDPR. Choosing a set of defocused intensity images at the front/back of the focused plane, the reconstructed phases are obtained by iterative back-and-forth computation and amplitude replacement. The corresponding results are shown in Fig. 11. Within five recorded images (propagating distance: , 0, 4, and ), the retrieved phases under 10, 100, and 1000 iterations are shown in Figs. 11(a)–11(f). For weighted feedback operation, speeding up convergence is easy to be discerned. At the same iterations, the imaging quality of MDPRF is superior to MDPR. Within 129 recorded images, the results of two methods under 10 iterations are given in Figs. 11(g)–11(h). It is noted that the cells are successfully reconstructed by the two methods, and the imaging contrast of MDPRF is higher than MDPR’s. To further exhibit this improvement, the plotlines of Figs. 11(g) and 11(h) are shown in Fig. 11(i), which indicates that the edge enhancement of cells enable itself to be sharper. For low recorded images and short intervals, weighted feedback ensures high-accuracy reconstruction for MDPR. 5.ConclusionWe expand the application of weighted feedback operation into partially coherent illumination and speckle illumination. MDPR and MWPR algorithms are modified into weighted modality, MDPRF and MWPRF algorithms. In simulation, it is proved that these modified methods have the ability of speeding up convergence in partially coherent and speckle field. In experiment, a programmable LED matrix is used to form lensless multidistance and multiwavelength imaging systems. Compared with conventional fiber laser illumination, partial coherency of a light source actually makes imaging quality better. Using weighted feedback to retrieve the resolution chart and orchid root, the imaging contrast and convergence speed are highly enhanced for both lensless imaging strategies. Furthermore, our method also functions well in the optical imaging through scattering medium. Similarly, the MDPR algorithm and its weighted version MDPRF algorithm are applied in microscopy to image translucent human cheek cells, which demonstrates that weighted feedback not only enhances the convergence speed but also strengthens the phase contrast for the translucent sample. This work provides an effective strategy to perform high-contrast imaging for IPR method. Also, due to fast and accurate convergence, weighted feedback could heavily decrease measurement times, which enables the experimental setup at low cost and compactness for label-free biological imaging. AcknowledgmentsThis work was supported by the National Natural Science Foundation of China (Nos. 61377016, 61575055, and 61575053), the Fundamental Research Funds for the Central Universities (No. HIT.BRETIII.201406), the Program for New Century Excellent Talents in University (No. NCET-12-0148), and the China Postdoctoral Science Foundation (Nos. 2013M540278 and 2015T80340), and the Scientific Research Foundation for the Returned Overseas Chinese Scholars, State Education Ministry, China. The authors thank Mr. Cheng Shen for polishing the English. ReferencesB. Bhaduri et al.,
“Diffraction phase microscopy: principles and applications in materials and life sciences,”
Adv. Opt. Photonics, 6 57
–119
(2014). http://dx.doi.org/10.1364/AOP.6.000057 AOPAC7 1943-8206 Google Scholar
H. Majeed et al.,
“Quantitative phase imaging for medical diagnosis,”
J. Biophotonics, 10
(2), 177
–205
(2017). http://dx.doi.org/10.1002/jbio.201600113 Google Scholar
W. Osten et al.,
“Recent advances in digital holography,”
Appl. Opt., 53
(27), G44
–G63
(2014). http://dx.doi.org/10.1364/AO.53.000G44 APOPAI 0003-6935 Google Scholar
Y. Shechtman et al.,
“Phase retrieval with application to optical imaging: a contemporary overview,”
IEEE Signal Process. Mag., 32 87
–109
(2015). http://dx.doi.org/10.1109/MSP.2014.2352673 ISPRE6 1053-5888 Google Scholar
E. McLeod and A. Ozcan,
“Unconventional methods of imaging: computational microscopy and compact implementations,”
Rep. Prog. Phys., 79 076001
(2016). http://dx.doi.org/10.1088/0034-4885/79/7/076001 RPPHAG 0034-4885 Google Scholar
G. Zheng, R. Horstmeyer and C. Yang,
“Wide-field, high-resolution Fourier ptychographic microscopy,”
Nat. Photonics, 7
(9), 739
–745
(2013). http://dx.doi.org/10.1038/nphoton.2013.187 NPAHBY 1749-4885 Google Scholar
S. Pacheco, G. Zheng and R. Liang,
“Reflective Fourier ptychography,”
J. Biomed. Opt., 21
(2), 026010
(2016). http://dx.doi.org/10.1117/1.JBO.21.2.026010 JBOPFO 1083-3668 Google Scholar
J. Sun et al.,
“Resolution-enhanced Fourier ptychographic microscopy based on high-numerical-aperture illuminations,”
Sci. Rep., 7 1187
–1197
(2017). http://dx.doi.org/10.1038/s41598-017-01346-7 SRCEC3 2045-2322 Google Scholar
R. Horstmeyer et al.,
“Diffraction tomography with Fourier ptychography,”
Optica, 3
(8), 827
–835
(2016). http://dx.doi.org/10.1364/OPTICA.3.000827 Google Scholar
L. Tian and L. Waller,
“3D intensity and phase imaging from light field measurements in an LED array microscope,”
Opitca, 2
(2), 104
–111
(2015). http://dx.doi.org/10.1364/OPTICA.2.000104 Google Scholar
Y. Yao et al.,
“Ptychographic phase microscope based on high-speed modulation on the illumination beam,”
J. Biomed. Opt., 22
(3), 036010
(2017). http://dx.doi.org/10.1117/1.JBO.22.3.036010 JBOPFO 1083-3668 Google Scholar
A. Anand, V. Chhaniwal and B. Javidi,
“Quantitative cell imaging using single beam phase retrieval method,”
J. Biomed. Opt., 16
(6), 060503
(2011). http://dx.doi.org/10.1117/1.3589090 JBOPFO 1083-3668 Google Scholar
M. R. Teague,
“Deterministic phase retrieval: a Green’s function solution,”
J. Opt. Soc. Am., 73 1434
–1441
(1983). http://dx.doi.org/10.1364/JOSA.73.001434 JOSAAH 0030-3941 Google Scholar
L. Waller, L. Tian and G. Barbastathis,
“Transport of intensity phase-amplitude imaging with higher order intensity derivatives,”
Opt. Express, 18
(12), 12552
–12561
(2010). http://dx.doi.org/10.1364/OE.18.012552 OPEXFF 1094-4087 Google Scholar
C. Zuo et al.,
“High-resolution transport-of-intensity quantitative phase microscopy with annular illumination,”
Sci. Rep., 7 7654
(2017). http://dx.doi.org/10.1038/s41598-017-06837-1 SRCEC3 2045-2322 Google Scholar
R. W. Gerchberg and W. O. Saxton,
“A practical algorithm for the determination of phase from image and diffraction plane pictures,”
Optik, 35 237
–246
(1972). OTIKAJ 0030-4026 Google Scholar
J. R. Fienup,
“Phase retrieval algorithms: a comparison,”
Appl. Opt., 21
(15), 2758
–2769
(1982). http://dx.doi.org/10.1364/AO.21.002758 APOPAI 0003-6935 Google Scholar
A. Greenbaum et al.,
“Wide-field computational imaging of pathology slides using lens-free on-chip microscopy,”
Sci. Transl. Med., 6 267ra175
(2014). http://dx.doi.org/10.1126/scitranslmed.3009850 STMCBQ 1946-6234 Google Scholar
S. Dong et al.,
“High-resolution fluorescence imaging via pattern-illuminated Fourier ptychography,”
Opt. Express, 22
(17), 20856
–20870
(2014). http://dx.doi.org/10.1364/OE.22.020856 OPEXFF 1094-4087 Google Scholar
T. M. Godden et al.,
“Phase calibration target for quantitative phase imaging with ptychography,”
Opt. Express, 24
(7), 7679
–7692
(2016). http://dx.doi.org/10.1364/OE.24.007679 OPEXFF 1094-4087 Google Scholar
J. M. Rodenburg and H. M. L. Faulkner,
“A phase retrieval algorithm for shifting illumination,”
Appl. Phys. Lett., 85
(20), 4795
–4797
(2004). http://dx.doi.org/10.1063/1.1823034 APPLAB 0003-6951 Google Scholar
A. M. Maiden and J. M. Rodenburg,
“An improved ptychographical phase retrieval algorithm for diffractive imaging,”
Ultramicroscopy, 109
(10), 1256
–1262
(2009). http://dx.doi.org/10.1016/j.ultramic.2009.05.012 ULTRD6 0304-3991 Google Scholar
W. Yu et al.,
“High-quality image reconstruction method for ptychography with partially coherent illumination,”
Phys. Rev. B, 93
(24), 241105
(2016). http://dx.doi.org/10.1103/PhysRevB.93.241105 Google Scholar
G. Pedrini, W. Osten and Y. Zhang,
“Wave-front reconstruction from a sequence of interferograms recorded at different planes,”
Opt. Lett., 30
(8), 833
–835
(2005). http://dx.doi.org/10.1364/OL.30.000833 OPLEDP 0146-9592 Google Scholar
Z. Liu et al.,
“Iterative phase amplitude retrieval from multiple images in gyrator domains,”
J. Opt., 17
(2), 025701
(2015). http://dx.doi.org/10.1088/2040-8978/17/2/025701 Google Scholar
C. Shen et al.,
“Two noise-robust axial scanning multi-image phase retrieval algorithms based on Pauta criterion and smoothness constraint,”
Opt. Express, 25
(14), 16235
–16249
(2017). http://dx.doi.org/10.1364/OE.25.016235 OPEXFF 1094-4087 Google Scholar
C. Guo et al.,
“Axial multi-image phase retrieval under tilt illumination,”
Sci. Rep., 7 7562
(2017). http://dx.doi.org/10.1038/s41598-017-08045-3 SRCEC3 2045-2322 Google Scholar
P. Bao et al.,
“Phase retrieval using multiple illumination wavelengths,”
Opt. Lett., 33
(4), 309
–311
(2008). http://dx.doi.org/10.1364/OL.33.000309 OPLEDP 0146-9592 Google Scholar
D. W. E. Noom, K. S. E. Eikema and S. Witte,
“Lensless phase contrast microscopy based on multiwavelength Fresnel diffraction,”
Opt. Lett., 39
(2), 193
–196
(2014). http://dx.doi.org/10.1364/OL.39.000193 OPLEDP 0146-9592 Google Scholar
X. Pan, C. Liu and J. Zhu,
“Single shot ptychographical iterative engine based on multi-beam illumination,”
Appl. Phys. Lett., 103
(17), 171105
(2013). http://dx.doi.org/10.1063/1.4826273 APPLAB 0003-6951 Google Scholar
J. A. Rodrigo et al.,
“Wavefield imaging via iterative retrieval based on phase modulation diversity,”
Opt. Express, 19
(19), 18621
–18635
(2011). http://dx.doi.org/10.1364/OE.19.018621 OPEXFF 1094-4087 Google Scholar
L.W. Whitehead et al.,
“Diffractive imaging using partially coherent x rays,”
Phys. Rev. Lett., 103 243902
(2009). http://dx.doi.org/10.1103/PhysRevLett.103.243902 PRLTAO 0031-9007 Google Scholar
Z. Jingshan et al.,
“Partially coherent phase imaging with simultaneous source recovery,”
Biomed. Opt. Express, 6
(1), 257
–265
(2015). http://dx.doi.org/10.1364/BOE.6.000257 BOEICL 2156-7085 Google Scholar
L. Tian et al.,
“Multiplexed coded illumination for Fourier ptychography with an LED array microscope,”
Biomed. Opt. Express, 5
(7), 2376
–2389
(2014). http://dx.doi.org/10.1364/BOE.5.002376 BOEICL 2156-7085 Google Scholar
M. Chen, L. Tian and L. Waller,
“3D differential phase contrast microscopy,”
Biomed. Opt. Express, 7
(10), 3940
–3950
(2016). http://dx.doi.org/10.1364/BOE.7.003940 BOEICL 2156-7085 Google Scholar
D. Lee et al.,
“Color-coded LED microscopy for multi-contrast and quantitative phase-gradient imaging,”
Biomed. Opt. Express, 6
(12), 4912
–4922
(2015). http://dx.doi.org/10.1364/BOE.6.004912 BOEICL 2156-7085 Google Scholar
C. Guo et al.,
“A fast-converging iterative method via weighted feedback for multi-distance diffractive imaging,”
Sci. Rep.,
(2017). Google Scholar
D. Voelz, Computational Fourier Optics: A MATLAB Tutorial, SPIE Press, Bellingham, Washington
(2011). Google Scholar
A. Anand et al.,
“Wavefront sensing with random amplitude mask and phase retrieval,”
Opt. Lett., 32
(11), 1584
–1586
(2007). http://dx.doi.org/10.1364/OL.32.001584 OPLEDP 0146-9592 Google Scholar
A. K. Singh et al.,
“Scatter-plate microscope for lensless microscopy with diffraction limited resolution,”
Sci. Rep., 7 10687
(2017). http://dx.doi.org/10.1038/s41598-017-10767-3 SRCEC3 2045-2322 Google Scholar
O. Katz et al.,
“Non-invasive single-shot imaging through scattering layers and around corners via speckle correlations,”
Nat. Photonics, 8
(10), 784
–790
(2014). http://dx.doi.org/10.1038/nphoton.2014.189 NPAHBY 1749-4885 Google Scholar
M. A. Schofield and Y. Zhu,
“Fast phase unwrapping algorithm for interferometric applications,”
Opt. Lett., 28
(14), 1194
–1196
(2003). http://dx.doi.org/10.1364/OL.28.001194 OPLEDP 0146-9592 Google Scholar
W. Shi, Y. Zhu and Y. Yao,
“Discussion about the DCT/FFT phase-unwrapping algorithm for interferometric applications,”
Optik, 121 1443
–1449
(2010). http://dx.doi.org/10.1016/j.ijleo.2009.02.006 OTIKAJ 0030-4026 Google Scholar
Z. Jingshan et al.,
“Transport of Intensity phase imaging by intensity spectrum fitting of exponentially spaced defocus planes,”
Opt. Express, 22
(9), 10661
–10674
(2014). http://dx.doi.org/10.1364/OE.22.010661 OPEXFF 1094-4087 Google Scholar
BiographyCheng Guo is currently a PhD student in the Department of Automatic Test and Control, Harbin Institute of Technology, under the supervision of Professor Zhengjun Liu. His research focuses on the development and application of iterative phase retrieval methods. Qiang Li is currently a master’s student in the Department of Automatic Test and Control, Harbin Institute of Technology, under the supervision of Professor Jian Liu. His research mainly focuses on computational photography and image processing. Xiaoqing Zhang is currently a PhD student at the School of Biological Science and Technology, Harbin Institute of Technology, under the supervision of Professor Huan Nie. His research focuses on glycomics. Jiubin Tan is the head of Precision Instrument Engineering School, Harbin Institute of Technology. He received a PhD from Harbin Institute of Technology in 1991. He is academician of the Chinese Academy of Engineering. He is also a standing committee member of the International Committee on Measurements and Instrumentation, the chairman of the China Measuring Instrument Specialty Committee, the managing director of the China Instrument and Control Society, and the managing director of the Chinese Society for Measurement. Shutian Liu is a professor in the Department of Physics, Harbin Institute of Technology. He has published more than 200 peer-reviewed journal articles in the field of optics and 1 book. His current research interests include optical information processing, optical information security, nonlinear optics, and quantum optics. He is a senior member of the Optical Society of America (OSA) and a fellow of the Chinese Physical Society. Zhengjun Liu is a professor in the Department of Automatic Test and Control, Harbin Institute of Technology, China. He was honored by the Program for New Century Excellent Talents in University in 2012. He has published 97 peer-reviewed journal articles in the field of optics, 2 books, and 1 book chapter. He is a senior member of OSA and a member of IEEE. His current research interests include optical image processing and super-resolution imaging. |