# Aberration Estimation for Synthetic Aperture Digital Holographic Microscope Using Deep Neural Network

^{*}

^{†}

## Abstract

**:**

## 1. Introduction

## 2. Aberration of Synthetic Aperture Digital Holographic Microscope

_{o}and A

_{r}are amplitude of the object and reference beam, and ${\theta}_{o}$ is the phase profile of the object beam. Using Equation (2), we can numerically reconstruct the complex wave function ${U}_{O}$ of the object, and it is as follows:

## 3. Generation of Training Datasets for DNN Model

_{1}, and the beam passing through the objective lens formed a frequency distribution in the Fourier domain away from the objective lens up to f

_{1}. In the Fourier domain, a mask with a 224-pixel diameter was applied to realize the NA of the objective lens, and it meant that the NA of the objective lens 0.0266. The masked signal was Fourier-transformed by the tube lens and imaged onto the CCD plane. As the last procedure, we calculated the intensity map in the out-of-focus region, which is d away from the CCD plane, where f

_{1}, f

_{2}, and d are 10, 200, and 10 mm.

_{1}and Z

_{2}are related to carrier frequencies, and they are limited to prevent aliasing in the simulation. The coefficients with n greater than 3 are randomly chosen in proportion to the carrier frequencies. It is noted that the combinations of all coefficients which are randomly chosen present an arbitrary illumination phase. We generated a total of 100,000 samples. After generation, we input the phase maps into our simulation, then we calculated the results, which is the intensity map on Fourier domain, focal plane, and out-of-focus plane. The three intensity maps were normalized and finally combined as three-channel image files.

## 4. Aberration Compensation Using Deep Neural Network

_{0}, which represents the piston phase, and the final loss value was mainly due to errors in this Z

_{0}coefficient. The reason for the difficulty in accurately predicting Z

_{0}is that the piston value represents an overall phase shift that does not significantly affect the intensity maps in both the frequency domain and image domain. Thus, it is difficult for the DNN model to learn any feature about Z

_{0}; therefore, it struggles with prediction.

_{i}and y

_{i}. A total of 153 illumination phase profiles were generated. Here, samples 1 through 5 represent the bright field, and others represent the dark field. Samples 6 through 121 were generated under the range of coefficients that was the same as the range of the training data. On the other hand, samples 122 through 153 were generated using coefficients with a larger range than the training range. This was performed to evaluate how accurately the model predicts results over a larger range. Using the generated illumination phase profiles and a circular aperture in SA-DHM simulations, we generated ${I}_{n=1,2,3,4}$ on the CCD plane, and calculated U

_{0}using Equation (3). The calculated U

_{0}was utilized to generate intensity maps in the frequency domain, the CCD plane, and out-of-focus plane, and these three types of intensity maps were combined into one BMP file. A total of 153 BMP files were generated. We predicted the Zernike coefficients using the trained model from these BMP files, and Figure 8 shows the spatial frequency of the illumination beam according to the lens index and the errors between the actual Zernike coefficients and the coefficients predicted by the trained model. For inputs up to 153, we saw the most error within 50% of the prediction results, and this means that the model has to be used within the training range. These results mean that it seems reasonable to use the predicted result in the range for the SA-DHM.

_{0}were used with the ground truth values. Figure 9a shows the reconstruction results using the perfect illumination phase profile, and Figure 9b shows the results using the ground truth Z

_{1}and Z

_{2}values without aberration compensation during the reconstruction process. In the simulation results, it is difficult to distinguish high-resolution patterns since the aberrations were not compensated. On the other hand, Figure 9c shows the result of using the predicted Zernike coefficients by using the trained model, with aberration compensation. The result shows the high-resolution image that was reconstructed with a high degree of restoration, which was similar to the actual values. Figure 9d shows the result of restoring the sample with a larger range of coefficients than the training range, i.e., lenses up to the 151st. Although the coefficients predicted from the illumination beam had an error of up to 10%, we can see that a high-resolution restoration of the sample was achieved.

## 5. Discussion

_{0}cannot be predicted by the trained model because the diffraction pattern is not changed by the coefficient. Therefore, considering that Z

_{0}is unpredictable, we made the phase of Fourier domain of the first wave function at a specific point in the overlapped regions match the piston phase, as shown in Figure 12a. If we perform SA-DHM without matching the piston phases, as shown in Figure 12b, the reconstructed image is severely distorted, and it is difficult to distinguish the original pattern. On the other hand, by compensating for the piston phase, as shown in Figure 12c, we obtain a reconstructed image with more clearly distinguishable patterns.

## 6. Conclusions

## Author Contributions

## Funding

## Institutional Review Board Statement

## Informed Consent Statement

## Data Availability Statement

## Conflicts of Interest

## References

- Schnars, U.; Jüptner, W. Direct recording of holograms by a CCD target and numerical reconstruction. Appl. Opt.
**1994**, 33, 179–181. [Google Scholar] [CrossRef] - Park, Y.; Choi, W.; Yaqoob, Z.; Dasari, R.; Badizadegan, K.; Feld, M.S. Speckle-field digital holographic microscopy. Opt. Express
**2009**, 17, 12285–12292. [Google Scholar] [CrossRef] [PubMed] - Sheng, J.; Malkiel, E.; Katz, J. Digital holographic microscope for measuring three-dimensional particle distributions and motions. Appl. Opt.
**2006**, 45, 3893–3901. [Google Scholar] [CrossRef] - Yu, X.; Hong, J.; Liu, C.; Kim, M.K. Review of digital holographic microscopy for three-dimensional profiling and tracking. Opt. Eng.
**2014**, 53, 112306. [Google Scholar] [CrossRef] - Greivenkamp, J.E. Field Guide to Geometrical Optics; SPIE: Bellingham, WA, USA, 2004. [Google Scholar]
- Lohmann, A.W.; Dorsch, R.G.; Mendlovic, D.; Zalevsky, Z.; Ferreira, C. Space–bandwidth product of optical signals and systems. JOSA A
**1996**, 13, 470–473. [Google Scholar] [CrossRef] - Di, J.; Zhao, J.; Jiang, H.; Zhang, P.; Fan, Q.; Sun, W. High resolution digital holographic microscopy with a wide field of view based on a synthetic aperture technique and use of linear CCD scanning. Appl. Opt.
**2008**, 47, 5654–5659. [Google Scholar] [CrossRef] - Lai, X.-J.; Tu, H.-Y.; Wu, C.-H.; Lin, Y.-C.; Cheng, C.-J. Resolution enhancement of spectrum normalization in synthetic aperture digital holographic microscopy. Appl. Opt.
**2015**, 54, A51–A58. [Google Scholar] [CrossRef] [PubMed] - Lin, Y.-C.; Tu, H.-Y.; Wu, X.-R.; Lai, X.-J.; Cheng, C.-J. One-shot synthetic aperture digital holographic microscopy with non-coplanar angular-multiplexing and coherence gating. Opt. Express
**2018**, 26, 12620–12631. [Google Scholar] [CrossRef] - Hillman, T.R.; Gutzler, T.; Alexandrov, S.A.; Sampson, D.D. High-resolution, wide-field object reconstruction with synthetic aperture Fourier holographic optical microscopy. Opt. Express
**2009**, 17, 7873–7892. [Google Scholar] [CrossRef] - Zhang, H.; Bian, Z.; Jiang, S.; Liu, J.; Song, P.; Zheng, G. Field-portable quantitative lensless microscopy based on translated speckle illumination and sub-sampled ptychographic phase retrieval. Opt. Lett.
**2019**, 44, 1976–1979. [Google Scholar] [CrossRef] - Chung, J.; Lu, H.; Ou, X.; Zhou, H.; Yang, C. Wide-field Fourier ptychographic microscopy using laser illumination source. Biomed. Opt. Express
**2016**, 7, 4787–4802. [Google Scholar] [CrossRef] [PubMed] - Kuang, C.; Ma, Y.; Zhou, R.; Lee, J.; Barbastathis, G.; Dasari, R.R.; Yaqoob, Z.; So, P.T. Digital micromirror device-based laser-illumination Fourier ptychographic microscopy. Opt. Express
**2015**, 23, 26999–27010. [Google Scholar] [CrossRef] - Liu, S.; Lian, Q.; Xu, Z. Phase aberration compensation for digital holographic microscopy based on double fitting and background segmentation. Opt. Lasers Eng.
**2019**, 115, 238–242. [Google Scholar] [CrossRef] - Zuo, C.; Chen, Q.; Qu, W.; Asundi, A. Phase aberration compensation in digital holographic microscopy based on principal component analysis. Opt. Lett.
**2013**, 38, 1724–1726. [Google Scholar] [CrossRef] - Huang, L.; Yan, L.; Chen, B.; Zhou, Y.; Yang, T. Phase aberration compensation of digital holographic microscopy with curve fitting preprocessing and automatic background segmentation for microstructure testing. Opt. Commun.
**2020**, 462, 125311. [Google Scholar] [CrossRef] - He, W.; Liu, Z.; Yang, Z.; Dou, J.; Liu, X.; Zhang, Y.; Liu, Z. Robust phase aberration compensation in digital holographic microscopy by self-extension of holograms. Opt. Commun.
**2019**, 445, 69–75. [Google Scholar] [CrossRef] - Deng, D.; Qu, W.; He, W.; Liu, X.; Peng, X. Phase aberration compensation for digital holographic microscopy based on geometrical transformations. J. Opt.
**2019**, 21, 085702. [Google Scholar] [CrossRef] - Liu, S.; Liu, Z.; Xu, Z.; Han, Y.; Liu, F. Automatic and accurate compensation for phase aberrations in digital holographic microscopy based on iteratively reweighted least squares fitting. Opt. Laser Technol.
**2023**, 167, 109704. [Google Scholar] [CrossRef] - Shohani, J.B.; Hajimahmoodzadeh, M.; Fallah, H. Using a deep learning algorithm in image-based wavefront sensing: Determining the optimum number of zernike terms. Opt. Contin.
**2023**, 2, 632–645. [Google Scholar] [CrossRef] - Hu, S.; Hu, L.; Gong, W.; Li, Z.; Si, K. Deep learning based wavefront sensor for complex wavefront detection in adaptive optical microscopes. Front. Inf. Technol. Electron. Eng.
**2021**, 22, 1277–1288. [Google Scholar] [CrossRef] - Nishizaki, Y.; Valdivia, M.; Horisaki, R.; Kitaguchi, K.; Saito, M.; Tanida, J.; Vera, E. Deep learning wavefront sensing. Opt. Express
**2019**, 27, 240–251. [Google Scholar] [CrossRef] [PubMed] - Möckl, L.; Petrov, P.N.; Moerner, W. Accurate phase retrieval of complex 3D point spread functions with deep residual neural networks. Appl. Phys. Lett.
**2019**, 115, 251106. [Google Scholar] [CrossRef] - Lin, L.-C.; Huang, C.-H.; Chen, Y.-F.; Chu, D.; Cheng, C.-J. Deep learning-assisted wavefront correction with sparse data for holographic tomography. Opt. Lasers Eng.
**2022**, 154, 107010. [Google Scholar] [CrossRef] - Tahara, T.; Ito, K.; Kakue, T.; Fujii, M.; Shimozato, Y.; Awatsuji, Y.; Nishio, K.; Ura, S.; Kubota, T.; Matoba, O. Parallel phase-shifting digital holographic microscopy. Biomed. Opt. Express
**2010**, 1, 610–616. [Google Scholar] [CrossRef] [PubMed]

**Figure 3.**Experimental measurements of intensity phase profiles in focal plane and in Fourier domain. The measurements from the illumination beams are (

**a**) $\overrightarrow{{k}_{i}}$ = (122.1, −163.6, 11,808.3) and (

**b**) (187.6, −164.6, 11,807.9).

**Figure 4.**Structure of ResNet50 for Zernike coefficients. The ResNet consists of identity blocks and convolution blocks, and a fully connected layer is added to estimate the ten lowest Zernike coefficients.

**Figure 6.**Examples of the training data. The intensity patterns on (

**a**) the Fourier domain, (

**b**) the focal plane, and (

**c**) the out-of-focus plane. (

**d**) They are, respectively, occupied in R, G, B channels for training data. The pictures on the first line were numerically generated in the bright field condition and the others were numerically generated in the dark field condition.

**Figure 7.**Training history of the loss graph. Red and blue lines represent training loss and validation loss, respectively.

**Figure 9.**SA-DHM reconstruction results of synthesized intensity profiles in Fourier domain, reconstruction images in focal plane, and their enlarged images. (

**a**) Ground truth and (

**b**) reconstruction without compensating aberration. Reconstructions with compensating the aberration (

**c**) within and (

**d**) outside the range of training data.

**Figure 10.**Resolution analysis of the SA-DHM. (

**a**) Resolution target. (

**b**) MTF chart with and without compensating aberration by using the proposed DNN.

**Figure 11.**SA-DHM reconstruction results with the spoke resolution chart, and their enlarged images. (

**a**) Ground truth and (

**b**) reconstruction without compensating aberration. (

**c**) Reconstructions with compensating the aberration.

**Figure 12.**Piston phase matching and numerical reconstruction results. (

**a**) Piston phase matching at a specific point in overlapped region. Numerical reconstruction results (

**b**) without and (

**c**) with piston phase matching.

**Table 1.**Examples of normalized Zernike coefficients of aberrations of illumination beams with wavevector $\overrightarrow{{k}_{i}}$.

Zernike Polynomial | Equations | Coefficients | |
---|---|---|---|

(122.1, −163.6, 11,808.3) | (187.6, −164.6, 11,807.9). | ||

0 | 1 | - | - |

1 | $\mathsf{\rho}\mathrm{sin}\varphi $ | - | - |

2 | $\mathsf{\rho}\mathrm{cos}\varphi $ | - | - |

3 | ${\mathsf{\rho}}^{2}\mathrm{sin}2\varphi $ | 0.5826 | 10.7284 |

4 | $2{\mathsf{\rho}}^{2}-1$ | −4.8872 | −11.1697 |

5 | ${\mathsf{\rho}}^{2}\mathrm{cos}2\varphi $ | −0.8507 | −1.1836 |

6 | ${\mathsf{\rho}}^{3}\mathrm{sin}3\varphi $ | −0.8823 | 4.3718 |

7 | $\left(3{\mathsf{\rho}}^{3}-2\mathsf{\rho}\right)\mathrm{sin}\varphi $ | 1.1083 | 10.0527 |

8 | $\left(3{\mathsf{\rho}}^{3}-2\mathsf{\rho}\right)\mathrm{cos}\varphi $ | −1.2794 | −9.5085 |

9 | ${\mathsf{\rho}}^{3}\mathrm{cos}3\varphi $ | 0.7599 | −2.6971 |

Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |

© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Jeon, H.; Jung, M.; Lee, G.; Hahn, J.
Aberration Estimation for Synthetic Aperture Digital Holographic Microscope Using Deep Neural Network. *Sensors* **2023**, *23*, 9278.
https://doi.org/10.3390/s23229278

**AMA Style**

Jeon H, Jung M, Lee G, Hahn J.
Aberration Estimation for Synthetic Aperture Digital Holographic Microscope Using Deep Neural Network. *Sensors*. 2023; 23(22):9278.
https://doi.org/10.3390/s23229278

**Chicago/Turabian Style**

Jeon, Hosung, Minwoo Jung, Gunhee Lee, and Joonku Hahn.
2023. "Aberration Estimation for Synthetic Aperture Digital Holographic Microscope Using Deep Neural Network" *Sensors* 23, no. 22: 9278.
https://doi.org/10.3390/s23229278