Next Article in Journal
Functional Role of STIM-1 and Orai1 in Human Microvascular Aging
Next Article in Special Issue
Rapid Identification of Infectious Pathogens at the Single-Cell Level via Combining Hyperspectral Microscopic Images and Deep Learning
Previous Article in Journal
Tumor Immunogenic Cell Death as a Mediator of Intratumor CD8 T-Cell Recruitment
Previous Article in Special Issue
HSSG: Identification of Cancer Subtypes Based on Heterogeneity Score of A Single Gene
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:

Computational Portable Microscopes for Point-of-Care-Test and Tele-Diagnosis

School of Electronic and Optical Engineering, Nanjing University of Science and Technology, Nanjing 210094, China
The Affiliated Suzhou Hospital of Nanjing Medical University, Suzhou Municipal Hospital, Suzhou 215002, China
School of Optoelectronic Science and Engineering & Collaborative Innovation Center of Suzhou Nano Science and Technology, Soochow University, Suzhou 215006, China
College of Optical Science and Engineering, Zhejiang University, Hangzhou 310027, China
Research Center for Intelligent Sensing, Zhejiang Lab, Hangzhou 311100, China
Authors to whom correspondence should be addressed.
Cells 2022, 11(22), 3670;
Submission received: 4 October 2022 / Revised: 11 November 2022 / Accepted: 16 November 2022 / Published: 18 November 2022
(This article belongs to the Collection Computational Imaging for Biophotonics and Biomedicine)


In bio-medical mobile workstations, e.g., the prevention of epidemic viruses/bacteria, outdoor field medical treatment and bio-chemical pollution monitoring, the conventional bench-top microscopic imaging equipment is limited. The comprehensive multi-mode (bright/dark field imaging, fluorescence excitation imaging, polarized light imaging, and differential interference microscopy imaging, etc.) biomedical microscopy imaging systems are generally large in size and expensive. They also require professional operation, which means high labor-cost, money-cost and time-cost. These characteristics prevent them from being applied in bio-medical mobile workstations. The bio-medical mobile workstations need microscopy systems which are inexpensive and able to handle fast, timely and large-scale deployment. The development of lightweight, low-cost and portable microscopic imaging devices can meet these demands. Presently, for the increasing needs of point-of-care-test and tele-diagnosis, high-performance computational portable microscopes are widely developed. Bluetooth modules, WLAN modules and 3G/4G/5G modules generally feature very small sizes and low prices. And industrial imaging lens, microscopy objective lens, and CMOS/CCD photoelectric image sensors are also available in small sizes and at low prices. Here we review and discuss these typical computational, portable and low-cost microscopes by refined specifications and schematics, from the aspect of optics, electronic, algorithms principle and typical bio-medical applications.

1. Introduction

Disease cross-regional transmission (e.g., COVID-19 explosion), food safety (e.g., pathogenic Escherichia coli) and environmental pollution (e.g., water eutrophication) might break out suddenly and unexpectedly [1,2,3,4,5,6]. These have threatened human health, life and the environment. Biological and medical detection and analysis are the important ways to prevent and control these problems. For biological and medical detection and analysis, a microscope is a very important optical instrument, one which directly promotes the development of biotechnology, medicine and pathology. Nowadays, research-level and desktop microscopes have been widely used in medical centers and scientific institutes [7,8,9,10,11,12,13,14,15,16,17,18], including bright-field microscopes, dark-field microscopes, fluorescence microscopes, Zernike-phase-contrast (ZPC) microscopes, differential-interference-contrast (DIC) microscopes, laser confocal microscopes and super-resolution microscopy imaging systems, etc. These traditional microscopy imaging systems are suitable for hospitals and research-level laboratories. They are expensive and bulky. And their operations and analysis require professional skill. Additionally, the medical centers and scientific institutes might be far away from the outdoor biological/medical sampling sites. The above issues all affect detection speed and efficiency. Especially in some underdeveloped remote areas, the medical conditions and resources cannot guarantee effective detection and analysis. Further, with the development of mobile/online medicine, daily medical inspection demands arise quickly. Addressing the above demands, point-of-care-test (POCT) devices and techniques attract many researchers and engineers for observation and tele-diagnosis [19,20,21,22,23,24,25,26].
The POCT aims to quickly collect samples, analyze results, and provide test reports. Small and simplified-operation POCT microscopes are particularly desirable [19,20,21]. POCT diagnostic microscopy devices generally need some of the following characteristics: (1) results are timely: the test results can be quickly provided at the site, without the need to wait for several days to print a report; (2) the operating system is simple: anyone can operate the instrument without special training, and it can be used in a wide geographical range without regional restrictions; (3) accurate qualitatively or quantitatively: instant detection not only requires rapid detection results, but also relatively accurate results; qualitative and quantitative accuracy is the most critical means to achieve user promotion, and POCT’s detection should rival the results of professional instrumentation equipment; (4) portability: to achieve rapid detection in any location, small and portable is an essential factor.
With the development of consumer-level electronic and optical products, modern opto-electronic elements (e.g., laser diodes (LDs) or light-emitting diodes (LEDs), optical fibers and binary optical components, CMOS/CCD photoelectric image sensors) are characterized by smaller sizes and lower prices. These promote portable biomedical testing platforms, mainly focusing on spectroscopy [22], microscopy and so on. Distinct from spectroscopy, POCT microscopy provides direct 2D-vision information. POCT microscopy devices are becoming cheaper and smaller. More importantly, we are in the 5G+ telecommunication and AI era [27,28]. High-performance processing chips are widely used in daily life. For example, the advanced central processing unit (CPU) and graphic processing unit (GPU) enable excellent computing hash-rate and image processing capabilities in a laptop. The Bluetooth module, WLAN module and 3G/4G/5G module are sufficiently miniature and low-priced. The pinhole lens, industrial monitor lens and microscopy objective lens are also characterized by small size and low price. As a typical example, smartphones simultaneously take into account good optical imaging performance, advanced image sensing equipment, high-performance processing chips and big-data telecommunication, functionalities which can be easily achieved by the industry. The coming 5G+ telecommunication era will quicken big-data transmission. Data interaction and processing can be carried out between the machine with wireless access to the remote super-computing server workstation and cloud storage space. Given these conditions, computational, portable and low-cost microscopes evolve quickly, serving for POCT and tele-diagnosis [19,20,21,27,28]. Here we would review these typical cases as a sort of imaging theorem, mainly including lens-free microscopes, smart-phone microscopes, singlet microscopes and portable super-resolution microscopes.

2. Lens-Less Microscopy

2.1. Projection Lens-Less Microscopy

Projection imaging is the simplest and earliest lens-less microscopy method. The entire process does not require unique image reconstruction algorithms [29,30,31,32,33,34,35,36,37,38,39]. Figure 1 is a typical schematic of lens-less diffraction microscopy setups based on a single frame [39], in projection lens-less microscopy, the sample is placed directly on the CCD/CMOS image sensor. A light source, e.g., a LED, directly illuminates the sample, and the projected shadow of the sample is directly captured by the CCD/CMOS image sensor. Assuming the distance from the light source to the sample is z1, the gap from the sample to the image sensor is z2. The light modulated by the sample information propagates along z2 to the CMOS image sensor. When the light source is incoherent, the directly recorded image is a defocused blur image at the out-of-focus distance of z2. When the light source is (partially) coherent, the recorded image may be the diffracted pattern with concentric interference fringes. These defocused blurs and concentric interference fringes both deteriorate the image’s resolution abilities and the image’s contrast. When the image is incoherent, the blur might be optimized by the computational deconvolution algorithms, such as the famous Richardson-Lucy method. When the image is diffracted, the resolution might be improved by the hologram reconstruction algorithms. Besides, other hardware processes, such as placing a hole-array between the sample and the image sensor, might be considered to improve the resolution. An opaque metal layer is coated in the image sensor, and an array of sub-micron small holes are arranged on the metal layer, where each hole corresponds to each pixel on the image sensor. This projection lens-less microscopy method effectively reduces the volume of the imaging system while improving the imaging resolution. However, put briefly, the projection lens-less microscopy is mainly limited by its resolution ability.

2.2. Fluorescence Lens-Less Microscopy

Fluorescent labelling methods are widely used in bio-medical and pathological observing. The CMOS image sensor records the emitted fluorescent light, not the light source’s rays. Usually, the wavelength spectrum of the emitted light and that of the light source are different. It is generally necessary to place a filter between the sample and the CMOS image sensor to filter out the original wavelength spectrum of the light source in a fluorescent lens-less microscope [40,41,42,43,44,45,46]. However, even with a wavelength spectrum filter, there may still be some leakage of the light source’s rays into the CMOS image sensor. Other methods are used to reduce these leaked rays, such as adding a total internal reflection (TIR) prism. The resolution ability of the lens-less fluorescent microscopy is limited by the system’s point spread function (PSF). The PSF can be tested and calculated by recording light from a single point in the sample plane. The light would spread to a spot on the CMOS image sensor. Then, the PSF size would be measured. Therefore, it can be seen that the sample-to-sensor distance will affect the PSF. For example, when the defocusing gap is ~200 μm, the PSF limits the resolution to ~200 μm, which is far worse than the resolution of most lens-based microscope systems. Therefore, despite the obvious advantages of lens-less fluorescence imaging devices in terms of cost, portability and field-of-view, the spatial bandwidth product obtained by this method is not superior to conventional bench-top microscopes. In order to improve the resolution of lens-less fluorescent microscopy, there are several methods: (1) Place the filter directly on the sensor, so that defocusing gap can be minimized. Using this method, the spatial resolution can be improved to ~10 μm. (2) The fluorescent light is relayed using a densely packed fiber array. In principle, when these fibers deliver light to the image sensor, the fiber beam is amplified, providing a degree of magnification for lens-less imaging. Using this method, the resolution can be improved to ~4 μm. (3) The mask of the nanostructure would be placed close to the object; the PSF of the system is no longer spatially invariant, but will depend on nanostructures. With this approach, sub-pixel resolution of up to 2–4 μm can be achieved. (4) computational deconvolution and compression decoding algorithms could be as the post process.

2.3. Digital Holographic Lens-Less Microscopy

When the light source is spatially and temporally coherent, the diffracted pattern recorded by the CMOS image sensor are the on-axial holograms. Different from projection lens-less microscopy and fluorescent lens-less microscopy, digital holographic lens-less microscopy uses the recorded holograms to computationally reconstruct the intensity and phase information of the sample [47,48]. In general, lens-less microscopy based on digital holographic reconstruction is essentially on-axial digital holographic reconstruction. In principle, the acquired image is the light U O ( x , y ) scattered by the object on the sample and the the reference light U R ( x , y ) of the transparent substrate:
U ( x , y ) = U R ( x , y ) + U o ( x , y ) = A R + A O ( x , y ) exp [ i φ o ( x , y ) ] ,  
where A R are the amplitude information of the reference light and A O is the object light, respectively; φ o is the phase information of the object light. The digital holographic reconstruction seeks to reconstruct the object light through the directly recorded hologram I ( x , y ) . The transfer function of coherent wave propagation is
H z ( x , y ) = { exp ( i k z 1 λ 2 f x 2 λ 2 f y 2 ) ,   f x 2 + f y 2 1 λ 2 0 ,   f x 2 + f y 2 1 λ 2   ,  
where λ is the wavelength of the illumination wavelength. fx and fy are the spatial frequency domain coordinates.

2.3.1. Phase Recovery Based on Single Frame

In lens-less holographic microscopy, the conjugate image is unavoidable. The simplest single-frame phase recovery method is an iterative phase recovery method based on support domain constraints [49,50,51,52,53]. The key to this approach is the support field/mask of the object plane. For example, one can use a simple threshold or segmentation algorithm to automatically estimate the position of the object, computationally creating an object-dependent support field/mask. Also, a hardware mask can be created with Talbot grating illumination. The collected single-frame object image is used as the amplitude information of the complex amplitude U 2 O of the initialization object, and an arbitrary value is used as the phase. The complex amplitude U 2 k + 1 (denoted as U 2 O for the first time, and k is the number of times the complex amplitude is limited to be updated) is propagated to the object plane using the angular spectrum propagation, and the light intensity information propagated to the plane is updated by using the predicted support domain information, and the phase is kept unchanged, a new complex amplitude U 1 k + 1 is obtained, and the complex amplitude is propagated back to the image plane. Similarly, the light intensity information of the complex amplitude U 2 k at this time is updated by using the hologram of the directly collected image plane to obtain a new U 2 k + 1 , then the loop iterates to get the final complex amplitude information. This single-frame-based lens-less microscopic imaging method is generally simple in structure, so the system is compact. Thus, it would be widely used in places with low requirements for imaging quality and high requirements for portability, such as water pollution monitoring. In addition, with some additional devices (which may slightly increase the volume of the system), cell movement monitoring can be achieved.

2.3.2. Phase Recovery Based on Multiple Holograms

Multiple-hologram phase recovery methods can be classified into those based on multi-distance recorded holograms, multi-wavelength illumination, and multi-angle illumination [39,54,55,56,57,58]. The typical image reconstruction is essentially the Gerchberg-Saxton (G-S) iterative phase recovery method. The typical experimental setup is shown in Figure 2; the light source directly illuminates the sample, the sample is axially scanned along the z-axis, and holograms are directly recorded by the CMOS image sensor at different distance planes. For computational reconstruction, firstly the holograms collected in each plane should be registered. Then the image at the first plane (phase initialized to a zero 2D-matrix) is propagated to the second plane. The phase remains unchanged, and the amplitude is substituted by the square root of the intensity at the second plane. A new complex amplitude is synthesized, and then is afterwards propagated computationally. These same steps are executed to the n-th plane, completing one iteration loop. After completing multiple iterations, the reconstructed complex amplitude is computationally propagated to the focal plane. The high-resolution complex amplitude information (i.e., amplitude and phase) of the object would be obtained, where there is no twin-image-noise.
The above multi-distance-based phase recovery method needs to collect intensity holograms at different defocus planes, so a mechanical displacement device is generally required. To avoid this problem, Zuo et al. proposed a method based on multi-wavelength illumination to achieve the phase recovery of objects. The whole reconstruction processes are as follows: holograms under different wavelengths are collected, such as R/G/B illumination. Then, the TIE solver is used to guess the initial value, to obtain the complex amplitude of the object under G light illumination. According to the diffraction integral equation, the wavelength λ and the propagation distance z always appear in pairs. The change of the wavelength is equivalent with the change of the propagation distance. Therefore, for a non-scattering sample, if the illumination wavelength is changed from λ to λ + Δλ, the collected diffraction pattern can be equivalently seen as a change in propagation distance Δ z = z · Δ λ / λ . Holograms of different wavelengths can be regarded as images collected at different defocus planes under the same illumination wavelength. Finally, the high-resolution complex information would be obtained with no twin-image-noise, using the G-S iterative method.

2.4. Deep Learning Lens-Less Microscopy

The deep learning lens-less holographic phase recovery algorithm is without iterative optimization processes [59,60,61,62,63]. The deep learning algorithm to reconstruct the single-frame complex information by convolution neural network (CNN) architecture is shown in Figure 3. The training set is obtained, and the trained neural network can recover the corresponding complex amplitude of the object according to the input diffraction image.

2.5. Colorful Lens-Less Microscopy

The high-resolution lens-less microscopy relies on the hologram recorded on the CMOS image sensor. When the illumination source is completely incoherent, the resolution will be relatively poor. Therefore, in order to improve the coherence of the illumination light source, quasi-monochromatic light is generally selected for illumination, which is within a narrow spectrum band. However, in many bio-medical observing cases, valuable information on biological samples is generally stained by chemical dying. Bio-medical researchers prefer to use color images for diagnosis, such as the H&E staining. Therefore, researchers have proposed lens-free color imaging [63,64,65,66]. One of the simplest methods is statistical based color mapping. Greenbaum et al. used this method to achieve color imaging of human breast cancer, Pap smear, etc. [63]. Another relatively simple colorization method is to firstly illuminate the object with monochromatic red, green, and blue light sources, acquiring three images in sequence. Then, by superposing the reconstruction results at the three illumination wavelengths, the final color reconstructed image is obtained. Additionally, the deep learning’s virtual colorizing would also achieve colorful lens-less microscopy under only one coherent illumination, as shown in Figure 4 and Figure 5.

2.6. 3D Tomographic Lens-Less Microscopy

In the field of biological research, fast and compact 3D structure imaging and analysis are very attractive. The schematic diagram of 3D tomographic lens-less microscopy using at least two angular illuminations is shown in Reference [67]. In the lens-less 3D tomography based on multi-angle illumination, the height of the micro-object is determined by calculating the lateral displacement of the hologram [67,68,69,70,71]. This 3D tomography is based on the Fourier diffraction projection theory. For example, the multi-angle and multi-wavelength illumination are provided by the LED array to realize the 3D tomographic microscopy [67]. Using such techniques, a spatial resolution of 1 μm × 1 μm × 3 μm is achieved in the x, y, and z directions within a large volume of approximately 15 mm. The sample is placed directly on the CMOS image sensor, and a partially coherent light source is ~70 mm away from the CMOS image sensor, and the sample is illuminated from different angles (±50°) to obtain lens-less on-axial holograms at different angles.

2.7. Lens-Less Microscopy Application Examples for POCT and Biomedicine Diagnosis

Projection lens-less microscopy. Zheng et al. developed a projection lens-less microscope to track cell growth and differentiation of embryonic stem cells [29]. This imaging platform enables high-resolution and autonomous imaging of cells plated or grown on low-cost CMOS sensor chips, with the ability to image samples with a FOV with 6 mm × 4 mm with a resolution of 660 nm. The total data acquisition process takes about 20 s, and the reconstruction time of a single digital image is less than 1 s. Their cell culture experiments showed that their application of this technique could be a useful tool for long-term cell observation in vitro. At the same time, their device mainly consists of a smartphone and Lego bricks, demonstrating that their imaging platform can be easily assembled.
Fluorescence lens-less microscopy. Several researchers have successfully imaged fluorescent C. elegans samples on a fluorescence lens-less microscopy platform enhanced by compression encoding algorithms [42]. The imaging FOV is >8 cm2. The spatial resolution is about 10 µm. Additionally, they tested the effectiveness of this on-chip imaging approach using different types of imaging sensors, which achieves similar resolution ability independent of the imaging sensor chip. This provides a useful tool for high-throughput screening applications in observing fluorescent biomedical samples.
Digital holographic lens-less microscopy. Although there are many reconstruction methods for the digital holographic lens-less microscopy, the application examples are almost similar to each other. The resolution abilities, FOV and the imaging speed are similar to each other. For example, Cedric Allier et al. provide a method based on holographic lens-less video microscopy to measure important metrics for cell proliferation studies, e.g., cell cycle-duration and cell dry mass [35]. The microscopy imaging FOV is ~29.4 mm2, and the resolution ability of the reconstructed phase image is ~5 µm. By tracking 2.7 days’ acquisition of HeLa cells in culture, they would get a dataset with 2.2 × 106 single-cell morphology measurements and 10,584 cell cycle trajectories, with the speed of 1/10 frame per minute across the FOV. In addition to this study on cell proliferation, the setup could be used for the study of cell migration by tracking a large number of trajectories simultaneously over several hours. Deep learning computational methods, coloring and 3D tomography would enhance the holographic lens-less microscopy. For example, Liu et al. demonstrated a deep neural network-based virtual staining technique for label-free cells, named PhaseStain [62]. It is based on the quantitative phase image (QPI) obtained by lens-less microscopy and converted into a stained image equivalent to a bright-field microscopy image. Their experiments demonstrated the effectiveness of this virtual staining method on tissue sections such as human skin. This digital staining framework can further enhance the various uses of label-free QPI technology in pathology applications and general biomedical research, eliminating the need for histological staining, reducing costs and saving time associated with sample preparation. The implementation of a color opto-fluidic microscope SROFM prototype is reported. Based on a multi-wavelength illumination lens-less microscope, the system can scan approximately 400 cells per second for monochromatic imaging and 100 cells per second for color imaging, with the highest acuity optical resolution of 0.66 µm. They successfully applied the technique to color-image red blood cells infected with P. falciparum.

3. Smart-Phone Microscopy

The explosive application of smart phones not only provides a portable AI and tele- communication device, but also promotes a perfect and high-resolution camera. As shown in Figure 6, in the coming 5G telecommunication technology era, the development of modern mobile communication networks makes data transmission and remote information sharing. This also provides new development opportunities for computational, portable and low-cost microscopic imaging technology [72,73,74,75,76,77,78]. Thus, it is very potential to be applied to medical services in various remote and complex environments. The biological/medical images collected by smartphones can be transmitted to the central processing station for expert analysis or to a network server for cloud computing, such as image post-processing. Finally, the expert analysis results and image post-processing data are transmitted to the mobile phone for display, by the 5G+ networks and the Internet. Compared to the standard microscopic imaging systems, the smart phone’s back camera lens can be viewed as the tube lens and the CMOS image sensor, while an attached lens is the objective microscopic lens. According to the different attached lens, the microscopic imaging system based on the smartphone platform can be divided into three categories: the commercial microscopic objective lenses, the customized single lenses and the inverted pinhole lenses. As shown in Figure 7, there are three types of novel microscopic imaging optical path structures based on smartphone platforms. The first type of structure uses standard microscope objectives and eyepieces to form an infinity microscope optical system, and the smartphone platform is only used as an image recording and display device. Compared with traditional camera sensor acquisition equipment, the smartphone platforms, with integrated image acquisition and display features, provide strong compatibility, having the advantage of instant observation. The second type of structure uses the imaging lens and CMOS image sensor as an optical imaging and image acquisition device. Under this structure, a customized singlet lens with optical magnification capability is used as the objective lens for microscopic imaging, such as ball lens, aspheric lens or liquid lens. These customized singlet lenses are not sensitive to the gap between themselves with smartphone camera lens, so the total volume can be optimized. Distinct from the customized singlet lens design, since the multiple lens group matches the image sensor of the smartphone platform, the inverted pinhole/phone camera lens provides a third type of imaging optical path design, which can correct the image aberration and make better use of the image of the smartphone platform. As examples in Figure 8, based on the smartphone platform’s convenient and fast wireless data transmission capability, deep learning image enhancing and deep learning recognition algorithms would broaden applications of these cost-effective and portable microscopic imaging systems in medical diagnosis, disease diagnosis in remote areas testing and food safety testing [77,78].

3.1. Smartphone-Based Phase Contrast Microscopy

Image vision contrast in clinical microscopy is often achieved by chemical staining or labeling. By these dying methods, specific features of the sample can be enhanced, but they require extensive sample preparation. Label-free phase contrast imaging techniques do not require staining, and have not been developed for several years. Due to the higher cost of the associated optical hardware and the complexity of the system, their applications are limited. Now, computational, low-cost and portable microscopes based on smartphone platforms make label-free phase contrast techniques useful and possible in the fields of on-site testing and remote medical diagnosis [79,80,81,82]. To improve the quality of phase contrast images collected by smart-phone microscopes, deep learning networks can also transfer the style of directly collected images. Deep learning methods can extremely improve low-cost smartphone-based microscopes up to the imaging performance of bench-top microscopes. Bian et al. designed a portable microscope based on a smartphone platform, where an aspheric single lens is used as the imaging objective lens. The image data are transmitted to the computer through 5G/WIFI communication. Combined with the deep learning method of image style transfer, a virtual DPC image is obtained. Figure 9a,b show the structure and workflow of this setup. Traditional Zernike phase contrast microscopes require the insertion of a circular Zernike phase plate at the Fourier plane of the microscope objective. Also, traditional differential interference microscopes require finely assembled birefringent crystals such as Wollaston prisms. Now, only a self-designed aspheric single lens is used and a deep learning-based generative adversarial network to obtain a virtual phase contrast image with the same effect. Besides, the computational illumination could also achieve phase contrast microscopy, as shown in Figure 9c. These efforts by deep learning to transfer image style and improve the spatial resolution, will close the performance gap between smartphone microscopy and state-of-the-art bench-top DPC microscopy systems.

3.2. Smartphone-Based Dark-Field Microscopy

Similar to bench-top dark-field microscopes, in Figure 10, smartphone-based dark-field microscopes can also use ring illumination [83,84,85]. Instead of a dark-field condenser to achieve ring illumination, the outer ring of the LED array can be activated to provide ring illumination by using a programmable LED array. The LED array realizes dark field imaging by modulating the illumination aperture to be larger than the numerical aperture of the objective lens. Smartphone dark-field microscopy could also be based on total internal reflection (TIR) prism. TIR-based dark-field microscopy methods provide better signal-to-noise ratios than traditional oblique illumination-based dark-field microscopy, while eliminating the need for annular illumination apertures.

3.3. Smartphone-Based Quantitative Phase Microscopy

Digital holographic microscopes based on quantitative phase imaging have attracted much attention due to their quantitative phase imaging capabilities. Taking advantage of the cost-effective light source and compact system structure, various portable digital holographic microscopes have been designed based on smartphone platforms for sample observation and measurement [86,87,88]. Lee et al. also proposed a low-cost design for molecular diagnostics via digital holography on a smartphone platform [88]. Although LEDs can be used as light sources in portable smartphone digital holographic microscopes, a pinhole is still required to meet the requirement of illumination coherence. Furthermore, the high-resolution images would be reconstructed computationally based on diffraction algorithms. While both quantitative intensity and phase of the sample can be recovered from the hologram, the process is still time-consuming, including back-propagation and phase recovery. Based on these optical wave diffraction methods, Meng et al. developed an accurate, high-contrast, cost-effective, and portable phase imaging microscope based on a smartphone platform [86]. The system can be used to image biological samples such as Pap smears. In this TIE-based smartphone phase imaging microscope, the system resolution is ~1 μm, which can be used for 3D morphological studies of red blood cells. Phillips et al. used a hemispherical illuminator composed of an LED array as the illumination source and attached to an inverted smartphone platform-based microscope system. Compared to planar LED arrays, hemispherical LED arrays offer significantly better light efficiency, enabling shorter acquisition times and more efficient power usage.

3.4. Smartphone-Based Fluorescent Microscopy

When equipped with the appropriate accessories, fluorescence microscopy imaging is possible on a smartphone platform. This would provide more diagnostically-oriented biomedical observation [89,90,91,92,93,94,95,96,97,98,99,100,101,102,103,104]. A typical smartphone platform-based fluorescence microscope consists of an excitation light source (LED or laser diode), an emission filter, and an objective lens. In Figure 11a,b, smartphone fluorescence microscopes are proposed, which use an LD or an LED to excite the test tube sample. After passing through the sample, the fluorescence emission is collected perpendicular to the excitation direction. The key accessories of the epi-optical path are fluorescence excitation modules, which contain excitation filters, dichroic mirrors and emission filters. They can realize the orthogonality of the excitation optical path and the fluorescence emission optical path, and improve the signal-to-noise ratio of microscopy imaging. Wei et al. proposed a smartphone fluorescence microscope using laser diode oblique incidence illumination [100,103,104]. The sample is backlit by the excitation beam of a small laser diode with a wavelength of 450 nm at an incidence angle larger than the numerical aperture of the objective lens. This results in a high signal-to-noise ratio for imaging nano-scale analytes including DNA molecules, nanoparticles and viruses. Dai et al. used PDMS inkjet printing lens technology to design a lens with both focusing and filtering functions, and integrated the dual-function printing lens into a smartphone fluorescence microscope system [102]. The system’s structure is shown in Figure 11c. LEDs are used for bright-field imaging and LDs are used for fluorescence imaging. After inserting the LED chip or LD chip into the illumination source, the light source chip is positioned by two micro magnets and connected to the electrodes, which automatically turns on the LED or LD. The collimated laser beam irradiates the sample at an incident angle of 45°, which was greater than the acceptance angle of the dual function printing lens. Therefore, the excitation light is not directly coupled into the image sensor, effectively reducing the background noise.

3.5. Smart-Phone Microscopy Application Examples for POCT and Biomedicine Diagnosis

For sperm counting and monitoring, Computer Assisted Sperm Analysis (CASA) and Visual Assessment (VA) are the two assessment techniques used in the analysis. A method is proposed for sperm count analysis using smartphone microscopes and computers. [33] Smartphone microscopes are used to acquire image videos similar to visual assessment (VA) techniques; the videos of the samples are recorded by a high-resolution camera with a resolution of 1920 × 1080 and 30 Hz. Then the image data are wirelessly transmitted to the computer sever. A computerized sperm count software (CSCS) is designed to count and monitor sperm using a counting chamber. In contrast to the CASA system, a smartphone-based microscope offers a lower cost design. Compared with CASA and VA-based sperm count analysis, the proposed smartphone-based sperm concentration analysis is very attractive due to its modularity, functionality, accuracy, and cost. Using an LED light source, a dark-field condenser, and a 20× objective with a mobile phone camera, Sun et al. developed a dark-field smartphone microscope that can quantify nanoparticle signals for various research and medical applications [83]. The device captures images up to 8.0 megapixels. It weighs less than 400 g, and costs less than 2000 USDs. This method forms the basis of most clinical trials, combining kinetic and biomarker quantification, and provided a novel nanoparticle-based diagnostic assay for tuberculosis. The system is simple, robust, nanoparticle-based activity and quantitative analysis in resource-limited regions. Yang Zhenyu et al. demonstrate the integration of a quantitative phase imaging (QPI) method into a smartphone platform [87]. It is used for imaging red blood cells, with a resolution of about 1 μm. For algorithms, the computations actually run on a more powerful server. The computation time is less than 1 s. This device exhibits acceptable capabilities in erythrocyte imaging and reconstruction of cell thickness from computed phase maps for 3D morphological studies. In another example, a fluorescence microscopy imaging platform is developed on a mobile phone. [88] They demonstrated their platform for fluorescence imaging of labeled leukocytes in whole blood samples. Besides, water-borne pathogenic protozoan parasites, such as Giardia cysts, are successfully imaged over a large FOV of 81 mm2. A resolution of 10 µm is achieved. This compact and cost-effective fluorescence microscopy imaging platform weighs only about 28 g and measures about 3.5 × 5.5 × 2.4 cm. This setup could potentially be used for various lab-on-a-chip assays developed for global health applications, such as monitoring CD4-cell count or virus measurement for HIV patients.

4. Singlet Microscopy

Singlet lens is also another attractive way to achieve portable and low-cost microscopy setups. Commercial microscope objective lens and other imaging lens have become inexpensive due to mass industrial production. However, as they consist of multiple pieces of lenses, the cost mainly focuses on the lenses’ mounting and testing. In contrast, singlet lenses have no need for precise assembly, alignment and testing. The singlet lenses can reduce the time, money and labor cost extensively, resulting in a further price and integration revolution of imaging devices [105].

4.1. Singlet Bright-Field Microscopy

In Figure 12, the singlet microscopy setup is combined with only one aspheric lens and deep learning computational imaging technology [105]. The designed singlet aspheric lens is an approximate linear signal system. In this singlet microscopy setup, MTF curves on all FOVs are almost coincident with each other. The purpose of this design is to further improve imaging performance by using a deep learning algorithm. The total setup weighs only 400 g. By the sample of USAF-1951 target and pathological tissue slices, the experimental results show that both the resolution ability and FOVs of the singlet microscope are competitive with those of a commercial microscope with the 4X/NA0.1 objective lens. Figure 13 is the algorithm flowchart of the employed deep learning computational imaging method [105]. The algorithm includes two parts: the first is about the training stage of deep neural networks (DNN), and the other is the practical working stage. For the R/G/B illumination microscope, the data should be recorded separately, and the DNN training is executed respectively. The improved R/G/B channel images by deep learning would be combined as a colorful image computationally.

4.2. Singlet Achromatic Microscopy

The most attractive element of the singlet lens is freedom from precise testing, assembly, and alignment. These are very helpful for the application of portable and low-cost microscopes. But when the singlet is with only one kind of material, it would be difficult to overcome wavelength spectrum dispersion and chromatic aberrations. In Figure 14, the singlet lens and the deep learning image-style-transfer algorithms are combined to achieve achromatic aberrations [106]. These concepts and experiments have been proved and executed in clinical pathological slide microscopy. In the realm of hardware, the singlet aspheric lens is designed and fabricated. The lens has a high cutoff frequency and linear signal properties. There is only one mono-chromatic LED illumination, and the images are recorded by the CMOS image sensor. For algorithms, an image-style-transfer deep learning network is trained, which transfers mono-chromatic-illuminated greyscale microscopy images to virtual chemically stained images. A ‘U-Net’-like GAN framework architecture is designed to achieve image-style-transfer. Before the image-style-transfer, a conventional deep learning deconvolution method is interposed to improve the resolution and image contrast. In other computational art-applications, such as photo-to-comics, photo-to-painting, day-to-night and others, the textural details are not important. However, for medical and pathological observing, the image texture and high-resolution-content features should be kept in the virtually colorized images. In the proposed ‘U-NET’-like generator network, it strongly retains the high-resolution-content features of the original greyscale images. The loss function also contains two targets. One is to achieve the style transfer, and the other is to keep the high-resolution-content features. As shown in Figure 15, by experiments, data analysis and discussions, the proposed virtual colorization microscope imaging method is effective for H&E stained tumor tissue slides in singlet microscopy. It is believable that computational virtual colorization method for singlet microscopes would promote low-cost and portable singlet microscopy development in medical pathologic label staining observation (e.g., H&E staining, Gram staining, Fluorescent labeling, and so on).

4.3. Singlet Multi-Spectral Microscopy

To obtain more texture-spectrum information under different narrow wavelength spectrum bands, multi-spectral and hyper-spectral imaging methods and setups are proposed. Similarly, these portable and cost-effective multispectral microscopes are necessary in some remote biomedical applications. In Figure 16, the innovation concerns a portable and cost-effective multispectral microscopy setup [107]. In the optics hardware, a customized singlet lens is designed and fabricated. It could control rotational symmetric aberrations and eliminate the asymmetric optical aberrations. Then the image performance would be improved by the deep learning enhancing algorithms. The designed method helps to reduce the extreme difficulties of singlet lens fabrication due to the simple surface produced by the aberration optimization, while ensuring the high resolution. The singlet lens connects Zernike polynomial coefficients with singlet lens parameters through wavefront aberrations. By imaging a gold standard resolution pattern (Figure 17) and typical bio-samples, experiment results demonstrate that this portable singlet microscope would achieve multi-spectral microscopy well and cost-effectively.

4.4. Singlet Virtual Phase Contrast Microscopy

Phase imaging microscopy is for observing biological tissues and cells in vitro. Without chemical dying and fluorescent labeling, transparent and weakly scattering biological tissues/cells are imaged as the relative/quantitative phase information distribution. Conventional phase contrast microscopes consist of extensive precise and clean optics elements, which limits their usage, such as Zernike phase contrast (ZPC) microscopes (Figure 18a). In Figure 18, a singlet virtual Zernike phase contrast microscope is proposed for unstained pathological tumor tissue slide. In optics hardware, the objective lens is only one piece of lens. And there is no inset Zernike phase plate, which is even more expensive than a whole bright-field microscope setup. The Zernike phase contrast is virtually achieved by the deep learning computational imaging method. In the practical virtual Zernike phase contrast microscopy, the computational time consumed is less than 100 ms, which is far less than other computational quantitative phase imaging algorithms. By a conceptual demo experimental setup, it is competitive to a research-level conventional Zernike phase contrast microscope and effective for the unstained transparent pathological tumor tissue slides. Figure 18b is a singlet objective lens utilized to achieve virtual ZPC microscopy based on the deep learning computational imaging method [108]. The circular illumination part is still a Kohler illumination modulated by an annulus stop. However, instead of the conventional microscope objective consisting of multiple lens, a customized aspheric singlet lens is used here. The pathological tissue slide is at the objective plane, while a digital CMOS image sensor is at the conjugate imaging plane. But only based on the optic hardware in Figure 18b, one could not get a ZPC microscopy image of the transparent pathological tissue slice. One needs to computationally process the directly-recorded image by the deep learning ZPC-transfer method. In Figure 18c, the ‘using stage’ of the singlet virtual ZPC microscopy is presented. Before the ‘using stage’, ZPC-transfer DNN kernel would be deeply trained in the ‘training stage’. When the digital CMOS image sensor records an image, this image would be convoluted with the ZPC-transfer DNN kernel. Then the visual contrast of the microscopy image would be improved.

4.5. Singlet Meta-Lens Microscopy

Due to its ultrathin and flat structure, meta-lens has shown its wonderful capabilities in modulating and controlling light. Compared to the thick lens, the meta-lens has the most advantage of its high integration. This advantage would be very attractive for a portable microscope in the coming 5G+ telecommunication and AI era. Its high integration would be a good substitute for traditional thick lenses. In Figure 19, the ultrathin meta-lens is directly mounted on a CMOS image sensor, which constructs a highly integrated microscopy setup [109,110,111]. Different from conventional microscope objectives, the working distance is about sub-millimeters. The designed meta-lens is with NA0.78, made of GaN, in Figure 19a. The designed NA of this meta-lens objective lens is 0.37. The objective resolution is ~1.74 µm under the illumination wavelength of 630 nm, with a unit magnification. And the objective resolution is ~1.23 µm, with a 1.5× magnification. Besides, this meta-lens array approach would flexibly expand the FOV without sacrificing the resolution ability. This FOV extension method would achieve a high space-bandwidth product (SBP) for a wide-field microscopy application. Figure 19b shows the centimeter-scale FOV microscopy imaging, with a very high compact integration. But the illumination light should be polarized, with LCP light or RCP light. In this highly integrated microscopy setup, the meta-lens objective lenses are fabricated as a 6 × 6 meta-lens array. The total size of this setup is ~3.5 cm × 3 cm × 2.5 cm. Moreover, using the wavelength spectrum dispersion of the meta-lens, a highly integrated microscopy setup would develop tomography without mechanical scanning. Using the large diffractive dispersion property, this portable and cost-effective microscope would get an excellent tomographic imaging ability with a large DOF. And it successfully achieves the microscopic imaging for a frog egg cell. In conclusion, since the nature of ultrathin, flat and highly integration, the singlet meta-lens shows high potential in portable, low-cost and computational microscope in the 5G+ telecommunication and AI era.

4.6. Singlet Microscopy Application Examples for Biophotonics

Shen et al. propose a deep learning singlet microscope with imaging performance competitive with research-level commercial microscopes [81,105,106,107]. It has a total size of about 10 cm × 10 cm × 20 cm and weighs only 400 g. The resolution is up to 1.38 μm and a large FOV (diagonal 5 mm) is achieved. Since the singlet objective is a plastic model, it is very cost-effective compared to commercial objectives. The H&E stained pathological tumor tissue slides are successfully imaged multi-spectrally and colorfully. It can be seen that the portable singlet microscope has great application potential in biology, materials science, environmental science and other fields. Xu et al. demonstrate a compact imaging device with integrated meta-lens for wide-field microscopy [111]. The meta-lens is directly mounted on the image sensor. The device is based on a silicon lens working in the red wavelength range, with an overall size of about 3.5 × 3 × 2.5 cm. In this application, bio-samples of Pap smear and dragonfly wing are successfully imaged as the resolution of 1.74 um and a FOV of 1.2 × 1.2 mm2. Chen et al. proposed a metalens-based spectral imaging system that achieved high lateral and vertical resolutions, i.e., approximately 775 nm and 6.7 μm, respectively, with an aspheric GaN metalens (NA = 0.78) [110]. This computational portable microscopy setup successfully imaged the bio-sample of frog egg cells and showed excellent tomographic images of cell membranes and nuclei with distinct depth-of-focus (DOF) features.

5. Super-Resolution Microscopy

5.1. Super-Resolution Fourier Ptychography Microscopy

With optical microscopy developing, super resolution, large FOV, and phase imaging have been the hot researching focus. In some research-level microscopic setups, super-resolution and phase imaging capability have been outstanding advantages, compared with traditional microscopic imaging setups. But these advantages are at the cost of complex hardware structure, complicated operations and reducing other imaging performance, such as stochastic optical reconstruction microscopy (STROM), stimulated emission depletion (STED) microscopy and fluorescence emission difference (FED) microscopy. Fourier ptychographic microscopy (FPM) is an attractive and typical method to achieve QPI microscopy and super-resolution simultaneously, without affiliated precise optomechanical elements. Thus FPM has shown its high value and application prospect in low-cost portable microscopes [112,113,114,115,116]. In a typical FPM microscope, an LED array provides illumination from different directions. The CMOS image sensor collects a series of low-resolution images in turn. High spatial resolution, large FOV and quantitative phase imaging are achieved by computational phase recovery algorithms. Actually, FPM is the lens-based Fourier domain form of PIE, which is an extension of PIE phase recovery technology. The FPM uses a programmable LED array to provide flexible wavelength and illumination angles. FPM setups go beyond their super-resolution microscopy, and also computationally achieve quantitative phase imaging and tomography. Unlike a traditional Kohler lighting module, the illumination light source of FPM is a programmable LED array board. By sequentially illuminating each LED unit, the plane wave illuminates the bio-sample slide in different directions, while simultaneously the CMOS image sensor captures a series of low-resolution images. Each digitally recorded image corresponds to relative sub-spectral regions of the bio-sample. The illumination wave vector determines the center of the circular sub-spectral region. The illumination wavelength and the NA determine the radius of the region. In the assumed physical and mathematical model of FPM, the light wave from one LED, with wave vector of ( k x , k y ) , illuminates the thin sample. This means a central shift ( k x , k y ) of the sample Fourier spectrum. The complex amplitude transmittance function t ( x , y ) can be used to represent the sample information. When the illumination plane wave goes through the sample, the distribution of the transmitted light can be expressed as:
E o u t ( x , y ) = E i n ( x , y ) · t ( x , y ) ,  
where E i n ( x , y ) is the complex function of the incident light. The FPM phase recovery process consists of the following five steps:
Generate the initial value of high-resolution complex amplitude in the spatial domain, I h exp ( i φ h ) .
Filter the illumination plane wave at a certain vector in the Fourier domain. And an inverse Fourier transform is implemented to generate a low-resolution image,   I 1 exp ( i φ 1 ) .
The collected low-resolution image intensity I l m is replaced by I 1 , and the corresponding sub-regions in the Fourier domain are updated.
Repeat Steps (2) and (3) for inclined plane wave irradiation from N vectors.
Repeat Step (2) to Step (4) for a new round of iterative update. The termination conditions for iteration updating can be set in advance.
According to Step (1) to Step (5) above, the inverse Fourier transform is performed in the Fourier domain, and then the high-resolution quantitative phase information is obtained in the spatial domain. Based on this FPM framework, some low-cost computational super-resolution microscopes have been achieved.

5.2. Super-Resolution Lens-Less Microscopy Based on Sub-Pixel Displacement

The sub-pixel displacement of the sampling plane with respect to the CMOS image sensor is one of the super-resolution methods [117,118,119,120]. This mechanical lateral displacement could be carried out by displacement of the sample/sensor or moving the light source. When the sample/sensor is being displaced, the displacement accuracy must be at the sub-pixel level. This is usually the sub-micron displacement, which will increase the cost significantly. However, using the lateral displacement of the light source to obtain sub-pixel shifted images can greatly reduce the cost. Assuming that the gap between the light source and the sample is z1, the gap between the sample and the CMOS image sensor is z2. In a unit-magnification lens-less microscope, z1 is much longer than z2, i.e., z1 >> z2. The ratio between the light source’s lateral movement and the corresponding displacement of the CMOS image sensor is z1/z2. Therefore, moving the light source would reduce the precision requirement of the system’s mechanical displacement. As a consequence, this method of improving the resolution achieved by the mechanical displacement of the light source is widely used in lens-less super-resolution phase imaging. Although moving the light source has lower requirements on the accuracy of the stage than the displacement sensor/sample, mechanical displacement is still required. In order to reduce this mechanical displacement, the light is coupled to different fibers. After optical fiber coupling, these fibers are accurately assembled into a small area, which not only realizes the light source’s lateral displacement, but also promotes the spatial coherence of the light source.

5.3. Super-Resolution Lens-Less Microscopy Based on Wavelength Scanning

Similar to scanning the angles and sub-pixel, wavelength scanning would also achieve super-resolution lens-less microscopy. A wavelength-tunable laser would be used to generate wavelengths ranging from 498 nm to 510 nm (with an interval of 3 nm), and each dominant wavelength corresponds to a spectral width of about 2 nm [121]. Furthermore, additional multi-angle illumination and multi-height were added during the experiment. For the algorithm, a high-resolution correction matrix process for calculating limited light intensity is incorporated into the phase recovery method based on synthesizing apertures. Reference [119] shows the schematic diagram of the device. Based on 60 holograms, 5 angles of illumination and 12 wavelengths of illumination, the pixel size of the final 1.12 μm camera achieves a resolution of 0.5 μm.

5.4. Super-Resolution Lens-Less Microscopy Based on Multi-Angle Illumination

In super-resolution lens-less microscopes based on multi-angle illumination (Figure 20), the light source is mounted on a rotating arm in order to achieve oblique illumination of two orthogonal axes. In the super-resolution algorithm, the light source has a lateral motion at every angle. Thus, the high-resolution hologram images are generated by synthesizing holograms at each angle [122]. This approach is extremely similar to the FPM framework. Different angles of light are equal to the synthesis of Fourier spectrum information. The reconstruction relays on the derived G-S recovery algorithms backward and forward, between the spatial domain and the Fourier domain. By recovering the super-resolution complex amplitude information at the sensor plane, the information is propagated back to the object surface through numerical diffraction calculation. Ultimately, the objective light intensity and phase distribution with high resolution are obtained. Actually, sub-pixel displacement, wavelength scanning and multi-angle illumination could be combined simultaneously to achieve lens-less super-resolution.

5.5. Super-Resolution Lens-Less Microscopy Based on Axial Scanning

Axially scanning the distance is used between the CMOS image sensor and the sample for super-resolution lens-less microscopy [123]. The CMOS image sensor collects low-resolution light intensity in different defocus planes. After computational reconstruction, pixel super-resolution and phase recovery would simultaneously be achieved. This method introduces pixel transfer function into the traditional iterative phase recovery method. TIE is used to solve the phase as the initial value of iterative reconstruction, which can significantly accelerate the speed of iterative reconstruction. This proposed adaptive G-S iterative phase recovery imaging algorithm can be considered as an incremental gradient descent optimization algorithm. This algorithm not only maintains the fast convergence of the initial iteration by the incremental gradient method, but also remarkably improves the robustness. With the camera pixel size of 1.67 μm, the half-width resolution of 0.77 μm was finally achieved based on 10 defocus holograms.

5.6. Portable Super-Resolution Microscope Applications on Biomedical Observation

For observing CTCs, using a portable FPM setup, the researchers obtained high resolution color images of large FOV micro-filter samples, with a quantitative phase data. This portable FPM setup can refocus the 300 μm depth range of the sample. The sample image acquisition time is about 3 min, the reconstruction time of each color channel is about 10 min, and it takes a total of 39 min to generate a color image of a large FOV with a quantitative phase data. This portable FPM setup demonstrated high image quality, efficiency, and consistency in detecting tumor cells when the corresponding micro-filter samples were compared to standard microscopes with high correlation (R = 0.99932). Bishara et al. report a sub-pixel-shift-based lens-less super-resolution microscope. It is a compact on-chip microscope weighing approximately 95 g [71]. It is capable of reconstructing holographic images (amplitude and phase) of observed objects (e.g., human malaria parasites) with a resolution of less than 1 µm. Its FOV reaches up to 24 mm2. This lens-less on-chip microscope has successfully imaged malaria parasites. The results show that a compact and lightweight on-chip microscope is important for addressing global health problems such as diagnosing infectious diseases in remote areas. Luo et al. report a wavelength-scanning-based pixel super-resolution technique [121]. The technique allows analysis of achromatic (e.g., unstained) and stained/stained organisms. The red blood cells with a large area is used as the typical bio-sample. Besides, Luo et al. also provide a multi-angle illumination lens-less super-resolution microscopy technique [122]. To demonstrate the validity of this synthetic aperture-based holographic on-chip microscope, unstained Pap smears are successfully imaged on a very large FOV of 20 mm2. This compact synthetic aperture-based approach on-chip microscopes can be used in a variety of applications in medicine, physical science, and engineering that require high-resolution, wide-field imaging.

6. Discussion

The above computational portable microscopes show attractive potential applications for point-of-care-tests and tele-diagnosis. But, as developers and users, we should not be too optimistic. Usually, a computational portable microscope is developed for certain biomedical applications. Once designed, its function is almost unchangeable. That means one setup for one application. When faced with other applications, the setup should be modified slightly. Besides, for computational algorithms, unfortunately, some popular computational approaches are also marred by lack of universality, risking the generation of errorri artifacts. For example, the well-known issues with ‘over trained’ AI-based algorithms. It is not a big problem to generate a nearly ideal AI model for the selected data set, but design and validation of the robust, stable, and universal AI-algorithms is a different story. As for validation, some proof-of-concept studies do not cover this part much. Thirdly, low-cost is another claimed advantage. Commonly, research-level and desktop microscopes have been widely used in the medical centers and scientific institutes, which are relatively expensive. For example, a bright-field microscope is about $3000, a dark-field microscope is about $3000, a fluorescence microscope is about $10,000, a Zernike-phase-contrast (ZPC) microscope is about $20,000, a differential-interference-contrast (DIC) microscope is about $20,000, a laser confocal microscope is about $0.15 million and a super-resolution microscopy imaging system is about $1 million. Although computational portable microscopes are not widely purchased as a universal commercial price, the cost of computational portable microscope would be estimated as Table 1.

7. Conclusions

With the needs of POCT and tele-diagnosis increasing, computational portable microscopes become more and more necessary in bio-photonics and bio-medicine applications. As electronic products, home computers, computer sever stations and AI algorithms develop fast. Their costs are approaching an acceptable level due to the industrial mass scale effect. Thus, these cost-effective optical, electronic and computational resources are easy to get [124]. Obviously, in above mentioned lens-less microscopy setups, smart cellphone microscopes, singlet lens microscopes and super-resolution portable microscopes, the components are easy to be collected by a designer. Moreover, the widely applied 5G+ telecommunication and WIFI wireless networks make it very convenient to transfer big-data, for example, high resolution images. The available computer sever station and cloud storage space would provide cheap computational resources, with a powerful hash-rate. These available hardware and software environments provide the base condition for developing computational, portable and low-cost microscopes. Briefly, this is the best era for POCT and tele-diagnosis [124,125,126,127]. It is believable that more ideas and innovations for developing computational, portable and low-cost microscopes for POCT and tele-diagnosis would boom.

Author Contributions

Y.B. wrote the original draft preparation. Y.B. and T.X. together edited the manuscript. Y.B., K.J. and Q.K. processed figures. J.W. and X.Y. reviewed the draft. J.W., X.Y. and C.K. supervised the author group. S.Y., Y.J., R.S., H.S. and C.K. collected related information and papers. All authors have read and agreed to the published version of the manuscript.


This study is partially supported by National Natural Science Foundation of China (NSFC) (62005120), the Natural Science Foundation of Jiangsu Province (BK2019045, BK20201305), and Suzhou Science and Technology Development Project (SYSD2020132), and Biopharmaceutical Industry Innovation (Clinical Trial Capability Improve-ment)->Medicine-Industrial Collaborative Innovation Research Project (SLJ2022019).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data that support the findings of this study are available from the corresponding author upon reasonable request.

Conflicts of Interest

The authors declare no conflict of interest.


  1. Casini, L.; Roccetti, M. A Cross-Regional Analysis of the COVID-19 Spread during the 2020 Italian Vacation Period: Results from Three Computational Models Are Compared. Sensors 2020, 20, 7319. [Google Scholar] [CrossRef] [PubMed]
  2. Coulthard, P. Dentistry and coronavirus (COVID-19)—Moral decision-making. Br. Dent. J. 2020, 228, 503–505. [Google Scholar] [CrossRef] [PubMed]
  3. Ferraresso, J.; Apostolakos, I.; Fasolato, L.; Piccirillo, A. Third-Generation Cephalosporin (3gc) Resistance and Its Association with Extra-Intestinal Pathogenic Escherichia Coli (Expec). Focus on Broiler Carcasses. Food Microbiol. 2022, 103, 103936. [Google Scholar] [CrossRef]
  4. Klinkenberg, D.; Hahné, S.J.M.; Woudenberg, T.; Wallinga, J. The Reduction of Measles Transmission During School Vacations. Epidemiology 2018, 29, 562–570. [Google Scholar] [CrossRef]
  5. Prakasan, S.; Lekshmi, M.; Ammini, P.; Balange, A.K.; Nayak, B.B.; Kumar, S.H. Occurrence, pathogroup distribution and virulence genotypes of Escherichia coli from fresh seafood. Food Control 2021, 133, 108669. [Google Scholar] [CrossRef]
  6. Semedo-Aguiar, A.P.; Pereira-Leal, J.B.; Leite, R.B. Microbial Diversity and Toxin Risk in Tropical Freshwater Reservoirs of Cape Verde. Toxins 2018, 10, 186. [Google Scholar] [CrossRef] [Green Version]
  7. Horio, T.; Hotani, H. Visualization of the dynamic instability of individual microtubules by dark-field microscopy. Nature 1986, 321, 605–607. [Google Scholar] [CrossRef]
  8. Lichtman, J.W.; Conchello, J.A. Fluorescence Microscopy. Nat. Methods 2005, 2, 910–919. [Google Scholar] [CrossRef]
  9. Holzner, C.; Feser, M.; Vogt, S.; Hornberger, B.; Baines, S.B.; Jacobsen, C. Zernike phase contrast in scanning microscopy with X-rays. Nat. Phys. 2010, 6, 883–887. [Google Scholar] [CrossRef] [Green Version]
  10. Cui, X.; Lew, M.; Yang, C. Quantitative differential interference contrast microscopy based on structured-aperture interference. Appl. Phys. Lett. 2008, 93, 091113. [Google Scholar] [CrossRef]
  11. Schermelleh, L.; Ferrand, A.; Huser, T.; Eggeling, C.; Sauer, M.; Biehlmaier, O.; Drummen, G.P.C. Super-resolution microscopy demystified. Nature 2019, 21, 72–84. [Google Scholar] [CrossRef]
  12. Malamy, J.; Shribak, M. An orientation-independent DIC microscope allows high resolution imaging of epithelial cell migration and wound healing in a cnidarian model. J. Microsc. 2018, 270, 290–301. [Google Scholar] [CrossRef] [PubMed]
  13. Platonova, G.; Štys, D.; Souček, P.; Lonhus, K.; Valenta, J.; Rychtáriková, R. Spectroscopic Approach to Correction and Visualisation of Bright-Field Light Transmission Microscopy Biological Data. Photonics 2021, 8, 333. [Google Scholar] [CrossRef]
  14. Schmidt, P.; Lajoie, J.; Sivasankar, S. Robust scan synchronized force-fluorescence imaging. Ultramicroscopy 2020, 221, 113165. [Google Scholar] [CrossRef] [PubMed]
  15. Takano, H.; Wu, Y.L.; Irwin, J.; Maderych, S.; Leibowitz, M.; Tkachuk, A.; Kumar, A.; Hornberger, B.; Momose, A. Comparison of Image Properties in Full-Field Phase X-Ray Microscopes Based on Grating Interferometry and Zernike’s Phase Contrast Optics. Appl. Phys. Lett. 2018, 113, 063105. [Google Scholar] [CrossRef] [Green Version]
  16. Valli, J.; Garcia-Burgos, A.; Rooney, L.M.; Oliveira, B.V.d.M.e.; Duncan, R.R.; Rickman, C. Seeing beyond the limit: A guide to choosing the right super-resolution microscopy technique. J. Biol. Chem. 2021, 297, 100791. [Google Scholar] [CrossRef]
  17. Fakhrullin, R.; Nigamatzyanova, L.; Fakhrullina, G. Dark-Field/Hyperspectral Microscopy for Detecting Nanoscale Particles in Environmental Nanotoxicology Research. Sci. Total Environ. 2021, 772, 145478. [Google Scholar] [CrossRef]
  18. Reilly, W.M.; Obara, C.J. Advances in Confocal Microscopy and Selected Applications. Methods Mol. Biol. 2021, 2304, 1–35. [Google Scholar]
  19. McNerney, R.; Daley, P. Towards a point-of-care test for active tuberculosis: Obstacles and opportunities. Nat. Rev. Genet. 2011, 9, 204–213. [Google Scholar] [CrossRef]
  20. Castro, A.R.; Esfandiari, J.; Kumar, S.; Ashton, M.; Kikkert, S.E.; Park, M.M.; Ballard, R.C. Novel Point-of-Care Test for Simultaneous Detection of Nontreponemal and Treponemal Antibodies in Patients with Syphilis. J. Clin. Microbiol. 2010, 48, 4615–4619. [Google Scholar] [CrossRef] [Green Version]
  21. Chen, Y.; Chen, X.; Li, M.; Fan, P.; Wang, B.; Zhao, S.; Yu, W.; Zhang, S.; Tang, Y.; Gao, T. A new analytical platform for potential point-of-care testing of circulating tumor cells. Biosens. Bioelectron. 2020, 171, 112718. [Google Scholar] [CrossRef] [PubMed]
  22. Hussain, I.; Bowden, A.K. Smartphone-based optical spectroscopic platforms for biomedical applications: A review [Invited]. Biomed. Opt. Express 2021, 12, 1974–1998. [Google Scholar] [CrossRef] [PubMed]
  23. Huang, X.; Li, Y.; Xu, X.; Wang, R.; Yao, J.; Han, W.; Wei, M.; Chen, J.; Xuan, W.; Sun, L. High-Precision Lensless Microscope on a Chip Based on In-Line Holographic Imaging. Sensors 2021, 21, 720. [Google Scholar] [CrossRef] [PubMed]
  24. Fang, Y.; Yu, N.; Jiang, Y.; Dang, C. High-Precision Lens-Less Flow Cytometer on a Chip. Micromachines 2018, 9, 227. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  25. Jennische, E.; Lange, S.; Hultborn, R. Dark-field microscopy enhance visibility of CD31 endothelial staining. Eur. J. Histochem. 2020, 64, 3133. [Google Scholar] [CrossRef]
  26. Varra, T.; Simpson, A.; Roesler, B.; Nilsson, Z.; Ryan, D.; Van Erdewyk, M.; Christus, J.D.S.; Sambur, J.B. A Homemade Smart Phone Microscope for Single-Particle Fluorescence Microscopy. J. Chem. Educ. 2019, 97, 471–478. [Google Scholar] [CrossRef]
  27. Boudi, A.; Bagaa, M.; Poyhonen, P.; Taleb, T.; Flinck, H. AI-Based Resource Management in Beyond 5g Cloud Native Environment. IEEE Netw. 2021, 35, 128–135. [Google Scholar] [CrossRef]
  28. Leiner, T.; Bennink, E.; Mol, C.P.; Kuijf, H.J.; Veldhuis, W.B. Bringing AI to the clinic: Blueprint for a vendor-neutral AI deployment infrastructure. Insights Imaging 2021, 12, 11. [Google Scholar] [CrossRef]
  29. Zheng, G.; Lee, S.A.; Antebi, Y.; Elowitz, M.B.; Yang, C. The Epetri Dish, an on-Chip Cell Imaging Platform Based on Subpixel Perspective Sweeping Microscopy (Spsm). Proc. Natl. Acad. Sci. USA 2011, 108, 16889–16894. [Google Scholar] [CrossRef] [Green Version]
  30. Kesavan, S.V.; Momey, F.; Cioni, O.; David-Watine, B.; Dubrulle, N.; Shorte, S.; Sulpice, E.; Freida, D.; Chalmond, B.; Dinten, J.M.; et al. High-Throughput Monitoring of Major Cell Functions by Means of Lensfree Video Microscopy. Sci. Rep. 2014, 4, 5942. [Google Scholar] [CrossRef] [Green Version]
  31. Su, T.-W.; Seo, S.; Erlinger, A.; Ozcan, A. High-Throughput Lensfree Imaging and Characterization of a Heterogeneous Cell Solution on a Chip. Biotechnol. Bioeng. 2009, 102, 856–868. [Google Scholar] [CrossRef] [PubMed]
  32. Ozcan, A.; Demirci, U. Ultra Wide-Field Lens-Free Monitoring of Cells on-Chip. Lab Chip 2008, 8, 98–106. [Google Scholar] [CrossRef] [PubMed]
  33. Zhang, X.; Khimji, I.; Gurkan, U.A.; Safaee, H.; Catalano, P.N.; Keles, H.O.; Kayaalp, E.; Demirci, U. Lensless Imaging for Simultaneous Microfluidic Sperm Monitoring and Sorting. Lab Chip 2011, 11, 2535–2540. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  34. Jin, G.; Yoo, I.-H.; Pack, S.P.; Yang, J.-W.; Ha, U.-H.; Paek, S.-H.; Seo, S. Lens-free shadow image based high-throughput continuous cell monitoring technique. Biosens. Bioelectron. 2012, 38, 126–131. [Google Scholar] [CrossRef]
  35. Dolega, M.E.; Allier, C.; Kesavan, S.V.; Gerbaud, S.; Kermarrec, F.; Marcoux, P.; Dinten, J.; Gidrol, X.; Picollet-D’Hahan, N. Label-Free Analysis of Prostate Acini-Like 3d Structures by Lensfree Imaging. Biosens. Bioelectron. 2013, 49, 176–183. [Google Scholar] [CrossRef]
  36. Pushkarsky, I.; Liu, Y.; Weaver, W.; Su, T.; Mudanyali, O.; Ozcan, A.; di Carlo, D. Automated Single-Cell Motility Analysis on a Chip Using Lensfree Microscopy. Sci. Rep. 2014, 4, 4717. [Google Scholar] [CrossRef] [Green Version]
  37. Kun, J.; Smieja, M.; Xiong, B.; Soleymani, L.; Fang, Q. The Use of Motion Analysis as Particle Biomarkers in Lensless Optofluidic Projection Imaging for Point of Care Urine Analysis. Sci. Rep. 2019, 9, 17255. [Google Scholar] [CrossRef] [Green Version]
  38. Berdeu, A.; Laperrousaz, B.; Bordy, T.; Mandula, O.; Morales, S.; Gidrol, X.; Picollet-D’hahan, N.; Allier, C. Lens-Free Microscopy for 3d + Time Acquisitions of 3d Cell Culture. Sci. Rep. 2018, 8, 16135. [Google Scholar] [CrossRef] [Green Version]
  39. Rivenson, Y.; Wu, Y.; Wang, H.; Zhang, Y.; Feizi, A.; Ozcan, A. Sparsity-based multi-height phase recovery in holographic microscopy. Sci. Rep. 2016, 6, 37862. [Google Scholar] [CrossRef] [Green Version]
  40. Shanmugam, A.; Salthouse, C.D. Lensless fluorescence imaging with height calculation. J. Biomed. Opt. 2014, 19, 016002. [Google Scholar] [CrossRef] [Green Version]
  41. Coskun, A.F.; Sencan, I.; Su, T.-W.; Ozcan, A. Wide-field lensless fluorescent microscopy using a tapered fiber-optic faceplate on a chip. Anal. 2011, 136, 3512–3518. [Google Scholar] [CrossRef] [PubMed]
  42. Coskun, A.F.; Ikbal, S.; Su, T.W.; Aydogan, O.; Eleftherios, M. Lensfree Fluorescent on-Chip Imaging of Transgenic Caenorhabditis Elegans over an Ultra-Wide Field-of-View. PLoS ONE 2011, 6, e15955. [Google Scholar] [CrossRef]
  43. Coskun, A.F.; Su, T.-W.; Sencan, I.; Ozcan, A. Lensfree Fluorescent On-Chip Imaging Using Compressive Sampling. Opt. Photonics News 2010, 21, 27. [Google Scholar] [CrossRef] [PubMed]
  44. Han, C.; Pang, S.; Bower, D.V.; Yiu, P.; Yang, C. Wide Field-of-View on-Chip Talbot Fluorescence Microscopy for Longitudinal Cell Culture Monitoring from within the Incubator. Anal. Chem. 2013, 85, 2356–2360. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  45. Sasagawa, K.; Kimura, A.; Haruta, M.; Noda, T.; Tokuda, T.; Ohta, J. Highly sensitive lens-free fluorescence imaging device enabled by a complementary combination of interference and absorption filters. Biomed. Opt. Express 2018, 9, 4329–4344. [Google Scholar] [CrossRef] [PubMed]
  46. Sasagawa, K.; Ohta, Y.; Kawahara, M.; Haruta, M.; Tokuda, T.; Ohta, J. Wide field-of-view lensless fluorescence imaging device with hybrid bandpass emission filter. AIP Adv. 2019, 9, 035108. [Google Scholar] [CrossRef] [Green Version]
  47. Bian, Y.; Wang, W.; Hussian, A.; Kuang, C.; Li, H.; Liu, X. Experimental analysis and designing strategies of lens-less microscopy with partially coherent illumination. Opt. Commun. 2018, 434, 136–144. [Google Scholar] [CrossRef]
  48. Schiebelbein, A.; Pedrini, G. Lens Less Phase Imaging Microscopy Using Multiple Intensity Diffraction Patterns Obtained under Coherent and Partially Coherent Illumination. Appl. Opt. 2022, 61, B271–B278. [Google Scholar] [CrossRef]
  49. Fienup, J.R. Reconstruction of a complex-valued object from the modulus of its Fourier transform using a support constraint. J. Opt. Soc. Am. A 1987, 4, 118–123. [Google Scholar] [CrossRef]
  50. Koren, G.; Polack, F.; Joyeux, D. Iterative Algorithms for Twin-Image Elimination in in-Line Holography Using Finite-Support Constraints. J. Opt. Soc. Am. A 1993, 10, 423–433. [Google Scholar] [CrossRef]
  51. Feng, S.; Wang, M.; Wu, J. Enhanced Resolution for Amplitude Object in Lensless Inline Holographic Microscope with Grating Illumination. Opt. Eng. 2017, 56, 093107. [Google Scholar] [CrossRef]
  52. Guo, C.; Liu, X.; Zhang, F.; Du, Y.; Zheng, S.; Wang, Z.; Zhang, X.; Kan, X.; Liu, Z.; Wang, W. Lensfree on-chip microscopy based on single-plane phase retrieval. Opt. Express 2022, 30, 19855–19870. [Google Scholar] [CrossRef] [PubMed]
  53. Ebrahimi, S.; Dashtdar, M. Lens-free digital holographic microscopy for cell imaging and tracking by Fresnel diffraction from a phase discontinuity. Opt. Lett. 2021, 46, 3516–3519. [Google Scholar] [CrossRef]
  54. Bian, Y.; Yao, Y.; Liu, Q.; Xu, Y.; Kuang, C.; Li, H.; Liu, X. Assessment of tissues’ inhomogeneous optical properties based on a portable microscope under partially coherent illumination. Opt. Commun. 2019, 434, 145–151. [Google Scholar] [CrossRef]
  55. Bian, Y.; Zhang, Y.; Yin, P.; Li, H.; Ozcan, A. Optical Refractometry Using Lensless Holography and Autofocusing. Opt. Express 2018, 26, 29614–29628. [Google Scholar] [CrossRef] [PubMed]
  56. Bian, Y.; Liu, Q.; Zhang, Z.; Liu, D.; Hussian, A.; Kuang, C.; Li, H.; Liu, X. Portable Multi-Spectral Lens-Less Microscope with Wavelength-Self-Calibrating Imaging Sensor. Opt. Lasers Eng. 2018, 111, 25–33. [Google Scholar] [CrossRef]
  57. Shen, H.; Gao, J. Deep learning virtual colorful lens-free on-chip microscopy. Chin. Opt. Lett. 2020, 18, 121705. [Google Scholar] [CrossRef]
  58. Zuo, C.; Sun, J.; Zhang, J.; Hu, Y.; Chen, Q. Lensless Phase Microscopy and Diffraction Tomography with Multi-Angle and Multi-Wavelength Illuminations Using a Led Matrix. Opt. Express 2015, 23, 14314–14328. [Google Scholar] [CrossRef]
  59. Rivenson, Y.; Zhang, Y.; Günaydın, H.; Teng, D.; Ozcan, A. Phase Recovery and Holographic Image Reconstruction Using Deep Learning in Neural Networks. Light Sci. Appl. 2018, 7, 17141. [Google Scholar] [CrossRef] [Green Version]
  60. Chen, H.; Huang, L.; Liu, T.; Ozcan, A. Fourier Imager Network (FIN): A deep neural network for hologram reconstruction with superior external generalization. Light. Sci. Appl. 2022, 11, 254. [Google Scholar] [CrossRef]
  61. Midtvedt, B.; Helgadottir, S.; Argun, A.; Pineda, J.; Midtvedt, D.; Volpe, G. Quantitative digital microscopy with deep learning. Appl. Phys. Rev. 2021, 8, 011310. [Google Scholar] [CrossRef]
  62. Huang, L.; Yang, X.; Liu, T.; Ozcan, A. Few-Shot Transfer Learning for Holographic Image Reconstruction Using a Recurrent Neural Network. APL Photonics 2022, 7, 070801. [Google Scholar] [CrossRef]
  63. Liu, T.; de Haan, K.; Bai, B.; Rivenson, Y.; Luo, Y.; Wang, H.; Karalli, D.; Fu, H.; Zhang, Y.; FitzGerald, J.; et al. Deep Learning-Based Holographic Polarization Microscopy. ACS Photon- 2020, 7, 3023–3034. [Google Scholar] [CrossRef] [PubMed]
  64. Greenbaum, A.; Feizi, A.; Akbari, N.; Ozcan, A. Wide-field computational color imaging using pixel super-resolved on-chip microscopy. Opt. Express 2013, 21, 12469–12483. [Google Scholar] [CrossRef] [Green Version]
  65. Bian, Y.; Jiang, Y.; Wang, J.; Yang, S.; Deng, W.; Yang, X.; Shen, R.; Shen, H.; Kuang, C. Deep learning colorful ptychographic iterative engine lens-less diffraction microscopy. Opt. Lasers Eng. 2021, 150, 106843. [Google Scholar] [CrossRef]
  66. Liu, T.; Wei, Z.; Rivenson, Y.; de Haan, K.; Zhang, Y.; Wu, Y.; Ozcan, A. Deep Learning-Based Color Holographic Microscopy. J. Biophotonics 2019, 12, e201900107. [Google Scholar] [CrossRef] [Green Version]
  67. Su, T.-W.; Isikman, S.O.; Bishara; W.; Tseng, D.; Erlinger, A.; Ozcan, A. Multi-angle lensless digital holography for depth resolved imaging on a chip. Opt. Express 2010, 18, 9690–9711. [Google Scholar]
  68. Isikman, S.O.; Bishara, W.; Mudanyali, O.; Sencan, L.; Ozcan, A. Lensfree on-Chip Microscopy and Tomography. IEEE J. Sel. Top. Quantum Electron. Publ. IEEE Lasers Electro-Opt. Soc. 2011, 18, 1059. [Google Scholar]
  69. Isikman, S.O.; Bishara, W.; Ozcan, A. Lensfree On-chip Tomographic Microscopy Employing Multi-angle Illumination and Pixel Super-resolution. J. Vis. Exp. 2012, e4161. [Google Scholar]
  70. Isikman, S.O.; Bishara, W.; Mavandadi, S.; Yu, F.W.; Feng, S.; Lau, R.; Ozcan, A. Lens-free optical tomographic microscope with a large imaging volume on a chip. Proc. Natl. Acad. Sci. USA 2011, 108, 7296–7301. [Google Scholar] [CrossRef] [Green Version]
  71. Hui, M.; Hussain, F. In-Line Recording and Off-Axis Viewing Technique for Holographic Particle Velocimetry. Appl. Opt. 1995, 34, 1827–1840. [Google Scholar]
  72. Cavalcanti, T.C.; Kim, S.; Lee, K.; Lee, S.; Park, M.K.; Hwang, J.Y. Smartphone-based spectral imaging otoscope: System development and preliminary study for evaluation of its potential as a mobile diagnostic tool. J. Biophotonics 2020, 13, e201960213. [Google Scholar] [CrossRef] [PubMed]
  73. Darwish, G.H.; Asselin, J.; Tran, M.V.; Gupta, R.; Kim, H.; Boudreau, D.; Algar, W.R. Fully Self-Assembled Silica Nanoparticle–Semiconductor Quantum Dot Supra-Nanoparticles and Immunoconjugates for Enhanced Cellular Imaging by Microscopy and Smartphone Camera. ACS Appl. Mater. Interfaces 2020, 12, 33530–33540. [Google Scholar] [CrossRef] [PubMed]
  74. Goud, B.K.; Shinde, D.D.; Udupa, D.V.; Krishna, C.M.; Rao, K.D.; Sahoo, N.K. Low Cost Digital Holographic Microscope for 3-D Cell Imaging by Integrating Smartphone and DVD Optical Head. Opt. Lasers Eng. 2019, 114, 1–6. [Google Scholar] [CrossRef]
  75. Lee, K.C.; Lee, K.; Jung, J.; Lee, S.H.; Kim, D.; Lee, S.A. A Smartphone-Based Fourier Ptychographic Microscope Using the Display Screen for Illumination. ACS Photonics 2021, 8, 1307–1315. [Google Scholar] [CrossRef]
  76. Song, C.; Yang, Y.; Tu, X.; Chen, Z.; Gong, J.; Lin, C. A Smartphone-Based Fluorescence Microscope with Hydraulically Driven Optofluidic Lens for Quantification of Glucose. IEEE Sens. J. 2021, 21, 1229–1235. [Google Scholar] [CrossRef]
  77. Rivenson, Y.; Koydemir, H.C.; Wang, H.; Wei, Z.; Ren, Z.; Günaydın, H.; Zhang, Y.; Göröcs, Z.; Liang, K.; Tseng, D.; et al. Deep Learning Enhanced Mobile-Phone Microscopy. ACS Photonics 2018, 5, 2354–2364. [Google Scholar] [CrossRef] [Green Version]
  78. de Haan, K.; Koydemir, H.C.; Rivenson, Y.; Tseng, D.; Van Dyne, E.; Bakic, L.; Karinca, D.; Liang, K.; Ilango, M.; Gumustekin, E.; et al. Automated screening of sickle cells using a smartphone-based microscope and deep learning. NPJ Digit. Med. 2020, 3, 76. [Google Scholar] [CrossRef]
  79. Jung, D.; Choi, J.-H.; Kim, S.; Ryu, S.; Lee, W.; Lee, J.-S.; Joo, C. Smartphone-based multi-contrast microscope using color-multiplexed illumination. Sci. Rep. 2017, 7, 7564. [Google Scholar] [CrossRef] [Green Version]
  80. Phillips, Z.F.; D’Ambrosio, M.V.; Tian, L.; Rulison, J.J.; Patel, H.S.; Sadras, N.; Gande, A.V.; Switz, N.; Fletcher, D.A.; Waller, L. Multi-Contrast Imaging and Digital Refocusing on a Mobile Microscope with a Domed LED Array. PLoS ONE 2015, 10, e0124938. [Google Scholar] [CrossRef] [Green Version]
  81. Bian, Y.; Jiang, Y.; Huang, Y.; Yang, X.; Deng, W.; Shen, H.; Shen, R.; Kuang, C. Smart-phone phase contrast microscope with a singlet lens and deep learning. Opt. Laser Technol. 2021, 139, 106900. [Google Scholar] [CrossRef]
  82. Ogasawara, Y.; Sugimoto, R.; Maruyama, R.; Arimoto, H.; Tamada, Y.; Watanabe, W. Mobile-phone-based Rheinberg microscope with a light-emitting diode array. J. Biomed. Opt. 2018, 24, 031007. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  83. Sun, D.; Hu, T.Y. A low cost mobile phone dark-field microscope for nanoparticle-based quantitative studies. Biosens. Bioelectron. 2018, 99, 513–518. [Google Scholar] [CrossRef] [PubMed]
  84. Rabha, D.; Biswas, S.; Chamuah, N.; Mandal, M.; Nath, P. Wide-Field Multi-Modal Microscopic Imaging Using Smartphone. Opt. Lasers Eng. 2021, 137, 106343. [Google Scholar] [CrossRef]
  85. Dnmez, S.L.; Needs, S.H.; Osborn, H.; Edwards, A.D. Label-Free Smartphone Quantitation of Bacteria by Darkfield Imaging of Light Scattering in Fluoropolymer Micro Capillary Film Allows Portable Detection of Bacteriophage Lysis. Sens. Actuators B Chem. 2020, 323, 128645. [Google Scholar] [CrossRef]
  86. Meng, X.; Huang, H.; Yan, K.; Tian, X.; Yu, W.; Cui, H.; Kong, Y.; Xue, L.; Liu, C.; Wang, S. Smartphone based hand-held quantitative phase microscope using the transport of intensity equation method. Lab Chip 2016, 17, 104–109. [Google Scholar] [CrossRef]
  87. Yang, Z.; Zhan, Q. Single-Shot Smartphone-Based Quantitative Phase Imaging Using a Distorted Grating. PLoS ONE 2016, 11, e0159596. [Google Scholar] [CrossRef]
  88. Lee, S.A.; Yang, C. A smartphone-based chip-scale microscope using ambient illumination. Lab Chip 2014, 14, 3056–3063. [Google Scholar] [CrossRef]
  89. Coskun, A.F.; Nagi, R.; Sadeghi, K.; Phillips, S.; Ozcan, A. Albumin testing in urine using a smart-phone. Lab Chip 2013, 13, 4231–4238. [Google Scholar] [CrossRef]
  90. Cai, F.; Wang, T.; Lu, W.; Zhang, X. High-resolution mobile bio-microscope with smartphone telephoto camera lens. Optik 2020, 207, 164449. [Google Scholar] [CrossRef]
  91. Ardalan, S.; Hosseinifard, M.; Vosough, M.; Golmohammadi, H. Towards Smart Personalized Perspiration Analysis: An Iot-Integrated Cellulose-Based Microfluidic Wearable Patch for Smartphone Fluorimetric Multi-Sensing of Sweat Biomarkers. Biosens. Bioelectron. 2020, 168, 112450. [Google Scholar] [CrossRef] [PubMed]
  92. Chung, S.; Breshears, L.E.; Gonzales, A.; Jennings, C.M.; Morrison, C.M.; Betancourt, W.Q.; Reynolds, K.A.; Yoon, J.-Y. Norovirus detection in water samples at the level of single virus copies per microliter using a smartphone-based fluorescence microscope. Nat. Protoc. 2021, 16, 1452–1475. [Google Scholar] [CrossRef] [PubMed]
  93. Kim, J.-H.; Joo, H.-G.; Kim, T.-H.; Ju, Y.-G. A smartphone-based fluorescence microscope utilizing an external phone camera lens module. BioChip J. 2015, 9, 285–292. [Google Scholar] [CrossRef]
  94. Koydemir, H.C.; Feng, S.; Liang, K.; Nadkarni, R.; Benien, P.; Ozcan, A. Comparison of supervised machine learning algorithms for waterborne pathogen detection using mobile phone fluorescence microscopy. Nanophotonics 2017, 6, 731–741. [Google Scholar] [CrossRef]
  95. Moehling, T.J.; Lee, D.H.; Henderson, M.E.; McDonald, M.K.; Tsang, P.H.; Kaakeh, S.; Kim, E.S.; Wereley, S.T.; Kinzer-Ursem, T.L.; Clayton, K.N.; et al. A smartphone-based particle diffusometry platform for sub-attomolar detection of Vibrio cholerae in environmental water. Biosens. Bioelectron. 2020, 167, 112497. [Google Scholar] [CrossRef]
  96. Paterson, A.S.; BRaja, a.; Mandadi, V.; Townsend, B.; Lee, M.; Buell, A.; Binh, V.; Brgoch, J.; Willson, R.C. A Low-Cost Smartphone-Based Platform for Highly Sensitive Point-of-Care Testing with Persistent Luminescent Phosphors. Lab Chip 2017, 17, 1051–1059. [Google Scholar] [CrossRef] [Green Version]
  97. Rojas-Barboza, D.; EdPark, w.; Sassenfeld, R.; Winder, J.; Smith, G.B.; Valles-Rosalles, D.; Delgado, E.; Park, Y.H. Rapid, Simple, Low-Cost Smartphone-Based Fluorescence Detection of Escherichia Coli. Int. J. Agric. Biol. Eng. 2021, 14, 189–193. [Google Scholar] [CrossRef]
  98. Shrivastava, S.; Lee, W.-I.; Lee, N.-E. Culture-free, highly sensitive, quantitative detection of bacteria from minimally processed samples using fluorescence imaging by smartphone. Biosens. Bioelectron. 2018, 109, 90–97. [Google Scholar] [CrossRef]
  99. Wargocki, P.; Deng, W.; Anwer, A.G.; Goldys, E.M. Medically Relevant Assays with a Simple Smartphone and Tablet Based Fluorescence Detection System. Sensors 2015, 15, 11653–11664. [Google Scholar] [CrossRef] [Green Version]
  100. Wei, Q.; Acuna, G.; Kim, S.; Vietz, C.; Tseng, D.; Chae, J.; Shir, D.; Luo, W.; Tinnefeld, P.; Ozcan, A. Plasmonics Enhanced Smartphone Fluorescence Microscopy. Sci. Rep. 2017, 7, 2124. [Google Scholar] [CrossRef] [Green Version]
  101. Zhao, L.; Dong, B.; Li, W.; Zhang, H.; Zheng, Y.; Tang, C.; Hu, B.; Yuan, S. Smartphone-Based Quantitative Fluorescence Detection of Flowing Droplets Using Embedded Ambient Light Sensor. IEEE Sens. J. 2020, 21, 4451–4461. [Google Scholar] [CrossRef]
  102. Dai, B.; Jiao, Z.; Zheng, L.; Bachman, H.; Fu, Y.; Wan, X.; Zhang, Y.; Huang, Y.; Han, X.; Zhao, C.; et al. Colour Compound Lenses for a Portable Fluorescence Microscope. Light Sci. Appl. 2019, 8, 75. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  103. Wei, Q.; Luo, W.; Chiang, S.; Kappel, T.; Mejia, C.; Tseng, D.; Chan, R.Y.L.; Yan, E.; Qi, H.; Shabbir, F.; et al. Imaging and Sizing of Single DNA Molecules on a Mobile Phone. ACS Nano 2014, 8, 12725–12733. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  104. Wei, Q.; Qi, H.; Luo, W.; Tseng, D.; Ki, S.J.; Wan, Z.; Göröcs, Z.; Bentolila, L.A.; Wu, T.-T.; Sun, R.; et al. Fluorescent Imaging of Single Nanoparticles and Viruses on a Smart Phone. ACS Nano 2013, 7, 9147–9155. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  105. Shen, H.; Gao, J. Portable deep learning singlet microscope. J. Biophotonics 2020, 13, e202000013. [Google Scholar] [CrossRef] [PubMed]
  106. Bian, Y.; Jiang, Y.; Huang, Y.; Yang, X.; Deng, W.; Shen, H.; Shen, R.; Kuang, C. Deep Learning Virtual Colorization Overcoming Chromatic Aberrations in Singlet Lens Microscopy. APL Photonics 2021, 6, 031301. [Google Scholar] [CrossRef]
  107. Gao, J.; Shen, H.; Cui, X.; Zhu, R. Portable deep learning singlet multi-spectral microscope. Opt. Lasers Eng. 2020, 137, 106378. [Google Scholar] [CrossRef]
  108. Bian, Y.; Jiang, Y.; Deng, W.; Shen, R.; Shen, H.; Kuang, C. Deep learning virtual Zernike phase contrast imaging for singlet microscopy. AIP Adv. 2021, 11, 065311. [Google Scholar] [CrossRef]
  109. Luo, Y.; Tseng, M.L.; Vyas, S.; Hsieh, T.-Y.; Wu, J.-C.; Chen, S.-Y.; Peng, H.-F.; Su, V.-C.; Huang, T.-T.; Kuo, H.Y.; et al. Meta-lens light-sheet fluorescence microscopy for in vivo imaging. Nanophotonics 2022, 11, 1949–1959. [Google Scholar] [CrossRef]
  110. Chen, C.; Song, W.; Chen, J.-W.; Wang, J.-H.; Chen, Y.H.; Xu, B.; Chen, M.K.; Li, H.; Fang, B.; Chen, J.; et al. Spectral tomographic imaging with aplanatic metalens. Light. Sci. Appl. 2019, 8, 99. [Google Scholar] [CrossRef] [Green Version]
  111. Xu, B.; Li, H.; Gao, S.; Hua, X.; Yang, C.; Chen, C.; Yan, F.; Zhu, S.N.; Li, T. Metalens-integrated compact imaging devices for wide-field microscopy. Adv. Photon. 2020, 2, 066004. [Google Scholar] [CrossRef]
  112. Ou, X.; Horstmeyer, R.; Yang, C.; Zheng, G. Quantitative phase imaging via Fourier ptychographic microscopy. Opt. Lett. 2013, 38, 4845–4848. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  113. Sun, J.; Zuo, C.; Zhang, L.; Chen, Q. Resolution-enhanced Fourier ptychographic microscopy based on high-numerical-aperture illuminations. Sci. Rep. 2017, 7, 1187. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  114. Wang, A.; Zhang, Z.; Wang, S.; Pan, A.; Ma, C.; Yao, B. Fourier Ptychographic Microscopy Via Alternating Direction Method of Multipliers. Cells 2022, 11, 1512. [Google Scholar] [CrossRef] [PubMed]
  115. Gao, Y.; Chen, J.; Wang, A.; Pan, A.; Ma, C.; Yao, B. High-Throughput Fast Full-Color Digital Pathology Based on Fourier Ptychographic Microscopy Via Color Transfer. Sci. China Phys. Mech. Astron. 2021, 64, 114211. [Google Scholar] [CrossRef]
  116. Pan, A.; Zuo, C.; Yao, B. High-resolution and large field-of-view Fourier ptychographic microscopy and its applications in biomedicine. Rep. Prog. Phys. 2020, 83, 096101. [Google Scholar] [CrossRef] [PubMed]
  117. Sobieranski, A.C.; Inci, F.; Tekin, H.C.; Yuksekkaya, M.; Comunello, E.; Cobra, D.; von Wangenheim, A.; Demirci, U. Portable Lensless Wide-Field Microscopy Imaging Platform Based on Digital Inline Holography and Multi-Frame Pixel Super-Resolution. Light Sci. Appl. 2015, 4, e346. [Google Scholar] [CrossRef] [Green Version]
  118. Greenbaum, A.; Ozcan, A. Maskless imaging of dense samples using pixel super-resolution based multi-height lensfree on-chip microscopy. Opt. Express 2012, 20, 3129–3143. [Google Scholar] [CrossRef]
  119. Zhang, J.; Chen, Q.; Li, J.; Sun, J.; Zuo, C. Lensfree dynamic super-resolved phase imaging based on active micro-scanning. Opt. Lett. 2018, 43, 3714–3717. [Google Scholar] [CrossRef]
  120. Gao, Y.; Yang, F.; Cao, L. Pixel Super-Resolution Phase Retrieval for Lensless On-Chip Microscopy via Accelerated Wirtinger Flow. Cells 2022, 11, 1999. [Google Scholar] [CrossRef]
  121. Luo, W.; Zhang, Y.; Feizi, A.; Gorocs, Z.; Ozcan, A. Pixel Super-Resolution Using Wavelength Scanning. Light Sci. Appl. 2016, 5, e16060. [Google Scholar] [CrossRef] [Green Version]
  122. Luo, W.; Greenbaum, A.; Zhang, Y.; Ozcan, A. Synthetic aperture-based on-chip microscopy. Light. Sci. Appl. 2015, 4, e261. [Google Scholar] [CrossRef] [Green Version]
  123. Zhang, J.; Sun, J.; Chen, Q.; Li, J.; Zuo, C. Adaptive Pixel-Super-Resolved Lensfree in-Line Digital Holography for Wide-Field on-Chip Microscopy. Sci. Rep. 2017, 7, 11777. [Google Scholar] [CrossRef] [PubMed]
  124. Vilà, A.; Moreno, S.; Canals, J.; Diéguez, A. A Compact Raster Lensless Microscope Based on a Microdisplay. Sensors 2021, 21, 5941. [Google Scholar] [CrossRef] [PubMed]
  125. Aileni, M.; Rohela, G.K.; Jogam, P.; Soujanya, S.; Zhang, B. Biotechnological Perspectives to Combat the COVID-19 Pandemic: Precise Diagnostics and Inevitable Vaccine Paradigms. Cells 2022, 11, 1182. [Google Scholar] [CrossRef]
  126. García-Villena, J.; Torres, J.E.; Aguilar, C.; Lin, L.; Bermejo-Peláez, D.; Dacal, E.; Mousa, A.; Ortega, M.D.P.; Martínez, A.; Vladimirov, A.; et al. 3D-Printed Portable Robotic Mobile Microscope for Remote Diagnosis of Global Health Diseases. Electronics 2021, 10, 2408. [Google Scholar] [CrossRef]
  127. Chen, D.; Wang, L.; Luo, X.; Xie, H.; Chen, X. Resolution and Contrast Enhancement for Lensless Digital Holographic Microscopy and Its Application in Biomedicine. Photonics 2022, 9, 358. [Google Scholar] [CrossRef]
Figure 1. Typical schematic of lens-less diffraction microscopy setups based on a single frame [39].
Figure 1. Typical schematic of lens-less diffraction microscopy setups based on a single frame [39].
Cells 11 03670 g001
Figure 2. Typical schematic of lens-less diffraction microscopy setups based on multiple holograms: (a) the principal schematic; (b) a 3D mechanical design of a lens-less diffraction microscope; (c) typical H&E stained pathological slide [57].
Figure 2. Typical schematic of lens-less diffraction microscopy setups based on multiple holograms: (a) the principal schematic; (b) a 3D mechanical design of a lens-less diffraction microscope; (c) typical H&E stained pathological slide [57].
Cells 11 03670 g002
Figure 3. CNN architecture to recover complex-value information of the object [59].
Figure 3. CNN architecture to recover complex-value information of the object [59].
Cells 11 03670 g003
Figure 4. (a,b) Deep learning virtually colorizing to achieve colorful lens-less microscopy[57].
Figure 4. (a,b) Deep learning virtually colorizing to achieve colorful lens-less microscopy[57].
Cells 11 03670 g004
Figure 5. Deep learning virtually colorizing based on a GAN architecture [57].
Figure 5. Deep learning virtually colorizing based on a GAN architecture [57].
Cells 11 03670 g005
Figure 6. Interaction pipeline maps of cell-phone microscopes, 5G telecommunication stations and computer sever stations.
Figure 6. Interaction pipeline maps of cell-phone microscopes, 5G telecommunication stations and computer sever stations.
Cells 11 03670 g006
Figure 7. Smartphone platforms based on: (a) commercial microscopic objective lenses, (b) customized single lenses, (c) inverted pinhole lenses.
Figure 7. Smartphone platforms based on: (a) commercial microscopic objective lenses, (b) customized single lenses, (c) inverted pinhole lenses.
Cells 11 03670 g007
Figure 8. Smartphone-based bright-field microscope with deep learning. (a) Deep learning enhanced mobile-phone microscopy [77], and (b) a photograph of the smartphone-based bright-field microscope. [78].
Figure 8. Smartphone-based bright-field microscope with deep learning. (a) Deep learning enhanced mobile-phone microscopy [77], and (b) a photograph of the smartphone-based bright-field microscope. [78].
Cells 11 03670 g008
Figure 9. Smartphone-based phase contrast microscopy setups: (a) a 2D principle schematic based on a singlet objective and deep learning architecture for image enhancing [81]; (b) a 3D mechanical design [81]; (c) smartphone-based phase contrast microscope based on computational LED illumination [79].
Figure 9. Smartphone-based phase contrast microscopy setups: (a) a 2D principle schematic based on a singlet objective and deep learning architecture for image enhancing [81]; (b) a 3D mechanical design [81]; (c) smartphone-based phase contrast microscope based on computational LED illumination [79].
Cells 11 03670 g009
Figure 10. Smartphone-based dark-field microscopy based on (a) external angular illumination [84], and (b) a dark-field condenser [83].
Figure 10. Smartphone-based dark-field microscopy based on (a) external angular illumination [84], and (b) a dark-field condenser [83].
Cells 11 03670 g010
Figure 11. Smartphone-based fluorescent microscopy setups. (a) schematic illustration of the smartphone microscope [90]; (b) photograph and schematic of cell-phone-based fluorescence microscopes. [104]; and (c) schematic diagram of portable fluorescence microscopes [102].
Figure 11. Smartphone-based fluorescent microscopy setups. (a) schematic illustration of the smartphone microscope [90]; (b) photograph and schematic of cell-phone-based fluorescence microscopes. [104]; and (c) schematic diagram of portable fluorescence microscopes [102].
Cells 11 03670 g011
Figure 12. Singlet bright-field microscopy setup [105]: (a) a 2D principle schematic and (b) a 3D mechanical design.
Figure 12. Singlet bright-field microscopy setup [105]: (a) a 2D principle schematic and (b) a 3D mechanical design.
Cells 11 03670 g012
Figure 13. Deep learning architecture to enhance high-resolution and image contrast for the singlet bright-field microscopy in Figure 12 [105].
Figure 13. Deep learning architecture to enhance high-resolution and image contrast for the singlet bright-field microscopy in Figure 12 [105].
Cells 11 03670 g013
Figure 14. Singlet achromatic microscopy based on deep learning virtual colorization [106]. (a) 2D schematic; (b) a deep learning virtualizing flowchart; and (c) an application working flowchart.
Figure 14. Singlet achromatic microscopy based on deep learning virtual colorization [106]. (a) 2D schematic; (b) a deep learning virtualizing flowchart; and (c) an application working flowchart.
Cells 11 03670 g014
Figure 15. Experimental singlet achromatic microscopy results [106].
Figure 15. Experimental singlet achromatic microscopy results [106].
Cells 11 03670 g015
Figure 16. Singlet multi-spectral microscopy [107]: (a) photograph, and (b) a 3D mechanical design.
Figure 16. Singlet multi-spectral microscopy [107]: (a) photograph, and (b) a 3D mechanical design.
Cells 11 03670 g016
Figure 17. Experimental singlet multi-spectral microscopy by a sample of USAF-target resolution plate [107]. (a) Sub-picture 1–7 immediately imaged by singlet MSM with 7 wavelength, and sub-picture 8 was imaged by commercial microscope with 4X objective. (b) Sub-picture 1–7 recovered by deep learning algorithm with 7 wavelength, respectively; and sub-picture 8 was imaged by commercial microscope with 10X objective.
Figure 17. Experimental singlet multi-spectral microscopy by a sample of USAF-target resolution plate [107]. (a) Sub-picture 1–7 immediately imaged by singlet MSM with 7 wavelength, and sub-picture 8 was imaged by commercial microscope with 4X objective. (b) Sub-picture 1–7 recovered by deep learning algorithm with 7 wavelength, respectively; and sub-picture 8 was imaged by commercial microscope with 10X objective.
Cells 11 03670 g017
Figure 18. Singlet virtual phase-contrast microscopy. [108]: (a) a 2D schematic of a traditional Zernike phase contrast microscopy setup, (b) a 2D schematic of a singlet phase contrast microscopy setup, and (c) Deep ZPC-transfer network architecture.
Figure 18. Singlet virtual phase-contrast microscopy. [108]: (a) a 2D schematic of a traditional Zernike phase contrast microscopy setup, (b) a 2D schematic of a singlet phase contrast microscopy setup, and (c) Deep ZPC-transfer network architecture.
Cells 11 03670 g018
Figure 19. Portable singlet meta-lens microscopy setup: (a) macro-scale and nano-scale structure of a singlet meta-lens [110] and (b) a photograph of the singlet meta-lens microscopy setup with the resolution test result [111].
Figure 19. Portable singlet meta-lens microscopy setup: (a) macro-scale and nano-scale structure of a singlet meta-lens [110] and (b) a photograph of the singlet meta-lens microscopy setup with the resolution test result [111].
Cells 11 03670 g019
Figure 20. Super-resolution lens-less microscopy based on multi-angle scanning [122].
Figure 20. Super-resolution lens-less microscopy based on multi-angle scanning [122].
Cells 11 03670 g020
Table 1. Estimated prices of key optical and electronic elements in computational portable microscopes.
Table 1. Estimated prices of key optical and electronic elements in computational portable microscopes.
Lens-less MicroscopyLight sourceunder $1
Sensor$50 to $200
Structureunder $10
Smart-phone microscopylight sourceunder $1
Smartphoneunder $800
Objective lens$10 to $100
Customized singlet lensunder $100
Reversed smartphone camera lensunder $50
structureunder $10
Singlet microscopylight sourceunder $1
Sensor$50 to $200
Singlet lensunder $100
structureunder $10
Super-resolution microscopylight source~$500
Sensor$50 to $200
Lens$10 to $100
structureunder $100
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Bian, Y.; Xing, T.; Jiao, K.; Kong, Q.; Wang, J.; Yang, X.; Yang, S.; Jiang, Y.; Shen, R.; Shen, H.; et al. Computational Portable Microscopes for Point-of-Care-Test and Tele-Diagnosis. Cells 2022, 11, 3670.

AMA Style

Bian Y, Xing T, Jiao K, Kong Q, Wang J, Yang X, Yang S, Jiang Y, Shen R, Shen H, et al. Computational Portable Microscopes for Point-of-Care-Test and Tele-Diagnosis. Cells. 2022; 11(22):3670.

Chicago/Turabian Style

Bian, Yinxu, Tao Xing, Kerong Jiao, Qingqing Kong, Jiaxiong Wang, Xiaofei Yang, Shenmin Yang, Yannan Jiang, Renbing Shen, Hua Shen, and et al. 2022. "Computational Portable Microscopes for Point-of-Care-Test and Tele-Diagnosis" Cells 11, no. 22: 3670.

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop