Next Article in Journal
A Cosine-Similarity-Based Deconvolution Method for Analyzing Data-Independent Acquisition Mass Spectrometry Data
Next Article in Special Issue
Identical Parallel Machine Scheduling Considering Workload Smoothness Index
Previous Article in Journal
A Fault Section Location Method for Distribution Networks Based on Divide-and-Conquer
Previous Article in Special Issue
Fabrication and Characterization of Gel-Forming Cr2O3 Abrasive Tools for Sapphire Substrate Polishing
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Deep Learning Technique for Optical Inspection of Color Contact Lenses

1
Institute of AI Convergence, Chosun University, Gwangju 61452, Republic of Korea
2
Jckmedical Co., Ltd., Gwangju 61008, Republic of Korea
3
Interdisciplinary Program in IT-Bio Convergence System, School of Electronic Engineering, Chosun University, Gwangju 61452, Republic of Korea
*
Author to whom correspondence should be addressed.
Appl. Sci. 2023, 13(10), 5966; https://doi.org/10.3390/app13105966
Submission received: 14 April 2023 / Revised: 5 May 2023 / Accepted: 11 May 2023 / Published: 12 May 2023
(This article belongs to the Special Issue Advanced Manufacturing Technologies and Their Applications, Volume II)

Abstract

:
Colored contact lenses have gained popularity in recent years. However, their production process is plagued by low efficiency, which is attributed to the complex nature of the lens color patterns. The manufacturing process involves multiple complex steps that can introduce defects or inconsistencies into the contact lenses. Moreover, manual inspection of a considerable number of contact lenses that are produced inefficiently in terms of consistency and quality by humans is prevalent. Alternatively, automatic optical inspection (AOI) systems have been developed to perform quality-control checks on colored contact lenses. However, their accuracy is limited due to the increasing complexity of the lens color patterns. To address these issues, convolutional neural networks have been used to detect and classify defects in colored contact lenses. This study aims to provide a comprehensive guide for AOI systems using artificial intelligence in the colored contact lens manufacturing process, including the benefits and challenges of using these systems. Further, future research directions to achieve a classification accuracy of >95%, which is the human recognition rate, are explored.

1. Introduction

Soft contact lenses have traditionally been worn for vision correction; however, their application has been recently expanded to healthcare and beauty products, including drug delivery and cosmetic colored contact lenses [1,2,3,4,5]. Colored contact lenses, such as those shown in Figure 1, have gained popularity among people wishing to change or enhance the color and beauty of their eyes. Consequently, the market for colored contact lenses has been growing rapidly. The colored contact lens manufacturing process involves several steps: molding, coloring, curing, and coating. Each of these steps can result in the introduction of defects or inconsistencies that may not be visible until the final inspection [6,7]. Most colored contact lenses are inspected manually by human inspectors. However, manual inspection is limited by human error and can be time- and resource-consuming. In addition, accurate inspection can be challenging owing to the colored contact lenses having complex designs and various colors, including multiple color and gradation layers. This renders the detection of defects or inconsistencies that occur during the manufacturing process challenging. Moreover, subtle imperfections in colored contact lenses that are difficult to observe with the naked eye, such as bubbles, scratches, and uneven pigmentation, may not be immediately noticeable. However, this may affect the wearer’s vision or cause discomfort. Therefore, the production and distribution of high-quality colored contact lenses to end users is difficult.
Overall, the complexity of the color patterns and the potential for subtle defects render the inspection of colored contact lenses challenging. Consequently, automatic optical inspection (AOI) systems have been developed to perform quality-control checks on colored contact lenses. In a previous study [8], a video camera was installed to automatically measure the back surface characteristics of the produced contact lenses. The images acquired via the camera were used to measure the surface finish, toricity, and eccentricity of the back surface of the lenses. In another study [9], an AOI system for lens defect detection was constructed, comprising a light source and camera. It employed an image-processing algorithm for defect detection and was designed to detect five types of defects via the algorithm using images captured by the camera. The systems in these studies [8,9] produced relatively accurate inspection results compared with those obtained through existing manual inspection. However, their accuracy can be limited by the complexity of the lens color patterns. The use of AOI systems with deep learning algorithms that train with contact lens defects can aid in improving the accuracy of manual lens defect detection by learning and identifying defects in colored contact lenses with high accuracy. This can provide a more accurate and efficient inspection process, reduce inspection time, and improve feasibility. However, certain challenges are involved in implementing these systems, such as the requirement of extensive training data and the potential for overfitting.
This study aims to provide a comprehensive guide for AOI systems using a convolutional neural network (CNN) in colored contact lens manufacturing, including the benefits and challenges of using these systems and future research directions in this field.

2. Materials and Methods

This section presents the main materials and manufacturing processes for contact lenses; the main characteristics for distinguishing good-quality lenses are explained.

2.1. Materials

Hydroxyethyl methacrylate (HEMA), ethylene glycol dimethacrylate (EGDMA), and reactive black 5 were obtained from Sigma-Aldrich (St Louis, MO, USA). Further, azobisisobutyronitrile (AIBN) was purchased from Junsei (Tokyo, Japan). Contact lens molds comprising upper (polybutylene terephthalate) and lower (polypropylene) molds with a diameter and thickness of 11.4 and 0.1 mm, respectively, were supplied by Youngwon Tech (Daegu, Korea).

2.2. Manufacturing Process for Colored Contact Lenses

The colored contact lens manufacturing process used is as follows. The contact lens monomers were vacuum-distilled before polymerization. Subsequently, the coating solution was prepared by mixing poly(ethylene glycol) methyl ether methacrylate (5 g), HEMA (4.92 g), EGDMA (0.04 g), and AIBN (0.04 g). The colorant coating solution was prepared by adding a reactive dye (reactive black 5) to 10 g of coating solution. The contact lens solution was prepared by mixing HEMA (9.92 g), EGDMA (0.04 g), and AIBN (0.04 g). The coating solution was then printed on top of the upper mold using a silicon pad and heated at 90 °C for 20 min. Further, the colorant coating solution was printed on the prepared upper part using a silicon pad patterned via laser treatment and heated at 90 °C for 20 min. Following injection of the contact lens solution into the lower mold, it was combined with the doubly printed upper mold and heated at 120 °C for 30 min to facilitate polymerization. Following cooling to 25 °C, the samples were removed from the molds and placed in 500 mL of deionized water. To completely remove the unreacted monomers and initiators, a washing step was conducted for 3 days by changing the water 3 times per day. Finally, the colored contact lenses were soaked in a saline solution and sterilized at 120 °C for 30 min. The mold used to prepare the contact lenses had a 14.6 mm outer diameter with an 8.6 mm base curve. Colored contact lens images were obtained using a high-resolution camera, MARS 2000-150UC (ICENTRAL) (Hangzhou Vision Datum Technology Co., Ltd., Hangzhou, China). The hydrated contact lenses were placed on a highly transparent glass plate. Consequently, images of the lenses were captured under light irradiation from white LEDs located under the glass plate. The images were obtained in a dark room to prevent distortion by external light. Further, the diameters of the colored contact lenses were measured using a Chiltern Lens Analyzer (Optimece JCF, Optimece Ltd., Malvern, UK).

2.3. Optical Transmittance Measurements for Colored Contact Lenses

The transmittance of the colored contact lenses was measured in a wavelength range of 250–700 nm using a Shimadzu UV-1650PC spectrophotometer (Shimadzu, Tokyo, Japan). The measurements for each sample were iterated four times, and the results were averaged.

2.4. Water Content Measurements for Colored Contact Lenses

The water content (WC) of the contact lenses was measured gravimetrically at room temperature. Initially, all the contact lenses were fully dried to a constant weight prior to equilibrium swelling. Subsequently, the fully dehydrated hydrogel contact lenses were allowed to swell to equilibrium through immersion in a saline solution and maintained at room temperature for 2 days. Thereafter, the samples were removed from the saline solution, excess surface water was blotted off with tissue paper, and the swollen hydrogel contact lens weights were recorded. The WC is calculated as follows:
WC (%) = [(Ws − Wd)/Wd] × 100,
where Ws and Wd denote the hydrogel contact lens weights in the equilibrium swelling and dry states, respectively. The average values of five measurements were calculated for each sample.

2.5. Contact Lens Defect Types

The types of contact lens defects [10] that may occur during the manufacturing process are shown in Figure 2. They can be divided into six categories: unformed, crack, foreign, bubble, dust, and print defects.
  • Unformed: A defect wherein a part of the lens is not molded;
  • Crack: A defect wherein the inside or edge of the lens is broken;
  • Foreign: Defects that occur when foreign matter enters the lens;
  • Bubble: Defects caused by bubbles forming in the lens during molding;
  • Dust: Defects caused by dust during molding;
  • Print: A defect wherein a part of the iris image is not printed.

2.6. CNN-Based Contact Lens Discrimination

Prior to discussing the CNN-based AOI system, this subsection reviews previous studies on CNN-based contact lens recognition (use of deep learning) as follows.
  • ContlensNet [11]: ContlensNet is a CNN trained with a large capacity of iris images and can be divided into three classes based on the iris images: normal (no lenses worn), textured (or colored) contact lenses worn, and soft (or clear) contact lenses worn.
  • DCLNet [12]: DCLNet is a CNN designed via the customization of a support vector machine (SVM) in Densenet121 and detects contact lenses from iris images captured from heterogeneous sensors.
  • DensePAD [13]: DensePAD is designed based on DenseNet with a depth of 22 to classify input iris images as real or iris spoofing with textured contact lenses.
  • GHCLNet [14]: GHCLNet is a CNN based on ResNet-50 and can be classified into three categories: no, textured, and soft lens, without preprocessing of the iris image.
  • In [15,16,17,18,19], a CNN was used to detect contact lenses in real time from iris images. Further, in [20], various CNNs were applied to detect defective colored contact lenses and the results were analyzed.
Among these previous studies, [11,12,13,14,15,16,17,18,19] focused on determining whether contact lenses were worn based on iris images, whereas [20] focused on determining whether colored contact lenses were defective. However, most contact-lens-related studies using CNNs are related to contact lens discrimination for iris recognition systems, and few studies on CNN-based lens defect detection have been conducted. Therefore, this study proposes a contact lens defect detection process using a CNN to determine defects occurring during the lens production process.

3. Contact Lens Defect Detection with a CNN Model

Next, we describe the application of a CNN model to an AOI system and the overall training and testing process for the CNN model for contact lens defect detection. For outlier detection (defective points) in real-time images using a CNN, the CNN model should not require a large amount of computation and should have an appropriate depth for nonlinear feature extraction. CNNs satisfying these conditions include ResNet [21], AlexNet [22], DenseNet [23], GoogLeNet [24], and MobileNet [25]. GoogLeNet V4 has exhibited the highest classification accuracy for colored contact lens defect detection [20]. Therefore, this study used GoogLeNet as the CNN for contact lens defect detection.

3.1. GoogLeNet Structure

Figure 3 shows a block diagram that simplifies the GoogLeNet structure. Based on the input data, each block was divided into three parts: the stem network, inception modules, and classifier output. After the training data were input, the first six layers were referred to as the stem network, which is a stack of convolution and max-pooling layers. The stem network structure is shown in Figure 4a. The inception module block, which is the core element of GoogLeNet, is a stack of inception modules whose structure is shown in Figure 4b. It extracts various training data features using convolution layers of different sizes and pooling. Further, it can have a deep neural network structure while reducing the number of parameters via the application of a 1 × 1 convolution layer. Finally, the classifier output comprised the collection of layers shown in Figure 4c, and the final classification was performed based on global average pooling (GAP).
The existing GoogLeNet, pretrained with ImageNet, can classify input images into 1000 object categories, such as keyboards, mice, and various animals. In this study, only seven categories (one good and six defects) were needed for contact lens defect classification. Consequently, the last three layers of classifier output layers (i.e., full connection, softmax, and output) were modified accordingly.

3.2. GoogLeNet Contact Lens Defect Detection Process

Figure 5 shows a flowchart for GoogLeNet-based contact lens defect detection. The process included a preprocessing step, including size conversion of the original photo, a classification process for classifying the training and verification data, a GoogLeNet training and verification process, and a trained GoogLeNet test process.

3.2.1. Preprocessing

  • Image resizing:
The original images used to determine contact lens defects on the production line are shown in Figure 6. They measure 3648 × 5472 uint8 in size, which cannot be used as input data because it does not match the input layer dimension of GoogLeNet (224 × 224 × 3). Therefore, the image dimension must be converted into three dimensions via image cropping. Figure 7 shows the original images reduced and converted via image cropping.
2.
Data augmentation:
In Section 2, the defects occurring during the manufacturing process were classified into six categories. However, data augmentation is essential for training the CNN owing to the considerable deviation in the amount of image data for defective products from that for good products. Data augmentation was performed by specifying image augmentation preprocessing options, such as resizing, rotation, translation, and reflection, in the original data. When training GoogLeNet, the data augmentation method used in this study involved resizing, rotation, translation, and reflection. Resizing was performed in the range of 1–1.5 times the original image, rotation was in the range of 0–360°, and translation was in the X and Y axes. The ranges of X and Y between −50 and 50 pixels was randomly applied to the training image at every epoch. Figure 8 shows an example of data augmentation for bubble data among the defect types.

3.2.2. Separation of the Training/Verification Data

Owing to the limited number of sample images obtained from the contact lens manufacturing process (bubble: 120, crack: 60, dust: 120, foreign: 120, good: 120, print: 90, unformed: 120), 90% of the data were set for each category as the training dataset, with the remaining 10% set as the validation dataset.

3.2.3. GoogLeNet Training and Verification

Table 1 lists the computer specifications for training GoogLeNet for contact lens defect detection.
Owing to the limitation in the total amount of computer resources and data, GoogLeNet trained with ImageNet was trained on the lens data through transfer learning, and the trained neural network was retrained. To prevent overfitting to the training data, the learning rate was reduced step by step for each epoch, and the training solver applied was the stochastic gradient descent with momentum (SGDM). The learning parameters applied for each training are listed in Table 2 and Table 3.

3.2.4. Defect Detection Test based on the Trained GoogLeNet

The GoogLeNet trained to identify lens defects was tested using six test images for each category. Consequently, the number of test images was unified to match the number of test images for the crack, with the least number of test images; the test images were images that had not been used for training and validation of GoogLeNet. Figure 9 shows the confusion matrix for the test result, and as is evident, the overall accuracy of GoogLeNet trained for contact lens defect discrimination was 59.5%.

4. Results and Discussion

The manufacturing process for color contact lenses necessitates high precision and attention to ensure that the lenses are safe, comfortable, and provide the desired color effect for the wearer. The color contact lens has a transparent pupil part, and the surrounding iris section has a colorant applied in a pattern to the entire user iris. A contact lens manufacturing process via cast molding was performed according to the following steps. (i) A plastic manufacturing mold was provided. (ii) At least one imprint with lens material was deposited on the surface of said mold via silicon pad printing and dried using a blank annular plate. (iii) A colorant was deposited on the dried imprint via pad printing. (iv) The mold was filled with the lens material. (v) The lens material was polymerized. (vi) The polymerized lens was separated from the molds. (vii) The swelling process of the contact lens applicable to a hydrogel state was applied. Finally, (viii) images were collected for the optical inspection of the colored contact lenses (Figure 10 and Figure 11) [26,27]. In step (iii), a pad composed of silicon rubber, impregnated with silicon oil for easy release, was pressed against the laser engraving patterns. Subsequently, the colorant from the prepared colorant-containing solution was obtained. The colorant on the pad was transferred to the front surface of the coated upper mold, thus depositing the colorant in the desired pattern over the iris section. The printing step may be repeated one or more times using different patterns in different colors. Before the lenses were packaged, they were subjected to an inspection process to ensure that they satisfied the quality standards. This inspection step included checking for defects such as bubbles, scratches, or uneven coloring. Notably, according to the patterns and colors of the contact lenses, the recognition rate may vary depending on the individual inspector.
Contact lenses comprise p(HEMA) networks that are chemically crosslinked, as shown in Figure 12. Initially, p(HEMA)-based contact lenses were synthesized via free radical polymerization with hydrophilic HEMA monomers using EGDMA as a crosslinking agent and AIBN as the initiator. The physical properties, such as elasticity and tensile strength of the contact lens, may vary depending on the ratio and species of the crosslinking agent and the monomer used. In this study, contact lenses were manufactured using the simplest and most common materials and ratios. Prior to the polymerization to obtain the contact lenses, the coating solution was prepared and immobilized on the surface of the upper mold. The coating solution prevented the colorant from eluting out of the lens. Poly(ethylene glycol) methyl ether methacrylate with a high molecular weight was added to the coating and colorant-containing solutions to increase the viscosity of the solutions, thus preventing their flow down from the mold.
The most important features of commercially available contact lenses are optical transparency, diameter, and WC. The optical transmittance of the fabricated hydrogels was measured at 300–700 nm and expressed as transmittance (%). The fabricated contact lenses exhibited high transmittance values (>94%), exceeding the optical transmittance required for wearable contact lenses (92%), as shown in Figure 13a. No opaque part could be observed on the lens surfaces with naked eyes, which is consistent with the spectroscopic results. These results imply that the networked p(HEMA) chains are uniformly distributed in the lens, causing no visible light scattering.
The diameters of the prepared contact lenses were measured to be approximately 14.6 mm, corresponding to an overall diameter of 14.0–14.8 mm for most traditional soft contact lenses. The WC of the prepared contact lenses was measured to be approximately 38.4% using a drying oven. A WC level of approximately 38% feels comfortable and allows the eyes to breathe without becoming dried out, which renders it suitable for commercialized products. Following the completion of the above process, the contact lens defect was assessed using the CNN.
Figure 14 shows the trained GoogLeNet results of the contact lens defect detection for each of the six test images. The discrimination accuracies for each category were 83.3% for good products and 83.3, 16.7, 50, 83.3, 33.3, and 66.7% for bubble, crack, dust, foreign, print, and unformed defects, respectively. The low recognition rate can be attributed to the exceedingly small amount of data used, which appears to be sufficiently overcome if the amount of data is accumulated and increased in the future. In contrast to the defective data, the good data did not exhibit any singularities, and the recognition rate was greater than 83% for all backlight, ring-light, and spotlight images. A misrecognition rate of approximately 17% corresponded to assessing an unformed image that could not be identified by the naked eye as a good product. For bubbles, the recognition rate tended to decrease in the backlight image. As shown in the first image in Figure 14b, if bubbles were generated inside the area where the iris image was printed, it was determined as a bubble; however, as shown in the third image in Figure 14b, the inside of the iris image or the edge of the lens was classified as another category in the case of bubbles. In contrast, the ring and spotlight images exhibited an almost accurate recognition rate, as the characteristics of the bubbles appeared clearly. In the case of crack defects with the minimum amount of data, the absolute amount of learning data was insufficient. Consequently, the recognition rate was assessed as poor because it exhibited characteristics similar to those of unformed and foreign defects. Dust exhibited features similar to those of foreign matter in the backlight images, and clear dust features appeared in the ring and spotlight images. In the case of foreign defects, most of the images were assessed as foreign; however, certain images were recognized as unformed patterns. Further, in the case of print defects, the distinction was challenging owing to the various patterns of the iris. In addition, as shown in Figure 14f, it could be approximately distinguished in the backlight image; however, as the iris image was not visible in the ring- and spotlight images, the extraction of the feature points was difficult. This appeared to lower the recognition rate. In the case of unformed defects, the feature was clearly visible at the edge of the lens. However, in certain images, a tendency to confuse it with foreign defects was observed. This is shown in Figure 14g.

5. Conclusions

This study presents a comprehensive guide for CNN-based AOI systems. A sequential process was introduced for colored contact lens manufacturing; optical transparency, diameter, and WC determination; and CNN-based contact lens defect detection. In general, defect identification for contact lenses produced in large quantities through the casting process has been conducted manually. For contact lens defect detection, backlight, ring-light, and spotlight images were visually checked using optical equipment. Upon the detection of a defect type, it was determined to be a defect. However, the manual detection of lens defects requires a relatively long time to determine the defects compared with the production volume. The CNN (GoogLeNet) employed in this study to detect defective contact lenses in the AOI system yielded a good product recognition rate of 83.3% despite the limitations of the computational resources and data. This is somewhat lower than the 95% recognition rate for human contact lens defects; however, the recognition rate is expected to increase further with additional data and computational resources that are expected to become available in the future. These results also highlight the challenges to be addressed, along with the advantages of utilizing a CNN-based AOI system that can revolutionize the contact lens manufacturing quality-control process. Ultimately, it can be concluded that the use of an AOI system incorporating CNN-based artificial intelligence can ensure high-quality and safe contact lens production, which can benefit both manufacturers and end users.

Author Contributions

Conceptualization, T.-y.K., D.P., H.M., and S.-s.H.; methodology, T.-y.K. and S.-s.H.; software, T.-y.K.; validation, T.-y.K., D.P., H.M.; investigation, T.-y.K., D.P., H.M., and S.-s.H.; writing—original draft preparation, T.-y.K.; writing—review and editing, S.-s.H.; supervision, S.-s.H. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by the Basic Science Research Program through the National Research Foundation of Korea (NRF), which received funding from the Ministry of Education (No. 2022R1A6A3A01087346). This research was supported by the Basic Science Research Program through the National Research Foundation of Korea (NRF), which received funding from the Ministry of Education, Science, and Technology (No. 2018R1D1A1B07041644).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data sharing is not applicable to this article.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Moreddu, R.; Vigolo, D.; Yetisen, A.K. Contact lens technology: From fundamentals to applications. Adv. Healthc. Mater. 2019, 8, 1900368. [Google Scholar] [CrossRef] [PubMed]
  2. Musgrave, C.S.A.; Fang, F. Contact lens materials: A materials science perspective. Materials 2019, 12, 261. [Google Scholar] [CrossRef] [PubMed]
  3. Xu, J.; Xue, Y.; Hu, G.; Lin, T.; Gou, J.; Yin, T.; He, H.; Zhang, Y.; Tang, X. A comprehensive review on contact lens for ophthalmic drug delivery. J. Control Release 2018, 281, 97–118. [Google Scholar] [CrossRef] [PubMed]
  4. Sartini, F.; Menchini, M.; Posarelli, C.; Casini, G.; Figus, M. In vivo efficacy of contact lens drug-delivery systems in glaucoma management. A systematic review. Appl. Sci. 2021, 11, 724. [Google Scholar] [CrossRef]
  5. Kim, J.; Cha, E.K.; Park, J.U. Recent advances in smart contact lenses. Adv. Mater. Technol. 2020, 5, 1900728. [Google Scholar] [CrossRef]
  6. Herrera, J.A.; Vilaseca, M.; Düll, J.; Arjona, M.; Torrecilla, E.; Pujol, J. Iris color and texture: A comparative analysis of real irises, ocular prostheses, and colored contact lenses. Color Res. Appl. 2011, 36, 373–382. [Google Scholar] [CrossRef]
  7. Hsu, M.Y.; Hong, P.Y.; Liou, J.C.; Wang, Y.P.; Chen, C. Assessment of ocular surface response to tinted soft contact lenses with different characteristics and pigment location. Int. J. Optomechatronics 2020, 14, 119–130. [Google Scholar] [CrossRef]
  8. Elliott, C.J. Automatic optical measurement of contact lenses. Proc. SPIE Auto. Opt. Inspec. 1986, 654, 125–129. [Google Scholar]
  9. Chang, C.L.; Wu, W.H.; Hwang, C.C. Automatic optical inspection method for soft contact lenses. Proc. SPIE Int. Conf. Opt. Photonic Eng. 2015, 9524, 17–22. [Google Scholar]
  10. Efron, N. Contact Lens Practice, 2nd ed.; Elsevier Health Sciences: Amsterdam, The Netherlands, 2010; pp. 100–108. [Google Scholar]
  11. Raghavendra, R.; Raja, K.B.; Busch, C. Contlensnet: Robust iris contact lens detection using deep convolutional neural networks. In Proceedings of the 2017 IEEE Winter Conference on Applications of Computer Vision (WACV), Santa Rosa, CA, USA, 24–31 March 2017. [Google Scholar]
  12. Choudhary, M.; Tiwari, V.; Venkanna, U. An approach for iris contact lens detection and classification using ensemble of customized DenseNet and SVM. Future Gener. Comput. Syst. 2019, 101, 1259–1270. [Google Scholar] [CrossRef]
  13. Yadav, D.; Kohli, N.; Vatsa, M.; Singh, R.; Noore, A. Detecting textured contact lens in uncontrolled environment using DensePAD. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, Long Beach, CA, USA, 16–20 June 2019. [Google Scholar]
  14. Singh, A.; Mistry, V.; Yadav, D.; Nigam, A. Ghclnet: A generalized hierarchically tuned contact lens detection network. In Proceedings of the 2018 IEEE 4th International Conference on Identity, Security, and Behavior Analysis (ISBA), Singapore, 11–12 January 2018. [Google Scholar]
  15. Kimura, G.Y.; Lucio, D.R.; Britto, A.S., Jr.; Menotti, D. CNN hyperparameter tuning applied to iris liveness detection. arXiv 2020, arXiv:2003.00833. [Google Scholar]
  16. Doyle, J.S.; Flynn, P.J.; Bowyer, K.W. Automated classification of contact lens type in iris images. In Proceedings of the 2013 International Conference on Biometrics (ICB), Madrid, Spain, 4–7 June 2013. [Google Scholar]
  17. Gautam, G.; Mukhopadhyay, S. Contact lens detection using transfer learning with deep representations. In Proceedings of the 2018 International Joint Conference on Neural Networks (IJCNN), Rio de Janeiro, Brazil, 8–13 July 2018. [Google Scholar]
  18. Poster, D.; Nasrabadi, N.; Riggan, B. Deep sparse feature selection and fusion for textured contact lens detection. In Proceedings of the 2018 International Conference of the Biometrics Special Interest Group (BIOSIG), Darmstadt, Germany, 26–28 September 2018. [Google Scholar]
  19. Hoffman, S.; Sharma, R.; Ross, A. Iris+ ocular: Generalized iris presentation attack detection using multiple convolutional neural networks. In Proceedings of the 2019 International Conference on Biometrics (ICB), Crete, Greece, 4–7 June 2019. [Google Scholar]
  20. Kim, G.-N.; Kim, S.-H.; Joo, I.; Yoo, K.-H. Detection of Color Contact Lens Defects using Various CNN Models. J. Korea Contents Assoc. 2022, 22, 160–170. [Google Scholar] [CrossRef]
  21. He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 26 June–1 July 2016. [Google Scholar]
  22. Krizhevsky, A.; Sutskever, I.; Hinton, G.E. Imagenet classification with deep convolutional neural networks. Commun. ACM 2017, 60, 84–90. [Google Scholar] [CrossRef]
  23. Huang, G.; Liu, Z.; Van Der Maaten, L.; Weinberger, K.Q. Densely connected convolutional networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 22–25 July 2017. [Google Scholar]
  24. Szegedy, C.; Liu, W.; Jia, Y.; Sermanet, P.; Reed, S.; Anguelov, D.; Erhan, D.; Vanhoucke, V.; Rabinovich, A. Going deeper with convolutions. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015. [Google Scholar]
  25. Howard, A.G.; Zhu, M.; Chen, B.; Kalenichenko, D.; Wang, W.; Weyand, T.; Andreetto, M.; Adam, H. Mobilenets: Efficient convolutional neural networks for mobile vision applications. arXiv 2017, arXiv:1704.04861. [Google Scholar]
  26. Choi, J.H.; Li, Y.; Jin, R.; Shrestha, T.; Choi, J.S.; Lee, W.J.; Moon, M.J.; Ju, H.T.; Choi, W.; Yoon, K.C. The efficiency of cyclosporine A-eluting contact lenses for the treatment of dry eye. Curr. Eye Res. 2019, 44, 486–496. [Google Scholar] [CrossRef] [PubMed]
  27. Kim, J.H.; Mondal, H.; Jin, R.; Yoon, H.J.; Kim, H.J.; Jee, J.P.; Yoon, K.C. Cellulose Acetate Phthalate-Based pH-responsive cyclosporine A-loaded contact lens for the treatment of dry eye. Int. J. Mol. Sci. 2023, 24, 2361. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Examples of colored contact lenses (provided by Jckmedical Co., Ltd., Gwangju, Republic of Korea).
Figure 1. Examples of colored contact lenses (provided by Jckmedical Co., Ltd., Gwangju, Republic of Korea).
Applsci 13 05966 g001
Figure 2. Types of contact lens defects: (a) unformed, (b) crack, (c) foreign, (d) bubble, (e) dust, and (f) print defects.
Figure 2. Types of contact lens defects: (a) unformed, (b) crack, (c) foreign, (d) bubble, (e) dust, and (f) print defects.
Applsci 13 05966 g002
Figure 3. Block diagram of GoogLeNet.
Figure 3. Block diagram of GoogLeNet.
Applsci 13 05966 g003
Figure 4. Layer organization for each GoogLeNet block: (a) stem network; (b) inception module; (c) classifier output.
Figure 4. Layer organization for each GoogLeNet block: (a) stem network; (b) inception module; (c) classifier output.
Applsci 13 05966 g004
Figure 5. GoogLeNet-based contact lens defect detection process.
Figure 5. GoogLeNet-based contact lens defect detection process.
Applsci 13 05966 g005
Figure 6. Original contact lens images: (a) backlight, (b) ring-light, and (c) spotlight images.
Figure 6. Original contact lens images: (a) backlight, (b) ring-light, and (c) spotlight images.
Applsci 13 05966 g006
Figure 7. Handling original contact lens images through cropping: (a) backlight, (b) ring-light, and (c) spotlight images.
Figure 7. Handling original contact lens images through cropping: (a) backlight, (b) ring-light, and (c) spotlight images.
Applsci 13 05966 g007
Figure 8. Example of data augmentation (defect type: bubble).
Figure 8. Example of data augmentation (defect type: bubble).
Applsci 13 05966 g008
Figure 9. Confusion matrix of test result.
Figure 9. Confusion matrix of test result.
Applsci 13 05966 g009
Figure 10. Photographs of the contact lens manufacturing processes (provided by Jckmedical Co., Ltd., Gwangju, Republic of Korea).
Figure 10. Photographs of the contact lens manufacturing processes (provided by Jckmedical Co., Ltd., Gwangju, Republic of Korea).
Applsci 13 05966 g010
Figure 11. Images of molds and colored contact lenses during the printing and curing processes: (1) coated upper mold, (2) color-printed upper mold, (3) lens-material-filled lower mold, (4) combination of color-printed upper mold and lens-material-filled lower mold, (5) obtained colored contact lens after removing the molds, and (6) colored contact lenses under swelling.
Figure 11. Images of molds and colored contact lenses during the printing and curing processes: (1) coated upper mold, (2) color-printed upper mold, (3) lens-material-filled lower mold, (4) combination of color-printed upper mold and lens-material-filled lower mold, (5) obtained colored contact lens after removing the molds, and (6) colored contact lenses under swelling.
Applsci 13 05966 g011
Figure 12. Synthesis of hydrogel contact lenses via copolymerization of HEMA and EGDMA.
Figure 12. Synthesis of hydrogel contact lenses via copolymerization of HEMA and EGDMA.
Applsci 13 05966 g012
Figure 13. (a) Transmittance spectra and (b) diameter observation of the prepared colored contact lenses.
Figure 13. (a) Transmittance spectra and (b) diameter observation of the prepared colored contact lenses.
Applsci 13 05966 g013
Figure 14. Test results by category: (a) good products, (b) bubble defect, (c) crack defect, (d) dust defect, (e) foreign defect, (f) print defect, (g) unformed defect.
Figure 14. Test results by category: (a) good products, (b) bubble defect, (c) crack defect, (d) dust defect, (e) foreign defect, (f) print defect, (g) unformed defect.
Applsci 13 05966 g014aApplsci 13 05966 g014bApplsci 13 05966 g014c
Table 1. Computer specifications for training GoogLeNet.
Table 1. Computer specifications for training GoogLeNet.
IndexSpecifications
OSWindow 11
CPUIntel® Core™ i5-9400
GPU (memory)GeForce GT 1030 (2 GB)
MATLAB version2022B
Table 2. Transfer learning parameters.
Table 2. Transfer learning parameters.
FieldSpecification
Training SolverSGDM
Batch Size64
Epochs20
Initial Learning Rate0.001
Learn Rate SchedulePiecewise
Learn Rate Drop Factor0.2
Learn Rate Drop Period5 epochs
ShuffleEvery epoch
Validation Frequency7
Table 3. Secondary training parameters.
Table 3. Secondary training parameters.
FieldSpecification
Training SolverSGDM
Batch Size64
Epochs100
Initial Learning Rate0.001
Learn Rate SchedulePiecewise
Learn Rate Drop Factor0.15
Learn Rate Drop Period20 epochs
ShuffleEvery epoch
Validation Frequency10
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Kim, T.-y.; Park, D.; Moon, H.; Hwang, S.-s. A Deep Learning Technique for Optical Inspection of Color Contact Lenses. Appl. Sci. 2023, 13, 5966. https://doi.org/10.3390/app13105966

AMA Style

Kim T-y, Park D, Moon H, Hwang S-s. A Deep Learning Technique for Optical Inspection of Color Contact Lenses. Applied Sciences. 2023; 13(10):5966. https://doi.org/10.3390/app13105966

Chicago/Turabian Style

Kim, Tae-yun, Dabin Park, Heewon Moon, and Suk-seung Hwang. 2023. "A Deep Learning Technique for Optical Inspection of Color Contact Lenses" Applied Sciences 13, no. 10: 5966. https://doi.org/10.3390/app13105966

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop