Next Article in Journal
Multi-Robot Exploration Based on Multi-Objective Grey Wolf Optimizer
Next Article in Special Issue
Virtual Object Manipulation by Combining Touch and Head Interactions for Mobile Augmented Reality
Previous Article in Journal
A Hybrid Approach Using Fuzzy AHP-TOPSIS Assessing Environmental Conflicts in the Titan Mining Industry along Central Coast Vietnam
Previous Article in Special Issue
The Motivation of Technological Scenarios in Augmented Reality (AR): Results of Different Experiments
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Augmented Reality Implementations in Stomatology

by
Aleš Procházka
1,2,*,
Tatjana Dostálová
3,
Magdaléna Kašparová
3,
Oldřich Vyšata
1,4,
Hana Charvátová
5,
Saeid Sanei
6 and
Vladimír Mařík
2
1
Department of Computing and Control Engineering, University of Chemistry and Technology in Prague, 166 28 Prague 6, Czech Republic
2
Czech Institute of Informatics, Robotics and Cybernetics, Czech Technical University in Prague, 160 00 Prague 6, Czech Republic
3
Department of Stomatology, 2nd Medical Faculty, Charles University in Prague and Motol University Hospital, 150 06 Prague, Czech Republic
4
Department of Neurology, Faculty of Medicine in Hradec Králové, Charles University in Prague and University Hospital Hradec Králové, 500 05 Hradec Králové, Czech Republic
5
Faculty of Applied Informatics, Tomas Bata University in Zlín, 760 01 Zlín, Czech Republic
6
School of Science and Technology, Nottingham Trent University, Nottingham NG11 8NS, UK
*
Author to whom correspondence should be addressed.
Appl. Sci. 2019, 9(14), 2929; https://doi.org/10.3390/app9142929
Submission received: 3 July 2019 / Revised: 16 July 2019 / Accepted: 18 July 2019 / Published: 22 July 2019
(This article belongs to the Special Issue Augmented Reality: Current Trends, Challenges and Prospects)

Abstract

:
Augmented reality has a wide range of applications in many areas that can extend the study of real objects into the digital world, including stomatology. Real dental objects that were previously examined using their plaster casts are often replaced by their digital models or three-dimensional (3D) prints in the cyber-physical world. This paper reviews a selection of digital methods that have been applied in dentistry, including the use of intra-oral scanning technology for data acquisition and evaluation of fundamental features of dental arches. The methodology includes the use of digital filters and morphological operations for spatial objects analysis, their registration, and evaluation of changes during the treatment of specific disorders. The results include 3D models of selected dental arch objects, which allow a comparison of their shape and position during repeated observations. The proposed methods present digital alternatives to the use of plaster casts for semiautomatic evaluation of dental arch measures. This paper describes some of the advantages of 3D digital technology replacing real world elements and plaster cast dental models in many areas of classical stomatology.

Graphical Abstract

1. Introduction

Augmented reality (AR) covers many research areas [1,2,3,4,5], including biomedicine, neurology and engineering. The implementation of these methods in stomatology [6,7,8,9,10,11] extends the medical study of real dental objects into the digital world. This paper describes how real dental objects can be replaced by their digital models in the present cyber-physical environment, which combines augmented and virtual reality [7]. The efficiency of the use of AR in stomatology can be seen during surgical navigation [12], manipulation in implantology [13,14], and applications in maxillofacial surgery [15] as well.
The initial stage of digitization and the introduction of computational methods in stomatology include the replacement of physical plaster casts models by digital models [16,17] that are recorded by digital cameras, as presented in Figure 1. This approach overcomes several of the limitations of traditional plaster models; for example, they can be easily damaged, require time for model fabrication, and require space to be set aside for their storage [18]. This use of digital technologies allows orthodontists to more accurately analyze dental problems, predict their treatment, more precisely specify the refabrication tasks, and it simplifies the collaboration with further specialists, remote research laboratories and educational institutions.
The use of advanced 3D scanners, as presented in Figure 2a, and wireless communication links allows the construction of more accurate digital models (see Figure 2b) that are based on stereolithography (STL) data. The use of AR for user-friendly intra-oral scanning [19] allows detailed analysis of dental objects, restoration of tooth cavities and tooth alignment in the computational environment, and then enables the return from the digital to real world again. This approach is often used for surgical navigation and for the study of dental arch models during the orthodontic treatment.
In all these cases, digital technologies are extensively combined with computational intelligence to construct and analyze digital models with sufficient accuracy allowing for reducing the examination of real objects in the clinical practice. Specific mathematical methods are used in this stage for digital rotation, shifting and scaling of digital objects, and for their registration, feature learning and classification.
Three-dimensional (3D) modelling is a rapidly developing area that uses digital signal processing methods to analyse 3D bodies and their time evolution. The 3D data processing forms the basis of modern robotic and information systems, including the possible implementation of virtual reality methods. The results of these technologies include printing of spatial bodies, including dental objects.
The whole methodology related to the AR in stomatology is based upon the use of specific sensors, cameras, contactless scanners [18], and (wireless) communication links, followed by 3D object reconstruction. Applications of these methods assume the knowledge of optics [20], engineering [21], biomechanics, information technologies [22,23], and material sciences.
Mathematical and computational methods used for data processing in this area include procedures of signal and image analysis, data denoising methods [24,25], morphological operations, image registration [26], stereo vision methods, and statistical evaluation of the results. Functional transforms—including wavelet analysis and specific optimization methods—are important for feature extraction and classification. Meanwhile, 3D data processing also assumes the application of geometrical modelling [27]. Point cloud data processing and the 3D modelling of spatial objects form further specific research areas, which are based upon the use of digital signals and image processing methods.
This paper surveys selected methods related to the 3D modelling in stomatology, which allow the implementation of AR, and the examination and treatment of stomatological disorders. It presents methods of data acquisition by camera systems [16,28] and digital evaluation of plaster casts by simple and more advanced mathematical methods.
Special attention is devoted to processing the data recorded by an intra-oral contactless scanner [20,29,30,31,32,33,34], as presented in Figure 2a, which allows the construction of 3D models as a basic step in the whole complex digitization process. The resulting data are then de-noised and analyzed to select the dental measures [22] that are important for the treatment of specific disorders. Image registration methods are then used to compare images during the treatment and to analyze its time evolution.

2. Methods

Augmented reality in stomatology uses different methods to process data obtained by panoramic radiography, to analyse images of dental casts, or to process data acquired by intra-oral scanning devices. No ethical approval was required for this study.

2.1. Panoramic Radiography

Dental panoramic radiographs are a form of focal plane tomography and they are often used for clinical examination in the diagnosis of dental diseases. These methods are routinely applied in dental practice, despite the limitations of two-dimensional (2D) analysis. Figure 3 presents an example of the resulting image, which can be further analysed by specific image processing [35] and edge detection methods.
The computational methods that are used in this area include common edge detectors (Roberts, Sobel, Previtt and Canny methods), which are based upon the use of 3 × 3 matrix kernels that are convolved with the image for estimation of the first derivative in the horizontal ( G x ) and vertical ( G y ) directions, followed by edge gradient evaluation using relation
G = G x 2 + G y 2 .
Recently, several new computer-assisted approaches [36,37] have been proposed that have better performance efficiency and allow the detection of edges in complex medical images with different artefacts.

2.2. Camera Systems in Orthodontic Measures Evaluation

Dental casts still play an important role in the diagnosis and in the planning of the treatment of dental disorders. They are the gold standard in the evaluation of the dental arch shape, in measuring dental arch distances, and tooth positions. Associated digital models can simplify their use because they allow access to the patients’ records through the computer network, they increase the possibility to share the models with other specialists during the therapy, and they enable accurate measurements for the diagnosis setups using selected image processing methods.
While manual measurements of plaster casts can be used, it is possible to create digital models of them [22] and collect these measurements using general image processing methods applied to images from a single camera, as presented in Figure 4. Manually detected rectangular regions A and B can be then used for the automatic detection of their extreme values applied to the thresholded image. If we know the camera resolution and the distance between image pixels, it is possible to evaluate the distance between selected teeth during the treatment.
The multicamera spatial modelling approach can be used to reduce the errors caused by the use of a single camera. Figure 5 presents the system of two cameras located at a selected distance c to follow a specific point C on a plaster cast in 3D space. Cameras R 1 and R 2 together with the object C form a triangle that can be used to determine its spatial coordinates. The angular and spatial resolution of both cameras in the selected coordinate system was evaluated during the calibration stage [22]. Corresponding objects C 1 , C 2 detected by both cameras can be then specified by their coordinates [ x C 1 , y C 1 , z C 1 ] and [ x C 2 , y C 2 , z C 2 ] and their distance can be evaluated using relation
D = ( x C 2 x C 1 ) 2 + ( y C 2 y C 1 ) 2 + ( z C 2 z C 1 ) 2 .
Given that the process of the treatment in dentistry includes also time evolution aspects, it is often necessary to compare the dental arch parameters changes. The dental plane evaluated from the spatial coordinates of the teeth tops from separate observations can be then used [22] to define its parameters
a 1 x + a 2 y + a 3 z + a 4 = 0
using the least square method and the selected Cartesian coordinate system. Coefficients { a i } i = 1 4 were then used [22] to rotate all of the objects around individual coordinates into the horizontal plane, which allows digital comparison of individual teeth positioning.
Figure 6a,b present locations of the centres of individual teeth in 3D space as evaluated by the proposed method [22] and the plane of their positions in the selected Cartesian coordinate system. The teeth centers specified in dental planes and rotated into the horizontal position are presented in Figure 6c, together with the dental arch curve. This enables locations of the teeth to be evaluated during treatment from the spatial coordinates that have been obtained. This transform allows us to detect changes in the dental arch with a stereo matching algorithm [38,39]. Corresponding pairs of images obtained before and after the dental treatment enable us to use 2D registration to evaluate the results of the operations [40].
The use of camera systems is closely related to the appropriate illumination [23] and separation of overlapping regions. Figure 7 presents the results of segmentation applied to the image of a dental arch acquired with combined light sources to enhance image contours, which is processed by moving average and median filtering to reject the high-frequency and impulsive image noise components. Figure 7a presents the results of the circular Hough transform, which allows us to detect individual objects, and the results of the segmentation process using the region growing and convex hull methods in Figure 7b.
The detail segmentation of overlapping regions that is presented in Figure 8 is based on separation of individual objects using specific methods based upon definition of convex regions in the observed image [23]. The following detail segmentation can use morphological methods, watershed transform and region growing by multiple seed points to substantially improve the result of this segmentation process.

2.3. Scanning Technologies in Spatial Model Construction

The scanning technologies that are used in dentistry [41,42] include triangulation using laser light, parallel confocal imaging, or fringe interferometry based upon projection of light patterns.
Data acquisition systems mostly include sensors located in a handheld camera on a wand that collects surface data inside the mouth cavity, as presented in Figure 2a, and a communication system that transmits these data into a computer for further processing. Either laser or white light is used to illuminate the scanned area and its reflection is acquired by specific sensors. Signal and image processing methods are then used to analyse and visualise the resulting 3D model.
The widely used Trios intra-oral scanning device uses advanced parallel confocal imaging. In principle, this device [43,44,45,46] captures sets of 2D images of the oral cavity and then combines them into a spatial model using optical sectioning technology [41] and geometrical modelling. STL data exported from the Trios intra-oral scanner into the Matlab and Comsol computational environments were used to construct and analyze digital models of the lower and upper dental arches. The resolution of the Trios scanner [47] was 41.2 points/mm 2 and each model was constructed from about 300,000 vertices.
Algorithms have also been developed to process the point cloud datasets acquired by the intra-oral scanner, including the selected signal processing methods, imaging technologies [32,41], and mathematical methods to construct the surface areas. Delaunay triangulation and Voronoi diagrams form the basis of the computational geometry used in this area.
Specific methods of data processing presented in [34] included the transform of recorded datasets into the selected 3D coordinate system, forming a matrix A and using the chosen grid size of the plane [ x , z ] , as presented in Figure 9. The evaluated data were then de-noised by a median filter as a nonlinear digital filtering technique using the moving window and a mask of 9 by 9 elements, resulting in a new matrix B . The selection of the contour level density was then applied to detect specific areas of the individual teeth. These regions of interest were then used to estimate the dental arch parameters.
The coordinates of evaluated tops of each tooth were used [34] for parabolic approximation of the dental arch by a function
f ( x ) = c 1 x 2 + c 2 x + c 3
for x in a range of minimum and maximum values on the horizontal axes. The coefficients of this function formed further measures associated with each dental arch.
Image rigid registration methods that are used for acquired 3D dental arch data can be then applied to study the reproducibility of the proposed method in the analysis of dental arches and to follow the time evolution of observed dental measures during the treatment. The results presented in [34] used datasets acquired by an intra-oral scanner and recorded in the STL format, which were analyzed and processed in the MATLAB (ver. R2019a, The MathWorks, Inc., Natick, MA, USA) and COMSOL Multiphysics (ver. 5.3, COMSOL, Inc., Stockholm, Sweden) computational, simulation, and visualization environments.

3. Results

Figure 2b presents a selected STL model of a lower dental arch obtained by an intra-oral scanner; further details are presented in [34]. The contour plot of the evaluated digital model in 3D space is presented in Figure 9a. Figure 9b presents a 3D plot of its selected body (a molar tooth) with the associated contour plot in Figure 9c, which is used for detail visualisation of its shape and to specify the regions of interest.
Figure 10 presents digital models of the left and the right canine teeth. The details of their contours presented in Figure 10c,d were used to precisely locate the teeth tops coordinates x k , z k associated with the value
y k = max ( y ( i , j ) )   for   i , j U k ,
where U k specifies the rectangular region around the k-th tooth. This semiautomatic specification of the teeth tops coordinates allows us to estimate the distances between the left and the right canine teeth (for k = L 3 and k = R 3 , respectively) by relation (2), which defines a valuable coefficient used to specify the dental arch and to evaluate the treatment. An alternative approach based upon object rotation into the horizontal plane is presented in [22,34].
The evaluation of distances between canine teeth was done for 20 observations of the same individual in [34]. The difference in standard deviation to process the lower and upper dental arches was explained by the less convenient scanning of the upper dental arch, which was controlled on the monitor of the scanner itself.
Figure 11 presents the process of the rigid registration [48] of the input and base images (in Figure 11a,b, respectively) of the same individual. The fixed points are selected from the 3D model and the tops of the individual teeth. The image registration includes scaling, rotation and translation of all data points { x ( i , j ) , z ( i , j ) } into their new locations { X ( i , j ) , Z ( i , j ) } , which can be performed by matrix multiplication
[ X ( i , j ) , Z ( i , j ) ] = [ x ( i , j ) , z ( i , j ) , 1 ] a ( 3 ) b ( 3 ) a ( 2 ) b ( 2 ) a ( 1 ) b ( 1 )
with optimized values of the transform matrix.
The distances between symmetrical values evaluated for the base and registered images by Equation (4), as described in [34], are lower than 4% in this case. The results of the registration presented in Figure 11c can be further used to evaluate the treatment and time comparison of dental arch changes.
Intraoral scanners provide stereographic files that can be used for both digital analysis of real objects and for their 3D printing [49,50,51,52]. The accuracy depends on the scanner and printer technology used, as presented in Figure 12.
AR in stomatology is closely related to real data acquired as a plaster cast or as a digital model. In the computational environment, these data can be analysed by numerical methods and computational intelligence tools can be used to evaluate the information that is necessary for diagnostics and the treatment of dental disorders. In this way, the real world can be combined with virtual reality, while the computational environment extends direct observations and allows for a more detailed understanding of real situations and processes.
Digitization has become a regular part of dentistry, orthodontics and maxillofacial surgery. The 3D reconstructive imagery based on a cone beam computed tomography (CBCT), intraoral and facial scans and computer-aided design techniques and technologies is more precise and can replace 2D X-rays, photographs, making impressions, models, splints and fabricating prostheses. In addition, the education and training of new practitioners is simpler due to virtual patient programs, dental software, testing devices, audiovisual aids and so on. Digitization permits the use of many new and more effective methods in the development of stomatology.

4. Discussion

This paper summarizes selected aspects of the use of AR in dentistry with a wide range of applications during the analysis of dental bodies, during surgical navigation, restorative dentistry, and orthodontics. It also presents the use of selected multidimensional signal processing methods and different computational intelligence tools in this area. Special attention is paid to a brief description of scanning technologies and an intra-oral 3D scanner used for dental data acquisition and their 3D modelling in the selected coordinate system.
The results present the possibility of using a spatial model for semiautomatic detection of specific parts of dental objects and to evaluate dental arch parameters during the treatment of dental disorders. They also demonstrate how the specification of regions of interest for individual teeth on the 3D surface plot can be improved by the contour lines of the surface, together with their selected density. The results also show that contour plots allow more precise analysis of the shape of the individual teeth.
The separation of individual objects of dental arches points to the possibility of dental object description, which will allow their mathematical analysis and evaluation of dental arch parameters. Selected digital models of dental objects can be used for their 3D printing and physical modelling.
Using AR provides the dental patients more freedom and comfort to navigate around their treatment plans and choices. It also enables a better interaction of patients with dentists to select and implement their optimum treatment.
Future studies related to digitization in stomatology will be probably devoted to the use of digital models for more detailed mathematical evaluation of dental parameters and to classify dental disorders using a set of selected dental features. Specific studies might be further devoted to 3D printing of individual teeth and to the analysis of appropriate printing materials. It seems at the moment that these 3D physical prints can form an alternative to classical dental replacements in the near future.

5. Conclusions

Aesthetic dentistry and implants are very promising AR-based research areas. Using AR, it is possible to point the camera around and see how the implant would look in different positions and locations in the patient’s mouth. In this way, AR can seamlessly be assimilated into existing technology today. AR allows us to overlay information in the real world. In the case of learning dental restoration, a practitioner might point a device at the mouth and overlay it with video or other information. AR would then provide them with additional information to interact with the machine in the real world.
Applications of AR in stomatology cover both digital modelling of spatial objects and analysis of signals acquired by different sensors including data obtained by the diffuse reflectance spectroscopy. Machine learning and selected classification and probabilistic methods can be then applied for detail processing of specific object features.
It seems that the future development of the use of AR in stomatology will be closely related to the development of associated visualization methods and parallel investigation of objects in the real world and computational environment. Sophisticated registration methods will also be applied in the comparative mode to adapt mathematical models to changing external conditions.

Author Contributions

Investigation: A.P.; Conceptualization: T.D.; Data acquisition: M.K.; Methodology: O.V.; Evaluation: H.C.; Supervision: S.S.; Conceptualization: V.M.

Funding

This research was funded (a) by grant projects of the Ministry of Health of the Czech Republic (FN HK 00179906) and of the Charles University in Prague, Czech Republic (PROGRES Q40), (b) by the project PERSONMED, Reg. No. CZ.02.1.010.00.017_0480007441, co-financed by European Regional Development Fund (ERDF) and the governmental budget of the Czech Republic, and (c) by grant VI 20152020040.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Interrante, V.; Hollerer, T.; Lecuyer, A. Virtual and augmented reality. IEEE Comput. Graph. Appl. 2018, 38, 28–30. [Google Scholar] [CrossRef]
  2. Nayyar, A.; Nguyen, G.N. Augmenting Dental Care: A Current Perspective. In Emerging Technologies for Health and Medicine: Virtual Reality, Augmented Reality, Artificial Intelligence, Internet of Things, Robotics, Industry 4.0; Le, D., Le, C., Tromp, J.G., Nguyen, G., Eds.; Wiley-Scrivener Publishing LLC: Beverly, MA, USA, 2018. [Google Scholar]
  3. Hettig, J.; Engelhardt, S.; Hansen, S.; Mistelbauer, G. AR in VR: Assessing surgical augmented reality visualizations in a steerable virtual reality environment. Int. J. Comput. Assist. Radiol. Surg. 2018, 13, 1717–1725. [Google Scholar] [CrossRef] [PubMed]
  4. Kavalcová, L.; Škába, R.; Kyncl, M.; Rousková, B.; Procházka, A. The diagnostic value of MRI fistulogram and MRI distal colostogram in patients with anorectal malformations. J. Pediatr. Surg. 2013, 48, 1806–1809. [Google Scholar] [CrossRef] [PubMed]
  5. Procházka, A.; Schatz, M.; Centonze, F.; Kuchyňka, J.; Vyšata, O.; Vališ, M. Extraction of breathing features using MS Kinect for sleep stage detection. SPRINGER Signal Image Video Process. 2016, 10, 1279–1286. [Google Scholar] [CrossRef]
  6. Kwon, H.; Park, Y.; Han, J. Augmented reality in dentistry: A current perspective. Acta Odontol. Scand. 2018, 76, 497–503. [Google Scholar] [CrossRef] [PubMed]
  7. Huang, T.K.; Yang, C.H.; Hsieh, Y.H.; Wang, J.C.; Hung, C.C. Augmented reality (AR) and virtual reality (VR) applied in dentistry. Kaohsiung J. Med. Sci. 2018, 34, 243–248. [Google Scholar] [CrossRef] [PubMed]
  8. Llena, C.; Folguera, S.; Forner, L.; Rodriguez-Lozano, F. Implementation of augmented reality in operative dentistry learning. Eur. J. Dent. Educ. 2018, 22, e122–e130. [Google Scholar] [CrossRef]
  9. Wang, J.; Suenaga, H.; Hoshi, K.; Yang, L.; Kobayashi, E.; Sakuma, I.; Liao, H. Augmented reality navigation with automatic marker-free image registration using 3D image overlay for dental surgery. IEEE Trans. Biomed. Eng. 2014, 61, 1295–1304. [Google Scholar] [CrossRef]
  10. Touati, R.; Richert, R.; Millet, C.; Farges, J.; Sailer, I.; Ducret, M. Comparison of two innovative strategies using augmented reality for communication in aesthetic dentistry: A pilot study. J. Healthc. Eng. 2019, 2019, 7019046. [Google Scholar] [CrossRef]
  11. Joda, T.; Gallucci, G.; Wismeijer, D.; Zitzmann, N. Augmented and virtual reality in dental medicine: A systematic review. Comput. Biol. Med. 2019, 108, 93–100. [Google Scholar] [CrossRef]
  12. Sun, T.; Lan, T.; Pan, C.; Lee, H.E. Dental implant navigation system guide thesurgery future. Kaohsiung J. Med. Sci. 2018, 34, 56–64. [Google Scholar] [CrossRef] [PubMed]
  13. Ma, L.; Jiang, W.; Zhang, B.; Qu, X.; Ning, G.; Zhang, X.; Liao, H. Augmented reality surgical navigation with accurate CBCT-patient registration for dental implant placement. Med. Biol. Eng. Comput. 2019, 57, 47–57. [Google Scholar] [CrossRef] [PubMed]
  14. Jiang, W.; Ma, L.; Zhang, B.; Fan, Y.; Qu, X.; Zhang, X.; Liao, H. Evaluation of the 3D augmented reality-guided intraoperative positioning of dental implants in edentulous mandibular models. Int. J. Oral Maxillofac. Implant. 2018, 33, 1219–1228. [Google Scholar] [CrossRef] [PubMed]
  15. Zhu, M.; Liu, F.; Chai, G.; Pan, J.; Jiang, T.; Lin, L.; Xin, Y.; Zhang, Y.; Li, Q. A novel augmented reality system for displaying inferior alveolar nerve bundles in maxillofacial surgery. Sci. Rep. 2017, 7, 42365. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  16. Kašparová, M.; Gráfová, L.; Dvořák, P.; Dostálová, T.; Procházka, A.; Eliášová, H.; Pruša, J.; Kakawand, S. Possibility of reconstruction of dental plaster cast from 3D digital study models. Biomed. Eng. Online 2013, 12, 49. [Google Scholar] [CrossRef] [PubMed]
  17. Bukhari, S.; Reddy, K.; Reddy, M.; Shah, S. Evaluation of virtual models (3Shape OrthoSystem) in assessing accuracy and duration of model analyses based on the severity of crowding. Saudi J. Dent. Res. 2017, 8, 11–18. [Google Scholar] [CrossRef]
  18. Jacob, H.; Wyatt, G.; Buschang, P. Reliability and validity of intraoral and extraoral scanners. Prog. Orthod. 2015, 16, 38. [Google Scholar] [CrossRef] [PubMed]
  19. Thoma, J.; Havlena, M.; Stalder, S.; Gool, L. Augmented reality for user-friendly intra-oral scanning. In Proceedings of the 2017 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct), Nantes, France, 9–13 October 2017; pp. 97–102. [Google Scholar]
  20. Mangano, F.; Gandolfi, A.; Luongo, G.; Logozzo, S. Intraoral scanners in dentistry: A review of the current literature. BMC Oral Health 2017, 17, 149. [Google Scholar] [CrossRef] [PubMed]
  21. Yadollahi, M.; Procházka, A.; Kašparová, M.; Vyšata, O.; Mařík, V. Separation of overlapping dental arch objects using digital records of illuminated plaster casts. Biomed. Eng. Online 2015, 14, 67. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  22. Procházka, A.; Kašparová, M.; Yadollahi, M.; Vyšata, O.; Grajciarová, L. Multi-camera systems use for dental arch shape measurement. Vis. Comput. 2015, 31, 1501–1509. [Google Scholar] [CrossRef]
  23. Yadollahi, M.; Procházka, A.; Kašparová, M.; Vyšata, O. The use of combined illumination in segmentation of orthodontic bodies. SPRINGER Signal Image Video Process. 2015, 9, 243–250. [Google Scholar] [CrossRef]
  24. Hošťálková, E.; Vyšata, O.; Procházka, A. Multi-dimensional biomedical image de-noising using Haar transform. In Proceedings of the 15th International Conference on Digital Signal Processing, Wales, UK, 1–4 July 2007; pp. 175–179. [Google Scholar]
  25. Jerhotová, E.; Švihlík, J.; Procházka, A. Biomedical image volumes denoising via the wavelet transform. In Applied Biomedical Engineering; INTECH: Rijeka, Croatia, 2011; pp. 435–458. [Google Scholar]
  26. Suenaga, H.; Tran, H.; Liao, H.; Masamune, K.; Dohi, T.; Hoshi, K.; Takato, T. Vision-based markerless registration using stereo vision and an augmented reality surgical navigation system: A pilot study. BMC Med. Imaging 2015, 15, 51. [Google Scholar] [CrossRef] [PubMed]
  27. Koopaie, M.; Kolahdouz, S. Three-dimensional simulation of human teeth and its application in dental education and research. Med. J. Islam. Repub. Iran 2016, 30, 461. [Google Scholar] [PubMed]
  28. Kašparová, M.; Procházka, A.; Gráfová, L.; Yadollahi, M.; Vyšata, O.; Dostálová, T. Evaluation of dental morphometrics during the orthodontic treatment. Biomed. Eng. Online 2014, 13, 68. [Google Scholar] [CrossRef] [PubMed]
  29. Mangano, F.; Veronesi, G.; Hauschild, U.; Mijiritsky, E.; Mangano, C. Trueness and Precision of Four Intraoral Scanners in Oral Implantology: A Comparative in Vitro Study. PLoS ONE 2016, 11, e0163107. [Google Scholar] [CrossRef] [PubMed]
  30. Lee, K. Comparison of two intraoral scanners based on three-dimensional surface analysis. Prog. Orthod. 2018, 19, 6. [Google Scholar] [CrossRef] [PubMed]
  31. Richert, R.; Goujat, A.; Venet, L.; Viguie, G.; Viennot, S.; Robinson, P.; Farges, J.; Fages, M.; Ducret, M. Intraoral scanner technologies: A review to make a successful impression. J. Healthc. Eng. 2017, 2017, 8427595. [Google Scholar] [CrossRef] [PubMed]
  32. Barone, S.; Paoli, A.; Razionale, A. Creation of 3D multi-body orthodontic models by using independent imaging sensors. Sensors 2013, 13, 2033–2050. [Google Scholar] [CrossRef]
  33. Logozzo, S.; Zanetti, E.; Franceschini, G.; Kilpela, A.; Makinen, A. Recent advances in dental optics—Part I: 3D intraoral scanners for restorative dentistry. Opt. Lasers Eng. 2014, 54, 203–221. [Google Scholar] [CrossRef]
  34. Kašparová, M.; Halamová, S.; Dostálová, T.; Procházka, A. Intra-oral 3D scanning for the digital evaluation of dental arch parameters. Appl. Sci. 2018, 8, 1838. [Google Scholar] [CrossRef]
  35. Song, T.; Yang, C.; Dianat, O.; Azimi, E. Endodontic guided treatment using augmented reality on a head-mounted display system. Healthc. Technol. Lett. 2018, 5, 201–207. [Google Scholar] [CrossRef]
  36. Gráfová, L.; Kašparová, M.; Kakawand, S.; Procházka, A.; Dostálová, T. Study of edge detection task in dental panoramic X-ray images. Dentomaxillofac. Radiol. 2013, 42, 20120391. [Google Scholar] [CrossRef] [PubMed]
  37. Procházka, A.; Vyšata, O.; Kašparová, M.; Dostálová, T. Wavelet transform in biomedical image segmentation and classification. In Proceedings of the 2011 7th International Symposium on Image and Signal Processing and Analysis (ISPA), Dubrovnik, Croatia, 4–6 September 2011; pp. 1–4. [Google Scholar]
  38. Chang, J.S.; Shih, A.C.C.; Liao, H.Y.M.; Fang, W.H. Using Normal Vectors for Stereo Correspondence Construction. In Knowledge-Based Intelligent Information and Engineering Systems; Khosla, R., Howlett, R.J., Jain, L.C., Eds.; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2005; Volume 3681, pp. 582–588. [Google Scholar]
  39. Gao, K.; Chen, H.; Zhao, Y.; Geng, Y.; Wang, G. Stereo matching algorithm based on illumination normal similarity and adaptive support weight. Opt. Eng. 2013, 52, 027201. [Google Scholar] [CrossRef]
  40. Ramalingam, S.; Taguchi, Y. A theory of minimal 3D point to 3D plane registration and its generalization. Int. J. Comput. Vis. 2013, 102, 73–90. [Google Scholar] [CrossRef]
  41. Kravitz, N.; Groth, C.; Jones, P.; Graham, J.; Redmond, W. Intraoral digital scanners. J. Clin. Orthod. 2014, 48, 337–347. [Google Scholar]
  42. Martin, C.; Chalmers, E.; McIntyre, G.; Cochrane, H.; Mossey, P. Orthodontic scanners: What’s available? J. Orthod. 2015, 42, 136–143. [Google Scholar] [CrossRef] [PubMed]
  43. Hong-Seok, P.; Chintal, S. Development of high speed and high accuracy 3D dental intra oral scanners. Procedia Eng. 2015, 100, 1174–1181. [Google Scholar] [CrossRef]
  44. Ahn, J.; Park, A.; Kim, J.; Lee, B.; Eom, J. Development of three-dimensional dental scanning apparatus using structured illumination. Sensors 2017, 17, 1634. [Google Scholar] [CrossRef]
  45. Logozzo, S.; Franceschini, A.; Kilpela, A.; Caponi, M.; Governi, L.; Blois, L. A comparative analysis of intraoral 3d digital scanners for restorative dentistry. Int. J. Med. Technol. 2011, 5, 1–18. [Google Scholar]
  46. Nedelcua, R.; Olssonb, P.; Nyströmb, I.; Rydénc, J.; Thora, A. Accuracy and precision of 3 intraoral scanners and accuracy of conventional impressions: A novel in vivo analysis method. J. Dent. 2018, 69, 110–118. [Google Scholar] [CrossRef]
  47. Medina-Sotomayor, P.; Pascual-Moscardo, A.; Camps, I. Relationship between resolution and accuracy of four intraoral scanners in complete-arch impressions. J. Clin. Exp. Dent. 2018, 10, e361–e366. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  48. Goshtasby, A. Image Registration: Principles, Tools and Methods; Springer: Berlin/Heidelberg, Germany, 2012. [Google Scholar]
  49. Dostálová, T.; Kašparová, M.; Kříž, P.; Halamová, V.; Jelínek, M.; Bradna, P.; Mendřický, J. Intraoral scanner and stereographic 3D print in dentistry-quality and accuracy of model-new laser application in clinical practice. Laser Phys. 2018, 28, 125602. [Google Scholar] [CrossRef]
  50. Prasad, S.; Kader, N.; Sujatha, G.; Raj, T.; Patil, S. 3D printing in dentistry. J. 3D Print. Med. 2018, 2, 89–91. [Google Scholar] [CrossRef]
  51. Zaharia, C.; Gabor, A.; Gavrilovici, A.; Stan, A.; Idorasi, A.; Sinescu, C.; Negrutiu, M. Digital dentistry: 3D printing applications. J. Interdiscip. Med. 2017, 2, 50–53. [Google Scholar] [CrossRef]
  52. Wesemann, C.; Muallah, J.; Mah, J.; Bumann, A. Accuracy and efficiency of full-arch digitalization and 3D printing: A comparison between desktop model scanners, an intraoral scanner, a CBCT model scan, and stereolithographic 3D printing. Quintessence Int. 2017, 48, 41–50. [Google Scholar] [PubMed]
Figure 1. Plaster casts models and the orthodontic measurement of the distance between specific bodies using (a) manual methods and (b) digital images and their analysis.
Figure 1. Plaster casts models and the orthodontic measurement of the distance between specific bodies using (a) manual methods and (b) digital images and their analysis.
Applsci 09 02929 g001
Figure 2. Digital spatial orthodontic models construction presenting (a) the intra-oral scanning by a handheld camera (Trios/3shape) for data acquisition and (b) resulting stereolithographic digital model.
Figure 2. Digital spatial orthodontic models construction presenting (a) the intra-oral scanning by a handheld camera (Trios/3shape) for data acquisition and (b) resulting stereolithographic digital model.
Applsci 09 02929 g002
Figure 3. A dental orthopantomogram and a selected result of edge detection of desired structures inside the selected region of interest.
Figure 3. A dental orthopantomogram and a selected result of edge detection of desired structures inside the selected region of interest.
Applsci 09 02929 g003
Figure 4. Evaluation of distances between selected teeth presenting (a) the front image, (b) the thresholded gradient image and selected regions A and B for their extrema detection, and (c) the image contour plot and evaluation of the distance between selected front teeth.
Figure 4. Evaluation of distances between selected teeth presenting (a) the front image, (b) the thresholded gradient image and selected regions A and B for their extrema detection, and (c) the image contour plot and evaluation of the distance between selected front teeth.
Applsci 09 02929 g004
Figure 5. The determination of the distance between specified objects in space: (a) its principle using two spatially distributed cameras A and B, and (b,c) its use for plaster cast images.
Figure 5. The determination of the distance between specified objects in space: (a) its principle using two spatially distributed cameras A and B, and (b,c) its use for plaster cast images.
Applsci 09 02929 g005
Figure 6. The spatial location of teeth centers detected by the two-camera system and used for evaluation of the corresponding teeth distances, together with their dental planes before and after rotation into the horizontal position for examination (a) before the treatment, (b) after the treatment, and (c) for the location of teeth centers rotated to the horizontal plane and the dental arch evaluated by the mean square method.
Figure 6. The spatial location of teeth centers detected by the two-camera system and used for evaluation of the corresponding teeth distances, together with their dental planes before and after rotation into the horizontal position for examination (a) before the treatment, (b) after the treatment, and (c) for the location of teeth centers rotated to the horizontal plane and the dental arch evaluated by the mean square method.
Applsci 09 02929 g006
Figure 7. The segmentation process presenting (a) the original image and the circular Hough transform and (b) the segmentation using the region growing method and the convex hull.
Figure 7. The segmentation process presenting (a) the original image and the circular Hough transform and (b) the segmentation using the region growing method and the convex hull.
Applsci 09 02929 g007
Figure 8. Principle of the separation of overlapping regions presenting (a) the original image acquired with a combined illumination, (b) area selection, (c) region growing segmentation, (d) the convex region estimation, and (e) the separation curve construction.
Figure 8. Principle of the separation of overlapping regions presenting (a) the original image acquired with a combined illumination, (b) area selection, (c) region growing segmentation, (d) the convex region estimation, and (e) the separation curve construction.
Applsci 09 02929 g008
Figure 9. A 3D analysis of the lower dental arch presenting (a) the contour plot of the selected STL model, (b) the 3D plot of the molar tooth, and (c) the corresponding contour plot, which allows detection of specific surface areas.
Figure 9. A 3D analysis of the lower dental arch presenting (a) the contour plot of the selected STL model, (b) the 3D plot of the molar tooth, and (c) the corresponding contour plot, which allows detection of specific surface areas.
Applsci 09 02929 g009
Figure 10. Digital model of (a,c) left, and (b,d) the right canine teeth used to evaluate the dental arch parameters.
Figure 10. Digital model of (a,c) left, and (b,d) the right canine teeth used to evaluate the dental arch parameters.
Applsci 09 02929 g010
Figure 11. Processing stages of the rigid registration of the lower dental arches presenting: (a) the input plot image; (b) the base contour plot image of the same individual recorded at a different instant with a selection of corresponding fixed points in both images specified by the location of contour levels; and (c) the registered input image, which allows a comparison with the base image.
Figure 11. Processing stages of the rigid registration of the lower dental arches presenting: (a) the input plot image; (b) the base contour plot image of the same individual recorded at a different instant with a selection of corresponding fixed points in both images specified by the location of contour levels; and (c) the registered input image, which allows a comparison with the base image.
Applsci 09 02929 g011
Figure 12. A 3D model of a dental arch constructed by a 3D printer using different materials and printing accuracies.
Figure 12. A 3D model of a dental arch constructed by a 3D printer using different materials and printing accuracies.
Applsci 09 02929 g012

Share and Cite

MDPI and ACS Style

Procházka, A.; Dostálová, T.; Kašparová, M.; Vyšata, O.; Charvátová, H.; Sanei, S.; Mařík, V. Augmented Reality Implementations in Stomatology. Appl. Sci. 2019, 9, 2929. https://doi.org/10.3390/app9142929

AMA Style

Procházka A, Dostálová T, Kašparová M, Vyšata O, Charvátová H, Sanei S, Mařík V. Augmented Reality Implementations in Stomatology. Applied Sciences. 2019; 9(14):2929. https://doi.org/10.3390/app9142929

Chicago/Turabian Style

Procházka, Aleš, Tatjana Dostálová, Magdaléna Kašparová, Oldřich Vyšata, Hana Charvátová, Saeid Sanei, and Vladimír Mařík. 2019. "Augmented Reality Implementations in Stomatology" Applied Sciences 9, no. 14: 2929. https://doi.org/10.3390/app9142929

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop