Next Article in Journal
Comprehensive Assessments in Bonding Energy of Plasma Assisted Si-SiO2 Direct Wafer Bonding after Low Temperature Rapid Thermal Annealing
Next Article in Special Issue
Airline Point-of-Care System on Seat Belt for Hybrid Physiological Signal Monitoring
Previous Article in Journal
A High-Performance and Flexible Architecture for Accelerating SDN on the MPSoC Platform
Previous Article in Special Issue
Performance Analysis of Electromyogram Signal Compression Sampling in a Wireless Body Area Network
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Recent Advances in Tracking Devices for Biomedical Ultrasound Imaging Applications

1
School of Biomedical Engineering, ShanghaiTech University, Shanghai 201210, China
2
Department of Mechanical and Aerospace Engineering, North Carolina State University, Raleigh, NC 27695, USA
*
Authors to whom correspondence should be addressed.
Micromachines 2022, 13(11), 1855; https://doi.org/10.3390/mi13111855
Submission received: 8 October 2022 / Revised: 26 October 2022 / Accepted: 27 October 2022 / Published: 29 October 2022

Abstract

:
With the rapid advancement of tracking technologies, the applications of tracking systems in ultrasound imaging have expanded across a wide range of fields. In this review article, we discuss the basic tracking principles, system components, performance analyses, as well as the main sources of error for popular tracking technologies that are utilized in ultrasound imaging. In light of the growing demand for object tracking, this article explores both the potential and challenges associated with different tracking technologies applied to various ultrasound imaging applications, including freehand 3D ultrasound imaging, ultrasound image fusion, ultrasound-guided intervention and treatment. Recent development in tracking technology has led to increased accuracy and intuitiveness of ultrasound imaging and navigation with less reliance on operator skills, thereby benefiting the medical diagnosis and treatment. Although commercially available tracking systems are capable of achieving sub-millimeter resolution for positional tracking and sub-degree resolution for orientational tracking, such systems are subject to a number of disadvantages, including high costs and time-consuming calibration procedures. While some emerging tracking technologies are still in the research stage, their potentials have been demonstrated in terms of the compactness, light weight, and easy integration with existing standard or portable ultrasound machines.

1. Introduction

An object tracking system locates a moving object (or multiple objects) through time and space [1]. The main aim of a tracking system is to identify an object regarding the position and orientation in space recorded in an extension of time, characterized by precision, accuracy, working range as well as degree-of-freedom (DOF), depending on the systems and applications [2]. With the rapid development of computational and sensing technologies, nowadays tracking systems have been widely utilized in various fields, including robotics [3,4], military [5,6], medicine [7,8] and sports [9,10]. In the medical field, tracking of rotation and translation of medical instruments or patients plays a substantial role in many important applications, such as diagnostic imaging [11], image-guided navigation systems for intervention and therapy [12,13], as well as rehabilitation medicine [14].
Ultrasound (US) imaging is a well-established imaging modality that has been widely utilized in clinical practice for diagnosing diseases or guiding decision-making in therapy [15]. Compared with other medical imaging modalities, such as computed tomography (CT) and magnetic resonance imaging (MRI), US shows the major advantages of real-time imaging, non-radiation exposure, low-cost, and ease to apply [16]. Despite its many advantages, ultrasonography is considered to be highly operator-dependent [17]. Manually guiding of the US probe to obtain reproducible image acquisition is challenging. Moreover, in order to correctly interpret the information acquired by the scanning, rich clinical experience is required for sonographers. Besides the operator dependency that brings the high risk of interpretive error influencing the diagnosis and therapy results, the restricted field of view (FOV) of US probe poses challenges for image visualization and feature localization, thus limiting diagnosis or therapy accuracy. The integration of object tracking system with US imaging can resolve the above-mentioned limitations. By integrating tracking devices with US probes, an extended FOV of US probe can be obtained, resulting in a less operator-dependent scanning procedure and more accurate results. Over the past decade, there has been a significant growth of studies on integration of various tracking systems with US imaging systems for biomedical and healthcare applications. The applying of emerging tracking systems for biomedical US imaging applications has resulted in improved accuracy and intuitiveness of US imaging and navigation with less reliance on operator skills, thereby benefiting the medical diagnosis and therapy.
The purpose of this article is to provide a literature review on the various tracking systems for biomedical US imaging applications, as illustrated in Figure 1. The rest of the article is organized as follows: in Section 2, the principles of different tracking techniques, including optical tracking, electromagnetic tracking, mechanical tracking, acoustic tracking and inertial tracking, are summarized. The typical tracking systems and their technical performances, such as accuracy and latency are provided in Section 3. Section 4 details the advancement of different tracking systems for US imaging applications, including freehand 3D US imaging, US image fusion, US-guided diagnosis, and US-guided therapy. Finally, a summary and concluding remarks are presented in Section 5.

2. Physical Principles of Tracking Technologies

The latest advancements in tracking technologies have enabled conventional medical devices to be equipped with more advanced functions. In biomedical US imaging, object tracking technologies are key to locate US probes and other medical tools for precise operation and intuitive visualization. The underlying physical principles behind the most common tracking technologies will be reviewed in this section.

2.1. Optical Tracking

An optical tracking system is among the most precise tracking technologies with 6 DOF that achieves a sub-millimeter accuracy level. Multiple spatially synchronized cameras track the markers attached to the target in the designed space. There are two types of markers: active and passive [18]. Infrared light emitting diodes (LEDs) are used in active markers for the purpose of emitting invisible light that can be detected by cameras. A passive marker is covered with a retro-reflective surface that can reflect incoming infrared light back to the camera. There are usually three or more unsymmetrical markers in a target object. The 6 DOF position and orientation of the object are determined by triangulation [19].
Similar to the human vision system, optical tracking requires at least two cameras that are fixed at a known distance from each other. Adding additional cameras will improve tracking accuracy and robustness. In Figure 2, a trinocular vision system is illustrated as an example of triangulation [20]. P is observed simultaneously by three lenses. Additionally, it generates three projection points on the focal plane. A right-handed, Y-up coordinate is assigned to each lens, with coordinate origins at O 1 , O 2 , and O 3 . The reference coordinate X c Y c Z c coincides with the coordinate X 2 Y 2 Z 2 of the middle lens. A baseline l i j is defined to be the distance between any two lenses. Three parallel principal axes are present in the three fixed lenses. There is a perpendicular relationship between the principal axes and the baseline.
To perceive the depth with two lenses i and j , where i ,   j {1, 2, 3}, and i     j . With the known focal length f , and the disparity x p i x p j representing the offset between the two projections in the X O Z plane, the depth Z can be derived as
Z = f l i j x p i x p j
Furthermore, the other two coordinates of P ( X , Y , Z ) can thus be calculated as
X = x p i Z f Y = y p j Z f
Given all the markers’ positions, the orientation of the marker set is determined. With the known positions of all markers, the orientation of the target is also determined. The 6 DOF poser information is delivered in the form of a transformation matrix T M C , with the subscript and superscript representing the marker set coordinate ( M ) to the camera coordinate ( C ), respectively.
T M C = [ R p 0 1 ]
where p = ( X M , Y M , Z M ) T is the offset between the two origins of coordinates M and C . Additionally, R is a 3 × 3 rotational matrix in the form of
R = [ c o s φ c o s θ c o s φ s i n θ s i n ψ s i n φ c o s ψ c o s φ s i n θ c o s ψ + s i n φ s i n ψ s i n φ c o s θ s i n φ s i n θ s i n ψ + c o s φ c o s ψ s i n φ s i n θ c o s ψ c o s φ s i n ψ s i n θ c o s θ s i n ψ c o s θ c o s ψ ]
From the above matrix, the orientations φ (yaw), θ (pitch), and ψ (roll) can thus be solved as follows:
{ φ = a r c t a n R 12 R 11 θ = a r c s i n R 31 ψ = a r c t a n R 32 R 33

2.2. Electromagnetic Tracking

Tracking systems using electromagnetic signals can also provide sub-millimeter accuracy in dynamic and real-time 6 DOF tracking. Its advantages include being lightweight and free of line-of-sight. In biomedical engineering, it is commonly used for the navigation of medical tools.
Electromagnetic tracking systems consist of four modules: transmission circuits, receiving circuits, digital signal processing units, and microcontrollers. Based on Faraday’s law, electromagnetic tracking systems use transmitted voltages to estimate the position and orientation of objects in alternating magnetic fields when the object is coupled to a receiver sensor [21,22].
Figure 3 illustrates the involved coordinate systems. The reference coordinate system is denoted as X 0 Y 0 Z 0 , which is fixed at the emission coil. X s Y s Z s is the coordinate system fixed at the receiver sensor. The location of its origin O 1 ( X , Y , Z ) with respect to X 0 Y 0 Z 0 , can also be denoted as O 1 ( R , α , β ) in spherical coordinate. Additionally, the orientation is represented as Euler angles φ , θ , and ψ .
Assuming the excitation current i ( t ) = I i sin ( ω t + ϕ ) , the transmitter parameter along each direction defined as C i = μ N i I i S i 4 π , where i = x ,   y ,   z , N i is the number of turns in the coil, and S i is the area of the coil, the excitation signal can be defined as
f 0 = C = [ C x 0 0 0 C y 0 0 0 C z ]
Accordingly, the receiver parameter can be written as
K = [ K x 0 0 0 K y 0 0 0 K z ]
where k i = ω n i g i s i , with ω representing the radian frequency of the source excitation signal, n i and s i denoting the number of turns in the coil and the area of the coil, and g i indicating the system gain. According to Faraday’s law of induction, the amplitude of the voltage from the receiver coil is expressed as
S i j = ω K j C i h ( R , α , β , φ , θ , ψ ) = ω K j B i j
With the position and orientation of the receiver fixed, the value of h ( R , α , β , φ , θ , ψ ) is determined. B ij is the amplitude of the magnetic field produced at that location.
From Equation (8), by defining the final sensor output to be
f s = [ s xx s yx s zx s xy s yy s zy s xz s yz s zz ]
the magnetic field expressed in X s Y s Z s is
B = K 1 f s = [ s xx k x s xy k y s xz k x s yx k y s yy k y s yz k y s zx k z s zy k z s zz k z ]
When the equivalent transmitter coil along each direction is excited, the square amplitude of the magnetic field P can be expressed as
P = [ s xx k x 2 + s yx k x 2 + s zx k x 2 s xy k y 2 + s yy k y 2 + s zy k y 2 s xz k z 2 + s yz k z 2 + s zz k z 2 ] = [ 4 C x 2 r 6 ( x 2 + 1 4 y 2 + 1 4 z 2 ) 4 C y 2 r 6 ( 1 4 x 2 + y 2 + 1 4 z 2 ) 4 C z 2 r 6 ( 1 4 x 2 + 1 4 y 2 + z 2 ) ]
Canceling out the unknown position ( x ,   y ,   z ) by summing up all three entries, the only unknown parameter r can be deduced.
With two rotational matrices T ( α ) and T ( β ) as
T ( α ) = [ c o s α s i n α 0 s i n α c o s α 0 0 0 1 ] T ( β ) = [ c o s β 0 s i n β 0 1 0 s i n β 0 c o s β ]
A key matrix F can be defined as
                                                          F = r 6 ( f 0 T ) 1 f s T K 1 K 1 f s f 0 1   = [ 1 + 3 cos 2 α cos 2 β 3 sin α cos α cos 2 β 3 cos α sin β cos β 3 sin α cos α cos 2 β 1 + 3 sin 2 α cos 2 α 3 sin α sin β cos β 3 cos α sin β cos β 3 sin α sin β cos β 1 + 3 sin 2 β ]
As s i j , ω , C i , K j as known parameters, the unknown α and β can be solved as
{ α = a r c t a n F 23 F 13                       β = a r c s i n F 33 1 3
From the spherical coordinates, the position of the target P can be written as
{ x = r c o s β c o s α y = r c o s β s i n α z = r s i n β                    
Substituting
T ( φ ) = [ 1 0 0 0 c o s φ s i n φ 0 s i n φ c o s φ ] T ( θ ) = [ c o s θ 0 s i n θ 0 1 0 s i n θ 0 c o s θ ] T ( ψ ) = [ c o s ψ s i n ψ 0 s i n ψ c o s ψ 0 0 0 1 ]
into
f s = K B = 2 r 3 K T ( φ ) T ( θ ) T ( ψ ) T ( α ) T ( β ) S T ( β ) T ( α ) f 0
matrix T is defined as
T = T ( φ ) T ( θ ) T ( ψ ) = [ cos φ cos ψ cos φ sin ψ sin φ sin θ sin φ cos ψ cos θ sin ψ sin φ sin θ sin ψ + cos θ cos ψ sin θ cos φ cos θ sin φ cos ψ + sin θ sin ψ cos θ sin φ sin ψ + sin θ cos ψ cos θ cos φ ]
Therefore, the target’s orientation can be solved as
{ φ = a r c t a n ( T 13 ( T 23 2 + T 33 2 ) 1 2 ) θ = a r c t a n ( T 23 T 33 )                                             ψ = a r c t a n ( T 12 T 11 )                                              

2.3. Mechanical Tracking

Robotic tracking systems use articulated robotic arms to manipulate the target attached to the end effector. Typically, industrial robots are composed of a number of joints and links. Joint movement is continuously detected by potentiometers and encoders in-stalled on each joint. The real-time position and orientation of the effector can be deter-mined by calculating homogeneous transformations from the collected robotic dynamics. In clinical practice, the operator can either control the movement of the robot to a certain location with the desired orientation, or specify the destination, and the robot solves the path using inverse dynamics based on the spatial information of the destination and the architecture of the robot. Following the Denavit and Hartenberg notation, the forward dynamics will be applied to illustrate how the 6 DOF pose information is transformed be-tween adjacent joints and links, as illustrated in Figure 4 [23].
The 4 × 4 homogeneous transformation matrix for each step is shown below.
A 1 = [ c o s θ i s i n θ i 0 0 s i n θ i c o s θ i 0 0 0 0 1 0 0 0 0 1 ] ,   A 2 = [ 1 0 0 a i 0 1 0 0 0 0 1 0 0 0 0 1 ] A 3 = [ 1 0 0 0 0 1 0 0 0 0 1 d i 0 0 0 1 ] ,   A 1 = [ 1 0 0 0 0 c o s α i s i n α i 0 0 s i n α i c o s α i 0 0 0 0 1 ]
[ x i y i z i 1 ] = A 1 A 2 A 3 A 4 T [ x i y i z i 1 ] = [ c o s θ i s i n θ i c o s α i s i n θ i s i n α i a i c o s θ i s i n θ i c o s θ i c o s α i c o s θ i s i n α i a i s i n θ i 0 s i n α i c o s α i d i 0 0 0 1 ] [ x i y i z i 1 ]
The transformation matrix T can also be represented in terms of position ( p x , p y , p z ) in the reference coordinate and orientation ( φ , θ , ψ ) in yaw-pitch-roll representation.
T = [ c o s φ c o s θ c o s φ s i n θ s i n ψ s i n φ c o s ψ c o s φ s i n θ c o s ψ + s i n φ s i n ψ p x s i n φ c o s θ s i n φ s i n θ s i n ψ + c o s φ c o s ψ s i n φ s i n θ c o s ψ c o s φ s i n ψ p y s i n θ c o s θ s i n ψ c o s θ c o s ψ p z 0 0 0 1 ]
Based on Equations (21) and (22), the 6 DOF pose information can be solved as
{ p x = a i c o s θ i p y = a i s i n θ i p z = d i                
{ θ = a r c s i n ( T 31 ) φ = a r c c o s ( T 33 c o s θ ) ψ = a r c c o s ( T 11 c o s θ )

2.4. Acoustic Tracking

An acoustic tracking system is one of the three DOF positional tracking systems. To determine the spatial location of the target object, an ultrasonic transmitter transmits a carrier signal that is received by multiple receivers operating at the same frequency. Specifically, by estimating the actual travel/arrival times (TOF/TOA) or the time difference between travel/arrival (TDOF/TDOA), the 3D coordinates of the object ( x ,   y ,   z ) can be determined to centimeter accuracy levels with receivers fixed at known locations, as shown in Figure 5. A TDOF/TDOA algorithm is more practical and accurate than a TOF/TOA algorithm since it circumvents the synchronization issue between the transmitter and receiver. A limitation of acoustic tracking is that the accuracy of the tracking is affected by the temperature and air turbulence in the environment [24]. This problem can be addressed by including the speed of sound (c) as an unknown parameter in the calculation [25].
The predetermined geometry of the receivers was notated as ( x i , y i , z i ) , where i represents the i th receiver, where i {1, 2, 3, 4, 5, 6}. The reference distance between the transmitter and receiver R1 is denoted as d . Δ T 1 j indicates the TDOF between receiver R 1 and R j , where j {2, 3, 4, 5, 6}.
[ 2 x 1 2 x 2 2 y 1 2 y 2 2 z 1 2 z 2 2 Δ T 12 2 Δ T 12 2 2 x 1 2 x 3 2 y 1 2 y 3 2 z 1 2 z 3 2 Δ T 13 2 Δ T 13 2 2 x 1 2 x 4 2 y 1 2 y 4 2 z 1 2 z 4 2 Δ T 14 2 Δ T 14 2 2 x 1 2 x 5 2 y 1 2 y 5 2 z 1 2 z 5 2 Δ T 15 2 Δ T 15 2 2 x 1 2 x 6 2 y 1 2 y 6 2 z 1 2 z 6 2 Δ T 16 2 Δ T 16 2 ] [ x y z c d c 2 ] = [ x 1 2 + y 1 2 + z 1 2 x 2 2 y 2 2 z 2 2 x 1 2 + y 1 2 + z 1 2 x 3 2 y 3 2 z 3 2 x 1 2 + y 1 2 + z 1 2 x 4 2 y 4 2 z 4 2 x 1 2 + y 1 2 + z 1 2 x 5 2 y 5 2 z 5 2 x 1 2 + y 1 2 + z 1 2 x 6 2 y 6 2 z 6 2 ]
Occlusion can also affect the accuracy of an acoustic tracking system. A receiver configuration should be taken into serious consideration when implementing such a system for biomedical US imaging applications [26]. In addition to reducing occlusion, an optimal configuration also contributes to improved tracking performance. As an acoustic tracking system is not able to identify a target’s object, other tracking systems, such as inertial tracking, are frequently required [27].

2.5. Inertial Tracking

Inertial tracking systems are based on an inertial measurement unit (IMU), which is a small, lightweight, cost-effective sensor enabled by microelectromechanical systems (MEMS) (Figure 6) [28]. An IMU sensor with 9 axes that integrates accelerometers, gyroscopes, and magnetometers is commonly used for 6 DOF object tracking. Accelerometers measure the target’s acceleration. The angular velocity of a target is measured by a gyroscope. Additionally, a magnetometer detects the magnetic field strength at the target’s location. With sensor fusion of the raw measurements, an IMU sensor is able to obtain more accurate readings [29]. As a result, after calibration and compensation for drifts and errors, the position and orientation of the target can be determined [27].
The tri-axial measurements of the accelerometer are accelerations of each axis, where a = [ a x , a y , a z ] T = [ d 2 x d t 2 , d 2 y d t 2 , d 2 z d t 2 ] T . Readings from the gyroscope indicate the angular rates of the sensor when rotated, where ω = [ ω x , ω y , ω z ] T = [ d 2 φ d t 2 , d 2 θ d t 2 , d 2 ψ d t 2 ] T . Additionally, [ p i t c h , r o l l , y a w ] T is denoted as Φ = [ φ , θ , ψ ] T .
By taking integration of the angular velocity from time t k 1 to t k ,
Φ = t k 1 t k m ω ( τ )   d
where τ is the discrete time. The solution of the orientation, under the assumption, can be written as
Φ = ω k 1 + ω k 2 ( t k t k 1 )
For simplicity, three rotation matrices were defined as follows.
R p i t c h = [ c o s θ 0 s i n θ 0 1 0 s i n θ 0 c o s θ ]
R r o l l = [ 1 0 0 0 c o s ψ s i n ψ 0 s i n ψ c o s ψ ]
R y a w = [ c o s φ s i n φ 0 s i n φ c o s φ 0 0 0 1 ]
The rotational matrix R is expressed as
R = R p i t c h R r o l l R y a w
Due to the effect of the Earth’s gravity,
υ ˙ = R a g
where υ is the velocity of the object and g = [ 0 , 0 , 9.8 ] T .
According to the midpoint method, the velocity
υ k = υ k 1 + ( R k 1 a k 1 + R k a k 2 g ) ( t k t k 1 )
Thus, the position p = [ x , y , z ] T at time k is
p k = p k 1 + υ k 1 ( t k t k 1 ) + 1 2 ( R k 1 a k 1 + R k a k 2 g ) ( t k t k 1 ) 2

3. Tracking Systems

Different types of tracking systems have been developed and marketed over the past decade. In this section, the main tracking systems in the market are reviewed in terms of their technical specifications.

3.1. Optical Tracking Systems

A large number of manufacturers have developed various kinds of optical tracking systems for biomedical applications, as summarized in Table 1. The main technical specifications that relate to the tracking performances are measurement volume (or FOV), resolution, volumetric accuracy, average latency, and measurement rates. Due to angle of view, the shape of measurement volume is usually a pyramid, which can be represented by radius × width × height. Some manufacturers prefer to use the term “field of view”, i.e., horizontal degree × vertical degree, to show the dimension of work volume. In addition, the advances of cameras promote the resolution of the image, and further increase the volumetric accuracy. To date, some advanced optical tracking system can obtain 26 megapixels (MP) resolution with 0.03 mm volumetric accuracy [30]. However, the high resolution of the captured images will burden the processor for data analysis, causing the increase of average latency and reduce of measurement rates.

3.2. Electromagnetic Tracking Systems

Compared to optical tracking systems, electromagnetic tracking system can cover a larger volume of measurement space, but normally has lower position accuracy. Table 2 summarizes the specification of some representative, commercially available electromagnetic tracking systems. Since it does not require the transmission of light, electromagnetic tracking systems are promising in intracorporeal biomedical applications. For example, Polhemus Inc. (Colchester, VT, USA) developed a miniatured electromagnetic motion tracking sensors with outer diameter of 1.8 mm. It can be inserted into human vessel with a catheter for both position and orientation tracking [69].

3.3. Mechanical Tracking Systems

Unlike other tracking systems, the development of mechanical tracking systems, especially for biomedical applications, is limited. This might be due to the fact that mechanical tracking systems are usually bulky and heavy. Meta motion. Inc. presented a mechanical tracking system, named Gypsy 7, decades ago, which had a position accuracy of 0.125° [79]. However, this kind of exoskeleton system consists of 14 joint sensors and the total weight is 4 kg.

3.4. Acoustic Tracking Systems

Most commercially available acoustic tracking system is related to marine positioning, the use for in-door positioning is still in its infancy. Sonitor Technologies, Inc. developed a Forkbeard system, which applied 40 kHz US for echo location [80]. Although it can cover a floor, the volumetric accuracy is 1–2 feet, while the latency is 1–2 s. The nature of low accuracy and high latency of acoustic tracking hamper its applications in biomedical field. However, considering that US has the benefit of non-ionizing radiation, it might be promising for some specific biomedical applications.

3.5. Inertial Tracking Systems

Inertial tracking systems are also commercially available for many years. Some products that can be purchased on the market are summarized in Table 3. Due to the fact that it does not require both transmitters and receivers as optical tracking, electromagnetic tracking or acoustic tracking, the size of the inertial tracking device can be very compact, such as a dot [81]. This feature contributes to a friendly and comfortable condition for tracking objects, which hardly affect the normal motion of objects. However, the inertial tracking system can only provide a position information relatively, and it always needs the assistance from other kinds of tracking systems.

4. Biomedical Ultrasound Imaging Applications

Over the past decade, a range of commercial and research tracking systems have been developed for biomedical US imaging-related applications. Following sections categorically review the applications of different tracking systems reported till date.

4.1. Freehand 3D Ultrasound Imaging

Over the past few decades, US imaging has become a valuable tool in clinical diagnostic and therapeutic procedures across a broad range of fields, ranging from routine screening, early cancer detection, diagnosis of cardiovascular disease to real-time monitoring [105]. Compared with CT, MRI, and PET, US demonstrates the advantages of safe for patients (no risk of ionizing radiation or high magnetic fields), real-time imaging, portability, and low-cost [11,106,107]. In clinical practice, a handheld US probe typically composed of a 1D linear US transducer array is routinely used to generate 2D US images in real-time, displaying cross-sectional images of the human anatomy. While 2D US imaging offers several advantages for medical applications, it can only acquire selectively-sampled, cross-sectional slice images of a 3D anatomic structure, and the orientation of each image plane depends on how the operator positions the handheld probe (i.e., operator dependency) [108]. If clinicians need to view 3D anatomic structures, they have to imagine the 3D volume with the planar 2D images mentally, thus limiting the diagnostic accuracy.
In order to overcome the limitations of 2D US, volumetric 3D US imaging has been developed, allowing direct visualization of the arbitrary plane of 3D volume and helping obtain a more accurate view of the shape, size, and location of the organ and lesion [16]. Up to now, three different types of methods have been utilized for the construction of 3D US volumes: employing a 2D phased array transducer, mechanical 3D US scanning, and freehand 3D US scanning [109,110]. Instead of using 1D array ultrasonic transducer for conventional 2D US systems, 3D US volume can also be generated by using a 2D phased array ultrasonic transducer with its elements spreading on a 2D aperture, which can deflect and focus the ultrasonic beam in a volumetric space [111,112]. Since the US beams are steered and focused on the region of interest by electronic scanning, the 2D array remains stationary during the procedure. Although this approach can acquire 3D volume straightforwardly and in real-time, manufacturing process of a 2D phased array transducer is complex and manufacturing cost is high due to a large number of array elements and the electrical connection of each element [113]. Another approach to obtain a 3D US volume is via mechanical 3D scanning using conventional linear array transducer. In this method, a mechanical motor is used to control the transducer rotation, tilt, or translation with designed scanning trajectory [109]. The 3D US volumes can then be reconstructed by using the acquired 2D US images with their predefined positions and orientations. While the 3D US imaging systems based on this kind of method can be operated conveniently by controlling mechanical motor, the whole system is bulky due to a mechanical motor integrated and the system flexibility is low due to the controlled movement limitation.
In addition to the above-mentioned approaches for 3D US imaging, freehand 3D US has become the most rapidly advancing technique over the years due to the advantages of scanning flexibility, convenience to operate and low cost. Freehand 3D US images are acquired by rigidly attaching a 6-DOF position sensor to a handheld US probe that generates a sequence of B-mode US images [114]. The position sensor records the positions and orientations of the probe during the scanning procedure, and then the 3D volumes are constructed by combining the sequence of the 2D US images along with the corresponding position information (Figure 7a). It is noted that for reconstructing a 3D US volume, the position and orientation data of each 2D US image is required. Various techniques have been reported for obtaining the position and orientation data of the US probe during freehand US scanning. The most commonly used position sensors during freehand 3D US imaging are optical tracking sensor and electromagnetic tracking sensor.
In a typical optical tracking system, either light-reflective makers (passive markers) or light-emitting markers (active markers) are attached to the US probe and the markers are monitored by two or more cameras fixed in a position (Figure 7b). Passive markers are usually matt spheres coated with retroreflective material and reflect light back to the cameras. Three or more markers are usually arranged asymmetrically, allowing the cameras to infer the orientation in space. Contrary to passive markers that reflect light generated by the external sources to the cameras, active markers are made of infrared LEDs, powered by themselves to emit infrared light. In a typical electromagnetic tracking system, a time-varying 3D magnetic field is transmitted through the volume in which the US scanning is to be conducted. Three sensor coils are attached to an US probe and utilized to obtain the field in the 3D Cartesian coordinates (x, y, z) (Figure 7c). This information enables the position and orientation of the sensor coils to be acquired [109].
Figure 7. (a) The typical configuration of a freehand 3D US imaging system. Reprinted from [115] with permission. (b) An optical tracker based freehand 3D US imaging system. Reprinted from [109] with permission. (c) An electromagnetic sensor based freehand 3D US imaging system. Reprinted from [109] with permission.
Figure 7. (a) The typical configuration of a freehand 3D US imaging system. Reprinted from [115] with permission. (b) An optical tracker based freehand 3D US imaging system. Reprinted from [109] with permission. (c) An electromagnetic sensor based freehand 3D US imaging system. Reprinted from [109] with permission.
Micromachines 13 01855 g007
Due to the advantages of flexible operation and simultaneous visualization, freehand 3D US imaging is increasingly gaining popularity in medical applications. For example, Chung et al. [116] reported an imaging system based on optical motion tracking technique with the objective of developing a carotid artery contour detection procedure for carotid atherosclerosis diagnosis (Figure 8a). The 3D motion tracking system consisted of 8 Eagle digital CCD cameras for motion detection in 3D space and 4 passive fluorescent markers attached to an US probe, showing spatial and temporal resolutions of 10 μm and 0.01 s, respectively. Daoud et al. [117] developed a freehand 3D US imaging system using a 3D electromagnetic position tracking system (trakSTAR, NDI, ON, Canada). The position and orientation of the US probe in 3D space were tracked by one of the electromagnetic sensors attached to the probe (Figure 8b). Herickhoff et al. [17] invented a volumetric 3D US imaging system at a very low cost (under USD 250) by using a single IMU sensor for orientation acquisition and a light-weight fixture customized to the US probe (Figure 8c). The preliminary results demonstrated the capability of the low-cost method for reconstructing a 3D US image volume, providing a solution for solving the problem of operator dependence. In another study, Chen and Huang [118] reported a freehand 3D US imaging system that could obtain volume reconstruction and visualization during data acquisition at real-time level. The real-time freehand 3D US system mainly consisted of a linear probe, an electromagnetic sensing system, and a computer with a GPU for image data reconstruction and visualization of the 3D volume image. A summary of the various reported freehand 3D US imaging system during the past decade is provided in Table 4.
Table 4. A summary of freehand 3D US imaging study.
Table 4. A summary of freehand 3D US imaging study.
ReferenceTracking PrincipleTracking SystemAccuracyApplication
Chung et al. [116]Optical trackingMotion Analysis, Santa Rosa, CA, USASpatial: 10 μm
Temporal: 0.01 s
Carotid atherosclerotic stenosis detection
Pelz et al. [119]Electromagnetic trackingCurefab CS system (Curefab Technologies GmbH,
Munich, Germany)
NoneInternal carotid artery stenosis diagnosis
Miller et al. [120]Optical trackingVectorVision2 navigation system (BrainLAB, Munich, Germany)NoneImage-guided surgery
Mercier et al. [121]Optical trackingPolaris (Northern Digital, Waterloo, ON, Canada)Spatial: 0.49–0.74 mm
Temporal: 82 ms
Neuronavigation
Chen et al. [122]Electromagnetic trackingAurora (NDI, ON, Canada)NoneImage-guided surgery
Wen et al. [115]Optical trackingPolaris (Northern Digital, Waterloo, ON, Canada)NoneImage-guided intervention
Sun et al. [123]Optical trackingOptiTrack V120:Trio (NaturalPoint Inc., Corvallis, OR, USA)Spatial: <1 mmImage-guided intervention
Worobey et al. [124]Optical trackingVicon Motion Systems; Centennial, ColoradoNoneScapular position
Passmore et al. [125]Optical trackingVicon Motion Systems, Oxford, UKNoneFemoral torsion measurement
Daoud et al. [117]Electromagnetic trackingtrakSTAR, NDI, ON, CanadaNone3D US imaging
Chen and Huang [118]Electromagnetic trackingMiniBird, Ascension Technology Corp.,
Burlington, VT, USA
NoneReal-time 3D imaging
Cai et al. [20]Optical trackingOptiTrack V120: Trio (NaturalPoint Inc., Corvallis, OR, USA)Positional: 0.08–0.69 mm
Rotational: 0.33–0.62°
3D US imaging
Herickhoff et al. [17]Inertial trackingIMU sensor (iNEMO-M1; STMicroelectronics, Geneva,
Switzerland)
NoneLow-cost 3D imaging platform
Kim et al. [126]Inertial trackingUltrasonic sensor + IMU sensor (HC-SR04, Shenzhen AV, Shenzhen, China)Spatial: 0.79–1.25 mmLow-cost 3D imaging platform
Lai et al. [127]Optical trackingT265, Intel, Santa Clara, CA, USASpatial: 2.9 ± 1.8°Scoliosis assessment
Jiang et al. [128]Electromagnetic trackingAscension Technology, Burlington, VT, USANoneScoliosis assessment
Figure 8. (a) A freehand 3D US imaging system with optical motion tracking system settings. Reprinted from [116] with permission. (b) The setup of a freehand 3D US imaging system with electromagnetic tracking. Reprinted from [122] with permission. (c) A low-cost 3D US image acquisition method. Reprinted from [126] with permission.
Figure 8. (a) A freehand 3D US imaging system with optical motion tracking system settings. Reprinted from [116] with permission. (b) The setup of a freehand 3D US imaging system with electromagnetic tracking. Reprinted from [122] with permission. (c) A low-cost 3D US image acquisition method. Reprinted from [126] with permission.
Micromachines 13 01855 g008

4.2. Ultrasound Image Fusion in Multimodality Imaging

Medical image fusion refers to the co-display of registered images from the same or different imaging modalities, such as US, CT, MRI, and PET [129]. Since the fused image contains all the important features from each input image, it can offer a more comprehensive, more reliable and better description of lesions, so as to assist the preclinical research and clinical diagnosis as well as therapy, such as routine staging, surgical navigation, radiotherapy planning, etc. [130] Percutaneous interventional procedures, particularly percutaneous biopsy and percutaneous tumor ablation, play an important role in caring for patients with cancer. To guide percutaneous interventional procedures, US imaging is the most widely used imaging modality owing to its real-time capability, no radiation exposure, and easy accessibility [131]. However, compared with CT and MRI, US imaging shows a narrower field of view and lower contrast resolution. In addition, the imaging performance is reduced by the presence of gas and fat in human body [132]. To localize and characterize lesions more precisely, applying US fusion imaging allows exploitation of the strengths of different imaging modalities simultaneously, eliminating or minimizing the weakness of every single modality [133]. The procedures of fusing CT/MRI images and US images are detailed in reference [134], the interested readers can refer to it. After the image fusion procedure, the CT/MRI images will be displayed on the monitor side-by-side with the real-time US images in a synchronous manner and updated simultaneously according to the change in position and imaging plane of US probe. A process of US and MRI fusion is illustrated in Figure 9.
As we have discussed in Section 3, to track an US probe in 3D space, there are 5 available tracking techniques. However, for US image fusion applications in percutaneous interventional procedures, the electromagnetic tracking system is the one mostly implemented [129], as shown in Figure 10. For instance, Krucker et al. [136] developed an Aurora (Northern Digital Inc, Waterloo, ON, Canada) electromagnetic tracking system to fuse real-time US with CT, providing real-time visualization of tracked interventional needles within preprocedural CT scans. Appelbaum et al. [137] compared conventional CT-guided biopsy to biopsy employing a U.S. Food and Drug Administration–approved electromagnetic biopsy navigation system (Veran IG4, Veran Medical Technologies). Phantom model study results showed that by using electromagnetic tracking system, needle placement accuracy had been improved and radiation exposure had been reduced compared with conventional CT techniques. Venkatesan et al. [138] fused US image to CT and 18F-FDG-PET/CT with an electromagnetic tracking system (Northern Digital Inc, Waterloo, ON, Canada) for biopsy of technically challenging FDG-avid targets. By using conventional US imaging, a total number of 36 lesion samples could not be well seen or were completely inapparent during the biopsy procedures. However, by using the combined electromagnetic tracking and US/CT/18F-FDG-PET fusion, 31 out of 36 biopsies were diagnostic.
In recent decade, US image fusion has developed significantly and can now perform crucial roles in diagnosis and clinical management across various anatomical regions [129,139,140,141,142,143]. One of the most widely applied examples in clinics is US fused with MRI images for percutaneous image-guided prostate biopsy [144,145]. Although US is the commonest modality utilized for real-time guidance during biopsy, it is limited in its ability to visualize deep targets. In addition, the biopsy procedure is performed targeting only the different anatomic locations of the prostate, thus the underdetection rate of transrectal US-guided biopsy is high. Fusing US images with MRI images allows the information from MRI to be used to direct biopsy needles under US guidance. It combines the superior diagnostic accuracy of MRI for detecting suspicious lesions in the prostate with the practicality and familiarity of US [145,146]. Several U.S. FDA approved systems for fusion imaging of real-time US with MRI are commercially available (summarized in Table 5), as shown in Figure 11.
In addition to US/MRI fusion-guided prostate biopsy, US image fusion has been investigated for clinical applications in various anatomical regions including liver, kidney, pancreas, and musculoskeletal system. A summary of US image fusion for applications in different anatomical regions is illustrated in Table 6.

4.3. Ultrasound-Guided Diagnosis

Percutaneous needle biopsy plays an important role in the diagnosis, staging, and treatment planning for various tumors [155,156]. The success of needle insertion procedures mainly depends on accurate needle placement to minimize complications and to avoid damage to neighboring tissues [156]. In many applications, US guidance has been shown to increase the safety and success rate of the procedure due to its real-time imaging capability, easy operation, portability, etc. [157,158,159]. During the procedure, the physician manually manipulates the needle and the US probe simultaneously while mentally relating US images acquired to locations inside a patient’s body [160]. Practically, it is very challenging for the physician to visualize the needle trajectory inside the patient tissue just by checking the US image [156]. In order to let the needle tip follow the desired trajectory and hit the target location in the image plane, it is beneficial and necessary to track the pose of the needle with respect to the coordinate system of the US image.
Three different types of tracking systems have been applied for US-guided needle insertion: electromagnetic, optical and mechanical trackers. For electromagnetic trackers, Franz et al. [161] assessed the precision and accuracy of a compact electromagnetic field generator (Aurora, Northern Digital Inc., Waterloo, ON, Canada) attached to 6 different US probes with various operating frequencies. Based on the assessment results, the error of the field generator was <0.2 mm; the positional accuracy was <1.0 mm. Xu et al. [162] evaluated the effectiveness of magnetic navigation in US-guided interventional procedures (Figure 12). A commercially available magnetic navigation system (GE Healthcare, Milwaukee, WI, USA) was applied. They found that compared with conventional US guidance, magnetic navigation in US-guided interventional procedure was especially useful for some complicated clinical situations, such as liver tumor ablation. In addition, Hakime et al. [163] evaluated the accuracy and safety of electromagnetic needle tracking for US-guided liver biopsy. An electromagnetic transmitter was placed near the scanning area and a pair of electromagnetic receiving sensors were attached to the US probe. The clinical results demonstrated that the overall diagnostic success rate of liver lesion was 91%. März et al. [164] proposed an interventional imaging system based on a mobile electromagnetic field generator (Aurora, Northern Digital Inc., Waterloo, ON, Canada) attached to an US probe. The tracking and calibration accuracy of the system was assessed in a clinical setting. The tracking accuracy was tested to be <1 mm and the calibration error was 1–2 mm.
For optical trackers, Wang et al. [165] utilized a low-cost Kinect sensor (a stereo camera) for interventional needle tracking. The accuracy of needle tracking was measured, ranging from 2.6 ± 1.7 to 6.9 ± 5.1 mm. Stolka et al. [166] developed a camera-based tracking system for US-guided interventions, consisting of an optical sensing head mounted on an US probe. The head could be mounted to support both in- or out-of-plane interventions. The phantom test results showed that the mean accuracy of the system was 3.27 ± 2.28 mm. Najafi et al. [167] proposed a single camera-based tracking system for US-guided needle insertion (Figure 13). The camera was directly mounted on the US probe and the needle location was tracked by using the needle markers. A needle tracking accuracy of 0.94 ± 0.46 mm was achieved, which was higher than that of the existing solutions. Daoud et al. [168] also reported a camera-based tracking system for US-guided needle interventions. An USB web camera (IceCam2, Macally Peripherals, Ontario, CA, USA) was attached to a 3D curvilinear US probe using a plastic housing. Dynamic needle tracking in a sequence of 3D US volumes was achieved. Based on the ex vivo animal experiments, the maximum error rate of 1.2 mm for the needle tip was measured in individual US volumes.
In addition to the magnetic and optical tracking devices, Ho et al. [169] invented an US-guided robotic system for transperineal prostate intervention, consisting of a gantry, a gun-holder, and an US probe holder (Figure 14). The system was constructed based on the dual-cone concept, ensuring that any part of the prostate can be accessed with minimal skin puncture. The egg phantom experimental results illustrated the system accuracy was <1 mm. Orhan et al. [170] reported design and modeling of a 5-DOF parallel robot for autonomous US-guided biopsy. The robot was composed of 5-DOF and 3 main stages; front stage, back stage, syringe mechanism. The biopsy needle connected to the syringe mechanism passed through the gimbal in the front stage. Poquet et al. [171] designed a 6-DOF, serial robotic co-manipulator system for assisting endorectal prostate biopsy. The robotic system consisted of three brakes and three motors. The system could provide freedom to the urologist to position the probe with respect to the prostate in the free mode while leaving him/her to focus on insertion only during locked mode.

4.4. Ultrasound-Guided Therapy

US-guided surgery is an area of minimally invasive surgery where surgical procedures are performed with the aid of US imaging throughout the operation. Contrary to traditional surgical access, US-guided surgery uses computer-based systems to provide real-time US images to help the physician precisely visualize and target the surgical site by updating the intraoperative information [12]. While other imaging modalities, such as CT and MRI, have also been applied for surgery navigation, US-guided surgery shows several advantages, including real-time imaging, equipment portability, low cost and reduced hospital stays [13]. Figure 15 shows a basic process of 3D US-guided surgery navigation [172]. In order to utilize US to guide surgical procedures, the US probe must be tracked. Although several tracking technologies are commercially available today, which are review in the last section, the most widely used solutions are optical and electromagnetic systems.
For instance, Stoll et al. [173] presented a novel approach for tracking surgical instruments in 3D US imaging by using a series of passive echogenic markers. The markers were attached near the distal end of the surgical instrument, and the marker position and orientation could be simply determined in a 3D US volume using image processing. Since the markers were completely passive, they can be easily implemented without prior integration with the imaging system. Moreover, the error of registering the tracking coordinate frame to the image frame can be eliminated. In another study, Li et al. [174] systematically compared the real-time US-guided percutaneous nephrolithotomy (PCNL) using SonixGPS navigation system with conventional US-guided PCNL using an US machine for the treatment of complex kidney stones. Based on their clinical results, the SonixGPS system was superior to the conventional method in terms of stone clearance rate and puncture accuracy. Hamamoto et al. [175] investigated the efficacy of applying real-time virtual sonography (RVS) guidance for renal puncture for endoscopic combined intrarenal surgery (ECIRS) treatment of large renal calculi (Figure 16). The RVS system synchronized real-time US images with CT images via a magnetic navigation system to provide volume and position data side by side. Compared with US-guided puncture, RVS-guided renal puncture illustrated lower incidence of bleeding-related complications. In addition, Gomes-Fonseca et al. [176] assessed the performance of electromagnetic tracking system guidance for percutaneous renal access in the operating room environment. Their experimental results demonstrated that ureterorenoscopes and 2D US probe did not affect the precision and accuracy of the electromagnetic tracking systems, suggesting that these instruments may be used for a safe percutaneous renal access.
Bharat et al. [177] measured the accuracy of the electromagnetic tracking system for identification of the position and shape of the treatment catheters in high-dose-rate (HDR) prostate brachytherapy (Figure 17). The tracking experiments were performed in both a controlled laboratory environment and a typical brachytherapy operating room. The robotic validation of the electromagnetic system found that the mean accuracy of the system was <0.5 mm, illustrating the potential value of using electromagnetic tracking for catheter mapping in HDR brachytherapy. Schwaab et al. [178] developed an US based motion tracking method for real-time motion correction in ion beam therapy. It was found that by using US tracking, it can yield nearly real-time position information at high frame rate of moving targets. Yu et al. [179] also evaluated the accuracy and precision of a transperineal US image-guided system (Clarity Autoscan US system (Elekta, Stockholm, Sweden)) for prostate radiotherapy. Based on a male pelvic phantom experimental result, the accuracy of US tracking performance in the lateral direction was better than that in the axial direction; the precision of US tracking performance in the axial (superior-inferior) direction was better than that in the lateral (left-right) direction.
In addition to US-guided surgical navigation and radiotherapy, US-guided catheterization has also attracted the attention of many researchers. Jakola et al. [180] reported a method to guide the placement of ventricular catheters using 3D US navigation system. The US-based navigation system (Sonowand Invite, Sonowand AS, Trondheim, Norway) consisted of an US probe integrated with an optical tracking system. Based on the patient studies, this 3D US navigation system was promising for accurate placement of catheters. Brattain et al. [181] designed a probe-mounted US guidance system for US-guided procedures. The system consisted of a lockable, articulating needle guide that attached to an US probe and a user-interface that provided real-time visualization of the predicted needle trajectory overlaid on the US image. The system illustrated the potential to increase efficiency, safety, quality, and reduce costs for US-guided procedures. Kobayashi et al. [182] invented an US-guided needle insertion manipulator for central venous catheterization (Figure 18). The performance of the manipulator was evaluated in vivo in a porcine model. The animal study results found that a venous placement rate of 80% could be obtained with opened skin, and this system was especially effective for jugular venous puncture of opened skin.

5. Conclusions

In this paper, we categorized and reviewed different types of tracking devices for biomedical US imaging applications based on the different tracking principles. The applications of various tracking systems reported in the literature in the past decade were categorized into four types: freehand 3D US imaging, US image fusion, US-guided diagnosis as well as US-guided therapy. In this review article, the working principles of different tracking technologies were analyzed in terms of their advantages and disadvantages for biomedical applications. A comprehensive overview of the state-of-the-art tracking devices on the market is provided in terms of their technical specifications, including accuracy, update rate and latency. With the rapid advancement of various tracking devices over the past decade, the usefulness of different tracking systems has been illustrated by a diverse range of biomedical applications, as reviewed in this paper.

6. Future Perspectives

Although the utilization of tracking device is becoming more and more essential for providing better information and navigation for biomedical applications, there is still much room for improvement. Nowadays, many different types of commercial tracking devices have been introduced and no significant specification differences have been found among them. For the biomedical applications, such as image-guided surgery, perhaps the existing tracking technologies do not fully meet the requirements, and the best choice of tracking device is highly application dependent. The future research of tracking systems may be focused on further improving accuracy and reducing the registration error of these technologies for medical applications. While freehand 3D US has already demonstrated its benefits for obstetrics, cardiology, and image-guided intervention applications, more preclinical studies are required to allow physicians to integrate 3D US imaging effectively and safely into US-guided interventional procedures. In addition, while real-time US image fusion has demonstrated its usefulness in different anatomical regions, such as prostate, liver, and kidney, future studies need to explore its effectiveness in imaging other anatomical regions or during surgery. Although the advancement of different tracking devices has accelerated the development of US image-guided systems, most of these systems are still in the prototype stage, and so far, only limited clinical trials have been carried out. As surgery continues to move toward minimally invasive interventions, US image-guided systems will increasingly be used to improve the precision and quality of medical procedures. More studies from the fields of biomedical engineering, medical physics as well as clinical research are necessary to move this technology from laboratory to hospital to improve patient care.

Author Contributions

Conceptualization, X.J.; writing—original draft preparation, C.P., Q.C. and M.C.; writing—review and editing, C.P., Q.C., M.C. and X.J.; supervision, project administration, and funding acquisition, X.J. All authors have read and agreed to the published version of the manuscript.

Funding

We would like to acknowledge the financial support from the Bill and Melinda Gates Foundation under Award #OPP1191684.

Acknowledgments

We appreciate the anonymous reviewers for their careful reading of our manuscript and their many insightful comments and suggestions to help improve and clarify this manuscript.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
2DTwo dimensions
3DThree dimensions
CCDCharge-coupled device
Corp.Corporation
CTComputed tomography
DOFDegree-of-freedom
ECIRSEndoscopic combined intrarenal surgery
FDAFood and drug administration
FOVField of view
FPSFrames per second
GPUGraphics processing unit
HCCHepatocellular carcinoma
HDRHigh-dose-rate
IMUInertial measurement unit
Inc.Incorporated
LEDLight emitting diode
MEMSMicroelectromechanical system
MPMegapixel
MRIMagnetic resonance imaging
NDINorthern Digital Inc.
PCNLPercutaneous nephrolithotomy
PETPositron emission tomography
RMSRoot mean square
RVSReal-time virtual sonography
TDOATime difference of arrival
TDOFTime difference of flight
TOATime of arrival
TOFTime of flight
TRUSTransrectal ultrasound
USUltrasound
USBUniversal serial bus

References

  1. Kothiya, S.V.; Mistree, K.B. A Review on Real Time Object Tracking in Video Sequences. In Proceedings of the Electrical, Electronics, Signals, Communication and Optimization (EESCO), 2015 International Conference on, Visakhapatnam, India, 24–25 January 2015; pp. 1–4. [Google Scholar]
  2. Octorina Dewi, D.E.; Supriyanto, E.; Lai, K.W. Position Tracking Systems for Ultrasound Imaging: A Survey. In Medical Imaging Technology; Springer: Berlin/Heidelberg, Germany, 2015; pp. 57–89. [Google Scholar]
  3. Karayiannidis, Y.; Rovithakis, G.; Doulgeri, Z. Force/Position Tracking for a Robotic Manipulator in Compliant Contact with a Surface Using Neuro-Adaptive Control. Automatica 2007, 43, 1281–1288. [Google Scholar] [CrossRef]
  4. Chang, Y.-C.; Yen, H.-M. Design of a Robust Position Feedback Tracking Controller for Flexible-Joint Robots. IET Control Theory Appl. 2011, 5, 351–363. [Google Scholar] [CrossRef]
  5. Liu, M.; Yu, J.; Yang, L.; Yao, L.; Zhang, Y. Consecutive Tracking for Ballistic Missile Based on Bearings-Only during Boost Phase. J. Syst. Eng. Electron. 2012, 23, 700–707. [Google Scholar] [CrossRef]
  6. Kendoul, F. Survey of Advances in Guidance, Navigation, and Control of Unmanned Rotorcraft Systems. J. Field Robot. 2012, 29, 315–378. [Google Scholar] [CrossRef]
  7. Ren, H.; Rank, D.; Merdes, M.; Stallkamp, J.; Kazanzides, P. Multisensor Data Fusion in an Integrated Tracking System for Endoscopic Surgery. IEEE Trans. Inf. Technol. Biomed. 2011, 16, 106–111. [Google Scholar] [CrossRef] [PubMed]
  8. Huang, Q.-H.; Yang, Z.; Hu, W.; Jin, L.-W.; Wei, G.; Li, X. Linear Tracking for 3-D Medical Ultrasound Imaging. IEEE Trans. Cybern. 2013, 43, 1747–1754. [Google Scholar] [CrossRef] [PubMed]
  9. Leser, R.; Baca, A.; Ogris, G. Local Positioning Systems in (Game) Sports. Sensors 2011, 11, 9778–9797. [Google Scholar] [CrossRef] [Green Version]
  10. Hedley, M.; Zhang, J. Accurate Wireless Localization in Sports. Computer 2012, 45, 64–70. [Google Scholar] [CrossRef]
  11. Mozaffari, M.H.; Lee, W.-S. Freehand 3-D Ultrasound Imaging: A Systematic Review. Ultrasound Med. Biol. 2017, 43, 2099–2124. [Google Scholar] [CrossRef] [Green Version]
  12. Cleary, K.; Peters, T.M. Image-Guided Interventions: Technology Review and Clinical Applications. Annu. Rev. Biomed. Eng. 2010, 12, 119–142. [Google Scholar] [CrossRef]
  13. Lindseth, F.; Langø, T.; Selbekk, T.; Hansen, R.; Reinertsen, I.; Askeland, C.; Solheim, O.; Unsgård, G.; Mårvik, R.; Hernes, T.A.N. Ultrasound-Based Guidance and Therapy. In Advancements and Breakthroughs in Ultrasound Imaging; IntechOpen: London, UK, 2013. [Google Scholar]
  14. Zhou, H.; Hu, H. Human Motion Tracking for Rehabilitation—A Survey. Biomed. Signal Process. Control 2008, 3, 1–18. [Google Scholar] [CrossRef]
  15. Moran, C.M.; Thomson, A.J.W. Preclinical Ultrasound Imaging—A Review of Techniques and Imaging Applications. Front. Phys. 2020, 8, 124. [Google Scholar] [CrossRef]
  16. Fenster, A.; Downey, D.B.; Cardinal, H.N. Three-Dimensional Ultrasound Imaging. Phys. Med. Biol. 2001, 46, R67. [Google Scholar] [CrossRef]
  17. Herickhoff, C.D.; Morgan, M.R.; Broder, J.S.; Dahl, J.J. Low-Cost Volumetric Ultrasound by Augmentation of 2D Systems: Design and Prototype. Ultrason. Imaging 2018, 40, 35–48. [Google Scholar] [CrossRef] [PubMed]
  18. Schlegel, M. Predicting the Accuracy of Optical Tracking Systems; Technical University of Munich: München, Germany, 2006. [Google Scholar]
  19. Abdelhamid, M. Extracting Depth Information from Stereo Vision System: Using a Correlation and a Feature Based Methods. Master’s Thesis, Clemson University, Clemson, SC, USA, 2011. [Google Scholar]
  20. Cai, Q.; Peng, C.; Lu, J.; Prieto, J.C.; Rosenbaum, A.J.; Stringer, J.S.A.; Jiang, X. Performance Enhanced Ultrasound Probe Tracking with a Hemispherical Marker Rigid Body. IEEE Trans. Ultrason. Ferroelectr. Freq. Control 2021, 68, 2155–2163. [Google Scholar] [CrossRef] [PubMed]
  21. Zhigang, Y.; Kui, Y. An Improved 6DOF Electromagnetic Tracking Algorithm with Anisotropic System Parameters. In International Conference on Technologies for E-Learning and Digital Entertainment; Springer: Berlin/Heidelberg, Germany, 2006; pp. 1141–1150. [Google Scholar]
  22. Zhang, Z.; Liu, G. The Design and Analysis of Electromagnetic Tracking System. J. Electromagn. Anal. Appl. 2013, 5, 85–89. [Google Scholar] [CrossRef] [Green Version]
  23. Craig, J.J. Introduction to Robotics: Mechanics and Control, 4th ed.; Pearson Education: New York, NY, USA, 2018. [Google Scholar]
  24. Gueuning, F.; Varlan, M.; Eugene, C.; Dupuis, P. Accurate Distance Measurement by an Autonomous Ultrasonic System Combining Time-of-Flight and Phase-Shift Methods. In Proceedings of the Quality Measurement: The Indispensable Bridge between Theory and Reality, Brussels, Belgium, 4–6 June 1996; Volume 1, pp. 399–404. [Google Scholar]
  25. Mahajan, A.; Walworth, M. 3D Position Sensing Using the Differences in the Time-of-Flights from a Wave Source to Various Receivers. IEEE Trans. Robot. Autom. 2001, 17, 91–94. [Google Scholar] [CrossRef]
  26. Ray, P.K.; Mahajan, A. A Genetic Algorithm-Based Approach to Calculate the Optimal Configuration of Ultrasonic Sensors in a 3D Position Estimation System. Rob. Auton. Syst. 2002, 41, 165–177. [Google Scholar] [CrossRef]
  27. Cai, Q.; Hu, J.; Chen, M.; Prieto, J.; Rosenbaum, A.J.; Stringer, J.S.A.; Jiang, X. Inertial Measurement Unit Assisted Ultrasonic Tracking System for Ultrasound Probe Localization. IEEE Trans. Ultrason. Ferroelectr. Freq. Control 2022. [Google Scholar] [CrossRef]
  28. Filippeschi, A.; Schmitz, N.; Miezal, M.; Bleser, G.; Ruffaldi, E.; Stricker, D. Survey of Motion Tracking Methods Based on Inertial Sensors: A Focus on Upper Limb Human Motion. Sensors 2017, 17, 1257. [Google Scholar] [CrossRef]
  29. Patonis, P.; Patias, P.; Tziavos, I.N.; Rossikopoulos, D.; Margaritis, K.G. A Fusion Method for Combining Low-Cost IMU/Magnetometer Outputs for Use in Applications on Mobile Devices. Sensors 2018, 18, 2616. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  30. Arqus. Available online: https://www.qualisys.com/cameras/arqus/#!%23tech-specs (accessed on 27 September 2022).
  31. Polaris Vega® ST. Available online: https://www.ndigital.com/optical-measurement-technology/polaris-vega/polaris-vega-st/ (accessed on 27 September 2022).
  32. Polaris Vega® VT. Available online: https://www.ndigital.com/optical-measurement-technology/polaris-vega/polaris-vega-vt/ (accessed on 27 September 2022).
  33. Polaris Vega® XT. Available online: https://www.ndigital.com/optical-measurement-technology/polaris-vega/polaris-vega-xt/ (accessed on 27 September 2022).
  34. Polaris Vicra®. Available online: https://www.ndigital.com/optical-measurement-technology/polaris-vicra/ (accessed on 27 September 2022).
  35. ClaroNav MicronTracker Specification. Available online: https://www.claronav.com/microntracker/microntracker-specifications/ (accessed on 27 September 2022).
  36. Smart DX. Available online: https://www.btsbioengineering.com/products/smart-dx-motion-capture/ (accessed on 27 September 2022).
  37. PrimeX 41. Available online: https://optitrack.com/cameras/primex-41/specs.html (accessed on 27 September 2022).
  38. PrimeX 22. Available online: https://optitrack.com/cameras/primex-22/specs.html (accessed on 27 September 2022).
  39. PrimeX 13. Available online: https://optitrack.com/cameras/primex-13/specs.html (accessed on 27 September 2022).
  40. PrimeX 13W. Available online: https://optitrack.com/cameras/primex-13w/specs.html (accessed on 27 September 2022).
  41. SlimX 13. Available online: https://optitrack.com/cameras/slimx-13/specs.html (accessed on 27 September 2022).
  42. V120:Trio. Available online: https://optitrack.com/cameras/v120-trio/specs.html (accessed on 27 September 2022).
  43. V120:Duo. Available online: https://optitrack.com/cameras/v120-duo/specs.html (accessed on 27 September 2022).
  44. Flex 13. Available online: https://optitrack.com/cameras/flex-13/specs.html (accessed on 27 September 2022).
  45. Flex 3. Available online: https://optitrack.com/cameras/flex-3/specs.html (accessed on 27 September 2022).
  46. Slim 3U. Available online: https://optitrack.com/cameras/slim-3u/specs.html (accessed on 27 September 2022).
  47. TrackIR 4. vs. TrackIR 5. Available online: https://www.trackir.com/trackir5/ (accessed on 27 September 2022).
  48. Miqus. Available online: https://www.qualisys.com/cameras/miqus/#tech-specs (accessed on 27 September 2022).
  49. Miqus Hybrid. Available online: https://www.qualisys.com/cameras/miqus-hybrid/#tech-specs (accessed on 27 September 2022).
  50. 5+, 6+ and 7+ Series. Available online: https://www.qualisys.com/cameras/5-6-7/#tech-specs (accessed on 27 September 2022).
  51. Valkyrie. Available online: https://www.vicon.com/hardware/cameras/valkyrie/ (accessed on 27 September 2022).
  52. Vantage+. Available online: https://www.vicon.com/hardware/cameras/vantage/ (accessed on 27 September 2022).
  53. Vero. Available online: https://www.vicon.com/hardware/cameras/vero/ (accessed on 27 September 2022).
  54. Vue. Available online: https://www.vicon.com/hardware/cameras/vue/ (accessed on 27 September 2022).
  55. Viper. Available online: https://www.vicon.com/hardware/cameras/viper/ (accessed on 27 September 2022).
  56. ViperX. Available online: https://www.vicon.com/hardware/cameras/viper-x/ (accessed on 27 September 2022).
  57. FusionTrack 500. Available online: https://www.atracsys-measurement.com/fusiontrack-500/ (accessed on 27 September 2022).
  58. FusionTrack 250. Available online: https://www.atracsys-measurement.com/fusiontrack-250/ (accessed on 27 September 2022).
  59. SpryTrack 180. Available online: https://www.atracsys-measurement.com/sprytrack-180/ (accessed on 27 September 2022).
  60. SpryTrack 300. Available online: https://www.atracsys-measurement.com/sprytrack-300/ (accessed on 27 September 2022).
  61. Kestrel 4200. Available online: https://motionanalysis.com/blog/cameras/kestrel-4200/ (accessed on 27 September 2022).
  62. Kestrel 2200. Available online: https://motionanalysis.com/blog/cameras/kestrel-2200/ (accessed on 27 September 2022).
  63. Kestrel 1300. Available online: https://motionanalysis.com/blog/cameras/kestrel-130/ (accessed on 27 September 2022).
  64. Kestrel 300. Available online: https://motionanalysis.com/blog/cameras/kestrel-300/ (accessed on 27 September 2022).
  65. EDDO Biomechanic. Available online: https://www.stt-systems.com/motion-analysis/3d-optical-motion-capture/eddo/ (accessed on 27 September 2022).
  66. ARTTRACK6/M. Available online: https://ar-tracking.com/en/product-program/arttrack6m (accessed on 27 September 2022).
  67. ARTTRACK5. Available online: https://ar-tracking.com/en/product-program/arttrack5 (accessed on 27 September 2022).
  68. SMARTTRACK3 & SMARTTRACK3/M. Available online: https://ar-tracking.com/en/product-program/smarttrack3 (accessed on 27 September 2022).
  69. Micro Sensor 1.8. Available online: https://polhemus.com/micro-sensors/ (accessed on 27 September 2022).
  70. Aurora. Available online: https://www.ndigital.com/electromagnetic-tracking-technology/aurora/ (accessed on 27 September 2022).
  71. 3D Guidance. Available online: https://www.ndigital.com/electromagnetic-tracking-technology/3d-guidance/ (accessed on 27 September 2022).
  72. Viper. Available online: https://polhemus.com/viper (accessed on 27 September 2022).
  73. Fastrak. Available online: https://polhemus.com/motion-tracking/all-trackers/fastrak (accessed on 27 September 2022).
  74. Patriot. Available online: https://polhemus.com/motion-tracking/all-trackers/patriot (accessed on 27 September 2022).
  75. Patriot Wireless. Available online: https://polhemus.com/motion-tracking/all-trackers/patriot-wireless (accessed on 27 September 2022).
  76. Liberty. Available online: https://polhemus.com/motion-tracking/all-trackers/liberty (accessed on 27 September 2022).
  77. Liberty Latus. Available online: https://polhemus.com/motion-tracking/all-trackers/liberty-latus (accessed on 27 September 2022).
  78. G4. Available online: https://polhemus.com/motion-tracking/all-trackers/g4 (accessed on 27 September 2022).
  79. Gypsy 7. Available online: https://metamotion.com/gypsy/gypsy-motion-capture-system.htm (accessed on 27 September 2022).
  80. Forkbeard. Available online: https://www.sonitor.com/forkbeard (accessed on 27 September 2022).
  81. Xsens DOT. Available online: https://www.xsens.com/xsens-dot (accessed on 27 September 2022).
  82. MTw Awinda. Available online: https://www.xsens.com/products/mtw-awinda (accessed on 27 September 2022).
  83. MTi 1-Series. Available online: https://mtidocs.xsens.com/sensor-specifications$mti-1-series-performance-specifications (accessed on 27 September 2022).
  84. MTi 10/100-Series. Available online: https://mtidocs.xsens.com/output-specifications$orientation-performance-specification (accessed on 27 September 2022).
  85. MTi 600-Series. Available online: https://mtidocs.xsens.com/sensor-specifications-2$mti-600-series-performance-specifications-nbsp (accessed on 27 September 2022).
  86. Inertial Motion Capture. Available online: https://www.stt-systems.com/motion-analysis/inertial-motion-capture/ (accessed on 27 September 2022).
  87. VN-100. Available online: https://www.vectornav.com/products/detail/vn-100 (accessed on 27 September 2022).
  88. VN-110. Available online: https://www.vectornav.com/products/detail/vn-110 (accessed on 27 September 2022).
  89. VN-200. Available online: https://www.vectornav.com/products/detail/vn-200 (accessed on 27 September 2022).
  90. VN-210. Available online: https://www.vectornav.com/products/detail/vn-210 (accessed on 27 September 2022).
  91. VN-300. Available online: https://www.vectornav.com/products/detail/vn-300 (accessed on 27 September 2022).
  92. VN-310. Available online: https://www.vectornav.com/products/detail/vn-310 (accessed on 27 September 2022).
  93. Motus. Available online: https://www.advancednavigation.com/imu-ahrs/mems-imu/motus/ (accessed on 27 September 2022).
  94. Orientus. Available online: https://www.advancednavigation.com/imu-ahrs/mems-imu/orientus/ (accessed on 27 September 2022).
  95. BOREAS D90. Available online: https://www.advancednavigation.com/inertial-navigation-systems/fog-gnss-ins/boreas/ (accessed on 27 September 2022).
  96. Spatial FOG Dual. Available online: https://www.advancednavigation.com/inertial-navigation-systems/fog-gnss-ins/spatial-fog-dual/ (accessed on 27 September 2022).
  97. Certus Evo. Available online: https://www.advancednavigation.com/inertial-navigation-systems/mems-gnss-ins/certus-evo/ (accessed on 27 September 2022).
  98. Certus. Available online: https://www.advancednavigation.com/inertial-navigation-systems/mems-gnss-ins/certus/ (accessed on 27 September 2022).
  99. Spatial. Available online: https://www.advancednavigation.com/inertial-navigation-systems/mems-gnss-ins/spatial/ (accessed on 27 September 2022).
  100. GNSS Compass. Available online: https://www.advancednavigation.com/inertial-navigation-systems/satellite-compass/gnss-compass/ (accessed on 27 September 2022).
  101. Kernel-100. Available online: https://inertiallabs.com/wp-content/uploads/2021/12/IMU-Kernel_Datasheet.rev_.2.9_December_2021.pdf (accessed on 27 September 2022).
  102. Kernel-110, 120. Available online: https://inertiallabs.com/wp-content/uploads/2022/09/IMU-Kernel-110-120_Datasheet.rev1_.7_September20_2022.pdf (accessed on 27 September 2022).
  103. Kernel-210, 220. Available online: https://inertiallabs.com/wp-content/uploads/2022/09/IMU-Kernel-210-220_Datasheet.rev1_.6_Sept20_2022.pdf (accessed on 27 September 2022).
  104. IMU-P. Available online: https://inertiallabs.com/wp-content/uploads/2022/09/IMU-P_Datasheet.rev4_.1_Sept20_2022.pdf (accessed on 27 September 2022).
  105. Peng, C.; Chen, M.; Spicer, J.B.; Jiang, X. Acoustics at the Nanoscale (Nanoacoustics): A Comprehensive Literature Review. Part II: Nanoacoustics for Biomedical Imaging and Therapy. Sens. Actuators A Phys. 2021, 332, 112925. [Google Scholar] [CrossRef] [PubMed]
  106. Rajaraman, P.; Simpson, J.; Neta, G.; de Gonzalez, A.B.; Ansell, P.; Linet, M.S.; Ron, E.; Roman, E. Early Life Exposure to Diagnostic Radiation and Ultrasound Scans and Risk of Childhood Cancer: Case-Control Study. BMJ 2011, 342, d472. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  107. Huang, Q.; Zeng, Z. A Review on Real-Time 3D Ultrasound Imaging Technology. Biomed. Res. Int. 2017, 2017, 6027029. [Google Scholar] [CrossRef] [Green Version]
  108. Morgan, M.R.; Broder, J.S.; Dahl, J.J.; Herickhoff, C.D. Versatile Low-Cost Volumetric 3-D Ultrasound Platform for Existing Clinical 2-D Systems. IEEE Trans. Med. Imaging 2018, 37, 2248–2256. [Google Scholar] [CrossRef]
  109. Prager, R.W.; Ijaz, U.Z.; Gee, A.H.; Treece, G.M. Three-Dimensional Ultrasound Imaging. Proc. Inst. Mech. Eng. Part H J. Eng. Med. 2010, 224, 193–223. [Google Scholar] [CrossRef]
  110. Fenster, A.; Parraga, G.; Bax, J. Three-Dimensional Ultrasound Scanning. Interface Focus 2011, 1, 503–519. [Google Scholar] [CrossRef] [Green Version]
  111. Yen, J.T.; Steinberg, J.P.; Smith, S.W. Sparse 2-D Array Design for Real Time Rectilinear Volumetric Imaging. IEEE Trans. Ultrason. Ferroelectr. Freq. Control 2000, 47, 93–110. [Google Scholar] [CrossRef] [Green Version]
  112. Yen, J.T.; Smith, S.W. Real-Time Rectilinear 3-D Ultrasound Using Receive Mode Multiplexing. ieee Trans. Ultrason. Ferroelectr. Freq. Control 2004, 51, 216–226. [Google Scholar] [CrossRef]
  113. Turnbull, D.H.; Foster, F.S. Fabrication and Characterization of Transducer Elements in Two-Dimensional Arrays for Medical Ultrasound Imaging. IEEE Trans. Ultrason. Ferroelectr. Freq. Control 1992, 39, 464–475. [Google Scholar] [CrossRef]
  114. Gee, A.; Prager, R.; Treece, G.; Berman, L. Engineering a Freehand 3D Ultrasound System. Pattern Recognit. Lett. 2003, 24, 757–777. [Google Scholar] [CrossRef]
  115. Wen, T.; Yang, F.; Gu, J.; Wang, L. A Novel Bayesian-Based Nonlocal Reconstruction Method for Freehand 3D Ultrasound Imaging. Neurocomputing 2015, 168, 104–118. [Google Scholar] [CrossRef]
  116. Chung, S.-W.; Shih, C.-C.; Huang, C.-C. Freehand Three-Dimensional Ultrasound Imaging of Carotid Artery Using Motion Tracking Technology. Ultrasonics 2017, 74, 11–20. [Google Scholar] [CrossRef] [PubMed]
  117. Daoud, M.I.; Alshalalfah, A.-L.; Awwad, F.; Al-Najar, M. Freehand 3D Ultrasound Imaging System Using Electromagnetic Tracking. In Proceedings of the 2015 International Conference on Open Source Software Computing (OSSCOM), Amman, Jordan, 10–13 September 2015; pp. 1–5. [Google Scholar]
  118. Chen, Z.; Huang, Q. Real-Time Freehand 3D Ultrasound Imaging. Comput. Methods Biomech. Biomed. Eng. Imaging Vis. 2018, 6, 74–83. [Google Scholar] [CrossRef]
  119. Pelz, J.O.; Weinreich, A.; Karlas, T.; Saur, D. Evaluation of Freehand B-Mode and Power-Mode 3D Ultrasound for Visualisation and Grading of Internal Carotid Artery Stenosis. PLoS ONE 2017, 12, e0167500. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  120. Miller, D.; Lippert, C.; Vollmer, F.; Bozinov, O.; Benes, L.; Schulte, D.M.; Sure, U. Comparison of Different Reconstruction Algorithms for Three-dimensional Ultrasound Imaging in a Neurosurgical Setting. Int. J. Med. Robot. Comput. Assist. Surg. 2012, 8, 348–359. [Google Scholar] [CrossRef] [Green Version]
  121. Mercier, L.; Del Maestro, R.F.; Petrecca, K.; Kochanowska, A.; Drouin, S.; Yan, C.X.B.; Janke, A.L.; Chen, S.J.-S.; Collins, D.L. New Prototype Neuronavigation System Based on Preoperative Imaging and Intraoperative Freehand Ultrasound: System Description and Validation. Int. J. Comput. Assist. Radiol. Surg. 2011, 6, 507–522. [Google Scholar] [CrossRef]
  122. Chen, X.; Wen, T.; Li, X.; Qin, W.; Lan, D.; Pan, W.; Gu, J. Reconstruction of Freehand 3D Ultrasound Based on Kernel Regression. Biomed. Eng. Online 2014, 13, 1–15. [Google Scholar] [CrossRef] [Green Version]
  123. Sun, S.-Y.; Gilbertson, M.; Anthony, B.W. Probe Localization for Freehand 3D Ultrasound by Tracking Skin Features. In Medical Image Computing and Computer-Assisted Intervention—MICCAI 2014; Springer: Berlin/Heidelberg, Germany, 2014; pp. 365–372. [Google Scholar]
  124. Worobey, L.A.; Udofa, I.A.; Lin, Y.-S.; Koontz, A.M.; Farrokhi, S.S.; Boninger, M.L. Reliability of Freehand Three-Dimensional Ultrasound to Measure Scapular Rotations. J. Rehabil. Res. Dev. 2014, 51, 985–994. [Google Scholar] [CrossRef]
  125. Passmore, E.; Pandy, M.G.; Graham, H.K.; Sangeux, M. Measuring Femoral Torsion in Vivo Using Freehand 3-D Ultrasound Imaging. Ultrasound Med. Biol. 2016, 42, 619–623. [Google Scholar] [CrossRef]
  126. Kim, T.; Kang, D.-H.; Shim, S.; Im, M.; Seo, B.K.; Kim, H.; Lee, B.C. Versatile Low-Cost Volumetric 3D Ultrasound Imaging Using Gimbal-Assisted Distance Sensors and an Inertial Measurement Unit. Sensors 2020, 20, 6613. [Google Scholar] [CrossRef] [PubMed]
  127. Lai, K.K.-L.; Lee, T.T.-Y.; Lee, M.K.-S.; Hui, J.C.-H.; Zheng, Y.-P. Validation of Scolioscan Air-Portable Radiation-Free Three-Dimensional Ultrasound Imaging Assessment System for Scoliosis. Sensors 2021, 21, 2858. [Google Scholar] [CrossRef] [PubMed]
  128. Jiang, W.; Chen, X.; Yu, C. A Real-time Freehand 3D Ultrasound Imaging Method for Scoliosis Assessment. J. Appl. Clin. Med. Phys. 2022, 23, e13709. [Google Scholar] [CrossRef] [PubMed]
  129. Ewertsen, C.; Săftoiu, A.; Gruionu, L.G.; Karstrup, S.; Nielsen, M.B. Real-Time Image Fusion Involving Diagnostic Ultrasound. Am. J. Roentgenol. 2013, 200, W249–W255. [Google Scholar] [CrossRef] [PubMed]
  130. Li, X.; Zhou, F.; Tan, H.; Zhang, W.; Zhao, C. Multimodal Medical Image Fusion Based on Joint Bilateral Filter and Local Gradient Energy. Inf. Sci. 2021, 569, 302–325. [Google Scholar] [CrossRef]
  131. Klibanov, A.L.; Hossack, J.A. Ultrasound in Radiology: From Anatomic, Functional, Molecular Imaging to Drug Delivery and Image-Guided Therapy. Investig. Radiol. 2015, 50, 657. [Google Scholar] [CrossRef] [Green Version]
  132. Baad, M.; Lu, Z.F.; Reiser, I.; Paushter, D. Clinical Significance of US Artifacts. Radiographics 2017, 37, 1408–1423. [Google Scholar] [CrossRef] [Green Version]
  133. European Society of Radiology (ESR) communications@ myesr. org D’Onofrio Mirko Beleù Alessandro Gaitini Diana Corréas Jean-Michel Brady Adrian Clevert Dirk. Abdominal Applications of Ultrasound Fusion Imaging Technique: Liver, Kidney, and Pancreas. Insights Imaging 2019, 10, 6. [Google Scholar] [CrossRef] [Green Version]
  134. Chien, C.P.Y.; Lee, K.H.; Lau, V. Real-Time Ultrasound Fusion Imaging–Guided Interventions: A Review. Hong Kong J. Radiol. 2021, 24, 116. [Google Scholar] [CrossRef]
  135. Natarajan, S.; Marks, L.S.; Margolis, D.J.A.; Huang, J.; Macairan, M.L.; Lieu, P.; Fenster, A. Clinical Application of a 3D Ultrasound-Guided Prostate Biopsy System. In Urologic Oncology: Seminars and Original Investigations; Elsevier: Amsterdam, The Netherlands, 2011; Volume 29, pp. 334–342. [Google Scholar]
  136. Krücker, J.; Xu, S.; Venkatesan, A.; Locklin, J.K.; Amalou, H.; Glossop, N.; Wood, B.J. Clinical Utility of Real-Time Fusion Guidance for Biopsy and Ablation. J. Vasc. Interv. Radiol. 2011, 22, 515–524. [Google Scholar] [CrossRef]
  137. Appelbaum, L.; Sosna, J.; Nissenbaum, Y.; Benshtein, A.; Goldberg, S.N. Electromagnetic Navigation System for CT-Guided Biopsy of Small Lesions. Am. J. Roentgenol. 2011, 196, 1194–1200. [Google Scholar] [CrossRef] [PubMed]
  138. Venkatesan, A.M.; Kadoury, S.; Abi-Jaoudeh, N.; Levy, E.B.; Maass-Moreno, R.; Krücker, J.; Dalal, S.; Xu, S.; Glossop, N.; Wood, B.J. Real-Time FDG PET Guidance during Biopsies and Radiofrequency Ablation Using Multimodality Fusion with Electromagnetic Navigation. Radiology 2011, 260, 848–856. [Google Scholar] [CrossRef] [PubMed]
  139. Lee, M.W. Fusion Imaging of Real-Time Ultrasonography with CT or MRI for Hepatic Intervention. Ultrasonography 2014, 33, 227. [Google Scholar] [CrossRef] [PubMed]
  140. Sumi, H.; Itoh, A.; Kawashima, H.; Ohno, E.; Itoh, Y.; Nakamura, Y.; Hiramatsu, T.; Sugimoto, H.; Hayashi, D.; Kuwahara, T. Preliminary Study on Evaluation of the Pancreatic Tail Observable Limit of Transabdominal Ultrasonography Using a Position Sensor and CT-Fusion Image. Eur. J. Radiol. 2014, 83, 1324–1331. [Google Scholar] [CrossRef] [PubMed]
  141. Lee, K.-H.; Lau, V.; Gao, Y.; Li, Y.-L.; Fang, B.X.; Lee, R.; Lam, W.W.-M. Ultrasound-MRI Fusion for Targeted Biopsy of Myopathies. AJR Am. J. Roentgenol. 2019, 212, 1126–1128. [Google Scholar] [CrossRef] [PubMed]
  142. Burke, C.J.; Bencardino, J.; Adler, R. The Potential Use of Ultrasound-Magnetic Resonance Imaging Fusion Applications in Musculoskeletal Intervention. J. Ultrasound Med. 2017, 36, 217–224. [Google Scholar] [CrossRef] [Green Version]
  143. Klauser, A.S.; De Zordo, T.; Feuchtner, G.M.; Djedovic, G.; Weiler, R.B.; Faschingbauer, R.; Schirmer, M.; Moriggl, B. Fusion of Real-Time US with CT Images to Guide Sacroiliac Joint Injection in Vitro and in Vivo. Radiology 2010, 256, 547–553. [Google Scholar] [CrossRef] [Green Version]
  144. Sonn, G.A.; Margolis, D.J.; Marks, L.S. Target Detection: Magnetic Resonance Imaging-Ultrasound Fusion–Guided Prostate Biopsy. In Urologic Oncology: Seminars and Original Investigations; Elsevier: Amsterdam, The Netherlands, 2014; Volume 32, pp. 903–911. [Google Scholar]
  145. Costa, D.N.; Pedrosa, I.; Donato, F., Jr.; Roehrborn, C.G.; Rofsky, N.M. MR Imaging–Transrectal US Fusion for Targeted Prostate Biopsies: Implications for Diagnosis and Clinical Management. Radiographics 2015, 35, 696–708. [Google Scholar] [CrossRef]
  146. Appelbaum, L.; Mahgerefteh, S.Y.; Sosna, J.; Goldberg, S.N. Image-Guided Fusion and Navigation: Applications in Tumor Ablation. Tech. Vasc. Interv. Radiol. 2013, 16, 287–295. [Google Scholar] [CrossRef]
  147. Marks, L.; Young, S.; Natarajan, S. MRI–Ultrasound Fusion for Guidance of Targeted Prostate Biopsy. Curr. Opin. Urol. 2013, 23, 43. [Google Scholar] [CrossRef]
  148. Park, H.J.; Lee, M.W.; Lee, M.H.; Hwang, J.; Kang, T.W.; Lim, S.; Rhim, H.; Lim, H.K. Fusion Imaging–Guided Percutaneous Biopsy of Focal Hepatic Lesions with Poor Conspicuity on Conventional Sonography. J. Ultrasound Med. 2013, 32, 1557–1564. [Google Scholar] [CrossRef] [PubMed]
  149. Lee, M.W.; Rhim, H.; Cha, D.I.; Kim, Y.J.; Lim, H.K. Planning US for Percutaneous Radiofrequency Ablation of Small Hepatocellular Carcinomas (1–3 Cm): Value of Fusion Imaging with Conventional US and CT/MR Images. J. Vasc. Interv. Radiol. 2013, 24, 958–965. [Google Scholar] [CrossRef] [PubMed]
  150. Song, K.D.; Lee, M.W.; Rhim, H.; Cha, D.I.; Chong, Y.; Lim, H.K. Fusion Imaging–Guided Radiofrequency Ablation for Hepatocellular Carcinomas Not Visible on Conventional Ultrasound. Am. J. Roentgenol. 2013, 201, 1141–1147. [Google Scholar] [CrossRef]
  151. Helck, A.; D’Anastasi, M.; Notohamiprodjo, M.; Thieme, S.; Sommer, W.; Reiser, M.; Clevert, D.A. Multimodality Imaging Using Ultrasound Image Fusion in Renal Lesions. Clin. Hemorheol. Microcirc. 2012, 50, 79–89. [Google Scholar] [CrossRef] [PubMed]
  152. Andersson, M.; Hashimi, F.; Lyrdal, D.; Lundstam, S.; Hellström, M. Improved Outcome with Combined US/CT Guidance as Compared to US Guidance in Percutaneous Radiofrequency Ablation of Small Renal Masses. Acta Radiol. 2015, 56, 1519–1526. [Google Scholar] [CrossRef] [PubMed]
  153. Zhang, H.; Chen, G.; Xiao, L.; Ma, X.; Shi, L.; Wang, T.; Yan, H.; Zou, H.; Chen, Q.; Tang, L. Ultrasonic/CT Image Fusion Guidance Facilitating Percutaneous Catheter Drainage in Treatment of Acute Pancreatitis Complicated with Infected Walled-off Necrosis. Pancreatology 2018, 18, 635–641. [Google Scholar] [CrossRef]
  154. Rübenthaler, J.; Paprottka, K.J.; Marcon, J.; Reiser, M.; Clevert, D.A. MRI and Contrast Enhanced Ultrasound (CEUS) Image Fusion of Renal Lesions. Clin. Hemorheol. Microcirc. 2016, 64, 457–466. [Google Scholar] [CrossRef]
  155. Guo, Z.; Shi, H.; Li, W.; Lin, D.; Wang, C.; Liu, C.; Yuan, M.; Wu, X.; Xiong, B.; He, X. Chinese Multidisciplinary Expert Consensus: Guidelines on Percutaneous Transthoracic Needle Biopsy. Thorac. Cancer 2018, 9, 1530–1543. [Google Scholar] [CrossRef]
  156. Beigi, P.; Salcudean, S.E.; Ng, G.C.; Rohling, R. Enhancement of Needle Visualization and Localization in Ultrasound. Int. J. Comput. Assist. Radiol. Surg. 2021, 16, 169–178. [Google Scholar] [CrossRef]
  157. Holm, H.H.; Skjoldbye, B. Interventional Ultrasound. Ultrasound Med. Biol. 1996, 22, 773–789. [Google Scholar]
  158. Stone, J.; Beigi, P.; Rohling, R.; Lessoway, V.; Dube, A.; Gunka, V. Novel 3D Ultrasound System for Midline Single-Operator Epidurals: A Feasibility Study on a Porcine Model. Int. J. Obstet. Anesth. 2017, 31, 51–56. [Google Scholar] [CrossRef] [PubMed]
  159. Scholten, H.J.; Pourtaherian, A.; Mihajlovic, N.; Korsten, H.H.M.; Bouwman, R.A. Improving Needle Tip Identification during Ultrasound-guided Procedures in Anaesthetic Practice. Anaesthesia 2017, 72, 889–904. [Google Scholar] [CrossRef] [PubMed]
  160. Boctor, E.M.; Choti, M.A.; Burdette, E.C.; Webster Iii, R.J. Three-dimensional Ultrasound-guided Robotic Needle Placement: An Experimental Evaluation. Int. J. Med. Robot. Comput. Assist. Surg. 2008, 4, 180–191. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  161. Franz, A.M.; März, K.; Hummel, J.; Birkfellner, W.; Bendl, R.; Delorme, S.; Schlemmer, H.-P.; Meinzer, H.-P.; Maier-Hein, L. Electromagnetic Tracking for US-Guided Interventions: Standardized Assessment of a New Compact Field Generator. Int. J. Comput. Assist. Radiol. Surg. 2012, 7, 813–818. [Google Scholar] [CrossRef] [PubMed]
  162. Xu, H.-X.; Lu, M.-D.; Liu, L.-N.; Guo, L.-H. Magnetic Navigation in Ultrasound-Guided Interventional Radiology Procedures. Clin. Radiol. 2012, 67, 447–454. [Google Scholar] [CrossRef]
  163. Hakime, A.; Deschamps, F.; De Carvalho, E.G.M.; Barah, A.; Auperin, A.; De Baere, T. Electromagnetic-Tracked Biopsy under Ultrasound Guidance: Preliminary Results. Cardiovasc. Intervent. Radiol. 2012, 35, 898–905. [Google Scholar] [CrossRef]
  164. März, K.; Franz, A.M.; Seitel, A.; Winterstein, A.; Hafezi, M.; Saffari, A.; Bendl, R.; Stieltjes, B.; Meinzer, H.-P.; Mehrabi, A. Interventional Real-Time Ultrasound Imaging with an Integrated Electromagnetic Field Generator. Int. J. Comput. Assist. Radiol. Surg. 2014, 9, 759–768. [Google Scholar] [CrossRef]
  165. Wang, X.L.; Stolka, P.J.; Boctor, E.; Hager, G.; Choti, M. The Kinect as an Interventional Tracking System. In Medical Imaging 2012: Image-Guided Procedures, Robotic Interventions, and Modeling; SPIE: Washington, DC, USA, 2012; Volume 8316, pp. 276–281. [Google Scholar]
  166. Stolka, P.J.; Foroughi, P.; Rendina, M.; Weiss, C.R.; Hager, G.D.; Boctor, E.M. Needle Guidance Using Handheld Stereo Vision and Projection for Ultrasound-Based Interventions. In Medical Image Computing and Computer-Assisted Intervention—MICCAI 2014; Springer: Cham, Switzerland, 2014; pp. 684–691. [Google Scholar]
  167. Najafi, M.; Abolmaesumi, P.; Rohling, R. Single-Camera Closed-Form Real-Time Needle Tracking for Ultrasound-Guided Needle Insertion. Ultrasound Med. Biol. 2015, 41, 2663–2676. [Google Scholar] [CrossRef]
  168. Daoud, M.I.; Alshalalfah, A.-L.; Mohamed, O.A.; Alazrai, R. A Hybrid Camera-and Ultrasound-Based Approach for Needle Localization and Tracking Using a 3D Motorized Curvilinear Ultrasound Probe. Med. Image Anal. 2018, 50, 145–166. [Google Scholar] [CrossRef]
  169. Ho, H.S.S.; Mohan, P.; Lim, E.D.; Li, D.L.; Yuen, J.S.P.; Ng, W.S.; Lau, W.K.O.; Cheng, C.W.S. Robotic Ultrasound-guided Prostate Intervention Device: System Description and Results from Phantom Studies. Int. J. Med. Robot. Comput. Assist. Surg. 2009, 5, 51–58. [Google Scholar] [CrossRef]
  170. Orhan, S.O.; Yildirim, M.C.; Bebek, O. Design and Modeling of a Parallel Robot for Ultrasound Guided Percutaneous Needle Interventions. In Proceedings of the IECON 2015—41st Annual Conference of the IEEE Industrial Electronics Society, Yokohama, Japan, 9–12 November 2015; pp. 5002–5007. [Google Scholar]
  171. Poquet, C.; Mozer, P.; Vitrani, M.-A.; Morel, G. An Endorectal Ultrasound Probe Comanipulator with Hybrid Actuation Combining Brakes and Motors. IEEE/ASME Trans. Mechatron. 2014, 20, 186–196. [Google Scholar] [CrossRef]
  172. Chen, X.; Bao, N.; Li, J.; Kang, Y. A Review of Surgery Navigation System Based on Ultrasound Guidance. In Proceedings of the 2012 IEEE International Conference on Information and Automation, Shenyang, China, 6–8 June 2012; pp. 882–886. [Google Scholar]
  173. Stoll, J.; Ren, H.; Dupont, P.E. Passive Markers for Tracking Surgical Instruments in Real-Time 3-D Ultrasound Imaging. IEEE Trans. Med. Imaging 2011, 31, 563–575. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  174. Li, X.; Long, Q.; Chen, X.; He, D.; He, H. Assessment of the SonixGPS System for Its Application in Real-Time Ultrasonography Navigation-Guided Percutaneous Nephrolithotomy for the Treatment of Complex Kidney Stones. Urolithiasis 2017, 45, 221–227. [Google Scholar] [CrossRef] [PubMed]
  175. Hamamoto, S.; Unno, R.; Taguchi, K.; Ando, R.; Hamakawa, T.; Naiki, T.; Okada, S.; Inoue, T.; Okada, A.; Kohri, K. A New Navigation System of Renal Puncture for Endoscopic Combined Intrarenal Surgery: Real-Time Virtual Sonography-Guided Renal Access. Urology 2017, 109, 44–50. [Google Scholar] [CrossRef]
  176. Gomes-Fonseca, J.; Veloso, F.; Queirós, S.; Morais, P.; Pinho, A.C.M.; Fonseca, J.C.; Correia-Pinto, J.; Lima, E.; Vilaça, J.L. Assessment of Electromagnetic Tracking Systems in a Surgical Environment Using Ultrasonography and Ureteroscopy Instruments for Percutaneous Renal Access. Med. Phys. 2020, 47, 19–26. [Google Scholar] [CrossRef] [PubMed]
  177. Bharat, S.; Kung, C.; Dehghan, E.; Ravi, A.; Venugopal, N.; Bonillas, A.; Stanton, D.; Kruecker, J. Electromagnetic Tracking for Catheter Reconstruction in Ultrasound-Guided High-Dose-Rate Brachytherapy of the Prostate. Brachytherapy 2014, 13, 640–650. [Google Scholar] [CrossRef] [PubMed]
  178. Schwaab, J.; Prall, M.; Sarti, C.; Kaderka, R.; Bert, C.; Kurz, C.; Parodi, K.; Günther, M.; Jenne, J. Ultrasound Tracking for Intra-Fractional Motion Compensation in Radiation Therapy. Phys. Med. 2014, 30, 578–582. [Google Scholar] [CrossRef]
  179. Yu, A.S.; Najafi, M.; Hristov, D.H.; Phillips, T. Intrafractional Tracking Accuracy of a Transperineal Ultrasound Image Guidance System for Prostate Radiotherapy. Technol. Cancer Res. Treat. 2017, 16, 1067–1078. [Google Scholar] [CrossRef] [Green Version]
  180. Jakola, A.S.; Reinertsen, I.; Selbekk, T.; Solheim, O.; Lindseth, F.; Gulati, S.; Unsgård, G. Three-Dimensional Ultrasound–Guided Placement of Ventricular Catheters. World Neurosurg. 2014, 82, 536.e5–536.e9. [Google Scholar] [CrossRef]
  181. Brattain, L.J.; Floryan, C.; Hauser, O.P.; Nguyen, M.; Yong, R.J.; Kesner, S.B.; Corn, S.B.; Walsh, C.J. Simple and Effective Ultrasound Needle Guidance System. In Proceedings of the 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Boston, MA, USA, 30 August–3 September 2011; pp. 8090–8093. [Google Scholar]
  182. Kobayashi, Y.; Hamano, R.; Watanabe, H.; Koike, T.; Hong, J.; Toyoda, K.; Uemura, M.; Ieiri, S.; Tomikawa, M.; Ohdaira, T. Preliminary in Vivo Evaluation of a Needle Insertion Manipulator for Central Venous Catheterization. ROBOMECH J. 2014, 1, 1–7. [Google Scholar] [CrossRef]
Figure 1. Various tracking systems for biomedical US imaging applications.
Figure 1. Various tracking systems for biomedical US imaging applications.
Micromachines 13 01855 g001
Figure 2. Principle of the camera-based optical tracking technique. (a) The three coordinates represent the three lenses integrated in the camera bar. The camera bar is looking at point P(x, y, z), where P1, P2, and P3 are the intersections on the image plane. (b) The top view demonstrates the similar triangles used to calculate the position information. The depth d of point P can be determined via triangulation. Reprinted from [20] with permission.
Figure 2. Principle of the camera-based optical tracking technique. (a) The three coordinates represent the three lenses integrated in the camera bar. The camera bar is looking at point P(x, y, z), where P1, P2, and P3 are the intersections on the image plane. (b) The top view demonstrates the similar triangles used to calculate the position information. The depth d of point P can be determined via triangulation. Reprinted from [20] with permission.
Micromachines 13 01855 g002
Figure 3. Coordinate systems involved in the 3D electromagnetic tracking (adapted from [21]).
Figure 3. Coordinate systems involved in the 3D electromagnetic tracking (adapted from [21]).
Micromachines 13 01855 g003
Figure 4. Two adjacent links with a revolute joint with the relative position denoted using Denavit-Hartenberg convention (adapted from [23]).
Figure 4. Two adjacent links with a revolute joint with the relative position denoted using Denavit-Hartenberg convention (adapted from [23]).
Micromachines 13 01855 g004
Figure 5. The transmitter sends out carrier signals to the receiver set. The unknow transmitter’s position and the speed of sound can be solved based of the time difference of arrival.
Figure 5. The transmitter sends out carrier signals to the receiver set. The unknow transmitter’s position and the speed of sound can be solved based of the time difference of arrival.
Micromachines 13 01855 g005
Figure 6. MPU 9250 and its coordinate system with Euler angles. Reprinted from [27] with permission.
Figure 6. MPU 9250 and its coordinate system with Euler angles. Reprinted from [27] with permission.
Micromachines 13 01855 g006
Figure 9. Schematic illustration of the process of US/MRI fusion. MR and transrectal US (TRUS) images were (1) segmented and then (2) rigidly aligned. Fusion then proceeded, involving (3) a surface registration, and (4) elastic (non-rigid) interpolation. Reprinted from [135] with permission.
Figure 9. Schematic illustration of the process of US/MRI fusion. MR and transrectal US (TRUS) images were (1) segmented and then (2) rigidly aligned. Fusion then proceeded, involving (3) a surface registration, and (4) elastic (non-rigid) interpolation. Reprinted from [135] with permission.
Micromachines 13 01855 g009
Figure 10. Equipment setup for electromagnetic tracking during interventional procedures. Reprinted from [138] with permission.
Figure 10. Equipment setup for electromagnetic tracking during interventional procedures. Reprinted from [138] with permission.
Micromachines 13 01855 g010
Figure 11. (A,B) Photo images of the UroNav US/MRI fusion system. (A) An electromagnetic field generator enables tracking of the transrectal US probe; (B) US/MRI fusion device. (C) Artemis US/MRI fusion system. Reprinted from [144] with permission.
Figure 11. (A,B) Photo images of the UroNav US/MRI fusion system. (A) An electromagnetic field generator enables tracking of the transrectal US probe; (B) US/MRI fusion device. (C) Artemis US/MRI fusion system. Reprinted from [144] with permission.
Micromachines 13 01855 g011
Figure 12. A magnetic navigation system for liver cancer ablation procedures. (a) The setup of the magnetic navigation system; (b) Magnetic field generator and magnetic receivers attached to an US probe. Reprinted from [162] with permission.
Figure 12. A magnetic navigation system for liver cancer ablation procedures. (a) The setup of the magnetic navigation system; (b) Magnetic field generator and magnetic receivers attached to an US probe. Reprinted from [162] with permission.
Micromachines 13 01855 g012
Figure 13. (a) Out-of-plane needle trajectory planning. (b) Schematic of the coordinate systems and needle movement. Reprinted from [167] with permission.
Figure 13. (a) Out-of-plane needle trajectory planning. (b) Schematic of the coordinate systems and needle movement. Reprinted from [167] with permission.
Micromachines 13 01855 g013
Figure 14. A robotic US-guided prostate intervention system. (a) Gantry and US probe holder. (b) Gun-holder and biopsy gun. Reprinted from [169] with permission.
Figure 14. A robotic US-guided prostate intervention system. (a) Gantry and US probe holder. (b) Gun-holder and biopsy gun. Reprinted from [169] with permission.
Micromachines 13 01855 g014
Figure 15. A basic process of 3D US image-guided surgery navigation. Reprinted from [172] with permission.
Figure 15. A basic process of 3D US image-guided surgery navigation. Reprinted from [172] with permission.
Micromachines 13 01855 g015
Figure 16. A navigation system for percutaneous renal puncture. (A) The main components of the system; (B) The magnetic sensor attached to an US probe; (C) The magnetic field generator; (D) US image and CT volume data displaying side by side on the same monitor. Reprinted from [175] with permission.
Figure 16. A navigation system for percutaneous renal puncture. (A) The main components of the system; (B) The magnetic sensor attached to an US probe; (C) The magnetic field generator; (D) US image and CT volume data displaying side by side on the same monitor. Reprinted from [175] with permission.
Micromachines 13 01855 g016
Figure 17. The experimental setup for catheter tracking in a controlled laboratory environment (ac) and in a brachytherapy operating room (d,e). (a) The experimental phantom setup; (b) The flexible electromagnetic-tracked guidewire and catheter; (c) Catheters inserted into the prostate model through the grid; (d) The experimental setup positioned on the treatment table in the operating room; (e) Mimicking a typical brachytherapy setup. Reprinted from [177] with permission.
Figure 17. The experimental setup for catheter tracking in a controlled laboratory environment (ac) and in a brachytherapy operating room (d,e). (a) The experimental phantom setup; (b) The flexible electromagnetic-tracked guidewire and catheter; (c) Catheters inserted into the prostate model through the grid; (d) The experimental setup positioned on the treatment table in the operating room; (e) Mimicking a typical brachytherapy setup. Reprinted from [177] with permission.
Micromachines 13 01855 g017
Figure 18. (a) Needle insertion manipulator for central venous catheterization. (b) Overview of venous puncture experiment in a porcine model. Reprinted from [182] with permission.
Figure 18. (a) Needle insertion manipulator for central venous catheterization. (b) Overview of venous puncture experiment in a porcine model. Reprinted from [182] with permission.
Micromachines 13 01855 g018
Table 1. Summary of commercially available optical tracking systems.
Table 1. Summary of commercially available optical tracking systems.
ManufacturerModelMeasurement Volume
(Radius × Width × Height) or
FOV
ResolutionVolumetric Accuracy (RMS)Average LatencyMeasurement Rate
Northern Digital Inc., Waterloo, ON, CanadaPolaris Vega ST [31]2400 × 1566 × 1312 mm3 (Pyramid Volume:)
3000 × 1856 × 1470 mm3 (Extended Pyramid)
N/A0.12 mm (Pyramid Volume)
0.15 mm (Extended Pyramid)
<16 ms60 Hz
Polaris Vega VT [32]2400 × 1566 × 1312 mm3 (Pyramid Volume)
3000 × 1856 × 1470 mm3 (Extended Pyramid)
N/A0.12 mm (Pyramid Volume)
0.15 mm (Extended Pyramid)
<16 ms60 Hz
Polaris Vega XT [33]2400 × 1566 × 1312 mm3 (Pyramid Volume)
3000 × 1856 × 1470 mm3 (Extended Pyramid)
N/A0.12 mm (Pyramid Volume)
0.15 mm (Extended Pyramid)
<3 ms400 Hz
Polaris Vicra [34]1336 × 938 × 887 mm3N/A0.25 mmN/A20 Hz
ClaroNav Inc., Toronto, ON, CanadaH3-60 [35]2400 × 2000 × 1600 mm31280 × 9600.20 mm~60 ms16 Hz
SX60 [35]1150 × 700 × 550 mm3640 × 4800.25 mm~20 ms48 Hz
HX40 [35]1200 × 1200 × 900 mm31024 × 7680.20 mm~50 ms20 Hz
HX60 [35]2000 × 1300 × 1000 mm31024 × 7680.35 mm~50 ms20 Hz
BTS Bioengineering Corp., Quincy, MA, USASMART DX 100 [36]2000 × 2000 × 2000 mm30.3 MP<0.20 mmN/A280 FPS
SMART DX 400 [36]4000 × 3000 × 3000 mm31.0 MP<0.30 mmN/A300 FPS
SMART DX 700 [36]4000 × 3000 × 3000 mm31.5 MP<0.10 mmN/A1000 FPS
SMART DX 6000 [36]4000 × 3000 × 3000 mm32.2 MP<0.10 mmN/A2000 FPS
SMART DX 7000 [36]6000 × 3000 × 3000 mm34.0 MP<0.10 mmN/A2000 FPS
NaturalPoint, Inc., Corvallis, OR, USAOptiTrack PrimeX 41 [37]FOV 51° × 51°4.1 MP0.10 mm5.5 ms250+ FPS
OptiTrack PrimeX 22 [38]FOV 79° × 47°2.2 MP0.15 mm2.8 ms500+ FPS
OptiTrack PrimeX 13 [39]FOV 56° × 46°1.3 MP0.20 mm4.2 ms1000 FPS
OptiTrack PrimeX 13W [40]FOV 82° × 70°1.3 MP0.30 mm4.2 ms1000 FPS
OptiTrack SlimX 13 [41]FOV 82° × 70°1.3 MP0.30 mm4.2 ms1000 FPS
OptiTrack V120: Trio [42]FOV 47° × 43°640 × 480N/A8.33 ms120 FPS
OptiTrack V120: Duo [43]FOV 47° × 43°640 × 480N/A8.33 ms120 FPS
OptiTrack Flex 13 [44]FOV 56° × 46°1.3 MPN/A8.3 ms120 FPS
OptiTrack Flex 3 [45]FOV 58° × 45°640 × 480N/A10 ms100 FPS
OptiTrack Slim 3U [46]FOV 58° × 45°640 × 480N/A8.33 ms120 FPS
TrackIR 4 [47]46° (Horizontal)355 × 288N/AN/A120 FPS
TrackIR 5 [47]51.7° (Horizontal)640 × 480N/AN/A120 FPS
Qualisys Inc., Gothenburg, SwedenArqus A5 [30]FOV 77° × 62°5 MP (normal) 1MP (high-speed)0.06 mmN/A700 FPS (normal) 1400 FPS (high-speed)
Arqus A9 [30]FOV 82° × 48°9 MP (normal) 2.5 MP (high-speed)0.05 mmN/A300 FPS (normal) 590 FPS (high-speed)
Arqus A12 [30]FOV 70° × 56°12 MP (normal) 3 MP (high-speed)0.04 mmN/A300 FPS (normal) 1040 FPS (high-speed)
Arqus A26 [30]FOV 77° × 77°26 MP (normal) 6.5 MP (high-speed)0.03 mmN/A150 FPS (normal) 290 FPS (high-speed)
Miqus M1 [48]FOV 58° × 40°1 MP0.14 mmN/A250 FPS
Miqus M3 [48]FOV 80° × 53°2 MP (normal) 0.5 MP (high-speed)0.11 mmN/A340 FPS (normal) 650 FPS (high-speed)
Miqus M5 [48]FOV 49° × 49°4 MP (normal) 1 MP (high-speed)0.07 mmN/A180 FPS (normal) 360 FPS (high-speed)
Miqus Hybrid [49]FOV 62° × 37°2 MPN/AN/A340 FPS
3+ [50]N/A1.3 MP (normal) 0.3 MP (high-speed)N/AN/A500 FPS (normal) 1750 FPS (high-speed)
5+ [50]49° (Horizontal)4 MP (normal) 1 MP (high-speed)N/AN/A180 FPS (normal) 360 FPS (high-speed)
6+ [50]56° (Horizontal)6 MP (normal) 1.5 MP (high-speed)N/AN/A450 FPS (normal) 1660 FPS (high-speed)
7+ [50]54° (Horizontal)12 MP (normal) 3 MP (high-speed)N/AN/A300 FPS (normal) 1100 FPS (high-speed)
Vicon Industries Inc., Hauppauge, NY, USAValkyrie VK26 [51]FOV 72° × 72°26.2 MPN/AN/A150 FPS
Valkyrie VK16 [51]FOV 72° × 56°16.1 MPN/AN/A300 FPS
Valkyrie VK8 [51]FOV 72° × 42°8.0 MPN/AN/A500 FPS
Valkyrie VKX [51]FOV 66° × 66°7.2 MPN/AN/A380 FPS
Vantage+ V16 [52]FOV 76.4° × 76.4°16 MP (normal) 4.2 MP (high-speed)N/A8.3 ms120 FPS (normal) 500 FPS (high-speed)
Vantage+ V8 [52]FOV 61.7° × 47°8 MP (normal) 2.2 MP (high-speed)N/A5.5 ms260 FPS (normal) 910 FPS (high-speed)
Vantage+ V5 [52]FOV 63.5° × 55.1°5 MP (normal) 1.8 MP (high-speed)N/A4.7 ms420 FPS (normal) 1070 FPS (high-speed)
Vero v2.2 [53]FOV 98.1° × 50.1°2.2 MPN/A3.6 ms330 FPS
Vero v1.3 [53]FOV 55.2° × 43.9°1.3 MPN/A3.4 ms250 FPS
Vero v1.3 X [53]FOV 79.0° × 67.6°1.3 MPN/A3.4 ms250 FPS
Vero Vertex [53]FOV 100.6° × 81.1°1.3 MPN/A3.4 ms120 FPS
Vue [54]FOV 82.7° × 52.7°2.1 MPN/AN/A60 FPS
Viper [55]FOV 81.8° × 49.4°2.2 MPN/A3.2 ms240 FPS
ViperX [56]FOV 50.2° × 50.2°6.3 MPN/A3.2 ms240 FPS
Atracsys LLC., Puidoux, SwitzerlandfusionTrack 500 [57]2000 × 1327 × 976 mm32.2 MP0.09 mm~ 4 ms335 Hz
fusionTrack 250 [58]1400 × 1152 × 900 mm32.2 MP0.09 mm~ 4 ms120 Hz
spryTrack 180 [59]1400 × 1189 × 1080 mm31.2 MP0.19 mm<25 ms54 Hz
spryTrack 300 [60]1400 × 805 × 671 mm31.2 MP0.14 mm<25 ms54 Hz
Motion Analysis Corp., Rohnert Park, CA, USAKestrel 4200 [61]N/A4.2 MPN/AN/A200 FPS
Kestrel 2200 [62]N/A2.2 MPN/AN/A332 FPS
Kestrel 1300 [63]N/A1.3 MPN/AN/A204 FPS
Kestrel 300 [64]N/A0.3 MPN/AN/A810 FPS
STT Systems, Donostia-San Sebastian, SpainEDDO Biomechanics [65]N/AN/A1 mmN/A120 FPS
Advanced Realtime Tracking GmbH & Co. KG, Oberbayern, GermanyARTTRACK6/M [66]FOV 135° × 102°1280 × 1024N/AN/A180 Hz
ARTTRACK5 [67]FOV 98° × 77°1280 × 1024N/A10 ms150 Hz
SMARTTRACK3 [68]FOV 135° × 102°1280 × 1024N/A9 ms150 Hz
Table 2. Summary of commercially available electromagnetic tracking systems.
Table 2. Summary of commercially available electromagnetic tracking systems.
ManufacturerModelTracking DistancePosition Accuracy (RMS)Orientation Accuracy (RMS)Average LatencyMeasurement Rate
Northern Digital Inc., Waterloo, ON, CanadaAurora-Cube Volume-5DOF [70]N/A0.70 mm0.2°N/A40 Hz
Aurora-Cube Volume-6DOF [70]N/A0.48 mm0.3°N/A40 Hz
Aurora-Dome Volume-5DOF [70]660 mm1.10 mm0.2°N/A40 Hz
Aurora-Dome Volume-6DOF [70]660 mm0.70 mm0.3°N/A40 Hz
3D Guidance trakSTAR-6DOF [71]660 mm1.40 mm0.5°N/A80 Hz
3D Guidance driveBAY-6DOF [71]660 mm1.40 mm0.5°N/A80 Hz
Polhemus Inc., Colchester, VT, USAViper [72]N/A0.38 mm0.10°1 ms960 Hz
Fastrak [73]N/A0.76 mm0.15°4 ms120 Hz
Patriot [74]N/A1.52 mm0.40°18.5 ms60 Hz
Patriot Wireless [75]N/A7.62 mm1.00°20 ms50 Hz
Liberty [76]N/A0.76 mm0.15°3.5 ms240 Hz
Liberty Latus [77]N/A2.54 mm0.50°5 ms188 Hz
G4 [78]N/A2.00 mm0.50°<10 ms120 Hz
Table 3. Summary of commercially available inertial tracking systems.
Table 3. Summary of commercially available inertial tracking systems.
ManufacturerModelStatic Accuracy (Roll/Pitch)Static Accuracy (Heading)Dynamic Accuracy (Roll/Pitch)Dynamic Accuracy (Heading)Average LatencyUpdate Rate
Xsens Technologies B.V., Enschede, The NetherlandsMTw Awinda [82]0.5°1.0°0.75°1.5°30 ms120 Hz
Xsens DOT [81]0.5°1.0°1.0°2.0°30 ms60 Hz
MTi-1 [83]0.5°N/AN/AN/AN/A100 Hz
MTi-2 [83]0.5°N/A0.8°N/AN/A100 Hz
MTi-3 [83]0.5°N/A0.8°2.0°N/A100 Hz
MTi-7 [83]0.5°N/A0.5°1.5°N/A100 Hz
MTi-8 [83]0.5°N/A0.5°1.0°N/A100 Hz
MTi-20 [84]0.2°N/A0.5°N/AN/AN/A
MTi-30 [84]0.2°N/A0.5°1.0°N/AN/A
MTi-200 [84]0.2°N/A0.3°N/A<10 msN/A
MTi-300 [84]0.2°N/A0.3°1.0°<10 msN/A
MTi-710 [84]0.2°N/A0.3°0.8°<10 ms400 Hz
MTi-610 [85]N/AN/AN/AN/AN/A400 Hz
MTi-620 [85]0.2°N/A0.25°N/AN/A400 Hz
MTi-630 [85]0.2°N/A0.25°1.0°N/A400 Hz
MTi-670 [85]0.2°N/A0.25°0.8°N/A400 Hz
MTi-680 [85]0.2°N/A0.25°0.5°N/A400 Hz
STT Systems, Donostia-San Sebastian, SpainiSen system [86]N/AN/A<0.5°<2.0°N/A400 Hz
VectorNav Technologies, Dallas, TX, USAVN-100 [87]0.5°N/A1.0°2.0°N/A800 Hz
VN-110 [88]0.05°N/AN/A2.0°N/A800 Hz
VN-200 [89]0.5°2.0°0.2°, 1σ0.03°, 1σN/A800 Hz
VN-210 [90]0.05°2.0°0.015°, 1σ0.05–0.1°, 1σN/A800 Hz
VN-300 [91]0.5°2.0°0.03°, 1σ0.2°, 1σN/A400 Hz
VN-310 [92]0.05°2.0°0.015°, 1σ0.05–0.1°, 1σN/A800 Hz
Advanced Navigation, Sydney, AustraliaMotus [93]0.05°0.8°N/AN/AN/A1000 Hz
Orientus [94]0.2°0.8°0.6°1.0°0.3 ms1000 Hz
Boreas D90 [95]0.005° *0.006° *N/AN/AN/A1000 Hz
Spatial FOG Dual [96]0.005° *0.007°N/AN/AN/A1000 Hz
Certus Evo [97]0.01° *0.01°N/AN/AN/A1000 Hz
Certus [98]0.03° *0.06°N/AN/AN/A1000 Hz
Spatial [99]0.04° *0.08°N/AN/A0.4 ms1000 Hz
GNSS Compass [100]0.4°0.4°N/AN/AN/A200 Hz
Inertial Labs, Paeonian Springs, VA, USAKERNEL-100 [101]0.05°0.08°N/AN/A<1 ms2000 Hz
KERNEL-110 [102]0.05°0.08°N/AN/A<1 ms2000 Hz
KERNEL-120 [102]0.05°0.08°N/AN/A<1 ms2000 Hz
KERNEL-210 [103]0.05°0.08°N/AN/A<1 ms2000 Hz
KERNEL-220 [103]0.05°0.08°N/AN/A<1 ms2000 Hz
IMU-P [104]0.05°0.08°N/AN/A<1 ms2000 Hz
Table 5. U.S. Food and Drug Administration approved US/MRI fusion system [147].
Table 5. U.S. Food and Drug Administration approved US/MRI fusion system [147].
System TypeManufacturerYear of FDA ApprovalUS Image AcquisitionTracking PrincipleBiopsy Route
UroNavPhilips2005Manual sweepElectromagnetic trackingTransrectal
ArtemisEigen2008Manual rotationMechanical armTransrectal
UrostationKoelis2010Automatic US probe rotationReal-time registrationTransrectal
HI-RVSHitachi2010Real-time biplanar transrectal USElectromagnetic trackingTransrectal or
transperineal
GeoScanBioJet2012Manual sweepMechanical armTransrectal or
transperineal
Table 6. A summary of US image fusion applications.
Table 6. A summary of US image fusion applications.
ReferenceModality for FusionTracking PrincipleApplication
Park et al. [148]Liver CT or MRIElectromagnetic trackingBiopsy of focal hepatic lesions with poor conspicuity on conventional B-mode US image
Lee et al. [149]Liver CT or MRIElectromagnetic trackingLesion detection of small hepatocellular carcinomas (HCCs)
Song et al. [150]Liver CT or MRIPlane registration
and point registration
Improve sonographic conspicuity of HCC and feasibility of percutaneous radiofrequency ablation for HCCs not visible on conventional US images
Helck et al. [151]Renal CT or MRIElectromagnetic trackingIdentifiability and assessment of the dignity of renal lesions
Andersson et al. [152]Renal CTElectromagnetic trackingImage-guided percutaneous radiofrequency ablation
of small renal masses
Zhang et al. [153]Pancreatic CTReal-time registrationImage-guided percutaneous catheter drainage in treatment of acute pancreatitis
Klauser et al. [143]Musculoskeletal CTInternal landmarksImage-guided sacroiliac joint injection
Lee et al. [141]Thigh MRIReal-time registrationSelecting the appropriate biopsy site in patients with suspected myopathies
Rubenthaler et al. [154]Renal MRI/contrast enhanced USElectromagnetic trackingClassification of unclear and difficult renal lesions
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Peng, C.; Cai, Q.; Chen, M.; Jiang, X. Recent Advances in Tracking Devices for Biomedical Ultrasound Imaging Applications. Micromachines 2022, 13, 1855. https://doi.org/10.3390/mi13111855

AMA Style

Peng C, Cai Q, Chen M, Jiang X. Recent Advances in Tracking Devices for Biomedical Ultrasound Imaging Applications. Micromachines. 2022; 13(11):1855. https://doi.org/10.3390/mi13111855

Chicago/Turabian Style

Peng, Chang, Qianqian Cai, Mengyue Chen, and Xiaoning Jiang. 2022. "Recent Advances in Tracking Devices for Biomedical Ultrasound Imaging Applications" Micromachines 13, no. 11: 1855. https://doi.org/10.3390/mi13111855

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop