Next Article in Journal
An End-to-End Recurrent Neural Network for Radial MR Image Reconstruction
Next Article in Special Issue
Navigation Path Based Universal Mobile Manipulator Integrated Controller (NUMMIC)
Previous Article in Journal
Prediction Models for Railway Track Geometry Degradation Using Machine Learning Methods: A Review
Previous Article in Special Issue
Smart Vehicle Path Planning Based on Modified PRM Algorithm
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Research on Monocular-Vision-Based Finger-Joint-Angle-Measurement System

Faculty of Mechanical Engineering & Mechanics, Ningbo University, Ningbo 315211, China
*
Author to whom correspondence should be addressed.
Sensors 2022, 22(19), 7276; https://doi.org/10.3390/s22197276
Submission received: 21 August 2022 / Revised: 19 September 2022 / Accepted: 21 September 2022 / Published: 26 September 2022
(This article belongs to the Special Issue Advanced Intelligent Control in Robots)

Abstract

:
The quantitative measurement of finger-joint range of motion plays an important role in assessing the level of hand disability and intervening in the treatment of patients. An industrial monocular-vision-based knuckle-joint-activity-measurement system is proposed with short measurement time and the simultaneous measurement of multiple joints. In terms of hardware, the system can adjust the light-irradiation angle and the light-irradiation intensity of the marker by actively adjusting the height of the light source to enhance the difference between the marker and the background and reduce the difficulty of segmenting the target marker and the background. In terms of algorithms, a combination of multiple-vision algorithms is used to compare the image-threshold segmentation and Hough outer- and inner linear detection as the knuckle-activity-range detection method of the system. To verify the accuracy of the visual-detection method, nine healthy volunteers were recruited for experimental validation, and the experimental results showed that the average angular deviation in the flexion/extension of the knuckle was 0.43° at the minimum and 0.59° at the maximum, and the average angular deviation in the adduction/abduction of the knuckle was 0.30° at the minimum and 0.81° at the maximum, which were all less than 1°. In the multi-angle velocimetry experiment, the time taken by the system was much less than that taken by the conventional method.

1. Introduction

The quantitative measurement of hand-joint range of motion (ROM) is important for clinicians to assess a patient’s level of hand disability and the effectiveness of intervention therapy. In the clinical setting, knuckle goniometers are often used to measure ROM due to their ease of use, portability, and affordability. However, these devices are time-consuming for single-joint angle measurements and do not allow simultaneous multi-joint angle measurements. Many experts and scholars have conducted in-depth research in the field of knuckle-angle measurement, including wearable-sensor-based knuckle-angle-measurement methods and vision-based knuckle-angle-measurement methods. Okuyama et al. developed a finger-joint-angle-measurement system based on flexible polymer sensors [1]. The system measures the flexion/extension movement of fingers by installing flexible polymer sensors on the surfaces of fingers, which can realize the detection of joint-angle changes during daily grasping movements. A three-dimensional (3-D) finger-motion-measurement system based on a soft sensor was proposed by Park et al. [2]. Changcheng et al. designed an integrated mechanical-sensor detection system, consisting of an angle-measurement device and a measurement circuit in order to achieve finger-joint measurement [3]. The effectiveness of the system was verified by joint-angle measurement, motion-law evaluation and object-grasping experiments, and the experimental results showed that the root mean square (RMS) of the DIP, PIP, and MCP angle-measurement errors were 0.36, 0.59, and 0.32 degrees, respectively [3]. It has been found that these wearable-sensor-based finger-joint-angle measurement methods have high accuracy in measuring finger joint angles, but the difficulty in wearing them has not been effectively solved in clinical applications for patients with hand motor dysfunction [4,5,6,7,8,9,10].
Vision-based knuckle-angle-measurement systems could realize the dynamic measurement of multi-joint angles without involving direct physical contact between the doctor and the patient’s hand. Vision-based measurement systems work by first capturing an image of the entire hand and then using computer-vision techniques to estimate the hand posture [11,12,13,14,15]. Commercial devices (such as Leap Motion) are currently used for hand-angle measurement [16,17] and, recently, they have been used in virtual-reality headsets (such as Facebook’s OculusQuest and Microsoft’s HoloLens2) equipped with hand tracking for human–computer interaction. The two main problems faced by current vision-based hand-posture estimation systems are the low accuracy of the knuckle-angle measurement and the high level of restriction on the camera view [18]. Lee J.W. et al. proposed a method of measuring finger-joint angles and finger forces in the process of maximum cylindrical grip using a multi-camera photogrammetric method with markers and a pressure-sensitive film, respectively [19]. The experimental results showed that this method can be used to judge the extension/flexion direction of the knuckle.
An industrial monocular-vision-based knuckle-angle-measurement system based on the existing computer-vision detection system is proposed in this paper [20]. This knuckle-angle-measurement system consists of a hardware system, a vision system, and a control system. The hand visual markers in the hardware system can simplify the difficulty of knuckle identification, and the use of high-resolution cameras can greatly improve the accuracy of the knuckle-angle detection. The active multi-angle light-detection system consisting of the control system, hardware system, and specified light source can adjust the light-irradiation angle and light-source-irradiation intensity to the marker by adjusting the height of the light source, thus enhancing the difference between the marker and the background, making the marker easy to the segment from the background and simplifying the marker-segmentation process.

2. Biological Structure of Human Fingers and Their Movement Characteristics

2.1. Structural Composition of the Human Hand

The human hand consists of the index finger (IF), middle finger (MF), ring finger (RF), little finger (LF), and thumb (TUM). The IF, MF, RF, and LF consist of one degree of freedom (DOF) distal phalangeal (DIP), one DOF proximal phalangeal (PIP), and two-DOF metacarpophalangeal (MCP) and two-DOF carpometacarpal (CMC) joints, respectively. The thumb consists of a one-DOF distal phalangeal joint (IP), a two-DOF metacarpophalangeal joint (MCP), and a two-DOF carpometacarpal joint (TM) [21], as shown in Figure 1.

2.2. Finger-Movement Characteristics

The movement of hand joints is mainly manifested by the abduction/adduction and flexion/extension movements of the four fingers and the thumb. The movement of human fingers has the following characteristics: (1) the DIP and PIP joints of the four fingers other than the thumb are bound to each other and meet; (2) when the MCP joint of the four fingers other than the thumb is flexed, the adjacent MCP joint is also flexed. According to the Evaluation of Rehabilitation Therapy, the ROM of the human finger joint and traditional measurement methods can be determined, as shown in Figure 2.

3. Experimental-Platform Construction

Machine-vision technology has been developed, including hardware and software, but in the computer-vision measurement system, the design and layout of the lighting system is still a pivotal link, which can often significantly affect the performance of the vision-measurement system. A good illumination system can greatly enhance the difference between the measurement target and the measurement background, improve the system imaging, and make the target easier to identify and segment, thus simplifying the time and hardware cost required for program calculation. The different arrangements of light-source systems in the field of defect detection are often divided into passive multi-angle illumination-detection methods and active multi-angle illumination-detection methods. Considering the different characteristics of the two lighting methods, the active multi-angle lighting-detection method was selected as the light source arrangement method in the experimental platform.

3.1. Design of Experimental Platform

The core of the active multi-angle light-source detection method is the machine-vision-detection part; therefore, the quality of the acquired images and the speed of the image processing have a greater impact on the visual-detection effect. The quality of the camera hardware determines the quality of the image acquisition, and a high-performance, high-resolution camera can produce image data containing clear features under the irradiation of a highly stable light source, while a clear image is the basis for ensuring the stable operation of the image-processing algorithm and the detection effect of the system, which shows that the selection and design of the detection hardware are also particularly important. Based on the finger-joint-angle-measurement-system scheme, the actual system built in this study is shown in Figure 3. In Figure 3, Figure 3a represents the angle detection in the finger flexion/extension state, and Figure 3b represents the angle detection in the finger abduction/adduction state. Through this platform, high-quality multi-angle light-source-irradiated multivariate images can be acquired; subsequently, through the PC image-processing algorithm, these can be processed to segment the finger-joint identifiers in the image for the subsequent calculation of the finger-joint angle and length.

3.2. Light-Source Selection and Solution of the Single-Reflection Matrix

Industrial cameras are at the core of the vision-inspection system, and their main role is to convert the optical signal into an electrical signal and transmit it to the processing unit. As the most important part of the industrial camera, the light-sensitive element is of two main types: CCD (charge-coupled element) and CMOS (complementary metal oxide semiconductor). Furthermore, CCD technology is more widely used. Industrial cameras have many important parameters, such as resolution, shutter time, external trigger, frame rate, etc. Therefore, the vision-inspection system should take into account the needs of the inspection task to select the most appropriate camera. Depending on the interface type of the camera, it can be divided into USB, GigE, and camera link. Considering the advantages of the data-transmission speed, ease of use, and data-transmission distance, the GigE interface camera in Basler ace was selected.
In the inspection system, the choice of industrial lens directly affects the quality of the captured image. The industrial-lens parameters, such as interface type and CCD size, should be matched with the industrial camera. In addition, the aperture of the lens controls the light intake of the industrial camera, which exerts a direct impact on the brightness of the image; the focal length directly affects the size of the field of view, representing the vertical distance from the imaging plane to the center of the lens. Considering these lens characteristics, the lens selected in this study was TEC-V7X.
The light source is another important component of the visual inspection system, which is to determine the key to clear and stable imaging. The choice of the light source should highlight the object to be detected. According to the classification of light-emitting devices in the light source, the light source can be divided into fluorescent lamps, LED lamps, halogen lamps, etc., of which LED lamps are the most common. The light source selected for this paper was the ring light source of model R50-26-13, developed by Huakang Technology Company.
The transformation of the camera coordinate system, x-y-z, into the two-dimensional image coordinate system, u-v, is shown in Equation (1).
u v 1 = s f x γ u 0 0 f y v 0 0 0 1 r 1 r 2 t x W y W 1
where f x γ u 0 0 f y v 0 0 0 1 is the internal reference matrix of the camera and r 1 r 2 t is the external reference matrix of the camera. This leads to the formula for calculating the single-response matrix of the camera and the conversion formula for converting the pixel coordinates of the image to world coordinates as:
H = s f x γ u 0 0 f y v 0 0 0 1 r 1 r 2 t = s M r 1 r 2 t s X = H 1 x
where H is the single-response matrix, x is the pixel coordinate in the image, and X is the world coordinate.
The above coordinate-system-conversion Equation (2) is used to obtain the single response matrix H from the pixel-coordinate system to one of the plane-coordinate systems (W) in space. Using H, two points in the pixel-coordinate system can be converted into W. The distance s1 between two points in W is calculated, after which a ruler is used to directly measure the actual distance s2 between the corresponding two points in W. The error result of comparing s1 and s2 is 0.073 mm. However, when the relative distance between W and the camera changes, the error between s1 and s2 becomes dramatically larger. Therefore, during the finger-joint-angle measurement, the position of the detection plane relative to the camera should always be constant, and H should be updated in time when the distance of the camera relative to the detection plane changes.

4. Vision-Based Finger-Joint-Angle-and-Length-Detection Method

The finger-joint angle-and-length-detection method proposed in this paper is a joint-angle-detection method for visual-identifier-segmentation reprocessing. The method mainly consists of finger-joint-identifier pasting and image acquisition, visual identifier segmentation, the edge detection of visual identifiers, and joint-angle calculation based on the different joint identifiers of the finger. In the visual-identifier-segmentation method, the HSV color-space-conversion method and image-threshold segmentation method were adopted in this study to segment the finger-joint identifiers in the image. In the finger-joint-angle-calculation method, the inner and outer edge Hough straight-line-detection method and the least-squares method of fitting a straight line are used. Therefore, a finger-joint-angle image produces 2 × 2 joint angles and lengths, and the method that is ultimately closest to the real joint angle was selected as the finger-angle detection method for this paper by comparing the four joint angles with the real joint angle.

4.1. Vision-Based Finger-Joint-Angle-and-Length-Detection Method

When detecting the angle of each finger joint, firstly, the position of each finger bone in the image is identified and, secondly, the position and joint angle of each finger joint by the intersection point and the angle between each finger bone are identified. A finger-joint identifier for which it was easy to perform image segmentation was used for the identification of finger phalanges in the image. The finger-joint identifiers of different scales are shown in Figure 4a, and the most suitable finger-joint identifier was selected by comparing the accuracy of the angle detection of the identifiers at different scales. Figure 4b shows the method of attaching the finger-joint identifiers.
Since the light-source intensity and light-irradiation angle have a significant impact on the segmentation and extraction of finger-joint markers, the height of the light source can be adjusted to alter the light-irradiation angle and the light-source-irradiation intensity of the markers, enhancing the difference between the markers and the background, making it easy to segment the markers from the background and simplifying the marker-segmentation process. The image-acquisition method based on the active multi-angle light-source detection method is shown in Figure 5: (a) represents high angle lighting; (b) represents medium angle lighting; (c) represents low angle lighting.

4.2. Visual Marker Segmentation Methods

To obtain a better finger-joint-angle-detection algorithm, this paper uses the HSV color-space-conversion method and the image-threshold-segmentation method to extract the target finger-joint identifier in the image and different edge-detection algorithms to obtain the identifier edge coordinates and then calculates each finger-joint pinch angle by two different finger-joint-angle-detection algorithms.
(1)
HSV color-space-marker-segmentation extraction with Canny edge detection
In HSV color space, H denotes color, S denotes shade when S = 0 only grayscale image, and V denotes light and dark, indicating the brightness of the color [22,23]. The conical model of HSV color space can be formed by erecting and flattening the central axis of the RGB-color-space 3D coordinates. The RGB–HSV color-space-conversion equations are shown in Equations (3)–(5).
V = max ( R , G , B )
S = V min ( R , G , B ) V V 0 0 other
H = 60 ( G B ) / ( V min ( R , G , B ) ) V = R 120 + 60 ( B R ) / ( V min ( R , G , B ) ) V = G 240 + 60 ( R G ) / ( V min ( R , G , B ) ) V = B
In Equations (3)–(5), R, G, and B denote the three components of the three-dimensional coordinate axes in the RGB color space. The setting ranges of the three components of HSV are H: 100~130, S: 150~255, V: 130~255. The results of the specified color-region extraction are shown in Figure 6b. Canny edge detection is currently a commonly used edge-detection algorithm. It was proposed by John Canny in 1986 [23]. It is a multi-stage algorithm consisting of image-noise reduction, the computation of the image gradient, non-maximal value suppression, and threshold screening. Its formula for image-gradient calculation for edge detection is shown in Equation (6).
G = G x 2 + G y 2 θ = atan 2 G y , G x
The θ in Equation (6) represents the gradient angle range of −π~π, which can be approximated as four angles, 0°, 45°, 90°, and 135°, representing the horizontal, vertical, and two diagonal directions, respectively. The Canny operator edge-extraction results are shown in Figure 7c.
(2)
Image thresholding method with edge-contour extraction
The use of image segmentation to separate the target region from the background region can prevent the need to conduct a blind search on the image and greatly improve the processing efficiency of the image [24,25]. Threshold segmentation based on the grayscale histogram is simple to compute and is suitable for grayscale images where the target and background are distributed in different grayscale ranges, as shown in Figure 7 for the histogram of the original image.
The image-segmentation formula based on different thresholds is shown in Equation (7), where T is the gray threshold; f(xi,yi) is the gray level of the detected image point, and A and A ¯ are the set gray level of the current position image. In this study, the gray level of the target image was set as 0, and the gray level of the other images was set as 255. The above operation was performed simultaneously by scanning the image by a line from two directions using a raster scan, which can prevent missing image information for various reasons, as shown in Figure 8a for the image after threshold segmentation. Next, the image contours were detected by the fine-contours function in OpenCV and, finally, the contours of the target identifier were filtered out automatically based on the similarity of the contour-enclosing area. The results of the target-identifier contour detection are shown in Figure 8b.
g x i , y i = A ¯ if   f x i , y i > T A if   f x i , y i T

4.3. Joint-Angle-Calculation Method Based on Different Joint Identifiers of the Finger

(1)
Hough straight-line detection method for inner and outer edges
The Hough transform was improved by Richard Duda in 1972. The method transforms a point in the data space into a curve in the ρ - θ parameter space so that points with the same reference-quantity characteristics intersect in the reference space after transformation. Subsequently, the detection of the characteristic straight line is completed by judging the accumulation degree at the intersection point. The expression formula of a straight line in the data space is shown in Equation (8), where k denotes the slope and b denotes the intercept.
y = k x + b
The standard straight-line Hough transform uses the following parametric straight-line formula, as shown in Equation (9), where ρ is the perpendicular distance from the origin to the line and θ is the angle between ρ and the x-axis.
x cos θ + y sin θ = ρ
When different points on a straight line in the data space are transformed into a family of sinusoidal curves intersecting at p points in the parameter space, the detection of a straight line in the data space can be achieved by detecting the local maximum p points in the parameter space. The results of the detection of the inner and outer Hough straight lines for the target identifier are shown in Figure 9. Figure 9a represents the detection results of the Hough line on the outside of the HSV segmentation; Figure 9b represents the detection results of the Hough line inside the HSV segmentation; Figure 9c represents the detection results of the Hough line outside the threshold segmentation; and Figure 9d represents the detection results of the Hough line inside the threshold segmentation. The inner- and outer-edge Hough straight-line-detection method detects four straight lines on the inner edge and four straight lines on the outer edge of each identifier, after which the angle of each knuckle on the inner side of the identifier and the angle of each knuckle on the outer side are calculated using the finger-joint-angle-calculation method, and finally, the angle of each knuckle is found as θ i = θ i w + θ i n 2 (i = 1, 2, 3).
(2)
Least-squares fitting of the target identifier profile
The least-squares method was discovered by Legendre in the 19th century and takes the form shown in Equation (10). In Equation (10), y i is the observed value, i.e., multiple samples, and y is the theoretical value, i.e., the assumed fit function. S ϵ 2 is the objective function, i.e., the loss function, and the objective of the least-squares method is to model the fit function when the objective function is minimized.
S ϵ 2 = y y i 2
To fit the four joint identifiers in the image as four straight lines, this paper assumes that the number of contour coordinates of each joint identifier is n. Assume that the equation of the straight line is y = ax + b, where a is the slope of the line and b is the intercept of the line. The least-squares method is used to solve for a and b, whose formulas are shown in Equation (11). The results of the least-squares method for fitting the straight line to the pixel points of the target identifier are shown in Figure 10. Figure 10a represents the line-fitting result of the HSV-segmentation least-squares method. Figure 10b represents the line-fitting result of the threshold-segmentation least-squares method.
b = i = 1 N x i 2 i = 1 N y i i = 1 N x i i = 1 N x i y i N i = 1 N x i 2 i = 1 N x i 2 a = N i = 1 N x i y i i = 1 N x i i = 1 N y i N i = 1 N x i 2 i = 1 N x i 2
(3)
Finger-joint-angle-calculation method
The relevant lines of finger-joint markers can be obtained by the above linear-detection methods. According to these lines, the head and tail coordinates of the four relevant lines of the four joint markers can be obtained, after which the angle between the joints of the fingers can be calculated by the formula of the angle between the two-dimensional vectors, as shown in Equation (12).
θ i = arccos a i b j a i a j
In Equation (12), a i and b j are the vectors of two adjacent phalangeal identifiers and θ i is the knuckle-joint angle. The finger-joint-angle measurements using different methods are shown in Table 1. The experiments showed better results with high-angle illumination. The results obtained for the detection of the human-hand model in the case of high-angle illumination are shown in the Table 1. HSV–HOISLM represents the HSV + Hough outer- and inner-straight-line method; HSV–LSFLKADM represents HSV + the method of least-squares-fitting linear-knuckle-angle detection; TS–HOMLDM represents the threshold segmentation + Hough outer medial linear-detection method; TS–LSFLM represents the threshold-segmentation + least-squares-fitting-line method; and TKAM represents traditional knuckle-angle measurement, as shown in Figure 2.
As can be seen from Table 1, the accuracy and reliability of the visual-based finger-joint-angle measurement method were demonstrated by comparing the measurement results of multiple visual-finger-joint-angle-measurement methods with those of the conventional finger-joint-angle measurement method, in which the angular deviation between the visual-based finger-joint-angle-measurement results and the conventional finger-joint-angle-measurement results were in the range of 0° to 2°. The maximum deviation in the comparison with the conventional knuckle-angle-measurement method was 2°, the knuckle where the maximum deviation was located was the DIP joint, and the visual-angle-measurement method that caused the maximum deviation was the HSV–LSFLKADM. The visual-angle-measurement method with the smallest mean value of the deviation of the finger-joint angle in comparison with the traditional finger-joint-angle measurement method was the TS–HOMLDM; therefore, this method was selected as the finger-joint-detection method for this paper.

5. Experimental Verification

In this study, nine healthy male volunteers aged between 20 and 25 were recruited for the experiment, and three different finger-joint angles were detected using the TS–HOMLDM for visual identifiers with widths of 1.5 mm, 2 mm, and 2.5 mm, respectively, to verify the monocular vision-based finger-joint-angle measurement system (MVBFJAMS) proposed in this paper to measure the accuracy of the test in comparison with the traditional inspection method and to determine the most appropriate visual identifier width. To ensure the reliability of the experiment, we invited professional physicians to measure different volunteer knuckle angles using the traditional method first, after which our group members measured different volunteer knuckle angles using MVBFJAMS. To verify the accuracy of the MVBFJAMS for finger-joint-angle measurement during finger extension/contraction, a control experiment was conducted using the conventional measurement method and the visual measurement method. This paper also verifies the speed of the knuckle detection by the visual inspection method by comparing the time used to detect and record 30 joint-angle data by the traditional method and the visual-inspection method. Table 2 shows the knuckle-joint-retention angles for different volunteers with different markers to verify the accuracy of the visual-detection method. The finger-bone-length data are not given because the actual joint position of the finger was uncertain.
The detection method in Figure 3a was adopted for the volunteers, and the detection results for the knuckle accuracy of the different volunteers at different scales of visual markers were obtained, as shown in Table 3.
From Table 2 and Table 3, the deviations from the mean knuckle angle at different scale markers, shown in Figure 11, can be calculated.
As shown in Figure 11, the minimum-knuckle-angle mean deviation was 0.27° and the maximum-knuckle-angle mean deviation was 1.38° for the nine volunteers using visual identifiers at different scales. The knuckle-angle deviations for the nine volunteers using visual identifiers at a scale of 1.5 mm were 0.43°, 0.47°, 0.58°, 0.27°, 0.45°, 0.5°, 0.5°, 0.59°, and 0.51°, which were much smaller than the mean deviation of the knuckle angle when using other scales of visual identifiers. Therefore, the scale of a 1.5-millimeter visual marker was chosen as the test condition for the subsequent experiments. To verify the accuracy of the finger-abduction angle, three different finger-abduction-joint angles were measured using visual measures on nine volunteers, and the accuracy of the angles was verified using conventional methods. The results of the measurement of the three different abduction-joint angles are shown in Table 4.
In Table 4, Vmm represents the visual measurement method and Tmm represent the traditional measurement method. As shown in Table 4, the maximum and minimum knuckle-angle deviations of the nine volunteers were 0.81° and 0.30°, respectively. The mean values of the knuckles were 0.63°, 0.68°, 0.77°, 0.49°, 0.33°, 0.61°, 0.30°, 0.81°, and 0.83°, respectively. Table 5 shows the average time taken to measure and record the angle data of 30 joints for the 9 volunteers using the traditional method and the visual-detection method (including the time to paste the visual marker).
From Table 5, it can be seen that the time taken by the vision-based knuckle-angle-detection method is much less than that of the conventional knuckle-angle-detection method. This result was produced because the vision-based knuckle-angle-detection method not only enables the simultaneous measurement of multiple knuckles compared to the conventional knuckle-angle-detection method, but also increases the speed of the knuckle measurement and the speed at which the knuckle-angle data are recorded.

6. Conclusions

To solve the problems that the joint-angle measuring instrument takes more time to measure the angle of single joints in clinical medicine, and cannot measure the angles of multiple joints at the same time, a vision-based finger-joint-angle-measuring system was designed on the basis of the original visual-inspection system. The system consists of a hardware system, a control system, and a vision system. The active multi-angle-light-source-detection system composed of a control system and a hardware system can simplify the recognition process of visual markers by adjusting the height of the light source. The vision system is composed of an industrial camera and the knuckle-angle-detection method proposed in this paper. The knuckle-angle-detection method proposed in this paper is composed of finger-joint-marker pasting, image acquisition, visual-marker segmentation, visual-marker edge detection, and joint-angle calculation based on different finger-joint markers. In this study, each component of the method was analyzed and verified by experiments. These experiments proved that in the case of high angle illumination, the TS–HOMLDM should be adopted, and the visual marker with the scale of 1.5 mm was selected, since it had the highest measurement accuracy. The shortcomings of the current proposed MVBFJAMS are also very obvious. Firstly, the system requires a Basler ace camera, a TEC-V7X industrial lens, an R50-26-13 light source, and a computer, which makes it much more expensive than traditional knuckle-measurement methods and sensor-based methods; furthermore, the system can only achieve two-dimensional inspection at present.
The system is still in the experimental stage and has high requirements for the detection environment for light sources. Considering the complexity of the clinical environment, in order to improve the anti-interference capability of the system, we intend to add an opaque housing to the exterior of the device in the future in order to maintain the stability of the testing environment. In the next phase, we intend to add another depth camera to this system and fuse the texture information from the normal camera with the depth-camera depth information to build a model of the detector’s hand. Using this approach, three-dimensional detection can then be achieved to detect the angle of each finger joint of the hand. In the meantime, we will further validate the accuracy of the system through clinical trials, as well as the accuracy of the assessment of the level of handicap and the effectiveness of the intervention treatment.

Author Contributions

Conceptualization, Y.F. and M.Z.; methodology, Y.F.; software, Y.F.; validation, Y.F. and F.D.; the main content of this manuscript was created and written by Y.F. and reviewed by all authors. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Natural Science Foundation of Zhejiang Province, grant number LQ21E050008; Educational Commission of Zhejiang Province, grant number Y201941335; Science and Technique Plans of Ningbo City, grant number 202002N3133; The Major Scientific and Technological Projects in Ningbo City, grant number 2020Z082; Research Fund Project of Ningbo University, grant number XYL19029; and the K. C. Wong Magna Fund of Ningbo University.

Institutional Review Board Statement

The study was conducted according to the guidelines of theDeclaration of Helsinki, and approved by the Ethics Committee of Faculty of Mechanical Engineering & Mechanics, Ningbo University (protocol code [2022]LLSP(0315) and 2022.03.15).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Okuyama, T.; Kobayashi, K.; Otsuki, M.; Tanaka, M. Measurement of finger joint angle using a flexible polymer sensor. Int. J. Appl. Electromagn. Mech. 2016, 52, 951–957. [Google Scholar] [CrossRef]
  2. Park, W.; Ro, K.; Kim, S.; Bae, J. A soft sensor-based three-dimensional (3-D) finger motion measurement system. Sensors 2017, 17, 420. [Google Scholar] [CrossRef] [PubMed]
  3. Kawaguchi, J.; Yoshimoto, S.; Kuroda, Y.; Oshiro, O. Estimation of finger joint angles based on electromechanical sensing of wrist shape. IEEE Trans. Neural Syst. Rehabil. Eng. 2016, 25, 1409–1418. [Google Scholar] [CrossRef] [PubMed]
  4. Kitano, K.; Ito, A.; Tsujiuchi, N.; Wakida, S. Estimation of joint center and measurement of finger motion by inertial sensors. In Proceedings of the 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Orlando, FL, USA, 16–20 August 2016; pp. 5668–5671. [Google Scholar]
  5. Zheng, Y.; Peng, Y.; Wang, G.; Liu, X.; Dong, X.; Wang, J. Development and evaluation of a sensor glove for hand function assessment and preliminary attempts at assessing hand coordination. Measurement 2016, 93, 1–12. [Google Scholar] [CrossRef]
  6. Veber, M.; Bajd, T.; Munih, M. Assessing joint angles in human hand via optical tracking device and calibrating instrumented glove. Meccanica 2007, 42, 451–463. [Google Scholar] [CrossRef]
  7. Lu, S.; Chen, D.; Liu, C.; Jiang, Y.; Wang, M. A 3-D finger motion measurement system via soft strain sensors for hand rehabilitation. Sens. Actuators A Phys. 2019, 285, 700–711. [Google Scholar] [CrossRef]
  8. Park, Y.; Lee, J.; Bae, J. Development of a wearable sensing glove for measuring the motion of fingers using linear potentiometers and flexible wires. IEEE Trans. Ind. Inform. 2014, 11, 198–206. [Google Scholar] [CrossRef]
  9. Park, Y.; Bae, J. A three-dimensional finger motion measurement system of a thumb and an index finger without a calibration process. Sensors 2020, 20, 756. [Google Scholar] [CrossRef] [PubMed]
  10. Jang, M.; Kim, J.S.; Kang, K.; Kim, J.; Yang, S. Towards Finger Motion Capture System Using FBG Sensors. In Proceedings of the 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Honolulu, HI, USA, 18–21 July 2018; pp. 3734–3737. [Google Scholar]
  11. Metcalf, C.D.; Robinson, R.; Malpass, A.J.; Bogle, T.P.; Dell, T.A.; Harris, C.; Demain, S.H. Markerless motion capture and measurement of hand kinematics: Validation and application to home-based upper limb rehabilitation. IEEE Trans. Biomed. Eng. 2013, 60, 2184–2192. [Google Scholar] [CrossRef] [PubMed]
  12. Pham, T.; Pathirana, P.N.; Trinh, H.; Fay, P. A non-contact measurement system for the range of motion of the hand. Sensors 2015, 15, 18315–18333. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  13. Simon, T.; Joo, H.; Matthews, I.; Sheikh, Y. Hand keypoint detection in single images using multiview bootstrapping. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 4645–4653. [Google Scholar]
  14. Mathis, A.; Mamidanna, P.; Cury, K.M.; Abe, T.; Murthy, V.N.; Mathis, M.W.; Bethge, M. Deeplabcut: Markerless pose estimation of user-defined body parts with deep learning. Nat. Neurosci. 2018, 21, 1281–1289. [Google Scholar] [CrossRef] [PubMed]
  15. Oikonomidis, P.I.; Argyros, A. Using a single rgb frame for real time 3d hand pose estimation in the wild. In Proceedings of the IEEE Winter Conference on Applications of Computer Vision (WACV), Lake Tahoe, NV, USA, 12–15 March 2018; pp. 436–445. [Google Scholar]
  16. Tung, J.Y.; Lulic; Gonzalez, D.A.; Tran, J.; Dickerson, C.R.; Roy, A.E.A. Evaluation of a portable markerless finger position capture device: Accuracy of the leap motion controller in healthy adults. Physiol. Meas. 2015, 36, 1025–1035. [Google Scholar] [CrossRef] [PubMed]
  17. Nizamis, K.; Rijken, N.H.M.; Mendes, A.; Janssen, M.M.H.P.; Bergsma, A.; Koopman, B.F.J.M. A novel setup and protocol to measure the range of motion of the wrist and the hand. Sensors 2018, 18, 3230. [Google Scholar] [CrossRef] [PubMed]
  18. Lim, G.M.; Jatesiktat, P.; Kuah, C.W.K.; Ang, W.T. Camera-based hand tracking using a mirror-based multi-view setup. In Proceedings of the 2020 42nd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Montreal, QC, Canada, 20–24 July 2020; pp. 5789–5793. [Google Scholar]
  19. Lee, J.W.; Rim, K. Measurement of finger joint angles and maximum finger forces during cylinder grip activity. J. Biomed. Eng. 1991, 13, 152–162. [Google Scholar] [CrossRef]
  20. Zhu, J.J.; Ji, W.; Hua, Q. An automatic vision inspection system for detecting surface cracks of welding joint. J. Comput. Methods Sci. Eng. 2019, 19, 635–646. [Google Scholar] [CrossRef]
  21. Zhang, H. Mechanical and Control System Design of Finger Training Rehabilitation Apparatus; Yanshan University: Qinhuangdao, China, 2016. [Google Scholar]
  22. Zotin, A. Fastalgorithm of image enhancement based on nulti-scale retinex. Procedia Comput. Sci. 2018, 131, 6–14. [Google Scholar] [CrossRef]
  23. Kruseaw, A.W.; Alenin, A.S.; Vaughni, I.J.; Tyo, J.S. Perceptually uniform color space for visualizing trivariate linear polarization imaging data. Opt. Lett. 2018, 43, 2426–2429. [Google Scholar] [CrossRef] [PubMed]
  24. Abbas, A.K.; Bassam, R. Phonocardiography signal processing. Morgan Claypool 2009, 4, 218. [Google Scholar]
  25. Papadaniil, C.D.; Hadjileontiadis, L.J. Efficient heart sound segmentation and extraction using ensemble empirical mode decomposition and kurtosis features. IEEE J. Biomed. Health Inform. 2014, 18, 1138–1152. [Google Scholar] [CrossRef]
Figure 1. Structural components of the human hand.
Figure 1. Structural components of the human hand.
Sensors 22 07276 g001
Figure 2. Human finger-joint range of motion and measurement methods.
Figure 2. Human finger-joint range of motion and measurement methods.
Sensors 22 07276 g002
Figure 3. Finger-joint-angle-detection platform.
Figure 3. Finger-joint-angle-detection platform.
Sensors 22 07276 g003
Figure 4. Finger-joint markers and their method of attachment.
Figure 4. Finger-joint markers and their method of attachment.
Sensors 22 07276 g004
Figure 5. Image-acquisition method based on the active multi-angle light-source-detection method.
Figure 5. Image-acquisition method based on the active multi-angle light-source-detection method.
Sensors 22 07276 g005
Figure 6. Grayscale conversion of the original image with the histogram.
Figure 6. Grayscale conversion of the original image with the histogram.
Sensors 22 07276 g006
Figure 7. HSV marker segmentation and edge detection.
Figure 7. HSV marker segmentation and edge detection.
Sensors 22 07276 g007
Figure 8. Image-thresholding segmentation and contour-detection results.
Figure 8. Image-thresholding segmentation and contour-detection results.
Sensors 22 07276 g008
Figure 9. Inner and outer Hough straight-line detection results.
Figure 9. Inner and outer Hough straight-line detection results.
Sensors 22 07276 g009
Figure 10. Least-squares linear-fit results.
Figure 10. Least-squares linear-fit results.
Sensors 22 07276 g010
Figure 11. Deviation from the mean value of knuckle angle at different scales.
Figure 11. Deviation from the mean value of knuckle angle at different scales.
Sensors 22 07276 g011
Table 1. Finger-joint angles measured by different methods.
Table 1. Finger-joint angles measured by different methods.
HSV + Hough Outer- and Inner-Straight-Line MethodHSV + Least-Squares-Fitting Linear-Knuckle-Angle-Detection MethodThreshold Segmentation + Hough Outer Medial Linear-Detection MethodThreshold-Segmentation + Least-Squares-Fitting-Line MethodTraditional Knuckle-Angle Measurement
MCP145.02°144.76°144.95°144.59°145°
PIP111.03°109.38°110.48°111.26°110°
DIP111.83°114.07°112.09°112.34°112°
Length of proximal phalanx26.94 mm28.24 mm27.37 mm27.32 mm27 mm
Length of middle phalanx25.53 mm25.53 mm25.64 mm25.26 mm26 mm
Mean Angle deviation0.407°0.967°0.207°0.670°
Table 2. Knuckle-retention angles under different markers in different volunteers.
Table 2. Knuckle-retention angles under different markers in different volunteers.
MCP (°)PIP (°)DIP (°)
Knuckle-hold angle under each marker145110115
160130110
150165130
Table 3. Results of different volunteers’ visual-detection angles.
Table 3. Results of different volunteers’ visual-detection angles.
VolunteerMark on the ScaleMCP (°)PIP (°)DIP (°)Length of Proximal Phalanx (mm)Length of Middle Phalanx (mm)
volunteer 11.5 mm144.72109.31115.4245.5230.23
160.10130.12109.2144.0731.45
149.48165.72130.3145.3130.21
2 mm145.21109.10114.4246.2131.03
161.71132.22109.7144.7130.15
151.31167.28130.0245.4929.24
2.5 mm144.72108.91115.9246.7129.02
160.40128.93109.27.43.9330.51
147.32164.89133.2246.4429.91.
volunteer 21.5 mm145.31110.21114.7147.2233.47
159.27130.31111.3147.3132.17
150.32164.44139.7446.2831.95
2 mm143.31110.72116.7146.9333.36
160.44129.10110.2347.3232.78
150.77165.69131.2148.9134.19
2.5 mm146.21110.79114.4948.3135.66
162.99131.44111.2247.7634.54
150.55167.21131.5947.7731.22
volunteer 31.5 mm144.81110.47115.6943.1727.49
160.77130.21110.4844.2126.36
150.06165.56131.8142.8928.91
2 mm144.31111.81114.0144.3329.36
161.17130.79112.5846.9627.22
150.97163.84130.9145.8926.54
2.5 mm146.79110.11115.9843.2227.77
160.89129.33111.3945.1029.99
150.34166.79130.4445.7826.53
volunteer 41.5 mm145.32110.17114.8745.1730.24
160.17130.22109.9744.5429.77
150.27164.90131.0745.9829.31
2 mm146.32110.54115.9443.3327.45
159.12130.84110.9544.5426.79
149.71165.55128.7742.5926.34
2.5 mm146.71110.21116.1943.2428.79
160.77131.44108.2245.7727.32
151.45165.99130.9743.3528.23
volunteer 51.5 mm145.21109.55115.9440.2223.33
160.56129.53109.8441.5724.35
150.41165.77129.9243.9823.47
2 mm145.99109.21116.3142.2225.22
160.77131.74109.5541.3124.51
150.22166.33130.4439.4523.91
2.5 mm145.97109.22115.3340.5825.33
161.44130.55110.8941.3224.56
148.97165.33131.7543.7722.22
volunteer 61.5 mm145.31109.12114.3340.1230.21
160.33130.22109.2244.4529.22
150.22165.72130.3343.4327.34
2 mm145.33110.47115.3339.6528.79
161.43130.99109.4441.7630.33
150.67165.33131.6542.2230.67
2.5 mm146.12110.22115.4845.9730.15
158.91130.21110.7742.7131.33
149.23163.47131.2242.4529.78
volunteer 71.5 mm145.33110.32116.1236.4527.13
159.31130.07110.7736.8426.56
150.21165.22129.2237.3226.32
2 mm146.71110.42113.4134.7825.72
160.12130.65110.8937.7728.23
151.14164.31130.2237.9627.45
2.5 mm145.42110.31114.2139.0329.81
161.31128.64109.0139.7625.33
152.12166.21129.1338.7825.91
volunteer 81.5 mm145.32110.77114.5743.1527.49
160.74129.22110.1041.3328.27
151.12165.33129.3844.5427.39
2 mm145.72110.31116.6642.5629.72
159.21131.72110.0742.3327.59
150.56166.77130.3341.1228.23
2.5 mm145.32110.07115.2144.4530.02
157.42131.72110.9941.7529.67
150.22166.23129.2545.3928.37
volunteer 91.5 mm145.31110.23115.7635.4623.57
160.22130.74110.5534.9024.88
159.31165.21130.9037.0424.42
2 mm146.13111.31114.3136.2425.56
158.91129.10110.1239.3523.78
149.01165.12131.1437.6724.33
2.5 mm143.21109.22115.3338.9124.89
160.33131.55107.3237.3326.33
151.33165.77129.2237.5723.91
Table 4. Measurement results of abduction/adduction knuckle angle.
Table 4. Measurement results of abduction/adduction knuckle angle.
VolunteerKnuckle-Angle MeasurementMeasuring Angle (°)Mean Knuckle-Angle Deviation (°)
volunteer 1Vmm25.7339.2140.390.63
Tmm254040
volunteer 2Vmm24.3239.0340.410.68
Tmm254040
volunteer 3Vmm23.9140.7140.510.77
Tmm254040
volunteer 4Vmm24.4139.9340.330.33
Tmm254040
volunteer 5Vmm25.2239.3540.610.49
Tmm254040
volunteer 6Vmm24.1238.7741.120.61
Tmm254040
volunteer 7Vmm24.5239.7940.220.30
Tmm254040
volunteer 8Vmm24.9138.5440.890.81
Tmm254040
volunteer 9Vmm24.3340.9540.870.83
Tmm254040
Table 5. Time taken to measure and record data for 30 joint angles under different methods.
Table 5. Time taken to measure and record data for 30 joint angles under different methods.
Method of Knuckle-Angle DetectionTime Taken to Measure and Record Knuckle Angles for 30 Times (s)
TMM51.75
VMM421.21
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Feng, Y.; Zhong, M.; Dong, F. Research on Monocular-Vision-Based Finger-Joint-Angle-Measurement System. Sensors 2022, 22, 7276. https://doi.org/10.3390/s22197276

AMA Style

Feng Y, Zhong M, Dong F. Research on Monocular-Vision-Based Finger-Joint-Angle-Measurement System. Sensors. 2022; 22(19):7276. https://doi.org/10.3390/s22197276

Chicago/Turabian Style

Feng, Yongfei, Mingwei Zhong, and Fangyan Dong. 2022. "Research on Monocular-Vision-Based Finger-Joint-Angle-Measurement System" Sensors 22, no. 19: 7276. https://doi.org/10.3390/s22197276

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop