Next Article in Journal
Design, Implementation, and Validation of a Piezoelectric Device to Study the Effects of Dynamic Mechanical Stimulation on Cell Proliferation, Migration and Morphology
Next Article in Special Issue
Performance Evaluation of RTS/CTS Scheme in Beacon-Enabled IEEE 802.15.6 MAC Protocol for Wireless Body Area Networks
Previous Article in Journal
P4UIoT: Pay-Per-Piece Patch Update Delivery for IoT Using Gradual Release
Previous Article in Special Issue
Mutual Authentication Protocol for D2D Communications in a Cloud-Based E-Health System
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Wireless Body Area Network (WBAN)-Based Telemedicine for Emergency Care

School of Electronics Engineering, Vellore Institute of Technology, Chennai 600127, India
*
Author to whom correspondence should be addressed.
Sensors 2020, 20(7), 2153; https://doi.org/10.3390/s20072153
Submission received: 25 March 2020 / Revised: 7 April 2020 / Accepted: 8 April 2020 / Published: 10 April 2020
(This article belongs to the Special Issue Wireless Body Area Networks for Health Applications)

Abstract

:
This paper is a collection of telemedicine techniques used by wireless body area networks (WBANs) for emergency conditions. Furthermore, Bayes’ theorem is proposed for predicting emergency conditions. With prior knowledge, the posterior probability can be found along with the observed evidence. The probability of sending emergency messages can be determined using Bayes’ theorem with the likelihood evidence. It can be viewed as medical decision-making, since diagnosis conditions such as emergency monitoring, delay-sensitive monitoring, and general monitoring are analyzed with its network characteristics, including data rate, cost, packet loss rate, latency, and jitter. This paper explains the network model with 16 variables, with one describing immediate consultation, as well as another three describing emergency monitoring, delay-sensitive monitoring, and general monitoring. The remaining 12 variables are observations related to latency, cost, packet loss rate, data rate, and jitter.

1. Introduction

With similarities to first come first serve (FCFS), traditional transmission in healthcare has local data processing units which prioritize, according to their time slots, the treatment of patients with emergency conditions. The prioritization illustrates the importance of a particular node from energy dissipation and the amount of time taken after transmission [1]. An algorithm is designed based on fitness priority chosen, which clearly signifies the fitness which is higher. The higher fitness is chosen to have a lower waiting time for data packets. The scheduling packet is modeled according to the priority, and many multi-level schedulers are used for transmission [2]. Wireless body area networks (WBANs) help in monitoring health activities regardless of patient location and only teleconsultation is needed, thus reducing the visiting hours of doctors. WBANs monitor humans’ e-health as a key element [3], but WBAN has many challenges with respect to data, network, interoperability, security, and scalability [4]. Wearable sensors help in collecting information about the physiological parameters of the patient, which are to the doctor at the medical center to analyze the information [5]. The WBAN must transfer patient information securely, quickly, and efficiently, without any change in data, and the energy consumption must also be low [6]. The IEEE 802.15.6 WBAN standard has measurements on channel modes, while it also includes the mobility of nodes and effects caused by human interaction [6]. Medical WBAN ensures the early detection of diseases, thus resulting in the prevention of illness and a decrease in healthcare cost [6,7]. WBAN is capable of providing ubiquitous healthcare services and maintains these records in servers [7]. Because of WBAN, there are many inventions which are easily usable technologies for monitoring the health of patients [8]. This pervasive application of WBAN is the most easily available technology for patients [9], since the WBAN nodes help in collecting and storing the patient’s data, offering healthcare services [10]. WBAN gives the real-time status of the patient, which provides the best monitoring without affecting daily activities [11], and it also sends health-related information to the doctor [12]. The IEEE 802.15.6 standard is suitable mainly for WBAN requirements since WBAN is widely used in telemedicine, as well as in remote monitoring of patients [13]. Through WBAN applications, the status of the patient, events of the patient, and their health are noted down [14]. This technology can revolutionize patient health monitoring in the future, allowing easy access to the patient’s condition from any place.
Physiological data are collected through WBAN nodes from the patient and transmitted to the sink [15]; thus, WBAN helps in the continuous monitoring of the patient and updates the status of the patient to a remote doctor [16]. Wearable equipment on the market now allows for providing users improved healthcare service [17]. In addition, WBAN provides the miniaturization of sensor nodes, which helps in the constant monitoring of remote patients [18]. The placement of relay nodes and the quality of service (QoS) requirement of medical signals result in a reliable and connected WBAN [19,20]. WBAN in healthcare must be secured through a robust authentication scheme [21], and these sensor readings may be inaccurate due to some faults, resulting in anomalous detection schemes [22]. WBAN network topology and cross-layer optimization were discussed in References [23,24], which optimized cost effectiveness through a multi-objective cost function. In Reference [25], a novel convergecast strategy, especially for WBAN, was designed using a delay-tolerant network and wireless sensor network in the OMNeT++ framework. In Reference [26], an authentication scheme for a medical body area network was designed, and its reliability was demonstrated. In Reference [27], authentication schemes for BAN were designed. In References [28,29], wearables that non-invasively detect drugs were surveyed with technologies for precision health and monitoring. In Reference [30], a telemedicine application using IEEE 802.15.6 MAC and IEEE 802.15.4 PHY was demonstrated, and, in Reference [31], the WBAN node energy was restrained.
A Bayesian network is a directed acyclic graph (DAG) with a set of random variables called nodes, and these nodes have direct links to the conditional probability table (CPT). Bayes networks are known to be graphical models which are probabilistic and based on expert knowledge derived from datasets. They are suitable for decision-making concerning uncertainty, and these networks are also called decision networks, causal networks, or belief networks, which depend on multiple events. Here, the real-time relationships are always probabilistic, and these networks represent probabilistic relations with multiple events. The node is represented as a hypothesis having at least two values, which are possible probabilities. The arrows give the conditional independencies of relationships between the state of node and its probability distribution and the other node to which it is connected. Decision theory is the combination of utility and probability theory, in which the nodes exhibit causal relationships and conditional independence between nodes. Bayes nets are directed graphical models representing hierarchical Bayesian models, and these networks find the probability of an uncertain cause with the help of observed evidence. As these Bayesian networks give a causal probabilistic relationship, the probability of the evidence can be found easily. The link between nodes gives the probabilistic relationship between them. Only loops can be formed, whereas cycles are not permitted in the graph. These networks mix probability with the values of measurement, and they also help in computing the given evidence. Bayes nets are also called belief nets or decision nets, and they help in optimizing decisions, thereby allowing probabilities to be updated. By using Bayes’ rule, there is no need for exact probabilities, but causal conditional probabilities are estimated easily and are applied in diagnosis, prediction, modeling, monitoring, and alerting.

2. Related Work

Agrahari analyzed a dynamic Bayesian network dealing with uncertainty and stochastic behavior, which helps in treating patients through telemedicine [31]. In telemedicine, the data are provided by the patient through sensors, and the diagnosis is confirmed or denied by the physician. The smart agent helps in predicting and correcting diagnosis. Smart agents improve the physician’s diagnosis. Bellot dealt with a Bayesian network, based on multi-agent systems which provide diagnosis decisions for a telemedicine framework [32]. This model was based on analyzing the probabilities of physiological indices, which included both a Bayesian and a multi-agent system. The framework for remote diagnosis was designed with the Bayesian inference for prediction and, therefore, the network model was estimated, tested, and verified through Bayesian statistics [33]. Here, the causal paths, uncertainty information, and prior knowledge were evaluated for hypotheses, and those approaches related to constraint-based and score-based methods were found for independent and dependent relationships in the graph [34]. Hill climbing, genetic algorithms, and tabu search are some of the heuristic search techniques used for DAGs. A Bayesian network was trained with Eigen genes, and the arc link conditional dependencies [35] were evaluated accurately for conditional probability density functions. These Bayesian networks are known for their predictions, as well as their interpretability [36], and they are helpful in making decisions, with cause-and-effect relationships, for statistical analysis. The connections in the medical data are interpreted, representing causality and uncertainty, and these Bayesian medical network models are quantified through mutually exclusive states. Their dependencies are noted in a CPT [37]. Adding evidence changes the probability of nodes and allows a structured elicitation, and these causal Bayesian networks include the structure judgment of experts. Bayesian networks are also used in classifying students according to their performance, helping them to perform better. A Bayesian network with tree width is the property of the DAG used for building the network iteratively [38]. Identification of the parent set and an optimized structure allow easy learning in Bayesian networks. A dynamic Bayesian network deals with uncertainties following the addition of time-related information [39]. The probabilistic prediction model requires learning of the structure and parameters, as well as probabilistic reasoning. Causal relationships deal better with uncertainty, and they also act as a decision-making tool for determining the best strategies [40]. This is achieved using both qualitative and quantitative approaches, whereby the former involves structured learning for creating the DAG, and the latter uses the CPT of each node for calculation, as well as monitoring, diagnosing, and predicting justifiable and quantified decisions. Bayes’ theorem is best used as a predictor; therefore, higher bug prediction is also possible through a naïve Bayes classifier [41]. It also exhibits clinical decision-making, and it connects theory to evidence in clinical practice [42]. The alternate way of finding conditional probability is through Bayes’ theorem, which does not consider the joint probability [43]. It helps in mapping the problem onto an equation such as posterior probability, prior probability, likelihood, and evidence. Bayes’ theorem is explained very well in the analysis of cancer diagnostics. By using the values of sensitivity, specificity, base rate, and binary classification (which gives the true positive rate (TPR), also known as sensitivity, and the true negative rate (TNR), also known as specificity), and base rates for condition and prediction (probability of a positive class (PC), probability of a negative class (NC), probability of a positive prediction (PP)), the posterior probability known as precision can be found using the positive predictive value from the confusion matrix when the beliefs are known from the events. This theorem also gives the relationship between data and the model, which is calculated from the observed probabilities of the data given the hypothesis. The observed data reflect the probability of the hypothesis based on prior probability. In addition, if the observed probability P(D) increases, the probability of the hypothesis given the observed data P(h|D) decreases. On the other hand, if the probability of hypothesis P(h) and the probability of the observed data given hypothesis P(D|h) increase, the probability of the hypothesis given data P(h|D) also increases. Classification is framed by calculating the conditional probability of a class given the data.
This paper models Bayesian belief networks, where probabilistic models define the relationship between variables and are, thus, used for calculating probabilities [44]. These Bayesian networks are represented by a probabilistic graphical model, which finds the conditional dependence through directed edges. Through this Bayesian network, the relationships between nodes and edges are found given the evidence. Here, Bayesian probability represents the beliefs in an outcome by finding the joint probabilities of events, and it is used for making decisions. It considers a Bayesian network with 16 random variables, assuming conditional dependence in the presence of conditional independence.

3. Bayes Network Prediction for Telemedicine System

A patient wearing non-invasive sensors sends health data to remote healthcare centers for teleconsultation. The doctors examine the data at the healthcare centers and check whether the patient needs emergency care or teleconsultation through a Bayes network model for telemedicine. According to the output, teleconsultation or point of emergency care is given to the patient. Figure 1 shows the schematic representation of the telemedicine set-up.
A Bayesian belief network model is formed from the conditions observed through a DAG, which represents the probabilistic relationships from events. The events are represented as immediate consultation, emergency monitoring, delay-sensitive monitoring, general monitoring, very low latency, low latency, medium latency, high cost, low cost, low jitter, medium jitter, very low packet loss rate, low packet loss rate, high packet loss rate, high data rate, and moderate data rate. The directed arrow represents the conditional probability between parent and child nodes, as shown in Figure 2. Firstly, the emergency condition event, which is the uncertainty situation, is possible when the conditions are very low latency (or) high cost (or) very low packet loss rate (or) high data rate (or) low jitter. Similarly, the delay-sensitive monitoring is the uncertainty situation when the conditions are low latency (or) moderate cost (or) low packet loss rate (or) low jitter. Similarly, the general monitoring is the uncertainty situation when the conditions are high latency (or) low cost (or) high packet loss rate (or) medium jitter. The purpose of the proposed system is to predict/determine the health of the patient with the help of the network parameters. This system is modeled for immediate teleconsultation, emergency monitoring, delay-sensitive monitoring, or general monitoring with the help of network parameters like latency, jitter, data rate, cost, and packet loss ratio. These nodes are connected according to the conditional dependence, as well as the causation between the nodes. The total number of nodes is 16 because, in this Bayes network, there are four network parameters to be analyzed for four possible monitoring conditions.
The probability values are assumed such that the emergency condition must be handled at the first priority. Second priority is given to delay-sensitive monitoring, and the least priority is given to general monitoring. Bayes’ theorem calculates the probability of occurrence in the future by incorporating prior knowledge. This method is considered as the best method because it learns from experience, and these predictions are stored in electronic records which can be used by a doctor when needed. The preferred explanation makes the evidence or observation more likely and, thus, the data fit the Bayes network model.
The proposed Bayes network model for telemedicine has 16 nodes, as shown in Figure 2. Each node has its CPT defined with probabilities, which also helps in finding conditional probability tables (CPTs) from the directed acyclic graph (DAG).
The Bayesian network is formed with nodes such as immediate teleconsultation (IT), emergency monitoring (EM), delay-sensitive monitoring (DS), and general monitoring (GM). IT occurs when there is need for EM. The EM condition is possible whenever there is very low latency (VLL), low packet loss rate (LPLR), low jitter (LJ), high cost (HC), and high data rate (HDR). The DS condition occurs when there is low jitter (LJ), high data rate (HDR), very low packet loss rate (VLPLR), low latency (LL), and low cost (LC). The GM condition is possible when there is low cost (LC), moderate data rate (MDR), high packet loss rate (HPLR), moderate latency (ML), and moderate jitter (MJ). If the network is the table of all possible combinations, it will be large. Thus, applying Bayesian nets only relates the nodes through causality, which greatly reduces the computation cost by finding the probabilities of related parent–child nodes. In addition, these are adaptable networks even with limited evidence, and they form new knowledge.
For this network, the values of probabilities in each node are derived by assuming 50 patients in a hospital, and the probabilities are assumed. The first node is the immediate teleconsultation node, and its parent node is in emergency monitoring. Its conditional probability is given by P ( I T ) = 0.999 . The second node is the delay node, and its parent node is in delay-sensitive monitoring. Its conditional probability is given by P ( D S ) = 0.888 . The third node is the general node, and its parent node is in general monitoring. Its conditional probability is given by P ( G M ) = 0.777 . The fourth node is emergency monitoring (EM). If IT is true, P ( E M ) = 0.98 . If IT is false P ( ~ E M ) = 0.02 . The fifth node is the very low latency (VLL) node, depending on EM. If EM is true,   P ( V L L ) = 0.90 ; otherwise, P ( ~ V L L ) = 0.05 .
The sixth node is the high cost (HC) node, depending on EM. If EM is true, P ( H C ) = 0.70 ; otherwise, P ( ~ H C ) = 0.30 . The seventh node is the low packet loss rate (LPLR) node, depending on EM. If EM is true, P ( L P L R ) = 0.95 ; otherwise, P ( ~ L P L R ) = 0.05 . The eighth node is the low latency (LL) node, depending on DS. If DS is true, P ( L L ) = 0.94 ; otherwise,   P ( ~ L L ) = 0.06 . The ninth node is the very low packet loss rate (VLPLR) node, depending on DS. If DS is true,   P ( V L P L R ) = 0.97 . If DS is false,   P ( ~ V L P L R ) = 0.03 . The 10th node is moderate data rate (MDR), depending on the GM. If GM is true, P ( M D R ) = 0.99 . If GM is false, P ( ~ M D R ) = 0.01 . The 11th node is high packet loss rate (HPLR). If GM is true, P ( H P L R ) = 0.95 . If GM is false, P ( ~ H P L R ) = 0.05 .
The 12th node is moderate latency (ML). If GM is true, P ( M L ) = 0.96 . If GM is false, P ( ~ M L ) = 0.04 . The 13th node is mixed jitter (MJ), depending on GM. If GM is true, P ( M J ) = 0.94 . If GM is false, P ( ~ M J ) = 0.06 . The 14th node is the low jitter (LJ) node, depending on EM and DS. If EM and DS are true, P ( L J ) = 0.95 . If EM is true and DS is false,   P ( L J ) = 0.94 . If EM is false and DS is true, P ( L J ) = 0.74 . If both EM and DS are false, P ( L J ) = 0.60 . The 15th node is the high data rate (HDR), depending on EM and DS. If EM and DS are true, P ( H D R ) = 0.96 . If EM is true and DS is false, P ( H D R ) = 0.92 . If EM is false and DS is true, P ( H D R ) = 0.74 . If EM and DS are false P ( H D R ) = 0.70 . The 16th node is low cost (LC), depending on DS and GM. If both DS and GM are true, P ( L C ) = 0.98 . If DS is true and GM is false, P ( L C ) = 0.94 . If DS is false and GM is true, P ( L C ) = 0.70 . If DS and EM are true, P ( L C ) = 0.60 .

4. Bayes Network Prediction Model for Telemedicine

Each node is assumed to be conditionally independent of its immediate non-descendants, given by its immediate parents. Bayes’ theorem gives the compact representation of joint distribution. In the below DAG, there are 16 nodes. Thus, the computation of probabilities results in 216 = 65,536 probabilities. However, through the CPT, there are only 28 conditional probabilities.
By analyzing the network model of Figure 2, there are three stages of nodes. The first-stage nodes include 1, 2, and 3, which deal with IT, DS, and GM. They have only two alternatives.

4.1. Probabilities of Nodes (1 to 3)

At node 1, the probability for finding emergency monitoring can be found when the network needs IT; then, P ( I T ) = 0.999 , and, when no IT is needed,   P ( ~ I T ) = 0.001 .
At node 2, the probability for finding delay-sensitive monitoring can be found when the network is DS; then, P ( D S ) = 0.888 , and, when there is no DS,   P ( ~ D S ) = 0.112 .
At node 3, the probability for finding general monitoring can be found when the network has GM; then, P ( G M ) = 0.777 , and, when it has no GM, P ( ~ G M ) = 0.223 .

4.2. Probabilities of Nodes (4 to 14)

The second-stage nodes include 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, and 14, which have a hypothesis and degree of belief. They have pre-existing beliefs about their hypothesis and calculate their joint probabilities with the help of their priors and likelihood. Posterior probability can be found by using the prior probability and observed beliefs/evidence. It can be found from the updated probability of an event with new evidence given by Equation (1).
P ( A / B ) = P ( A ) ·   P ( B / A ) P ( B ) .
At node 4, probabilistic reasoning can be applied with the set of candidate hypotheses with I T = immediate consultation and ~ I T = no immediate teleconsultation, which is shown in Table 1 as P ( I T ) = 0.999 = ( p 0 )   a n d   P ( ~ I T ) = 0.001 = ( p 1 ) , where p 0 + p 1 = 1 . If there are 50 patients in a hospital in an emergency condition and immediate consultation is needed, all patients are consulted immediately, which is given by the observed beliefs. The prior probabilities are P ( I T , E M ) = 49 50 = 0.98 ;   P ( I T , ~ E M ) = 2 50 = 0.04 ;   P ( ~ I T ,   E M ) = 1 50 = 0.02 ;   P ( ~ I T ,   ~ E M ) = 48 50 = 0.96 . From Equation (1), the posterior probabilities are given by Equation (A1) (Appendix A).
At node 5, probabilistic reasoning can be applied with the set of candidate hypotheses with EM = emergency condition and ~ E M = no emergency condition, which is shown in Table 1 as P ( E M ) = 0.97904 = ( p 0 )   a n d   P ( ~ E M ) = 0.04092 = ( p 1 ) , where p 0 + p 1 = 1 . The observed beliefs are revised, and, according to its prior probabilities, P ( E M , V L L ) = 0.99768 ;   P ( E M , ~ V L L ) = 0.71578 ;   P ( ~ E M ,   V L L ) = 0.002316 ;   P ( ~ E M ,   ~ V L L ) = 0.284212 . From Equation (1), the posterior probabilities are given by Equation (A2) (Appendix A).
At node 6, probabilistic reasoning can be applied with the set of candidate hypotheses with E M = emergency condition and ~ E M = no emergency condition, which is shown in Table 1 as P ( E M ) = 0.97904 = ( p 0 )   a n d   P ( ~ E M ) = 0.002316 = ( p 1 ) , where p 0 + p 1 = 1 . The observed beliefs are revised, and, according to prior probabilities, P ( E M , H C ) = 0.98591 ;   P ( E M , ~ H C ) = 0.87879 ;   P ( ~ E M ,   H C ) = 0.000588 ;   P ( ~ E M ,   ~ H C ) = 0.121209 . From Equation (1), the posterior probabilities are given by Equation (A3) (Appendix A).
At node 7, probabilistic reasoning can be applied with the set of candidate hypotheses with EM = emergency condition and ~ E M = no emergency condition, which is shown in Table 1 as P ( E M ) = 0.97904 = ( p 0 )   a n d   P ( ~ E M ) = 0.04092 = ( p 1 ) , where p 0 + p 1 = 1 . The observed beliefs are revised, and, according to prior probabilities, P ( E M , L P L R ) = 0.998248 ;   P ( E M , ~ L P L R ) = 0.554791 ;   P ( ~ E M ,   L P L R ) = 0.0017567 ;   P ( ~ E M ,   ~ L P L R ) = 0.44521 . From Equation (1), the posterior probabilities are given by Equation (A4) (Appendix A).
At node 8, probabilistic reasoning can be applied with the set of candidate hypotheses with D S   = delay-sensitive monitoring and ~ D S = no delay-sensitive monitoring, as shown in Table 1 as P ( D S ) = 0.888 = ( p 0 )   a n d   P ( ~ D S ) = 0.112 = ( p 1 ) , where p 0 + p 1 = 1 . The observed beliefs are revised, and, according to prior probabilities, P ( D S , L L ) = 0.992013 ;   P ( D S , ~ L L ) = 0.408088 ;   P ( ~ D S ,   L L ) = 0.007986 ;   P ( ~ D S ,   ~ L L ) = 0.591911 . From Equation (1), the posterior probabilities are given by Equation (A5) (Appendix A).
At node 9, probabilistic reasoning can be applied with the set of candidate hypotheses with D S   = delay-sensitive monitoring and ~ D S = no delay-sensitive monitoring, as shown in Table 1 as P ( D S ) = 0.888 = ( p 0 )   a n d   P ( ~ D S ) = 0.223 = ( p 1 ) , where p 0 + p 1 = 1 . The observed beliefs are revised, and, according to prior probabilities, P ( D S , V L P L R ) = 0.98938 ;   P ( D S , ~ V L P L R ) = 0.37121 ;   P ( ~ D S ,   V L P L R ) = 0.003885 ;   P ( ~ D S ,   ~ V L P L R ) = 0.62626 . From Equation (1), the posterior probabilities are given by Equation (A6) (Appendix A).
At node 10, probabilistic reasoning can be applied with the set of candidate hypotheses with G M   = general monitoring and ~ G M = no general monitoring, as shown in Table 1 as P ( G M ) = 0.777 = ( p 0 )   a n d   P ( ~ G M ) = 0.223 = ( p 1 ) , where p 0 + p 1 = 1 . The observed beliefs are revised, and, according to prior probabilities, P ( G M , M D R ) = 0.997109 ;   P ( G M , ~ M D R ) = 0.066387 ;   P ( ~ G M , M D R ) = 0.0028906 ;   P ( ~ G M ,   ~ M D R ) = 0.933612 . From Equation (1), the posterior probabilities are given by Equation (A7) (Appendix A).
At node 11, probabilistic reasoning can be applied with the set of candidate hypotheses with G M   = general monitoring and ~ G M = no general monitoring, as shown in Table 1 as P ( G M ) = 0.777 = ( p 0 )   a n d   P ( ~ G M ) = 0.223 = ( p 1 ) , where p 0 + p 1 = 1 . The observed beliefs are revised, and, according to prior probabilities, P ( G M , H P L R ) = 0.98511 ;   P ( G M , ~ H P L R ) = 0.09727 ;   P ( ~ G M , H P L R ) = 0.014880 ;   P ( ~ G M ,   ~ H P L R ) = 0.90272 . From Equation (1), the posterior probabilities are given by Equation (A8) (Appendix A).
At node 12, probabilistic reasoning can be applied with the set of candidate hypotheses with G M   = general monitoring and ~ G M = no general monitoring, as shown in Table 1 as P ( G M ) = 0.777 = ( p 0 )   a n d   P ( ~ G M ) = 0.223 = ( p 1 ) , where p 0 + p 1 = 1 . The observed beliefs are revised, and, according to prior probabilities, P ( G M , M L ) = 0.98857 ;   P ( G M , ~ M L ) = 0.033998 ;   P ( ~ G M , M L ) = 0.01182 ;   P ( ~ G M ,   ~ M L ) = 0.96600 . From Equation (1), the posterior probabilities are given by Equation (A9) (Appendix A).
At node 13, probabilistic reasoning can be applied with the set of candidate hypotheses with G M = general monitoring and ~GM = no general monitoring, as shown in Table 1 as P ( G M ) = 0.777 = ( p 0 )   a n d   P ( ~ G M ) = 0.223 = ( p 1 ) , where p 0 + p 1 = 1 . The observed beliefs are revised, and, according to prior probabilities, P ( G M , M J ) = 0.98201 ;   P ( G M , ~ M J ) = 0.15496 ;   P ( ~ G M , M J ) = 0.017989 ;   P ( ~ G M ,   ~ M J ) = 0.84503 . From Equation (1), the posterior probabilities are given by Equation (A10) (Appendix A).
The third-stage nodes include 14, 15, and 16, which depend on the hypotheses of two nodes and the degree of belief. They have pre-existing beliefs about their hypothesis, and they calculate their joint probabilities with the help of their priors and likelihood.
At node 14, probabilistic reasoning can be applied with the following set of candidate hypotheses:
Case 1: If E M , D S   a r e   t r u e ,   P ( L J ) = 0.95 ,   P ( ~ L J ) = 0.05 ;
Case 2: If E M   i s   t r u e ,   D S   i s   f a l s e ,   P ( L J ) = 0.94 ,   P ( ~ L J ) = 0.06 ;
Case 3: If E M   i s   f a l s e ,   D S   i s   t r u e ,   P ( L J ) = 0.74 ,   P ( ~ L J ) = 0.26 ;
Case 4: If E M , D S   a r e   f a l s e ,   P ( L J ) = 0.60 ,   P ( ~ L J ) = 0.40 .
The observed beliefs are revised as follows according to prior probabilities:
P ( E M , D S   | L J ) = 0.7 ;   P ( E M , D S |   ~ L J ) = 0.3 ; P ( E M , ~ D S   | L J ) = 0.6 ;   P ( ~ E M , D S |   ~ L J ) = 0.4 ; P ( ~ E M , D S   | L J ) = 0.5 ;   P ( ~ E M , D S |   ~ L J ) = 0.5 ; P ( ~ E M , ~ D S | L J ) = 0.8 ; P ( ~ E M , ~ D S | ~ L J ) = 0.2 .
From Equation (1), the posterior probabilities are given by Equation (A11) (Appendix A).
At node 15, probabilistic reasoning can be applied with the following set of candidate hypotheses:
Case 1: If E M , D S   a r e   t r u e ,   P ( H D R ) = 0.96 ,   P ( ~ H D R ) = 0.04 ;
Case 2: If E M   i s   t r u e ,   D S   i s   f a l s e ,   P ( H D R ) = 0.92 ,   P ( ~ H D R ) = 0.08 ;
Case 3: If E M   i s   f a l s e ,   D S   i s   t r u e ,   P ( H D R ) = 0.74 ,   P ( ~ H D R ) = 0.26 ;
Case 4: If E M , D S   a r e   f a l s e ,   P ( H D R ) = 0.70 ,   P ( ~ H D R ) = 0.30 .
The observed beliefs are revised as follows according to prior probabilities:
P ( E M , D S   | H D R ) = 0.6 ;   P ( E M , D S |   ~ H D R ) = 0.4 ; P ( E M , ~ D S   | H D R ) = 0.5 ;   P ( ~ E M , D S |   ~ H D R ) = 0.5 ; P ( ~ E M , D S   | H D R ) = 0.7 ;   P ( ~ E M , D S |   ~ H D R ) = 0.3 ; P ( ~ E M , ~ D S | H D R ) = 0.8 ; P ( ~ E M , ~ D S | ~ H D R ) = 0.2 .
From Equation (1), the posterior probabilities are given by Equation (A12) (Appendix A).
At node 16, probabilistic reasoning can be applied with the following set of candidate hypotheses:
Case 1: If D S , G M   a r e   t r u e ,   P ( L C ) = 0.98 ,   P ( ~ L C ) = 0.02 ;
Case 2: If DS is true, GM is false, P(LC) = 0.94, P(~LC) = 0.06;
Case 3: If D S   i s   f a l s e ,   G M   i s   t r u e ,   P ( L C ) = 0.70 ,   P ( ~ L C ) = 0.30 ;
Case 4: If D S , G M   a r e   f a l s e ,   P ( L C ) = 0.60 ,   P ( ~ L C ) = 0.40 .
The observed beliefs are revised as follows according to prior probabilities:
P ( D S , G M   | L C ) = 0.7 ;   P ( D S , G M |   ~ L C ) = 0.3 ; P ( D S , ~ G M   | L C ) = 0.6 ;   P ( D S , ~ G M |   ~ L C ) = 0.4 ; P ( ~ D S , G M   | L C ) = 0.5 ;   P ( ~ D S , G M |   ~ L C ) = 0.5 ; P ( ~ D S , ~ G M | L C ) = 0.8 ; P ( ~ D S , ~ G M | ~ L C ) = 0.2 .
From Equation (1), the posterior probabilities are given by Equation (A13) (Appendix A).

5. Results and Discussion

Bayes theorem deceptively calculates the conditional probabilities used for developing classification models. The scenario for Bayes network prediction is that a patient in a hospital may or may not need immediate consultation (immediate consultation is true or false) and a doctor determines whether the patient is having an emergency condition or not. The problem statement is as follows: if a patient is randomly selected by a doctor for checking emergency condition, the probability is that the patient needs immediate consultation.

Manual Calculation

In this scenario, sometimes the patient needs immediate consultation, but the doctor may not determine whether the patient is in an emergency condition. The patient’s situation is that they may need immediate consultation, which can be found through the sensitivity (true positive rate). In this case, the assumption of probability of immediate consultation is high, i.e., 49 persons in 50 cases are consulted immediately, given by P(IT = true) = 0.999 and the prior probability of a patient found through the observed beliefs. Thus, the probability of a patient who needs immediate consultation is given priority, and the doctor selects the patient who is in emergency condition, which can be found using Bayes’ theorem [43]. Thus, the posterior probability can be found as follows
  • P(A/B) = P(B/A) × P(B);
  • As assumed through CPT, posterior probability P(IT|EM) = (P(EM|IT) × P(IT))/P(EM);
  • (P(EM|IT) = 0.98 and P(IT) = 0.999;
  • P(IT|EM) = (0.98 × 0.999)/P(EM);
  • P(B) can be found using P(B) = P(B|A) × P(A) + P(B|~A) × P(~A);
  • P(EM) = P(EM|IT) × P(IT) + P(EM|~IT) × P(~IT);
  • P(EM) = 0.98 × 0.999 + 0.02 × 0.001 = 0.97904;
  • Finally, the posterior probability is found using P(IT|EM) = (0.98 × 0.999)/0.97904 = 1.000.
Similarly, every posterior probability of all the events in the DAG is found and appended in Appendix A. In the given scenario, three pieces of information are needed, i.e., prior probability, likelihood (sensitivity or true positive rate), and evidence (specificity or true negative rate). Thus, the confusion matrix can be defined as shown in Table 2.
From the confusion matrix shown in Table 1, the following statements apply:
  • P(B|A) = sensitivity, true positive rate (TPR) = TP/(TP + FN) = P(EM|IT) = 0.98;
  • P(B|~A) = false positive rate (FPR) = FP/(FP + TN) = P(EM|~IT) = 0.02;
  • P(~B|~A) = specificity, true negative rate (TNR) = TN/(TN + FP) = P(~EM|~IT) = 0.96;
  • P(~B|A) = false negative rate (FNR) = FN/(FN + TP) = P(~EM|IT = T) = 0.04.
By mapping the prior probabilities for the emergency condition (class) and the immediate consultation (prediction), the following statements apply:
  • P(A) = probability of a positive class (PC) = P(IT) = 0.999;
  • P(~A) = probability of a negative class (NC) = P(~IT) = 0.001;
  • P(B) = probability of a positive prediction (PP) = P(EM) = 0.97904;
  • P(~B) = probability of a negative prediction (NP) = P(~EM) = 0.02096.
Thus, Bayes’ theorem can be written as follows:
  • P(A|B) = (TPR × PC)/PP = (P(EM|IT) × P(IT))/P(EM);
  • P(B) = TPR × PC + FPR × NC = P(EM|IT) × P(IT) + P(EM|~IT) × P(~IT).
The posterior probabilities calculated using Bayes’ theorem is the precision, known as the positive predictive value (PPV), computed from confusion matrix.
  • PPV = TP/(TP + FP);
  • P(A|B) = PPV = TPR × PC/PP.
Finally, the prediction is given as follows:
  • P(IT|EM) = P(EM|IT) × P(IT)/P(EM);
  • P(IT|EM) = 0.98 × 0.999/0.97904 = 0.999979.
Likewise, the probabilities can be found for delay-sensitive monitoring and general monitoring cases also. Using this method, probabilities of any number of patient cases can be easily calculated and evaluated. By knowing the Bayesian belief network model, from the conditional probability tables, the probabilities of uncertain conditions can be determined with this method. The results are posterior probabilities calculated from the hypothesis and degree of belief in each node, and they are tabulated in Table 3 and in Table 4.

6. Conclusions

Recent WBAN papers applying telemedicine for emergency were collected and briefly presented. In addition, Bayes’ network-based prediction for emergency condition was analyzed, and its posterior probabilities were found. Bayes thinking can be applied when there is uncertainty in data. The inputs are linked to the output through probability levels, and these help in decision-making. Here, this technique was applied for decision-making with respect to immediate teleconsultation. As there is no confusion matrix for a population of people with immediate consultation and one without immediate consultation, which have emergency conditions and do not have emergency conditions, this method is easily applicable. Instead, prior probabilities with regard to the population and emergency condition are only known. This is suitable for beliefs that are known and when calculation is not possible in the real world. This helps in determining the probability of sending emergency messages to a remote doctor and determining how fast emergency messages can be sent. It can be observed that machine learning applies Bayes’ theorem for classifying predictive things. This method of testing any number of models on a dataset is done by finding the probability of each hypothesis with the true given data. In addition, maximum a posteriori (MAP) estimation for linear regression can be applied for prediction, and binary classification can be used as an alternative for maximum likelihood estimation (MLE). These networks are also applied in monitoring health outcomes and analysis.

Author Contributions

Conceptualization and methodology, L.R.; validation, V.P.; formal analysis, investigation, and resources, L.R.; writing—original draft preparation, writing—review and editing, L.R.; supervision, V.P.; funding acquisition, L.R. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Visvesvaraya PhD Scheme by Media Lab Asia, Deity, Government of India, grant number VISPHD-MEITY-1607. The APC was funded by the Visvesvaraya PhD Scheme by Media Lab Asia, Deity, Government of India.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A

This appendix contains the calculation of posterior probability details derived from the observed beliefs in the conditional probability tables.
P ( I T | E M ) = P ( E M , I T ) P ( I T ) P ( E M ) = 0.98 × 0.999 0.97904 = 0.999979 ,
P ( I T | ~ E M ) = P ( ~ E M , I T ) P ( I T ) P ( ~ E M ) = 0.04 × 0.999 0.04092 = 0.976539 ,
P ( ~ I T | E M ) = P ( E M , ~ I T ) P ( ~ I T ) P ( E M ) = 0.02 × 0.001 0.97904 = 0.00002 ,
P ( ~ I T | ~ E M ) = P ( ~ E M , ~ I T ) P ( ~ I T ) P ( ~ E M ) = 0.96 × 0.001 0.04092 = 0.02346 .
P ( E M | V L L ) = P ( V L L , E M ) P ( E M ) P ( V L L ) = 0.90 × 0.97904 0.883182 = 0.99768 ,
P ( E M | ~ V L L ) = P ( ~ V L L ,   E M ) P ( E M ) P ( ~ V L L ) = 0.10 × 0.97904 0.136778 = 0.71578 ,
P ( ~ E M | V L L ) = P (   V L L , ~ E M ) P ( ~ E M ) P ( V L L ) = 0.05 × 0.04092 0.883182 = 0.002316 ,
P ( ~ E M | ~ V L L ) = P ( ~ V L L ,   ~ E M ) P ( ~ E M ) P ( ~ V L L ) = 0.95 × 0.04092 0.136778 = 0.284212 .
P ( E M | H C ) = P ( H C , E M ) P ( E M ) P ( H C ) = 0.70 × 0.97904 0.695118 = 0.98591 ,
P ( E M | ~ H C ) = P ( ~ H C ,   E M ) P ( E M ) P ( ~ H C ) = 0.30 × 0.97904 0.334222 = 0.87879 ,
P ( ~ E M | H C ) = P ( H C , ~ E M ) P ( ~ E M ) P ( H C ) = 0.01 × 0.04092 0.695118 = 0.000588 ,
P ( ~ E M | ~ H C ) = P ( ~ H C ,   ~ E M ) P ( ~ E M ) P ( ~ H C ) = 0.99 × 0.04092 0.334222 = 0.121209 .
P ( E M | L P L R ) = P ( L P L R , E M ) P ( E M ) P ( L P L R ) = 0.95 × 0.97904 0.93172 = 0.998248 ,
P ( E M | ~ L P L R ) = P ( ~ L P L R ,   E M ) P ( E M ) P ( ~ L P L R ) = 0.05 × 0.97904 0.088235 = 0.554791 ,
P ( ~ E M | L P L R ) = P ( L P L R , ~ E M ) P ( ~ E M ) P ( L P L R ) = 0.04 × 0.04092 0.93172 = 0.0017567 ,
P ( ~ E M | ~ L P L R ) = P ( ~ L P L R ,   ~ E M ) P ( ~ E M ) P ( ~ L P L R ) = 0.96 × 0.04092 0.088235 = 0.44521 .
P ( D S | L L ) = P ( L L , D S ) P ( D S ) P ( L L ) = 0.94 × 0.888 0.84144 = 0.992013 ,
P ( D S | ~ L L ) = P ( ~ L L ,   D S ) P ( D S ) P ( ~ L L ) = 0.08 × 0.888 0.17408 = 0.408088 ,
P ( ~ D S | L L ) = P ( L L , ~ D S ) P ( ~ D S ) P ( L L ) = 0.06 × 0.112 0.84144 = 0.007986 ,
P ( ~ D S | ~ L L ) = P ( ~ L L ,   ~ D S ) P ( ~ D S ) P ( ~ L L ) = 0.92 × 0.112 0.17408 = 0.591911 .
P ( D S | V L P L R ) = P ( V L P L R , D S ) P ( D S ) P ( V L P L R ) = 0.97 × 0.882 0.86472 = 0.98938 ,
P ( D S | ~ V L P L R ) = P ( ~ V L P L R ,   D S ) P ( D S ) P ( ~ V L P L R ) = 0.07 × 0.882 0.16632 = 0.37121 ,
P ( ~ D S | V L P L R ) = P ( V L P L R , ~ D S ) P ( ~ D S ) P ( V L P L R ) = 0.03 × 0.112 0.86472 = 0.003885 ,
P ( ~ D S | ~ V L P L R ) = P ( ~ V L P L R ,   ~ D S ) P ( ~ D S ) P ( ~ V L P L R ) = 0.93 × 0.112 0.16632 = 0.62626 .
P ( G M | M D R ) = P ( M D R , G M ) P ( G M ) P ( M D R ) = 0.99 × 0.777 0.77146 = 0.997109 ,
P ( G M | ~ M D R ) = P ( ~ M D R ,   G M ) P ( G M ) P ( ~ M D R ) = 0.02 × 0.777 0.23408 = 0.066387 ,
P ( ~ G M | M D R ) = P ( M D R , ~ G M ) P ( ~ G M ) P ( M D R ) = 0.01 × 0.223 0.77146 = 0.0028906 ,
P ( ~ G M | ~ M D R ) = P ( ~ M D R ,   ~ G M ) P ( ~ G M ) P ( ~ M D R ) = 0.98 × 0.223 0.23408 = 0.933612 .
P ( G M | H P L R ) = P ( H P L R , G M ) P ( G M ) P ( H P L R ) = 0.95 × 0.777 0.7493 = 0.98511 ,
P ( G M | ~ H P L R ) = P ( ~ H P L R ,   G M ) P ( G M ) P ( ~ H P L R ) = 0.03 × 0.777 0.23962 = 0.09727 ,
P ( ~ G M | H P L R ) = P ( H P L R , ~ G M ) P ( ~ G M ) P ( H P L R ) = 0.05 × 0.223 0.7493 = 0.014880 ,
P ( ~ G M | ~ H P L R ) = P ( ~ H P L R ,   ~ G M ) P ( ~ G M ) P ( ~ H P L R ) = 0.97 × 0.223 0.23962 = 0.90272 .
P ( G M | M L ) = P ( M L , G M ) P ( G M ) P ( M L ) = 0.96 × 0.777 0.75454 = 0.98857 ,
P ( G M | ~ M L ) = P ( ~ M L ,   G M ) P ( G M ) P ( ~ M L ) = 0.01 × 0.777 0.22854 = 0.033998 ,
P ( ~ G M | M L ) = P ( M L , ~ G M ) P ( ~ G M ) P ( M L ) = 0.04 × 0.223 0.75454 = 0.01182 ,
P ( ~ G M | ~ M L ) = P ( ~ M L ,   ~ G M ) P ( ~ G M ) P ( ~ M L ) = 0.99 × 0.223 0.22854 = 0.96600 .
P ( G M | M J ) = P ( M J , G M ) P ( G M ) P ( M J ) = 0.94 × 0.777 0.74376 = 0.98201 ,
P ( G M | ~ M J ) = P ( ~ M J ,   G M ) P ( G M ) P ( ~ M J ) = 0.05 × 0.777 0.2507 = 0.15496 ,
P ( ~ G M | M J ) = P ( M J , ~ G M ) P ( ~ G M ) P ( M J ) = 0.06 × 0.223 0.74376 = 0.017989 ,
P ( ~ G M | ~ M J ) = P ( ~ M J ,   ~ G M ) P ( ~ G M ) P ( ~ M J ) = 0.95 × 0.223 0.2507 = 0.84503 .
P ( E M , D S | L J ) = P ( L J | E M , D S ) P ( E M , D S ) P ( L J ) = 0.7 × 0.3 0.92 = 0.38 ,
P ( E M , ~ D S | L J ) = P ( L J | E M , ~ D S ) P ( E M , ~ D S ) P ( L J ) = 0.6 × 0.2 0.92 = 0.13 ,
P ( ~ E M , D S | L J ) = P ( L J | ~ E M , D S ) P ( ~ E M , D S ) P ( L J ) = 0.5 × 0.2 0.92 = 0.108 ,
P ( ~ E M , ~ D S | L J ) = P ( L J | ~ E M , ~ D S ) P ( ~ E M , ~ D S ) P ( L J ) = 0.8 × 0.1 0.92 = 0.087 ,
P ( E M , D S | ~ L J ) = P ( ~ L J | E M , D S ) P ( E M , D S ) P ( ~ L J ) = 0.3 × 0.5 0.92 = 0.163 ,
P ( E M , ~ D S | ~ L J ) = P ( ~ L J | E M , ~ D S ) P ( E M , ~ D S ) P ( ~ L J ) = 0.4 × 0.2 0.92 = 0.087 ,
P ( ~ E M , D S | ~ L J ) = P ( ~ L J | ~ E M , D S ) P ( ~ E M , D S ) P ( ~ L J ) = 0.5 × 0.2 0.92 = 0.108 ,
P ( ~ E M , ~ D S | ~ L J ) = P ( L J | ~ E M , ~ D S ) P ( ~ E M , ~ D S ) P ( ~ L J ) = 0.2 × 0.1 0.92 = 0.022 .
P ( E M , D S | H D R ) = P ( H D R | E M , D S ) P ( E M , D S ) P ( H D R ) = 0.6 × 0.7 0.8 = 0.525 ,
P ( E M , ~ D S | H D R ) = P ( H D R | E M , ~ D S ) P ( E M , ~ D S ) P ( H D R ) = 0.5 × 0.1 0.8 = 0.0625 ,
P ( ~ E M , D S | H D R ) = P ( H D R | ~ E M , D S ) P ( ~ E M , D S ) P ( H D R ) = 0.7 × 0.1 0.8 = 0.0875 ,
P ( ~ E M , ~ D S | H D R ) = P ( H D R | ~ E M , ~ D S ) P ( ~ E M , ~ D S ) P ( H D R ) = 0.8 × 0.1 0.8 = 0.1 ,
P ( E M , D S | ~ H D R ) = P ( ~ H D R | E M , D S ) P ( E M , D S ) P ( ~ H D R ) = 0.4 × 0.7 0.8 = 0.35 ,
P ( E M , ~ D S | ~ H D R ) = P ( ~ H D R | E M , ~ D S ) P ( E M , ~ D S ) P ( ~ H D R ) = 0.5 × 0.1 0.8 = 0.0625 ,
P ( ~ E M , D S | ~ H D R ) = P ( ~ H D R | ~ E M , D S ) P ( ~ E M , D S ) P ( ~ H D R ) = 0.3 × 0.1 0.8 = 0.0375 ,
P ( ~ E M , ~ D S | ~ H D R ) = P ( H D R | ~ E M , ~ D S ) P ( ~ E M , ~ D S ) P ( ~ H D R ) = 0.2 × 0.1 0.8 = 0.025 .
P ( D S , G M | L C ) = P ( L C | D S , G M ) P ( D S , G M ) P ( L C ) = 0.7 × 0.6 0.1 = 0.42 ,
P ( D S , ~ G M | L C ) = P ( L C | D S , ~ G M ) P ( D S , ~ G M ) P ( L C ) = 0.6 × 0.2 0.1 = 0.12 ,
P ( ~ D S , G M | L C ) = P ( L C | ~ D S ,   G M ) P ( ~ D S , G M ) P ( L C ) = 0.5 × 0.1 0.1 = 0.05 ,
P ( ~ D S , ~ G M | L C ) = P ( L C | ~ D S , ~ G M ) P ( ~ D S , ~ G M ) P ( L C ) = 0.8 × 0.1 0.1 = 0.08 ,
P ( D S , G M | ~ L C ) = P ( ~ L C | D S , G M ) P ( D S , G M ) P ( ~ L C ) = 0.3 × 0.6 0.1 = 0.18 ,
P ( D S , ~ G M | ~ L C ) = P ( ~ L C | D S , ~ G M ) P ( D S , ~ G M ) P ( ~ L C ) = 0.4 × 0.2 0.1 = 0.08 ,
P ( ~ D S , G M | ~ L C ) = P ( ~ L C | ~ D S ,   G M ) P ( ~ D S , G M ) P ( ~ L C ) = 0.5 × 0.1 0.1 = 0.05 ,
P ( ~ D S , ~ G M | ~ L C ) = P ( ~ L C | ~ D S , ~ G M ) P ( ~ D S , ~ G M ) P ( ~ L C ) = 0.2 × 0.1 0.92 = 0.02

References

  1. Misra, S.; Sarkar, S. Priority-based time-slot allocation in wireless body area networks during medical emergency situations: An evolutionary game-theoretic perspective. IEEE J. Biomed. Health Inform. 2014, 19, 541–548. [Google Scholar] [CrossRef] [PubMed]
  2. Ross, P.E. Managing care through the air [remote health monitoring]. IEEE Spectr. 2004, 41, 26–31. [Google Scholar] [CrossRef]
  3. Salayma, M.; Al-Dubai, A.; Romdhani, I.; Nasser, Y. Wireless body area network (WBAN) a survey on reliability, fault tolerance, and technologies coexistence. ACM Comput. Surv. (CSUR) 2017, 50, 1–38. [Google Scholar] [CrossRef] [Green Version]
  4. Bradai, N.; Chaari, L.; Kamoun, L. A comprehensive overview of wireless body area networks (WBAn). In Digital Advances in Medicine, E-Health, and Communication Technologies; IGI Global: Hershey, PA, USA, 2013; pp. 1–32. [Google Scholar]
  5. Elhadj, H.B.; Chaari, L.; Kamoun, L. A survey of routing protocols in wireless body area networks for healthcare applications. Int. J. E Health Med. Commun. (IJEHMC) 2012, 3, 1–18. [Google Scholar] [CrossRef]
  6. Coping with the challenges of Designing Medical Body Area Networks. Available online: https://zmt.swiss/applications/wireless-body-area-networks/ (accessed on 14 January 2020).
  7. Meharouech, A.; Elias, J.; Mehaoua, A. Moving towards body-to-body sensor networks for ubiquitous applications: A survey. J. Sens. Actuator Netw. 2019, 8, 27. [Google Scholar] [CrossRef] [Green Version]
  8. Khan, R.A.; Pathan, A.S.K. The state-of-the-art wireless body area sensor networks: A survey. Int. J. Distrib. Sens. Netw. 2018, 14. [Google Scholar] [CrossRef]
  9. Fallahpour, M. Wireless body area networking: Joint physical-networking layer simulation and modeling. In Medical Internet of Things (m-IoT)-Enabling Technologies and Emerging Applications; IntechOpen: London, UK, 2019. [Google Scholar]
  10. Cicioğlu, M.; Çalhan, A. SDN-based wireless body area network routing algorithm for healthcare architecture. ETRI J. 2019, 41, 452–464. [Google Scholar] [CrossRef] [Green Version]
  11. Maheswar, R.; Kanagachidambaresan, G.R.; Jayaparvathy, R.; Thampi, S.M. (Eds.) Body Area Network Challenges and Solutions; Springer: Berlin/Heidelberg, Germany, 2019. [Google Scholar]
  12. Evaluation of Wireless Body Area Networks. Available online: https://www.ijitee.org/wp-content/uploads/papers/v8i9S/I10560789S19.pdf (accessed on 20 January 2020).
  13. Negra, R.; Jemili, I.; Belghith, A. Wireless body area networks: Applications and technologies. Procedia Comput. Sci. 2016, 83, 1274–1281. [Google Scholar] [CrossRef] [Green Version]
  14. Nabi, M.; Geilen, M.; Basten, T. Wireless body area network data delivery. In Telemedicine and Electronic Medicine; CRC Press: Boca Raton, FL, USA, 2018; pp. 246–265. [Google Scholar]
  15. Bu, G.; Potop-Butucaru, M. Ban-gzkp: Optimal zero knowledge proof based scheme for wireless body area networks. Ad Hoc Netw. 2018, 77, 28–41. [Google Scholar] [CrossRef] [Green Version]
  16. Zuhra, F.T.; Bakar, K.A.; Ahmed, A.; Tunio, M.A. Routing protocols in wireless body sensor networks: A comprehensive survey. J. Netw. Comput. Appl. 2017, 99, 73–97. [Google Scholar] [CrossRef]
  17. Shen, J.; Gui, Z.; Ji, S.; Shen, J.; Tan, H.; Tang, Y. Cloud-aided lightweight certificateless authentication protocol with anonymity for wireless body area networks. J. Netw. Comput. Appl. 2018, 106, 117–123. [Google Scholar] [CrossRef]
  18. Hasan, K.; Biswas, K.; Ahmed, K.; Nafi, N.S.; Islam, M.S. A comprehensive review of wireless body area network. J. Netw. Comput. Appl. 2019, 143, 178–198. [Google Scholar] [CrossRef]
  19. Samal, T.K.; Patra, S.C.; Kabat, M.R. An adaptive cuckoo search based algorithm for placement of relay nodes in wireless body area networks. J. King Saud Univ. Comput. Inf. Sci. 2019. [Google Scholar] [CrossRef]
  20. Yi, C.; Zhao, Z.; Cai, J.; de Faria, R.L.; Zhang, G.M. Priority-aware pricing-based capacity sharing scheme for beyond-wireless body area networks. Comput. Netw. 2016, 98, 29–43. [Google Scholar] [CrossRef]
  21. Liu, X.; Zhang, R.; Zhao, M. A robust authentication scheme with dynamic password for wireless body area networks. Comput. Netw. 2019, 161, 220–234. [Google Scholar] [CrossRef]
  22. Arfaoui, A.; Kribeche, A.; Senouci, S.M.; Hamdi, M. Game-based adaptive anomaly detection in wireless body area networks. Comput. Netw. 2019, 163, 106870. [Google Scholar] [CrossRef]
  23. Zhou, Y.; Sheng, Z.; Mahapatra, C.; Leung, V.C.; Servati, P. Topology design and cross-layer optimization for wireless body sensor networks. Ad Hoc Netw. 2017, 59, 48–62. [Google Scholar] [CrossRef] [Green Version]
  24. Kaur, N.; Singh, S. Optimized cost effective and energy efficient routing protocol for wireless body area networks. Ad Hoc Netw. 2017, 61, 65–84. [Google Scholar] [CrossRef]
  25. Badreddine, W.; Khernane, N.; Potop-Butucaru, M.; Chaudet, C. Convergecast in wireless body area networks. Ad Hoc Netw. 2017, 66, 40–51. [Google Scholar] [CrossRef]
  26. Zebboudj, S.; Cherifi, F.; Mohammedi, M.; Omar, M. Secure and efficient ECG-based authentication scheme for medical body area sensor networks. Smart Health 2017, 3, 75–84. [Google Scholar] [CrossRef]
  27. Wang, W.; Shi, X.; Qin, T. Encryption-free authentication and integrity protection in body area networks through physical unclonable functions. Smart Health 2019, 12, 66–81. [Google Scholar] [CrossRef]
  28. Mahmud, M.S.; Fang, H.; Carreiro, S.; Wang, H.; Boyer, E.W. Wearables technology for drug abuse detection: A survey of recent advancement. Smart Health 2019, 13, 100062. [Google Scholar] [CrossRef]
  29. Silvera-Tawil, D.; Hussain, M.S.; Li, J. Emerging technologies for precision health: An insight into sensing technologies for health and wellbeing. Smart Health 2019, 15. [Google Scholar] [CrossRef]
  30. Özderya, H.Y.; Erdöl, H.; Kayıkçıoğlu, T.; Yılmaz, A.Ö.; Kaya, İ. Wireless body area network studies for telemedicine applications using IEEE 802.15. 6 standard. In CMBEBIH; Springer: Singapore, Singapore, 2017; pp. 666–670. [Google Scholar]
  31. Agrahari, R.; Foroushani, A.; Docking, T.R.; Chang, L.; Duns, G.; Hudoba, M.; Karsan, A.; Zare, H. Applications of Bayesian network models in predicting types of hematological malignancies. Sci. Rep. 2018, 8, 6951. [Google Scholar] [CrossRef] [PubMed]
  32. Bellot, D.; Boyer, A.; Charpillet, F.E.F. Designing smart agent based telemedicine systems using dynamic bayesian networks: An application to kidney disease people. In Proceedings of the 4th International Workshop on Enterprise Networking and Computing in Health Care Industry, Nancy, France, 6–7 June 2002; p. 8. [Google Scholar]
  33. Beretta, S.; Castelli, M.; Gonçalves, I.; Henriques, R.; Ramazzotti, D. Learning the structure of Bayesian Networks: A quantitative assessment of the effect of different algorithmic schemes. Complexity 2018, 1–12. [Google Scholar] [CrossRef]
  34. Chan, F.; Wong, C.; Hon, T.; Choi, A. Bayesian network model for reducing accident rates of electrical and mechanical (E&M) work. Int. J. Environ. Res. Public Health 2018, 15, 2496. [Google Scholar]
  35. Christophersen, A.; Deligne, N.I.; Hanea, A.M.; Chardot, L.; Fournier, N.; Aspinall, W.P. Bayesian network modeling and expert elicitation for probabilistic eruption forecasting: Pilot study for Whakaari/White Island, New Zealand. Front. Earth Sci. 2018, 6, 211. [Google Scholar] [CrossRef] [Green Version]
  36. Li, M.; Liu, K. Application of intelligent dynamic bayesian network with wavelet analysis for probabilistic prediction of storm track intensity index. Atmosphere 2018, 9, 224. [Google Scholar] [CrossRef] [Green Version]
  37. Pandey, S.K.; Mishra, R.B.; Triphathi, A.K. Software bug prediction prototype using bayesian network classifier: A comprehensive model. Procedia Comput. Sci. 2018, 132, 1412–1421. [Google Scholar] [CrossRef]
  38. Park, E.; Chang, H.J.; Nam, H.S. A bayesian network model for predicting post-stroke outcomes with available risk factors. Front. Neurol. 2018, 9, 699. [Google Scholar] [CrossRef]
  39. Scanagatta, M.; Corani, G.; De Campos, C.P.; Zaffalon, M. Approximate structure learning for large Bayesian networks. Mach. Learn. 2018, 107, 1209–1227. [Google Scholar] [CrossRef] [Green Version]
  40. Scutari, M.; Auconi, P.; Caldarelli, G.; Franchi, L. Bayesian networks analysis of malocclusion data. Sci. Rep. 2017, 7, 15236. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  41. Wang, Y.; Cao, J.; Liu, L.; Feng, K.; Hong, S.; Xi, B.E.F. Framework of telemedicine diagnosis decision-making with Bayesian network based on multi-agent system. In Proceedings of the 7th International Conference on Computer Science & Education (ICCSE), Melbourne, VIC, Australia, 14–17 July 2012; pp. 68–70. [Google Scholar]
  42. Kammar, A.; Hernández-Hernández, M.; López-Moreno, P.; Ortíz-Bueno, A.; Martínez-Montaño, M. Probability and body composition of metabolic syndrome in young adults: Use of the bayes theorem as diagnostic evidence of the waist-to-height ratio. Stats 2018, 1, 3. [Google Scholar] [CrossRef] [Green Version]
  43. A Gentle Introduction to Bayes Theorem for Machine Learning. Available online: https://machinelearningmastery.com/bayes-theorem-for-machine-learning/ (accessed on 29 February 2020).
  44. A Gentle Introduction to Bayesian Belief Networks. Available online: https://machinelearningmastery.com/introduction-to-bayesian-belief-networks/ (accessed on 29 February 2020).
Figure 1. Schematic view of the methodology.
Figure 1. Schematic view of the methodology.
Sensors 20 02153 g001
Figure 2. Bayes network model for telemedicine.
Figure 2. Bayes network model for telemedicine.
Sensors 20 02153 g002
Table 1. Conditional probability table (CPT) for each node in the directed acyclic graph (DAG). IT—immediate teleconsultation; DS—delay-sensitive monitoring; GM—general monitoring; EM—emergency monitoring; VLL—very low latency; HC—high cost; LPLR—low packet loss rate; LL—low latency; VLPLR—very low packet loss rate; MDR—moderate data rate; HPLR—high packet loss rate; ML—moderate latency; MJ—moderate jitter; LJ—low jitter; HDR—high data rate; LC—low cost.
Table 1. Conditional probability table (CPT) for each node in the directed acyclic graph (DAG). IT—immediate teleconsultation; DS—delay-sensitive monitoring; GM—general monitoring; EM—emergency monitoring; VLL—very low latency; HC—high cost; LPLR—low packet loss rate; LL—low latency; VLPLR—very low packet loss rate; MDR—moderate data rate; HPLR—high packet loss rate; ML—moderate latency; MJ—moderate jitter; LJ—low jitter; HDR—high data rate; LC—low cost.
Node No.ConditionsProbabilities
Node 1If IT is trueP(IT) = 0.999
Node 2If DS is trueP(DS) = 0.888
Node 3If GM is trueP(GM) = 0.777
Node 4If IT is trueP(EM) = 0.98
If IT is falseP(~EM) = 0.02
Node 5If EM is trueP(VLL) = 0.90
If EM is falseP(~VLL) = 0.05
Node 6If EM is trueP(HC) = 0.70
If EM is falseP(~HC) = 0.30
Node 7If EM is trueP(LPLR) = 0.95
If EM is falseP(~LPLR) = 0.05
Node 8If DS is trueP(LL) = 0.94
If DS is falseP(~LL) = 0.06
Node 9If DS is trueP(VLPLR) = 0.97
If DS is falseP(~VLPLR) = 0.03
Node 10If GM is trueP(MDR) = 0.99
If GM is falseP(~MDR) = 0.01
Node 11If GM is trueP(HPLR) = 0.95
If GM is falseP(~HPLR) = 0.05
Node 12If GM is trueP(ML) = 0.96
If GM is falseP(~ML) = 0.04
Node 13If GM is trueP(MJ) = 0.94
If GM is falseP(~MJ) = 0.06
Node 14If EM and DS are trueP(LJ) = 0.95; P(~LJ) = 0.05
If EM is true and DS is falseP(LJ) = 0.94; P(~LJ) = 0.06
If EM is false and DS is trueP(LJ) = 0.74; P(~LJ) = 0.26
If EM and DS are falseP(LJ) = 0.60; P(LJ) = 0.40
Node 15If EM and DS are trueP(HDR) = 0.96; P(~HDR) = 0.04
If EM is true and DS is falseP(HDR) = 0.92; P(~HDR) = 0.08
If EM is false and DS is trueP(HDR) = 0.74; P(~HDR) = 0.26
If EM and DS are falseP(HDR) = 0.70; P(~HDR) = 0.30
Node 16If DS and GM are trueP(LC) = 0.98; P(~LC) = 0.02
If DS is true and GM is falseP(LC) = 0.94; P(~LC) = 0.06
If DS is false and GM is trueP(LC) = 0.70; P(~LC) = 0.30
If DS and GM are falseP(LC) = 0.60; P(~LC) = 0.40
Table 2. Confusion matrix.
Table 2. Confusion matrix.
Positive Class
Positive predictionTrue positive (TP)
Negative predictionFalse negative (FN)
Table 3. Posterior probabilities for nodes (4–13).
Table 3. Posterior probabilities for nodes (4–13).
Node No.ConditionsPosterior Probabilities
Node 4P(EM) =   0.97904
P(~EM) =   0.04092
P(IT|EM) = 0.999979
P(IT|~EM) = 0.976539
P(~IT|EM) = 0.00002
P(~IT|~EM) = 0.02346
Node 5P(VLL) =   0.883182
P(~VLL) =   0.136778
P(EM|VLL) = 0.99768
P(EM|~VLL) = 0.71578
P(~EM|VLL) = 0.002316
P(~EM|~VLL) = 0.284212
Node 6P(HC) =   0.695118
P(~HC) = 0.334222
P(EM|HC) = 0.98591
P(EM|~HC) = 0.87879
P(~EM|HC) = 0.000588
P(~EM|~HC) = 0.121209
Node 7P(LPLR) = 0.93172
P(~LPLR) = 0.088235
P(EM| LPLR) = 0.998248
P(EM|~ LPLR) = 0.554791
P(~EM| LPLR) = 0.0017567
P(~EM|~ LPLR) = 0.44521
Node 8P(LL) = 0.84144
P(~LL) = 0.17408
P(DS|LL) = 0.992013
P(DS|~LL) = 0.408088
P(~DS|LL) = 0.007986
P(~DS|~LL) = 0.591911
Node 9P(VLPLR) = 0.86472
P(~VLPLR) = 0.16632
P(DS|VLPLR) = 0.98938
P(DS|~VLPLR) = 0.37121
P(~DS|VLPLR) = 0.003885
P(~DS|~VLPLR) = 0.62626
Node 10P(MDR) = 0.77146
P(~MDR) = 0.23408
P(GM|MDR) = 0.997109
P(GM|~MDR) = 0.066387
P(~GM|MDR) = 0.0028906
P(~GM|~MDR) = 0.933612
Node 11P(HPLR) = 0.7493
P(~HPLR) = 0.23962
P(GM|HPLR) = 0.98511
P(GM|~HPLR) = 0.09727
P(~GM|HPLR) = 0.014880
P(~GM|~HPLR) = 0.90272
Node 12P(ML) = 0.75454
P(~ML) = 0.22854
P(GM|ML) = 0.98857
P(GM|~ML) = 0.033998
P(~GM|ML) = 0.01182
P(~GM|~ML) = 0.96600
Node 13P(MJ) = 0.74376
P(~MJ) = 0.2507
P(GM|MJ) = 0.98201
P(GM|~MJ) = 0.15496
P(~GM|MJ) = 0.017989
P(~GM|~MJ) = 0.84503
Table 4. Posterior probabilities for nodes (14–16).
Table 4. Posterior probabilities for nodes (14–16).
Node No.ConditionsPosterior Probabilities
Node 14 If EM is true and DS is true and
{P(LJ) = 0.95 and P(~LJ) = 0.05}
If EM is true and DS is false and
{P(LJ) = 0.94 and P(~LJ) = 0.06}
If EM is false and DS is true and
{P(LJ) = 0.74 and P(~LJ) = 0.26}
If EM is false and DS is false and
{P(LJ) = 0.60 and P(~LJ) = 0.40}
P(EM, DS|LJ) = 0.380 and
P(EM, DS|~LJ) = 0.163
P(EM, ~DS|LJ) = 0.130 and
P(EM, ~DS|~LJ) = 0.087
P(~EM, DS|LJ) = 0.108 and
P(~EM, DS|~LJ) = 0.108
P(~EM, ~DS|LJ) = 0.087and
P(~EM, ~DS|~LJ) = 0.222
Node 15If EM is true and DS is true and
{P(HDR) = 0.96 and P(~HDR) = 0.04}
If EM is true and DS is false and
{P(HDR) = 0.92 and P(~HDR) = 0.08}
If EM is true and DS is true and
{P(HDR) = 0.74 and P(~HDR) = 0.26}
If EM is true and DS is false and
{P(HDR) = 0.70 and P(~HDR) = 0.30}
P(EM, DS|HDR) = 0.525 and
P(EM, DS|~HDR) = 0.35
P(EM, ~DS|HDR) = 0.0625 and
P(EM, ~DS|~HDR) = 0.0625
P(~EM, DS|HDR) = 0.0875 and
P(~EM, DS|~HDR) = 0.0375
P(~EM, ~DS|HDR) = 0.1 and
P(~EM, ~DS|~HDR) = 0.025
Node 16If DS is true and GM is true and
{P(LC) = 0.98 and P(~LC) = 0.02}
If DS is true and GM is false and
{P(LC) = 0.94 and P(~LC) = 0.06}
If DS is true and GM is true and
{P(LC) = 0.70 and P(~LC) = 0.30}
If EM is true and GM is false and
{P(LC) = 0.60 and P(~LC) = 0.40}
P(DS, GM|LC) = 0.42 and
P(DS, GM|~LC) = 0.18
P(DS, ~GM|LC) = 0.12 and
P(DS, ~GM|~LC) = 0.08
P(~DS, GM|LC) = 0.05 and
P(~DS, GM|~LC) = 0.05
P(~DS, ~GM|LC) = 0.08 and
P(~DS, ~GM|~LC) = 0.02

Share and Cite

MDPI and ACS Style

R, L.; P, V. Wireless Body Area Network (WBAN)-Based Telemedicine for Emergency Care. Sensors 2020, 20, 2153. https://doi.org/10.3390/s20072153

AMA Style

R L, P V. Wireless Body Area Network (WBAN)-Based Telemedicine for Emergency Care. Sensors. 2020; 20(7):2153. https://doi.org/10.3390/s20072153

Chicago/Turabian Style

R, Latha, and Vetrivelan P. 2020. "Wireless Body Area Network (WBAN)-Based Telemedicine for Emergency Care" Sensors 20, no. 7: 2153. https://doi.org/10.3390/s20072153

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop