Next Article in Journal
EEG-Based BCIs on Motor Imagery Paradigm Using Wearable Technologies: A Systematic Review
Previous Article in Journal
A Fabry–Perot Sensor with Cascaded Polymer Films Based on Vernier Effect for Simultaneous Measurement of Relative Humidity and Temperature
Previous Article in Special Issue
Identification of a Gait Pattern for Detecting Mild Cognitive Impairment in Parkinson’s Disease
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

State-of-the-Art Review on Wearable Obstacle Detection Systems Developed for Assistive Technologies and Footwear

Institute for Health and Sport, Victoria University, Melbourne, VIC 3000, Australia
*
Author to whom correspondence should be addressed.
Sensors 2023, 23(5), 2802; https://doi.org/10.3390/s23052802
Submission received: 30 December 2022 / Revised: 9 February 2023 / Accepted: 10 February 2023 / Published: 3 March 2023
(This article belongs to the Special Issue Feature Papers in Wearables 2022)

Abstract

:
Walking independently is essential to maintaining our quality of life but safe locomotion depends on perceiving hazards in the everyday environment. To address this problem, there is an increasing focus on developing assistive technologies that can alert the user to the risk destabilizing foot contact with either the ground or obstacles, leading to a fall. Shoe-mounted sensor systems designed to monitor foot-obstacle interaction are being employed to identify tripping risk and provide corrective feedback. Advances in smart wearable technologies, integrating motion sensors with machine learning algorithms, has led to developments in shoe-mounted obstacle detection. The focus of this review is gait-assisting wearable sensors and hazard detection for pedestrians. This literature represents a research front that is critically important in paving the way towards practical, low-cost, wearable devices that can make walking safer and reduce the increasing financial and human costs of fall injuries.

1. Introduction

Extensive epidemiological and medical research now shows that it is imperative to prevent falls in older people and across a broad range of gait-impaired populations. The World Health Organization (WHO) identified falling as the second highest cause of unintentional death, after road accidents and approximately 37.3 million falls requiring medical attention occur each year [1]. In Australia, the cost of treating injury is the third highest category of healthcare spending after musculoskeletal disorders and cardiovascular diseases and annually AUD 3.6 billion is spent by the Australian healthcare system on fall-related physical injuries [2]. Many falls do not cause serious injuries, but one out of every five falls cause significant injury, with annually approximately 3 million older people treated in emergency departments due to a fall [3,4]. WHO statistics show that globally at least 2.2 billion people have a visual impairment, which in the case of older adults contributes to social isolation and walking difficulties, increasing their risk of falling and consequent likelihood of entering residential care [5]. Fear of falling also affects an individual’s quality of life by restricting everyday mobility and decreasing opportunities for recreation and social connection [6].
Biomechanically, falls result from any balance perturbations that cannot be restored [7] but tripping over obstacles is the principal cause of balance loss, accounting for more than 53% of falls [8]. Tripping can be defined as forceful, unanticipated contact with obstacles or the irregularities in the walking surface and reduced foot-ground clearance during the mid-swing phase of a gait cycle is highly hazardous [9]. The ability to continuously adapt foot trajectory to clear obstacles, such as roadside curbs or steps is critical to safe locomotion. In order to detect environmental hazards in the path of the user, obstacle detection technologies that perceive the environment, utilising sensors such as ultrasound, camera, infrared, radar, and laser range finder, have been extensively investigated in different domains [10,11,12,13]. Effective sensor integration with computational vision and environmental scanning is facilitating developments in real-time motion monitoring and effective interventions for fall prevention [14].
As the interface between the ground and foot, research in shoe-integrated technology began more than three decades ago, incorporating comfort and convenience into the design [15]. Technological transformations to the ‘smart shoe’ began with variables associated with walking speed and calorie consumption [16,17], advancing to rehabilitation applications [18]. Progress in wearable sensors, microfabrication, data acquisition, and processing combined with low-power, portable, wireless systems led to assistive footwear designed for individuals with visual or physical impairments.
Previous reviews have summarized the application and functionality of assistive devices, including shoe-based systems. Gokalgandhi, Kamdar [19] provided a review of smart technologies embedded in shoes, including electronic, mechanical, and electromechanical devices. Hegde, Bries [20] also summarized advances in footwear-based wearable systems, with applications in gait monitoring, plantar pressure measurement, posture and activity classification, body weight and energy expenditure estimation, biofeedback, navigation, and fall risk. Assistive technologies developed for visually impaired users have also been described [13,21,22]. Our aim here was to outline and critically evaluate shoe-integrated systems that incorporate obstacle detection to identify environmental hazards that pose a tripping risk. Despite technological advancements in assistive devices for collision avoidance in visually impaired individuals, the long cane and guide dog remain most commonly used. The white cane is preferred due to its reliability, simplicity, low cost, and minimal maintenance [22] but it does not entirely protect against collisions and guide dogs only provide support on familiar routes. There have, therefore, been developments in smart sensor-incorporated adaptations to assist navigation, including electronic canes/sticks [23,24], glasses [25], belts [26], caps [27], bracelets [28], and gloves [29]. Despite these advances, designers have not adapted them to the user’s gait characteristics [30] and thus people with navigational difficulties still remain less active.

Obstacle Detection Smart Shoe System

As shown in Figure 1 the fundamental requirements for an obstacle detection smart shoe are sensing, processing, and alerting. The sensing system has active or passive sensors to detect obstacles, while the processing unit for portable devices includes a microcontroller to trigger the sensor for perceiving, processing the data and analysing the risk of object contact. When an obstacle is detected the alerting system triggers an auditory or vibrotactile stimulus for an avoiding action. Selection of the hardware for each of these three principal units determines the effectiveness, reliability, and acceptance of the shoe system. Selected state-of-the-art shoe obstacle-detection systems are shown in Table 1 for illustration.

2. Sensor Technology in Obstacle Detection Systems

Obstacle detection and avoidance systems are widely prevalent in robotics and autonomous vehicles [39,40,41], to detect hazards in the environment. Sensors form an integral part of an obstacle detection system by perceiving the surroundings and converting that information into real-time data for further processing. Sensors can be classified as active or passive. Active sensors emit a signal and receive a distorted copy of the same signal, while passive sensors pickup an external signal to provide a corresponding output [42]. Active sensors for obstacle detection include radar (radio detection and ranging), lidar (light imaging detection and ranging), and sonar (sound navigation and ranging). Cameras that mimic the human eye(vision), are the most commonly used passive sensor. Table 2 provides an overview of sensors commonly employed in obstacle detection, and their characteristics.
Ultrasonic sensors, emitting high frequency sound waves above the range of human hearing, measure the distance to an object using a time-of-flight (TOF) technique [45,46], but they are limited by low angular resolution and cannot detect obstacle dimension. Ultrasound sensors can provide accurate short-range obstacle distance measurements, by emitting high-frequency (40 kHz) sound waves as a conical beam and detecting the reflected pulses [47]. Ultrasound sensors are widely used for obstacle detection because they are also compact and easily implemented in wearable assistive devices. To obtain a more complete picture of the environment, multiple sensors may be required and they can be confused by environmental noise and specular reflections [40].
Radar sensors emit high-frequency electromagnetic radio waves and typically adopt frequency-modulated continuous wave (FMCW) technology to estimate the target distance using the round-trip time principle, i.e., measuring the frequency shift between emitted and reflected signals [12]. Three types of radars employed in automotive systems are: long-range radar (LRR) for cruise control and collision avoidance, medium-range radar (MRR) for blind spot detection, and short-range radars (SRR) for parking assistance and proximity detection [10]. Long-range radars measure the vehicle’s speed and obstacles up to 200 m away using a 77 Ghz microwave radar but with low resolution. Short/medium range radars in the 24 Ghz and 76 Ghz bands also measure velocity and distance, but with limited resolution and complex return signals [10]. Despite limitations, radar sensors are employed very effectively in autonomous systems due to their reliable, accurate performance both day and night and in adverse weather conditions [48].
Lidar sensors have also been adopted in autonomous driving applications. They utilise shorter wavelength light sources giving high resolution, a wide field of view and fast sweep frequency [40,49,50]. Lidar sensors can be categorized as 1D, 2D, and 3D, differing based on the number of laser beams used and whether it is a point beam or scanning beam. Whereas 1D is range-only, 2D lidar has a single beam and spin, enabling 360° views and x and y axes data, the 3D lidar has several beams and 360° spin, providing object’s x, y, and z coordinates [51]. With high accuracy and long-distance measurement, 3D Lidars play a central role in obstacle detection for autonomous vehicles [52]. The point cloud obtained from Lidar has high processing requirements, is heavy and expensive, there is, therefore, work in progress to reconstruct a 3D environment, using 2D Lidar, reducing processing requirement and cost [53]. Lidars can also be either mechanical Lidar or solid-state (SSL), with mechanical version the most popular, using high-grade optics and rotary encoders driven by electric motors to capture 360° field of view. The SSLs uses micro-structured waveguides to direct the laser beams, eliminating the rotating lenses and reducing mechanical failures [48]. Even though the SSL has a narrower field of view than mechanical Lidars, typically 120° or less, they have gained interest in recent years due to being robust, reliable, and less expensive [54]. The higher frequency (10–20 Hz) and shorter wavelength of Lidar enable a more accurate measurement compared to radar sensors, with a typical accuracy of 1.5–10 cm, vertical angular resolution of 0.35–2 degrees, and horizontal angular resolution of 0.2 degrees [54].
Low-cost vision-based sensors such as cameras have been developed as the primary sensor for high-resolution obstacle detection. A camera detects the light emitted from the surroundings on a photosensitive surface, and redirects it through a lens, producing clear images [48]. Being relatively inexpensive, widely available, and providing rich contextual information similar to human vision, several obstacle detection applications have been developed using vision sensors. Passive cameras do not emit signals that may cause interference [42] and information in the form of pixel intensities captured with high-definition videos or images, can be used to extract shape, colour, and texture information, providing considerable environmental detail [48,55]. The considerable computational power demands of data processing is, however, a constraint, with the latest high-definition cameras processing multi-megabytes of real-time data [10] and they are susceptible to ambient light and weather conditions, with low illumination giving low quality images. Conventional monochrome cameras lack depth information, required for accurate size and position estimation of obstacles [42,55], although some applications can calculate depth information using complex algorithms [56]. A stereo camera with two image sensors can imitate 3D depth perception using epipolar geometry and triangulation methods [48] but demands more processing power. Time-of-flight cameras use an active sensor to measure the time taken for an infrared light beam to reach the object and reflect back to the camera, giving pixel depth and intensity, but distance accuracy and image quality are relatively low compared to other 3D sensors [57]. Microsoft Kinect is a popular range camera, capturing images and depth at high frame rates using a combination of RGB camera and infrared sensor. It has applications in mobile robotic mapping, navigation and localization, industrial robot collision avoidance, and human motion tracking [57], but it is too large for most wearable systems.
Infrared (IR) sensors are also active and passive in design, the active sensors emit infrared light to detect the reflections from the obstacles, the passive IR sensors detect the changes in infrared radiation striking them and are used primarily for motion detection [42]. There are advantages and limitations associated with each sensor type and when considering the applications to smart shoe systems, the size, weight, range, and real-time performance are also important. Table 3 gives the detailed specifications for a common sensor in each category taken from the literature and datasheets.

Sensor Fusion

A single sensor may not provide sufficient information to measure the exact shoe-object distance and object size (height and width). Combining the information from multiple sensors, known as sensor fusion, provides a more veridical pictures of the environment, and this technique has also been demonstrated to be effective for obstacle detection in autonomous cars and mobile robots [44,70,71,72,73]. Integrating the acquired data from multiple modalities reduces the detection uncertainties and overcome the shortcoming of individual sensors operating independently [48]. A highly effective approach is to combine information from different types of sensors. In general, a passive sensor such as a camera, cloning human vision, will give richer information regarding the obstacle features and appearance, while active sensors e.g., ultrasound, lidar, and radar will be more accurate in estimating obstacle distance. Shahdib, Bhuiyan [74] used both ultrasonic sensor and a Canon 550 DSLR camera to guide an autonomous mobile robot, detecting obstacles and estimating their distance and dimensions. The advantages of data integration from multiple sensors are summarized below:
  • Information obtained through fusion has a richer semantic and higher resolution than from a single sensor measurement.
  • Joint information from multiple sources reduces the ambiguity and uncertainty with the measured values.
  • Comprehensive coverage from multiple sensors gives extended spatial and temporal coverage.
  • Increased confidence due to the availability of redundant information from many sensors scanning the same environment, and improved reliability due to sufficient information even if there is a partial failure.
  • Reduce noise and errors through the fusion of multiple data, thus improving the accuracy [75,76].
The advantages of sensor fusion make it the optimal choice for a complex system, but the number and the type of sensor must consider their cost, size, and system’s application. In addition, the factors considered for a wearable system are different to those in autonomous cars and robots. Considerations for shoe-integrated obstacle detection are whether sensors can provide environmental data for safe navigation but also be compact, lightweight, and portable. In addition, fast, reliable data processing is required to alert the user of hazards in their walking path. The following section examines sensors used for obstacle detection while walking.

3. Walking with Obstacle Detection

Walking safely through the everyday environment is essential to healthy, productive lives and developments in gait assisting technologies incorporating wearable sensors are progressing rapidly [21]. Assistive technology encompasses devices, services, systems, and environmental modifications to enable individuals with locomotor impairments to overcome barriers to independence [77]. Research into the navigational aids and obstacle detection systems in assistive technologies is quite extensive. Portable assistive devices can be moved from place to place providing safer navigation for independent living and rehabilitation. Wearable assistive devices can be attached to wristbands [78], eyeglasses [79,80], head-mounted devices [81], vests [82], belts [83], shoes [34], and any other practical point of attachment, allowing hands-free interaction. Smartphones provide portability and convenience and have become a core assistive tool in supporting navigation by obtaining information and interacting with the user’s environment [84]. Microsoft Kinect, initially developed for gaming, has become popular among vision researchers and assistive technologies due to the cost, detection capability, and data acquisition software [85]. With the miniaturization of the electronics devices and advancements in computer vision and machine learning algorithms, research in wearable navigation devices have also incorporated these techniques and sensor fusion methods [86]. Figure 2 illustrates some of the wearable assistive devices developed for a safe navigation of the visually impaired. Some commonly used obstacle detection sensors in wearable assistive device and their specifications and feedback mechanisms are presented in Table 4.
Despite progress in the design of obstacle detection devices their acceptance by users in daily life is limited. Many navigation systems have been proposed for the visually impaired but few allow successful dynamic interactions and adaptability to changes and none can work seamlessly indoors and outdoors [97]. Poor user interface, functional complexity, weight, size, and cost have been identified as contributing to the low acceptance of electronic travel aids [31]. For these systems to meet user requirements, clear and precise object detection from close proximity to a minimum of 3 m are necessary. The section below outlines developments in shoe integrated assistive systems.

4. Obstacle Detection in Shoe-Based Assistive Devices

Incorporating technological features into a shoe avoids having an additional item to be worn or carried and with optimal design can be lightweight, affordable and comfortable. In addition to obstacle detection, smart shoe systems have incorporated features such as live location tracking, heat sensing, slippery surface detection, fall detection, electricity generation while walking for an alternate power source, health and fitness tracking, and pothole detection [33,98,99]. Table 5 highlights the main features of smart shoes with obstacle-detection technology, collected from the reviewed papers.
The most frequently targeted application for obstacle detection in smart shoes is visual impairment [30,31,33,34,35,36,37,38,91,98,99,100,101,102,103,104,105,108], with less work on obstacle detection for the physically challenged, where the gait impairments must also be considered. With the elderly population expected to almost double from 15% of the total population in 2015 to 22% by 2050 [109], there is a need for continuing research into devices that can detect hazards and prevent falls in individuals with walking impairments.
Smart shoe systems designed primarily for obstacle detection may have additional features, such as communication with a smartphone allowing location tracking and contacting someone for emergency assistance [34]. As well as smart shoes for obstacle detection, the Smart Bottine, designed to help individuals with autism, incorporates an android smartphone with the Blynk application for notifying the caregiver in case of emergency [32]. Additional features to provide support for the visually impaired include location tracking service using GPS-GSM, android application for locating missing persons and emergency calls(SOS) with location tracking using Google Maps [33]. Navigational guidance has also been facilitated with a Bluetooth transceiver mounted on the shoes, synchronized to a smartphone application using Google Maps [35]. COMPASS [105], an indoor position system, is targeted towards visually impaired university students, with the smart shoes detecting obstacles and an additional smart bracelet with a camera to verify the correct classroom.
Smart shoe systems can work well on their own [30,31,33,34,35,36,37,38,91,98,99,100,101,102,104,105,106,107,108], but some researchers have used it in conjunction with other obstacle-detection devices [91,103]. Chava, Srinivas [103] incorporated smart glass with smart shoes, with sensors attached to the spectacles to detect head level objects and integrated to Bluetooth hearing device for voice commands. ‘Vision Navigator’ [91] was designed as an assistive interface for users with low or fluctuating vision for indoor and outdoor navigation. The Smart alert walker consists of sensor-equipped sneakers with two built-in ultrasonic sensors to identify short-range obstacles. It is used only as an emergency backup with the Smart-fold cane as the primary system for detecting obstacles and containing all the major hardware components. The intelligent obstacle detection model was deployed for the image obtained from the camera-in-cane, using the SSD-RNN (single shot detection- recurrent neural network) approach, computing with an optimum accuracy of 95.06% and 87.68% indoors and outdoors, respectively [91]. A considerably different approach uses a thin, flexible metal wire antenna running along the shoelace for collision avoidance in the front, but without feedback to alert the user [110].

4.1. Sensors in Obsctale Detection Shoes

Of 20 reviewed papers 17 reported using high-frequency ultrasound sensors utilizing echolocation principle to transmit and receive sound waves for detecting and locating obstacles [32,33,34,35,36,91,98,99,100,101,102,103,104,105,108]. Objects must be either directly in front or at a minimum angle to the transmitter, to be reflected and received by the ultrasonic receiver [31], as shown in Figure 3. Accurate ultrasound detection of up to 4 m allows the detection of the obstacles accurately, and to avoid false detection in crowdy scenarios, a customizable mode with a range of 0 to 1 m has also been deployed [108]. With a wide field-of-view ultrasound sensors are efficient in detecting the distance to an obstacle but lack the ability to accurately determine the direction of the obstacle.
Rather than mounting a single ultrasound sensor on the shoe, multiple units can alert to obstacle in the direction of attachment [37,103], due to small size, low cost, and high reliability [98]. NavGuide [31] uses six ultrasonic sensors, to identify obstacles in their respective scanning fields and map the position of each obstacle in front, on both sides of the shoe, detecting floor-level obstacles, knee-level obstacles, and the risers of an ascending staircase [31].
While most shoe-based systems have utilised ultrasound, the feasibility of wearable radar sensors for detecting non-conductive obstacles and floor/wall surfaces has been demonstrated using a 60 GHz System-on-chip mm-wave radar, and a Texas Instruments IWR684ISK and MMWAVEICBOOST, connected to a laptop via USB port [111]. A novel frequency-modulated continuous wave (FMCW) radar transmits a linear chirp signal and calculates object distance based on round-trip time delay. The shoe prototype consists of two wearable K-band radars mounted on shoes (see Figure 4) to detect the absolute distance to an object and the shoe-ground clearance [107]. In another approach, Yang, Jung [30] utilised direction controlled infrared sensors with a narrow detection range to distinguish the object direction.
Lin, Yang [112], proposed a camera-based line-laser obstacle detection system, using a Logitech C310 webcam operating at 29 frames per second with 640 × 480 resolution, and a 405 nm wavelength laser. A fall prevention strategy was implemented with a Sum of Absolute Difference threshold to trigger the obstacle detection event, line-laser pattern segmentation, homography transformation, and obstacle danger-level, showing the possibility for installation on shoes. Staircases, potholes, and ditches can also be predicted using a Yolov3 model from the video output of a simple 5MP fish eye camera with a 120-degree field of view [36], as illustrated in Figure 5.
While a camera can provide reliable obstacle information, the need for a high-performing processor remains a problem when deployed as a vision sensor [30]. The implementation practicalities and accuracy of compact cameras for image capture in other assistive devices [91,103], shows potential for smart shoe applications. The SSD-RNN model, based on the input from the camera deployed in the walking cane system described earlier gave optimum performance, generating 95.54% recognition accuracy for common obstacles [91]. Compact, lightweight cameras, such as the Raspberry Pi camera, and high-performing microcontrollers could be deployed in the shoe systems, avoiding the requirement for a walking cane. Many other sensors also have potential for implementation in shoes, outlined below.

4.2. Additional Sensing Deployed

Along with obstacle detection, additional sensing methods are also deployed in the obstacle detection system, considering further safety features.
  • Pothole Detection: Potholes can be detected with infrared sensors [33,99].
  • Water Detection: When the surface of a water sensor comes into contact with water, it returns a non-zero value, indicating any water source including wet floors [31,33,37,91]. Soil dampness sensors measure the volumetric water content in soil [33,113].
  • Heat Sensing: The LM35 temperature sensor detects fire or a hot object near the user and the information can be relayed using voice commands [33].
  • Insect Detection: The movement of insects/reptiles in a shoe triggers the infrared sensors in the shoe, notifying the user via the Blynk app in a smartphone or through email [32]. This insect detection is activated when the user is not using the shoes.
  • Fall Detection: Foot motion data detected with motion sensors can be used to recognise a fall and alert an emergency contact via smartphone [32,34].
  • Location Tracking: Whether the location is tracked for safety or when a person is missing, the use of smartphones ensures reliability and portability. Google’s Geolocation API makes use of Wi-Fi to determine the coordinates of the device and with a request to Google Maps, the user’s location can be mapped [32]. Bhongade, Girhay [33] reported on a system that tracks the location by sending a SMS code to receive the coordinates and navigate to the location using Google Maps.
  • Height Detection: The height of the distant object in the frontal plane can be detected with a sensor integrated knee band [101].
  • Text Detection: Verification of the classroom number utilizing a raspberry pi with an 8 MP Sony IMX219 image sensor to capture the images of the classroom tags and covert it to the desired form using OpenCV and Tesseract [105].
  • Gait Detection: To detect the gait phases, based on which the obstacle detection could be triggered, motion data from an accelerometer [34] and the difference between two successive frames of a camera [112], were utilized.
  • Health Tracking: Daily activity such as the number of footsteps, distance travelled and burned calories can be recorded for one week and accessed by the user [33].

4.3. Microcontroller Unit

A microcontroller unit is the core of a wireless obstacle detection system, consisting of a processing unit, memory and input/output peripherals. It receives an input signal from the sensor(s), process the input data, estimate the target parameters, and makes a prediction. The desired features such as low cost, power efficiency, and small size, make microcontrollers with arm processors a suitable option for a wide number of portable applications. Table 6 shows the technical features of microcontrollers used in the reviewed obstacle detection shoes, when unreported the principal features were obtained from the corresponding datasheets issued by the manufacturers.
Other high-performing microcontroller boards in the market such as Google Coral Board [118], Raspberry pi 4B [119], and Jetson Nano [120], with specifications showing the possibility of implementing in obstacle detection shoes with high performing sensors and machine learning algorithms for reliable and accurate fall prevention systems. Google Coral development board [118] is a single board computer that has an NXP i.MX 8M SoC processor based on Arm Cortex-A53 and an Edge TPU co-processor, providing accelerated machine learning processing. It includes all peripheral connections such as USB 2.0/3.0 ports, DSI display interface, CSI-2 camera interface, Ethernet port, speaker terminals, and a 40-pin GPIO header, all useful in developing a prototype smart shoe system. The NXP’s iMX8M system-on-chip (SOC) and the Edge TPU coprocessor together with LPDDR4 memory, eMMC storage, and dual-band Wi-Fi, form a removable system-on-module (SOM), enabling prototype development on the Google Coral development board and then combining SOM with a custom baseboard. The Google Dev board mini [121] also provides fast machine learning (ML) inferencing in a small form factor. The Coral USB accelerator [122] can be added as an Edge TPU co-processor to a computer system to perform accelerated ML inferencing. It is capable of performing 4 trillion operations per second, with 2 watts of power, USB 3.0, enabling on-device machine learning processing.
While the specification of the google coral devices shows the possibility of implementing machine learning algorithms in embedded devices, the support and guide provided by Raspberry pi make it easier to use and more convenient. The latest Raspberry pi 4 Model B [119] includes a high-performance 64-bit quad-core processor, dual-display support at resolutions up to 4K via a pair of micro-HDMI ports, up to 8GB of RAM, dual-band 2.4/5.0 GHz wireless LAN, Bluetooth 5.0, Gigabit Ethernet, USB 3.0. For additional performance, it can also be used with the Coral USB accelerator. The Jetson nano Developer Kit [120], is a small powerful computer, with 64-bit Quad-core ARM A57 @ 1.43GHz, 128-core NVIDIA Maxwell @ 921MHz, 4GB 64-bit LPDDR4, four high-speed USB 3.0 ports, MIPI CSI-2 camera connector, HDMI 2.0 and DisplayPort 1.3, Gigabit Ethernet, M.2 Key-E module, MicroSD card slot, and 40-pin GPIO header capable of running multiple neural networks in parallel for applications such as image classification, object detection, segmentation, and speech processing.
Researchers have evaluated these hardware developments for their capability in embedded object detection [123,124,125,126]. Real-time implementations of edge-based obstacle detection models in advanced processors for robotics [127,128], autonomous vehicles [129,130], wheels chairs [131,132], and visually impaired [133,134,135] calls for investigations of the same processors for shoe-based detection systems. Even with these high specifications, when selecting a microcontroller for a wearable system, along with the performance, the size, and weight are also important determinants of acceptance by a user.

4.4. Feedback/Alerting Technique

Acoustic and vibrotactile feedback are the most common methods to alert the user to obstacles. Acoustic feedback uses sound to capture the user’s attention, while in tactile or haptic feedback, embedded vibrators use the pressure on the skin. Auditory warnings can be a tone, buzzer or audio messages. Piezo buzzers are output devices, containing piezo crystals that expand and contract proportional to the applied voltage, producing sounds to alert the user [32,37]. Based on the proximity signal intensity can be controlled using pulse width modulation to produce a louder noise for closer obstacles [38]. Audio messages are either synthesized or digitized. Bhongade, Girhay [33] used an android text-to-speech application to alert the user to obstacles and provide date and time via headphones. A difficulty with audio outputs is interference with environmental information and they may be aversive for some individuals.
Vibrotactile warning mechanism, such as used in smartphone alerts, can be embedded insole or on the shoe, and sometimes combined with actuators. A signal enabling the vibration to alert the user is sent from the microcontroller when an obstacle is detected [34]. A coin vibrator alerts with a vibration amplitude proportional to the distance to the obstacle [98]. Feedback of obstacle direction can be achieved using four vibrators, i.e., each for right, left, and forward and all four to signal stopping [103].
With the implementation of features additional to obstacle detection, other feedback mechanisms have been employed. NavGuide [31] alerts with audio and tactile output, with the audio feedback playing an audio file corresponding to the detection and tactile feedback involving vibration motors front, left, and right corresponding to the direction and the fourth for wet surface detection. Smartphone-based voice guidance and vibrational feedback are implemented with vibrations motors in the both shoe insoles alerting the user to potholes and pedestrians with their directions and simultaneous activation of both indicating staircase detection [36]. Appropriate choice of feedback method affects the implementation of the system in real world. While incorporating all feedback methods in a single system might not be fruitful, restricting with one feedback method, can be also challenging in some situations, for example, audio feedback might not be suitable for noisy environments but preferred in other situations [21].

4.5. Analysis of Obstacle Detection Techniques

Ultrasound being the most common obstacle detection sensor used, time-of-flight technique have been utilized to sense obstacles in the path [32,33,34,35,36,91,98,99,100,101,102,103,104,105,108]. A microcontroller triggers the ultrasound to emit waves at short intervals and the obstacles reflect these waves back to the sensors [31]. The object distance was computed from the period between emission and reception of ultrasound waves, D = ½ C × T, where C is the speed of the sound in air and T is the measured time of flight taken by the sound wave [105]. Distance information is passed on to the processor, to alert the user about the presence of an obstacle. A comparison between the ultrasound measured distance and actual distance showed an accuracy of 98% which decreased with increase in distance, showing 94.78% at 300 cm [37].
Detection of obstacles at different levels was achieved with multiple ultrasound sensors and appropriate calculations. In NavGuide [31] a logical map of the surrounding environment was constructed with six ultrasonic sensors, divided into two groups-Group 1 (S1, S2, S3) for floor level obstacles and Group 2 (S4, S5, S6) for knee level obstacles. Front facing S1 and S4, left facing S2 and S5, and right facing S3 and S6 detect obstacles in the corresponding direction in which they are placed. Obstacle x-coordinate value was calculated from the measured distance for all sensors.xcoordinate = cos θ × Di. Here, Di is distance calculated by ith ultrasonic sensor (i = 1, 2,..., 6) and θ is angle of the sensor with the horizontal. The presence of knee level obstacles was determined with the below equations.
[(S1x < S4x)&&((S4xS1x) ≤ δ)]
[(S2x < S5x)&&((S5xS2x) ≤ δ)]
[(S3x < S6x)&&((S6xS3x) ≤ δ)]
where S1x represents x-coordinate value calculated by S1, δ is the width difference between S1 and S4 and γ the height difference.
Additionally, ascending staircase was detected in the front, left and right using the below equations.
[((S1x < S4x)&&((S4x − (S1x + δ)) ≥ Td))]
[((S2x < S5x)&&((S5x − (S2x + δ)) ≥ Td))].
[((S3x < S6x)&&((S6x − (S3x + δ)) ≥ Td))].
Td represents tread depth (25 cm) and Rh is the value of the riser height (19.6 cm) [31].
Additional sensing techniques such as wet surface detection, pothole detection etc., running in parallel to obstacle detection and custom smartphone applications communicating with the smart shoe processor [32,33,35] extends capabilities of the shoe system. For COMPASS [105], in addition to, an ultrasound-based obstacle detection, a computer vision coupled with indoor position system solution was used for the independent navigation of the visually impaired students in university campus. An android application reminded the student on class timings while also providing navigational assistance. Shoe-mounted sensors detected obstacles and Bluetooth beacons tracked user’s location. At destination, for verifying the classroom number, image captured with a camera on smart bracelet is converted to text using OpenCV and Tesseract. After text-to-voice conversion with Google library, the audio output confirms the classroom number to user [105].
Vision navigator [91] uses the ultrasound in the shoe system (Smart-alert walker) in conjunction with a Smart-fold cane. An Arduino embedded with a Raspberry Pi camera act as the heart of the system. These were attached to the Smart cane to take live camera feed and utilized a SSD algorithm trained with MS COCO dataset to detect the potential obstacles. This was validated with trained deep learning model and transferred to RNN for sentence generation. Appropriate sentences were framed by interacting with Flickr30k dataset, which was then forwarded to text-to-speech application interface for audio alert through earpiece. Two ultrasonic sensors in Smart-alert Walker served as emergency alert provision, by detecting any obstacles that are too close, enhancing the accuracy of the system [91].
Though ultrasound was used in most systems, two other sensors have also used the signal round trip travelling technique for obstacle distance. Yang, Jung [30] used infrared sensors and six-axis motion sensors mounted on shoe to estimate the obstacle distance and direction. Corresponding to the foot angle, the direction of infrared sensor was adapted, to detect and alert to presence of obstacles in corridor environment. Tang, Peng [107] proposed a radar-based fall prevention approach that constantly measures distance information between surrounding objects and user’s feet. While the measurement of radar sensors showed same spectrum characteristics in normal walking scenario, appearance of obstacle in the radar detection range was captured as fragmented distance decrease. Comparison of the distance results measured by a prototype K-band FMCW radar and the ground truth value showed 1.76 cm average error and 4.5 cm worst-case error.
Rao and Singh [36] implemented a computer vision approach for obstacle detection and avoidance, guiding the user with appropriate haptic feedback and navigational support with smartphone voice assistance. The image captured with a fisheye camera on the shoe was streamed to an android application in smartphone. Obstacle detection was performed with a YOLOv3 model, trained with Darknet suit, converted into TensorFlow lite file, and integrated into the smartphone application. The application detected potholes, ditches, staircases, and people to understand crowded places. The distance to the object was estimated with the principle of triangle similarity represented with F = P × D/W, where W is the actual width, P is the perceived width, and D is the distance from the camera. F remains constant and was calculated for some standard examples, from which the distance was estimated D = W × F/P. A haptic alert was initiated whenever the obstacle distance falls below certain threshold, informing the user next direction to move [36].
It is clear from the review that in shoe-based obstacle detection systems, there is a gap in using advanced sensing and detection techniques. The implementation of deep learning and machine learning algorithms in other wearable assistive devices [86,136], shows a scope for the same in shoe-based systems.

4.6. Communication

Communication between the sensors, microcontroller boards and other devices can be achieved via wired or wireless protocols. With advances in wireless protocols, microcontrollers are now available with in-built wireless systems, or separate wireless modules can be used. Bluetooth short-range wireless transmission between 2.4 GHz and 2.485 GHz enables low cost communication with minimal power consumption [137]. The HC-05 Bluetooth module is ideal for transferring real-time shoe data between the microcontroller and the smartphone [33,35,37] but they must be within a specified range to function effectively. Kamaruddin, Mahmood [102] controlled a buzzer from the microcontroller through the internet using Wi-Fi module. Using internet communications location information from the user can also be made available to supervising personnel [38]. Wireless communication between smartphones and smart shoes has, therefore, paved the way for safety interventions using shoe system, such as location tracking and navigation [32,33,35].

4.7. Power Supply

Hardware components including processors and sensors demand a portable power supply of sufficient duration. In NavGuide [31], with power consumption between 48 mA and 120 mA, and changing surroundings, a compact battery enabled 850–1000 min of operation [31]. A Li-ion rechargeable 12 volts battery with sleep mode consumption of less than 2 mA, can support sensors and other hardware components [35]. Daou, Chehade [98] used two batteries to power up their system, by switching from the main battery to the second, when the former battery level reached 10%. With a power consumption of 708.9 mA the average life cycle of a 600 mA, 9 V battery is about 40 min [98].
While a lithium polymer battery powers the system, piezoelectric plates in the shoe sole are used to generate and store backup power from foot pressure applied when walking and running [33] allowing the wearer to maintain power to the sensors and smart phone [113]. Piezoelectric sensors convert applied pressure into electrical energy, filter AC content to produce DC and store generated power in a rechargeable battery [138] (see Figure 6). For powering AC systems, an inverter can convert the stored energy, thus showing possibility to support both DC and AC loads. With a 100% capacity a rechargeable battery was observed to support the shoe system for 3–4 h, and piezoelectric plates produced addition power but less current [33]. The Walking energy module of a Smart shoes [34] consists of a MAX17710 energy harvesting charger and protector, collecting the electricity power from the Piezoelectric transducer with Polyvinylidene Fluoride (PVDF) thin film, under the heel. Energy harvesting from human locomotion promises to be a convenient way to power versatile smart shoe systems.

4.8. Experimental Evaluation with Human Participants

Performance of wearable shoe systems should be confirmed in trials with human participants to ensure effectiveness in detecting the obstacles. Daou, Chehade [98] tested a system with five participants, for an average duration of 2 h per user, obtaining an accuracy of 95.33% with a sensitivity of 98% and a false detection rate of 5.3%; most errors occurred when the battery level was low. In another study, the effectiveness of smart shoes in gait event recognition and obstacle detection was assessed using six participants, with the gait event showing an overall accuracy of 90.9% and low variability in real-time, with high detection accuracy and low false-alarm rate [34].
Two shoe systems have tested the shoe-based obstacle detection systems and compared it along with a white cane system, the most commonly used assistance for the visually impaired. For testing NavGuide [31] experiments were conducted in a controlled environment with 70 participants. Following a training trial, performance was measured using the number of obstacles contacted, time, and speed to complete walking task, and success in wet floor detection. Results showed that the NavGuide assistance reduced both collisions and completion time compared to the white cane. Yang, Jung [30] also compared the shoe performance with white cane walking in 12 participants. Their results generally supported the earlier test results, but time to pass increased using the obstacle detection footwear, while more collisions were avoided and plantar pressure distribution and muscle activity improved. In summary, limited research has experimentally evaluated the performance of the developed shoe systems. While bench testing can estimate reliability, it is critical to find out how the system performs in simulated and real-world environments.

5. Conclusions

This review has described recent developments in portable and wearable obstacle-detection shoe systems to detect hazards and to reduce accidents while walking. The sensors used for detecting the hazards were examined, with the major advantages and disadvantages of each, and a detailed analysis provided of the sensors and hardware components used in obstacle detection shoes. Wearable sensors have the potential to improve the quality of life for individuals with locomotor deficits and enabling new scientific concepts in hazard detection and gait monitoring. Audio or tactile stimuli are the most common feedback methods used to alert the user of hazards. Integration of additional safety features, for example, the detection of wet surfaces, potholes, and heat, in addition to location tracking and navigation, promise more secure and safe independent mobility.
Our review of shoe-based obstacle detection systems reveals that while research has progressed, more advanced motion prediction algorithms and techniques are required. Despite technological advancements in sensors suitable for obstacle detection, most smart shoe systems have used only ultrasound-based sensors, with major application in visual impairment. Adopting more advanced sensor technologies and data processing may help in designing more efficient diagnostic methods, leading to practical, cost-effective, technology-based fall prevention interventions.
To pave the way for high-quality, efficient, reliable obstacle detection on shoes in real time, bridging the gap between research and practice, we make the following recommendations:
  • Design sensor systems that reliably detect obstacles using multiple data sources and only those that pose a hazard to the user, i.e., few false positives.
  • Implementation of advanced wearable sensors and fast processing boards on the shoe while not compromising user comfort and ease of use.
  • For prototype development, microcontrollers such as Arduino may be suitable but for real world applications smaller processors/boards with equivalent or advanced processing capabilities are needed.
  • Ensure the additional hardware and weight, do not interfere with the gait and normal locomotion of the user.
  • Examine temperature ranges for sensors (see Table 3) and other hardware components to determine the performance in extreme weather conditions.
  • Evaluate prototypes in the real world to ensure comfort and acceptance for everyday use.
Application of properly developed smart shoes can be envisioned in any discipline promoting independence, convenience, individualized comfort, and healthy living. Innovative smart shoes have the potential to revolutionize the footwear industry and create a new interdisciplinary science of sensor technology, computing, and gait biomechanics.

Author Contributions

Conceptualization, A.M.J., A.K. and R.B.; methodology, A.M.J. and A.K.; investigation, A.M.J.; validation, A.M.J.; visualization, A.M.J.; writing—original draft preparation, A.M.J.; writing—review and editing, A.M.J., A.K. and R.B.; supervision, R.B. and A.K.; project administration, R.B. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

The authors would like to acknowledge the Institute for Health and Sport, Victoria University, through which this review was conducted.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. WHO. Falls; WHO: Geneva, Switzerland, 16 January 2018. Available online: https://www.who.int/news-room/fact-sheets/detail/falls#:~:text=Falls%20are%20the%20second%20leading,greatest%20number%20of%20fatal%20falls (accessed on 10 October 2020).
  2. AIHW. Injury Expenditure in Australia 2015–16; AIHW: Darlinghurst, NSW, Australia, 2019. Available online: https://www.aihw.gov.au/reports/health-welfare-expenditure/injury-expenditure-in-australia-2015-16/contents/summary (accessed on 10 October 2020).
  3. Alexander, B.H.; Rivara, F.P.; Wolf, M.E. The cost and frequency of hospitalization for fall-related injuries in older adults. Am. J. Public Health 1992, 82, 1020–1023. [Google Scholar] [CrossRef] [Green Version]
  4. CDC. Keep on Your Feet—Preventing Older Adult Falls; CDC: Atlanta, GA, USA, 2020. Available online: https://www.cdc.gov/injury/features/older-adult-falls/index.html#:~:text=Every%20second%20of%20every%20day,particularly%20among%20the%20aging%20population (accessed on 10 October 2020).
  5. Blindness and Vision Impairment: Fact Sheet, o. 14 October 2021. Available online: https://www.who.int/news-room/fact-sheets/detail/blindness-and-visual-impairment (accessed on 3 October 2022).
  6. Schoene, D.; Heller, C.; Aung, Y.N.; Sieber, C.C.; Kemmler, W.; Freiberger, E. A systematic review on the influence of fear of falling on quality of life in older people: Is there a role for falls? Clin. Interv. Aging 2019, 14, 701. [Google Scholar] [CrossRef] [Green Version]
  7. Nagano, H.; Begg, R.K. Shoe-Insole Technology for Injury Prevention in Walking. Sensors 2018, 18, 1468. [Google Scholar] [CrossRef] [Green Version]
  8. Blake, A.J.; Morgan, K.; Bendall, M.J.; Dallosso, H.; Ebrahim, S.B.J.; Arie, T.H.D.; Fentem, P.H.; Bassey, E.J. Falls by elderly people at home: Prevalence and associated factors. Age Ageing 1988, 17, 365–372. [Google Scholar] [CrossRef]
  9. Nagano, H.; Begg, R.K.; Sparrow, W.A.; Taylor, S. Ageing and limb dominance effects on foot-ground clearance during treadmill and overground walking. Clin. Biomech. 2011, 26, 962–968. [Google Scholar] [CrossRef]
  10. Kocić, J.; Jovičić, N.; Drndarević, V. Sensors and Sensor Fusion in Autonomous Vehicles. In Proceedings of the 2018 26th Telecommunications Forum (TELFOR), Belgrade, Serbia, 20–21 November 2018. [Google Scholar]
  11. Niculescu, V.; Muller, H.; Ostovar, I.; Polonelli, T.; Magno, M.; Benini, L. Towards a Multi-Pixel Time-of-Flight Indoor Navigation System for Nano-Drone Applications. In Proceedings of the 2022 IEEE International Instrumentation and Measurement Technology Conference (I2MTC), Ottawa, ON, Canada, 16–19 May 2022. [Google Scholar]
  12. Marti, E.D.; de Miguel, M.A.; Garcia, F.; Perez, J. A Review of Sensor Technologies for Perception in Automated Driving. IEEE Intell. Transp. Syst. Mag. 2019, 11, 94–108. [Google Scholar] [CrossRef] [Green Version]
  13. Elmannai, W.; Elleithy, K. Sensor-Based Assistive Devices for Visually-Impaired People: Current Status, Challenges, and Future Directions. Sensors 2017, 17, 565. [Google Scholar] [CrossRef] [Green Version]
  14. Martinez, M.; Roitberg, A.; Koester, D.; Stiefelhagen, R.; Schauerte, B. Using Technology Developed for Autonomous Cars to Help Navigate Blind People. In Proceedings of the 2017 IEEE International Conference on Computer Vision Workshops (ICCVW), Venice, Italy, 22–29 October 2017. [Google Scholar]
  15. Digibarn. DigiBarn Weird Stuff: Puma RS Computer Tennis Shoes (Pedometer, 1980s); Digibarn: Boulder Creek, CA, USA, 2022; Available online: https://www.digibarn.com/collections/weirdstuff/computer-tennis-shoes/ (accessed on 15 December 2022).
  16. Wu, C.-J.; Tsun, F.-S.; Hsiang, S.-H.; Hsien, Y.-L. Electronic Pace and Distance Counting Shoe. U.S. Patent 4466204, 21 August 1984. [Google Scholar]
  17. Yang, S.; Li, Q. Inertial sensor-based methods in walking speed estimation: A systematic review. Sensors 2012, 12, 6102–6116. [Google Scholar] [CrossRef] [Green Version]
  18. Zhang, W.; Tomizuka, M.; Byl, N. A wireless human motion monitoring system for smart rehabilitation. J. Dyn. Syst. Meas. Control. 2016, 138, 111004. [Google Scholar] [CrossRef] [Green Version]
  19. Gokalgandhi, D.; Kamdar, L.; Shah, N.; Mehendale, N. A review of smart technologies embedded in shoes. J. Med. Syst. 2020, 44, 150. [Google Scholar] [CrossRef]
  20. Hegde, N.; Bries, M.; Sazonov, E. A Comparative Review of Footwear-Based Wearable Systems. Electronics 2016, 5, 48. [Google Scholar] [CrossRef]
  21. Kuriakose, B.; Shrestha, R.; Sandnes, F.E. Tools and Technologies for Blind and Visually Impaired Navigation Support: A Review. IETE Tech. Rev. 2020, 39, 3–18. [Google Scholar] [CrossRef]
  22. Hersh, M. Wearable Travel Aids for Blind and Partially Sighted People: A Review with a Focus on Design Issues. Sensors 2022, 22, 5454. [Google Scholar] [CrossRef]
  23. Agrawal, M.P.; Gupta, A.R. Smart Stick for the Blind and Visually Impaired People. In Proceedings of the 2018 Second International Conference on Inventive Communication and Computational Technologies (ICICCT), Coimbatore, India, 20–21 April 2018. [Google Scholar]
  24. Nada, A.; Mashelly, S.; Fakhr, M.A.; Seddik, A.F. Effective fast response smart stick for blind people. In Proceedings of the Second International Conference on Advances in Bioinformatics and Environmental Engineering–ICABEE, Rome, Italy, 18–19 April 2015. [Google Scholar]
  25. Bhuniya, A.; Laha, S.; Maity, D.K.; Sarkar, A.; Bhattacharyya, S. Smart Glass for Blind People. Amse J. Iieta 2017, 38, 102–110. [Google Scholar]
  26. Hossain, E.; Khan, R.; Ali, A. Design and data analysis for a belt-for-blind for visual impaired people. Int. J. Adv. Mechatron. Syst. 2011, 3, 384. [Google Scholar] [CrossRef] [Green Version]
  27. Hengle, A.; Kulkarni, A.; Bavadekar, N.; Kulkarni, N.; Udyawar, R. Smart Cap: A Deep Learning and IoT Based Assistant for the Visually Impaired. In Proceedings of the 2020 Third International Conference on Smart Systems and Inventive Technology (ICSSIT), Tirunelveli, India, 20–22 August 2020. [Google Scholar]
  28. Fallah, M.Y. Smart Bracelet Using Haptic System with Fuzzy Controller in Deminer Quad-rotor Robots. J. Instrum. Autom. Syst. 2017, 3, 22–30. [Google Scholar] [CrossRef]
  29. Srivastava, N.K.; Singh, S. Netra: Smart Hand Gloves Comprises Obstacle Detection, Object Identification & OCR Text to Speech Converter for Blinds. In Proceedings of the 2018 5th IEEE Uttar Pradesh Section International Conference on Electrical, Electronics and Computer Engineering (UPCON), Gorakhpur, India, 2–4 November 2018. [Google Scholar]
  30. Yang, C.M.; Jung, J.Y.; Kim, J.J. Development of obstacle detection shoes for visually impaired people. Sens. Mater. 2020, 32, 2227. [Google Scholar] [CrossRef]
  31. Patil, K.; Jawadwala, Q.; Shu, F.C. Design and Construction of Electronic Aid for Visually Impaired People. IEEE Trans. Human-Machine Syst. 2018, 48, 172–182. [Google Scholar] [CrossRef]
  32. Nanduri, S.; Umamaheswari, E.; Kishore, R.; Ajaykumar, M. Smart Bottine for autistic people. Mater. Today Proc. 2022, 62, 4788–4794. [Google Scholar] [CrossRef]
  33. Bhongade, P.; Girhay, S.; Sheikh, A.M.; Ghata, R.; Ambadkar, S.; Dusane, C. Internet of Things—Enabled Smart Shoes for Blind People. In Proceedings of the 2022 IEEE Delhi Section Conference, New Delhi, India, 11–13 February 2022. [Google Scholar]
  34. Wu, W.; Lei, N.; Tang, J. Smart Shoes for Obstacle Detection. In The 10th International Conference on Computer Engineering and Networks; Springer: Singapore, 2021; pp. 1319–1326. [Google Scholar]
  35. Kumar, P.; Inchara, K.M.; Lekhashree, S.; Likhith, C.N.; Pavan, U. Real Time Assistive Shoe for Visually Impaired People. In Proceedings of the 2021 6th International Conference for Convergence in Technology, I2CT 2021, Pune, India, 2–4 April 2021. [Google Scholar]
  36. Rao, S.; Singh, V.M. Computer vision and IoT based smart system for visually impaired people. In Proceedings of the Confluence 2021: 11th International Conference on Cloud Computing, Data Science and Engineering, Noida, India, 28–29 January 2021. [Google Scholar]
  37. Rahman, M.M.; Islam, M.M.; Ahmmed, S. “BlindShoe”: An electronic guidance system for the visually impaired people. J. Telecommun. Electron. Comput. Eng. 2019, 11, 49–54. [Google Scholar]
  38. Raja, L.; Santhosh, R. Experimental study on shoe based navigation system for the visually impaired. Mater. Today Proc. 2020, 45, 1713–1716. [Google Scholar] [CrossRef]
  39. Nair, D.; Aggarwal, J.K. Moving obstacle detection from a navigating robot. IEEE Trans. Robot. Autom. 1998, 14, 404–416. [Google Scholar] [CrossRef] [Green Version]
  40. Gray, K.W.; Baker, K. Obstacle Detection and Avoidance for an Autonomous farm Tractor; Citeseer: Princeton, NJ, USA, 2000. [Google Scholar]
  41. Prabhakar, G.; Kailath, B.; Natarajan, S.; Kumar, R. Obstacle detection and classification using deep learning for tracking in high-speed autonomous driving. In Proceedings of the 2017 IEEE Region 10 Symposium (TENSYMP), Kerala, India, 5–8 November 2017. [Google Scholar]
  42. Discant, A.; Rogozan, A.; Rusu, C.; Bensrhair, A. Sensors for obstacle detection—A survey. In Proceedings of the 2007 30th International Spring Seminar on Electronics Technology (ISSE), Cluj-Napoca, Romania, 9–13 May 2007. [Google Scholar]
  43. Mohammad, T. Using ultrasonic and infrared sensors for distance measurement. World Acad. Sci. Eng. Technol. 2009, 51, 293–299. [Google Scholar]
  44. Mustapha, B.; Zayegh, A.; Begg, R.K. Ultrasonic and infrared sensors performance in a wireless obstacle detection system. In Proceedings of the 2013 1st International Conference on Artificial Intelligence, Modelling and Simulation, Sabah, Malaysia, 3–5 December 2013. [Google Scholar]
  45. Azeta, J.; Bolu, C.; Hinvi, D.; Abioye, A.A. Obstacle detection using ultrasonic sensor for a mobile robot. IOP Conf. Series: Mater. Sci. Eng. 2019, 707, 012012. [Google Scholar] [CrossRef]
  46. Ruzaij, M.F.; Poonguzhali, S. Design and implementation of low cost intelligent wheelchair. In Proceedings of the 2012 International Conference on Recent Trends in Information Technology, Chennai, India, 19–21 April 2012. [Google Scholar]
  47. Adarsh, S.; Kaleemuddin, S.M.; Bose, D.; Ramachandran, K.I. Performance comparison of Infrared and Ultrasonic sensors for obstacles of different materials in vehicle/ robot navigation applications. IOP Conf. Ser. Mater. Sci. Eng. 2016, 149, 012141. [Google Scholar] [CrossRef] [Green Version]
  48. Yeong, D.J.; Velasco-Hernandez, G.; Barry, J.; Walsh, J. Sensor and sensor fusion technology in autonomous vehicles: A review. Sensors 2021, 21, 2140. [Google Scholar] [CrossRef]
  49. Han, J.; Kim, D.; Lee, M.; Sunwoo, M. Enhanced Road Boundary and Obstacle Detection Using a Downward-Looking LIDAR Sensor. IEEE Trans. Veh. Technol. 2012, 61, 971–985. [Google Scholar] [CrossRef]
  50. Baras, N.; Nantzios, G.; Ziouzios, D.; Dasygenis, M. Autonomous Obstacle Avoidance Vehicle Using LIDAR and an Embedded System. In Proceedings of the 2019 8th International Conference on Modern Circuits and Systems Technologies (MOCAST), Thessaloniki, Greece, 13–15 May 2019. [Google Scholar]
  51. Mazzari, V. What Is LiDAR Technology? Lidar 2019. Available online: https://blog.generationrobots.com/en/what-is-lidar-technology/ (accessed on 3 May 2021).
  52. Asvadi, A.; Premebida, C.; Peixoto, P.; Nunes, U. 3D Lidar-based static and moving obstacle detection in driving environments: An approach based on voxels and multi-region ground planes. Robot. Auton. Syst. 2016, 83, 299–311. [Google Scholar] [CrossRef]
  53. Fang, Z.; Zhao, S.; Wen, S.; Zhang, Y. A real-time 3d perception and reconstruction system based on a 2d laser scanner. J. Sensors 2018, 2018, 2937694. [Google Scholar]
  54. Jahromi, B.S.; Tulabandhula, T.; Cetin, S. Real-Time Hybrid Multi-Sensor Fusion Framework for Perception in Autonomous Vehicles. Sensors 2019, 19, 4357. [Google Scholar]
  55. Arnold, E.; Al-Jarrah, O.Y.; Dianati, M.; Fallah, S.; Oxtoby, D.; Mouzakitis, A. A survey on 3d object detection methods for autonomous driving applications. IEEE Trans. Intell. Transp. Syst. 2019, 20, 3782–3795. [Google Scholar] [CrossRef] [Green Version]
  56. Bhoi, A. Monocular depth estimation: A survey. arXiv 2019, arXiv:1901.09402. [Google Scholar]
  57. Yu, H.; Zhu, J.; Wang, Y.; Jia, W.; Sun, M.; Tang, Y. Obstacle classification and 3D measurement in unstructured environments based on ToF cameras. Sensors 2014, 14, 10753–10782. [Google Scholar] [CrossRef] [Green Version]
  58. Slamtech. RPLIDAR A2. Introduction and Datasheet 2016. Available online: https://www.slamtec.com/en/Lidar/A2Spec (accessed on 15 March 2021).
  59. Benewake. TFmini Infrared Module Specification. n.d. Available online: https://cdn.sparkfun.com/assets/5/e/4/7/b/benewake-tfmini-datasheet.pdf (accessed on 14 March 2021).
  60. IbeoAutomotiveSystems. ibeo LUX 4L/ibeo LUX 8L/ibeo LUX HD. 2017. Available online: https://cdn.www.ibeo-as.com/74ff7abf85e5f139f1f57020579ebf9d0436b25e/ibeo%2BLUX%2BFamily%2BData%2BSheet.pdf.pdf (accessed on 2 December 2022).
  61. AutonomousStuff. Velodyne LiDAR PUCK. 2015. Available online: https://www.amtechs.co.jp/product/VLP-16-Puck.pdf (accessed on 3 July 2021).
  62. AutonomousStuff. Delphi Electronically Scanning RADAR. n.d. Available online: https://hexagondownloads.blob.core.windows.net/public/AutonomouStuff/wp-content/uploads/2019/05/delphi-esr-whitelabel.pdf (accessed on 14 March 2021).
  63. Stanislas, L.; Peynot, T. Characterisation of the Delphi Electronically Scanning Radar for robotics applications. In Proceedings of the ICRA 2015, Seattle, WA, USA, 26–30 May 2015. [Google Scholar]
  64. Roboception. rc_visard 3D Stereo Sensor Assembly and Operating Manual. 2022. Available online: https://doc.rc-visard.com/latest/pdf/rc_visard_manual_en.pdf (accessed on 10 November 2022).
  65. Roboception. 3D Stereo Sensor. 2020. Available online: https://howtorobot.com/sites/default/files/2021-09/2020_Roboception_ProductSpecs.pdf (accessed on 10 November 2022).
  66. RaspberryPi. Camera Module. n.d. Available online: https://www.raspberrypi.org/documentation/hardware/camera/ (accessed on 20 March 2021).
  67. ifm. 3D Camera O3D303. 2015. Available online: https://www.ifm.com/au/en/product/O3D303 (accessed on 15 November 2022).
  68. Elecfreaks. Ultrasonic Ranging Module HC-SR04. n.d. Available online: https://cdn.sparkfun.com/datasheets/Sensors/Proximity/HCSR04.pdf (accessed on 10 March 2021).
  69. Sharp. Sharp GP2Y0A02YK0F Analog Distance Sensor. 2006. Available online: https://www.pololu.com/file/0J156/gp2y0a02yk_e.pdf (accessed on 14 March 2021).
  70. Wang, D.; Li, W.; Liu, X.; Li, N.; Zhang, C. UAV environmental perception and autonomous obstacle avoidance: A deep learning and depth camera combined solution. Comput. Electron. Agric. 2020, 175, 105523. [Google Scholar] [CrossRef]
  71. John, V.; Mita, S. RVNet: Deep Sensor Fusion of Monocular Camera and Radar for Image-Based Obstacle Detection in Challenging Environments. In Image and Video Technology; Springer International Publishing: Cham, Switzerland, 2019. [Google Scholar]
  72. Kato, T.; Ninomiya, Y.; Masaki, I. An obstacle detection method by fusion of radar and motion stereo. IEEE Trans. Intell. Transp. Syst. 2002, 3, 182–188. [Google Scholar] [CrossRef]
  73. Wu, S.; Decker, S.; Chang, P.; Camus, T.; Eledath, J. Collision Sensing by Stereo Vision and Radar Sensor Fusion. IEEE Trans. Intell. Transp. Syst. 2009, 10, 606–614. [Google Scholar]
  74. Shahdib, F.; Bhuiyan, M.W.U.; Hasan, M.K.; Mahmud, H. Obstacle detection and object size measurement for autonomous mobile robot using sensor. Int. J. Comput. Appl. 2013, 66, 28–33. [Google Scholar]
  75. Fung, M.L.; Chen, M.Z.Q.; Chen, Y.H. Sensor fusion: A review of methods and applications. In Proceedings of the 2017 29th Chinese Control And Decision Conference (CCDC), Chongqing, China, 28–30 May 2017. [Google Scholar]
  76. Elmenreich, W. An Introduction to Sensor Fusion; Vienna University of Technology: Vienna, Austria, 2002; Volume 502, pp. 1–28. [Google Scholar]
  77. Bhowmick, A.; Hazarika, S. An insight into assistive technology for the visually impaired and blind people: State-of-the-art and future trends. J. Multimodal User Interfaces 2017, 11, 149–172. [Google Scholar] [CrossRef]
  78. Shetara, J.; Majumder, S.; Acharjee, S.; Dhar, S. Smart Wrist Band for Obstacle Detection. In Proceedings of the International Conference on Materials, Electronics & Information Engineering, ICMEIE-2015, Rajshahi, Bangladesh, 27–28 February 2015. [Google Scholar]
  79. Busaeed, S.; Katib, I.; Albeshri, A.; Corchado, J.M.; Yigitcanlar, T.; Mehmood, R. LidSonic V2.0: A LiDAR and Deep-Learning-Based Green Assistive Edge Device to Enhance Mobility for the Visually Impaired. Sensors 2022, 22, 7435. [Google Scholar] [CrossRef]
  80. Al-Khalifa, S.; Al-Razgan, M. Ebsar: Indoor guidance for the visually impaired. Comput. Electr. Eng. 2016, 54, 26–39. [Google Scholar] [CrossRef]
  81. Schwarze, T.; Lauer, M.; Schwaab, M.; Romanovas, M.; Böhm, S.; Jürgensohn, T. A camera-based mobility aid for visually impaired people. KI-Künstliche Intell. 2016, 30, 29–36. [Google Scholar] [CrossRef] [Green Version]
  82. Khan, A.; Khan, A.; Waleed, M. Wearable navigation assistance system for the blind and visually impaired. In Proceedings of the 2018 International Conference on Innovation and Intelligence for Informatics, Computing, and Technologies (3ICT), Sakhier, Bahrain, 18–20 November 2018. [Google Scholar]
  83. Katzschmann, R.K.; Araki, B.; Rus, D. Safe local navigation for visually impaired users with a time-of-flight and haptic feedback device. IEEE Trans. Neural Syst. Rehabil. Eng. 2018, 26, 583–593. [Google Scholar] [CrossRef]
  84. Elgendy, M.; Sik-Lanyi, C.; Kelemen, A. A Novel Marker Detection System for People with Visual Impairment Using the Improved Tiny-YOLOv3 Model. Comput. Methods Programs Biomed. 2021, 205, 106112. [Google Scholar] [CrossRef]
  85. Kanwal, N.; Bostanci, E.; Currie, K.; Clark, A.F. A Navigation System for the Visually Impaired: A Fusion of Vision and Depth Sensor. Appl. Bionics Biomech. 2015, 2015, 479857. [Google Scholar] [CrossRef] [Green Version]
  86. Kuriakose, B.; Shrestha, R.; Sandnes, F.E. DeepNAVI: A deep learning based smartphone navigation assistant for people with visual impairments. Expert Syst. Appl. 2023, 212, 118720. [Google Scholar] [CrossRef]
  87. Mocanu, B.; Tapu, R.; Zaharia, T. When Ultrasonic Sensors and Computer Vision Join Forces for Efficient Obstacle Detection and Recognition. Sensors 2016, 16, 1807. [Google Scholar] [CrossRef] [Green Version]
  88. Chen, Z.; Liu, X.; Kojima, M.; Huang, Q.; Arai, T. A Wearable Navigation Device for Visually Impaired People Based on the Real-Time Semantic Visual SLAM System. Sensors 2021, 21, 1536. [Google Scholar] [CrossRef]
  89. Bai, J.; Liu, Z.; Lin, Y.; Li, Y.; Lian, S.; Liu, D. Wearable Travel Aid for Environment Perception and Navigation of Visually Impaired People. Electronics 2019, 8, 697. [Google Scholar] [CrossRef] [Green Version]
  90. Siddhartha, B.; Chavan, A.P.; Uma, B. An Electronic Smart Jacket for the Navigation of Visually Impaired Society. Mater. Today Proc. 2018, 5, 10665–10669. [Google Scholar] [CrossRef]
  91. Suman, S.; Mishra, S.; Sahoo, K.S.; Nayyar, A. Vision Navigator: A Smart and Intelligent Obstacle Recognition Model for Visually Impaired Users. Mob. Inf. Syst. 2022, 2022, 1–15. [Google Scholar] [CrossRef]
  92. Tudor, D.; Dobrescu, L.; Dobrescu, D. Ultrasonic electronic system for blind people navigation. In Proceedings of the 2015 E-Health and Bioengineering Conference (EHB), Iasi, Romania, 19–21 November 2015. [Google Scholar]
  93. Cardillo, E.; Di Mattia, V.; Manfredi, G.; Russo, P.; De Leo, A.; Caddemi, A.; Cerri, G. An electromagnetic sensor prototype to assist visually impaired and blind people in autonomous walking. IEEE Sens. J. 2018, 18, 2568–2576. [Google Scholar] [CrossRef]
  94. Nada, A.A.; Fakhr, M.A.; Seddik, A.F. Assistive infrared sensor based smart stick for blind people. In Proceedings of the 2015 Science and Information Conference (SAI), London, UK, 28–30 July 2015. [Google Scholar]
  95. Perumal, C.; Balamurugan, V.; Manickam, S.; Natarajamoorthy, M. Voice Navigation Based guiding Device for Visually Impaired People. In Proceedings of the 2021 International Conference on Artificial Intelligence and Smart Systems (ICAIS), Coimbatore, India, 25–27 March 2021. [Google Scholar]
  96. Barontini, F.; Catalano, M.G.; Pallottino, L.; Leporini, B.; Bianchi, M. Integrating Wearable Haptics and Obstacle Avoidance for the Visually Impaired in Indoor Navigation: A User-Centered Approach. IEEE Trans. Haptics 2021, 14, 109–122. [Google Scholar] [CrossRef]
  97. Ran, L.; Helal, S.; Moore, S. Drishti: An integrated indoor/outdoor blind navigation system and service. In Proceedings of the Second IEEE Annual Conference on Pervasive Computing and Communications, 2004, Orlando, FL, USA, 14–17 March 2004. [Google Scholar]
  98. Daou, R.A.Z.; Chehade, J.; Haydar, G.A.; Hayek, A.; Boercsoek, J.; Olmedo, J.J.S. Design and Implementation of Smart Shoes for Blind and Visually Impaired People for More Secure Movements. In Proceedings of the International Conference on Microelectronics, ICM, Aqaba, Jordan, 14–17 December 2020. [Google Scholar]
  99. Singh, V.; Sindhu, S.; Arora, R. BUZZFEET: Blind Man Shoes. In Proceedings of the International Conference on Machine Learning, Big Data, Cloud and Parallel Computing: Trends, Prespectives and Prospects, COMITCon 2019, Faridabad, Haryana, 14–16 February 2019. [Google Scholar]
  100. Rakshith, M.N.; Sundar, D.R.S.; Shanmugasundaram, M. An efficient assistive system for the visually impaired. ARPN J. Eng. Appl. Sci. 2017, 12, 5574–5577. [Google Scholar]
  101. Mishra, A.R.; Pippal, S.K.; Asif; Kumar, A.; Singh, D.; Singh, A. Clear Vision—Obstacle detection using Bat Algorithm Optimization Technique. In Proceedings of the 2021 9th International Conference on Reliability, Infocom Technologies and Optimization (Trends and Future Directions), ICRITO 2021, Noida, India, 7–9 September 2021. [Google Scholar]
  102. Kamaruddin, F.S.; Mahmood, N.H.; Razak, M.A.A.; Zakaria, N.A. Smart Assistive Shoes with Internet of Things Implementation for Visually Impaired People. J. Physics: Conf. Ser. 2021, 2107, 012030. [Google Scholar] [CrossRef]
  103. Chava, T.; Srinivas, A.T.; Sai, A.L.; Rachapudi, V. IoT based Smart Shoe for the Blind. In Proceedings of the 6th International Conference on Inventive Computation Technologies, ICICT 2021, Coimbatore, India, 20–22 January 2021. [Google Scholar]
  104. Anisha, M.; Kirthika, S.; Harline, J.D.; Thenmozhi, P.; Rubala, R.; Pragathi, T.G.; Benisha, M.; Elliot, C.J. Low-cost smart shoe for visually impaired. In Proceedings of the 3rd International Conference on Intelligent Communication Technologies and Virtual Mobile Networks, ICICV 2021, Tirunelveli, India, 4–6 February 2021. [Google Scholar]
  105. Alzamil, M.; AlBugmi, R.; AlOtaibi, S.; AlAnazi, G.; AlZubaidi, L.; Bashar, A. COMPASS: IPS-based navigation system for visually impaired students. In Proceedings of the 2020 IEEE 9th International Conference on Communication Systems and Network Technologies, CSNT 2020, Gwalior, India, 10–12 April 2020. [Google Scholar]
  106. Tang, Y.; Peng, Z.; Li, C. An experimental study on the feasibility of fall prevention using a wearable K-band FMCW radar. In Proceedings of the 2017 United States National Committee of URSI National Radio Science Meeting, USNC-URSI NRSM 2017, Boulder, CO, USA, 4–7 January 2017. [Google Scholar]
  107. Tang, Y.; Peng, Z.; Ran, L.; Li, C. IPrevent: A novel wearable radio frequency range detector for fall prevention. In Proceedings of the RFIT 2016—2016 IEEE International Symposium on Radio-Frequency Integration Technology, Taipei, Taiwan, 24–26 August 2016. [Google Scholar]
  108. Parmar, V.S.; Inkoolu, K.S. Designing smart shoes for obstacle detection: Empowering visually challenged users through ICT. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Springer International Publishing: Cham, Switzerland, 2017; pp. 258–266. [Google Scholar]
  109. WHO. Ageing and Health; WHO: Geneva, Switzerland, 2018. Available online: https://www.who.int/news-room/fact-sheets/detail/ageing-and-health (accessed on 2 June 2021).
  110. Li, G.; Tian, Z.; Gao, G.; Zhang, L.; Fu, M.; Chen, Y. A Shoelace Antenna for the Application of Collision Avoidance for the Blind Person. IEEE Trans. Antennas Propag. 2017, 65, 4941–4946. [Google Scholar] [CrossRef] [Green Version]
  111. Argañarás, J.G.; Wong, Y.T.; Pendharkar, G.; Begg, R.K.; Karmakar, N.C. Obstacle detection with MIMO 60 GHz radar for fall prevention. In Proceedings of the Asia-Pacific Microwave Conference, Singapore, 10–13 December 2019. [Google Scholar]
  112. Lin, T.H.; Yang, C.Y.; Shih, W.P. Fall Prevention Shoes Using Camera-Based Line-Laser Obstacle Detection System. J. Healthc. Eng. 2017, 2017, 8264071. [Google Scholar] [CrossRef] [Green Version]
  113. Dharme, R.K.; Surywanshi, J.R.; Kunwar, H.C.; Palve, Y.H. Smart Shoe Provides Vision to Visionless Person. In ICT Systems and Sustainability. Lecture Notes in Networks and Systems; Springer: Singapore, 2022; Volume 321, pp. 131–137. [Google Scholar]
  114. Arduino. Arduino Nano. 2022. Available online: https://docs.arduino.cc/static/282dbc4f4a12cec8da26375adf8c210d/A000005-datasheet.pdf (accessed on 2 November 2022).
  115. RobotShop. Arduino Mega 2560 Datasheet. Available online: https://www.robotshop.com/media/files/PDF/ArduinoMega2560Datasheet.pdf (accessed on 10 October 2022).
  116. STMicroelectronics. STM32L432KB STM32L432KC. 2018. Available online: https://www.st.com/resource/en/datasheet/stm32l432kc.pdf (accessed on 5 November 2022).
  117. Pi, R. Raspberry Pi Zero 2 W. 2021. Available online: https://datasheets.raspberrypi.com/rpizero2/raspberry-pi-zero-2-w-product-brief.pdf (accessed on 3 November 2022).
  118. Coral. Coral Dev Board Datasheet. 2020 July 2020. Available online: https://coral.ai/static/files/Coral-Dev-Board-datasheet.pdf (accessed on 25 October 2022).
  119. Foundation, R.P. Raspberry Pi 4 Computer Model B. 2021. Available online: https://datasheets.raspberrypi.com/rpi4/raspberry-pi-4-product-brief.pdf (accessed on 25 October 2022).
  120. NVIDIA. Jetson Nano. Available online: https://developer.nvidia.com/embedded/jetson-nano (accessed on 25 October 2022).
  121. Coral. Coral Dev Board Mini Datasheet. 2020. Available online: https://coral.ai/docs/dev-board-mini/datasheet/ (accessed on 25 October 2022).
  122. Coral. Coral USB Accelerator Datasheet. 2019. Available online: https://coral.ai/docs/accelerator/datasheet/ (accessed on 25 October 2022).
  123. Kang, P.; Somtham, A. An Evaluation of Modern Accelerator-Based Edge Devices for Object Detection Applications. Mathematics 2022, 10, 4299. [Google Scholar] [CrossRef]
  124. Kovács, B.; Henriksen, A.D.; Stets, J.D.; Nalpantidis, L. Object Detection on TPU Accelerated Embedded Devices; Springer International Publishing: Cham, Switzerland, 2021. [Google Scholar]
  125. Park, K.; Jang, W.; Lee, W.; Nam, K.; Seong, K.; Chai, K.; Li, W.S. Real-time mask detection on google edge TPU. arXiv 2020, arXiv:2010.04427. [Google Scholar]
  126. Winzig, J.; Almanza, J.C.A.; Mendoza, M.G.; Schumann, T. Edge AI—Use Case on Google Coral Dev Board Mini. In Proceedings of the 2022 IET International Conference on Engineering Technologies and Applications (IET-ICETA), Changhua City, Taiwan, 26–27 October 2022. [Google Scholar]
  127. Abraham, G.; Nithya, M. Multi-Functional Personal Assistant Robot Using Raspberry Pi and Coral Accelerator. In Proceedings of the 2021 5th International Conference on Computing Methodologies and Communication (ICCMC), Erode, India, 29–31 March 2022. [Google Scholar]
  128. Melo, J.G.; Barros, E. An Embedded Monocular Vision Approach for Ground-Aware Objects Detection and Position Estimation. arXiv 2022, arXiv:2207.09851. [Google Scholar]
  129. Devi, Y.S.; Sathvik, S.; Ananya, P.; Tharuni, P.; Vamsi, N.N.K. Vision-Based Obstacle Detection and Collision Prevention in Self-Driving Cars. J. Phys. Conf. Ser. 2022, 2335, 012019. [Google Scholar] [CrossRef]
  130. Fang, R.; Cai, C. Computer vision based obstacle detection and target tracking for autonomous vehicles. MATEC Web Conf. 2021, 336, 07004. [Google Scholar] [CrossRef]
  131. Tawil, Y.; Hafez, A.H.A. Deep Learning Obstacle Detection and Avoidance for Powered Wheelchair. In Proceedings of the 2022 Innovations in Intelligent Systems and Applications Conference (ASYU), Biarritz, France, 7–9 September 2022. [Google Scholar]
  132. Farheen, N.; Jaman, G.G.; Schoen, M.P. Object Detection and Navigation Strategy for Obstacle Avoidance Applied to Autonomous Wheel Chair Driving. In Proceedings of the 2022 Intermountain Engineering, Technology and Computing (IETC), Orem, UT, USA, 14–15 May 2022. [Google Scholar]
  133. Bergengruen, L.; Duran, D.; Sotelo, R. Effatá: Obstacle Identification System to help the Blind in Urban Displacement. In Proceedings of the 2021 IEEE Global Humanitarian Technology Conference (GHTC), Seattle, WA, USA, 19–23 October 2021. [Google Scholar]
  134. Sherpa, T.T.; Kimura, A. Pedestrian Crossing Lights and Obstacles Detections for Visually Impaired Person. In Proceedings of the 2022 Nicograph International (NicoInt), Tokyo, Japan, 4–5 June 2022. [Google Scholar]
  135. Bouteraa, Y. Smart real time wearable navigation support system for BVIP. Alex. Eng. J. 2023, 62, 223–235. [Google Scholar] [CrossRef]
  136. Hsieh, I.-H.; Cheng, H.-C.; Ke, H.-H.; Chen, H.-C.; Wang, W.-J. A CNN-Based Wearable Assistive System for Visually Impaired People Walking Outdoors. Appl. Sci. 2021, 11, 10026. [Google Scholar] [CrossRef]
  137. Pascale, F.; Adinolfi, E.; Avagliano, M.; Giannella, V.; Salas, A. A Low Energy IoT Application Using Beacon for Indoor Localization. Appl. Sci. 2021, 11, 4902. [Google Scholar] [CrossRef]
  138. Ramalingam, M.; Chinnavan, E.; Puviarasi, R.; Yu, N.H. Assistive technology for harvesting footstep energy in IoT enabled Smart shoe for the visually impaired. In Proceedings of the 2021 International Conference on Software Engineering and Computer Systems and 4th International Conference on Computational Science and Information Management, ICSECS-ICOCSIM 2021, Yokohama, Japan, 24–26 August 2021. [Google Scholar]
Figure 1. Schematic block diagram illustrating the main components of obstacle detection shoes, as used in the reviewed literature.
Figure 1. Schematic block diagram illustrating the main components of obstacle detection shoes, as used in the reviewed literature.
Sensors 23 02802 g001
Figure 2. Illustration of some wearable assistive technologies with obstacle detection incorporated. (a) Smart Belt [87]. (b) Head-Mounted [88]. (c) Smart Glass [89]. (d) Smart Jacket Reprinted with permission from Ref [90].Copyright 2018, Elsevier. (e) Wrist-worn navigation device [22]. (f) Vision Navigator with Smart-fold Cane and Smart-alert walker [91].
Figure 2. Illustration of some wearable assistive technologies with obstacle detection incorporated. (a) Smart Belt [87]. (b) Head-Mounted [88]. (c) Smart Glass [89]. (d) Smart Jacket Reprinted with permission from Ref [90].Copyright 2018, Elsevier. (e) Wrist-worn navigation device [22]. (f) Vision Navigator with Smart-fold Cane and Smart-alert walker [91].
Sensors 23 02802 g002
Figure 3. Obstacle detection based on reflection of a sound wave from an ultrasonic sensor. Reprinted with permission from Ref. [31]. Copyright 2018, IEEE.
Figure 3. Obstacle detection based on reflection of a sound wave from an ultrasonic sensor. Reprinted with permission from Ref. [31]. Copyright 2018, IEEE.
Sensors 23 02802 g003
Figure 4. Using an FMCW radar for fall prevention. Reprinted with permission from Ref. [107]. Copyright 2016, IEEE.
Figure 4. Using an FMCW radar for fall prevention. Reprinted with permission from Ref. [107]. Copyright 2016, IEEE.
Sensors 23 02802 g004
Figure 5. Block Diagram of the Smart shoe system with the fisheye camera. Adapted with permission from Ref [36]. Copyright 2021, IEEE.
Figure 5. Block Diagram of the Smart shoe system with the fisheye camera. Adapted with permission from Ref [36]. Copyright 2021, IEEE.
Sensors 23 02802 g005
Figure 6. Flow of energy generation from Piezo electric plates. Adapted with permission from Ref. [138]. Copyright 2021, IEEE.
Figure 6. Flow of energy generation from Piezo electric plates. Adapted with permission from Ref. [138]. Copyright 2021, IEEE.
Sensors 23 02802 g006
Table 1. Examples of smart shoe systems selected from the reviewed articles. Only the sensors used for obstacle detection by the shoe systems are listed in this table.
Table 1. Examples of smart shoe systems selected from the reviewed articles. Only the sensors used for obstacle detection by the shoe systems are listed in this table.
Sensors 23 02802 i001
Patil, Jawadwala, 2018 [31]-(copyrights authorized by IEEE)
Obstacle Detection Sensor: Ultrasound
Processing Board: Customized Microcontroller
Feedback: Audio, Vibration
Functionality:
  • Detecting the floor-level up to knee-level obstacles in front, on the left, and on the right side of the shoe system
  • Wet floor detection
  • Tactile feedback through vibration motors and auditory feedback through user’s wired or wireless headphones
Sensors 23 02802 i002
Nanduri, Umamaheswari, 2022 [32] (copyrights authorized by Elsevier)
Obstacle Detection Sensor: Ultrasound
Processing Board: Node MCU
Feedback: Smartphone, Piezo buzzer
Functionality:
  • Detect obstacles in the path and detect insects in the shoe while shoe is not in use
  • Fall detection and notifying the parent/caretaker through smartphone
  • Location tracking through Google’s Geolocation API
Sensors 23 02802 i003
Bhongade, Girhay, 2022 [33]-(copyrights authorized by IEEE)
Obstacle Detection Sensor: Ultrasound
Processing Board: Arduino Atmega328
Feedback: Audio
Functionality:
  • Obstacle detection to identify obstacles
  • Navigation and location tracking
  • Pothole detection, slippery surface detection and hot objects or fire detection
  • Emergency SOS to family members
  • Electricity generation while walking
  • Health and Fitness tracking
  • Alerting through voice instructions
Sensors 23 02802 i004
Wu, Lei, 2021 [34] (copyrights authorized by Springer nature)
Obstacle Detection Sensor: Ultrasound
Processing Board: STM32L432KC
Feedback: Vibration
Functionality:
  • Detect low obstacles and alert through vibration
  • Gait events detection to detect motion state of the feet
  • Fall detection and contacting emergency contacts through cell phone
  • Recharge battery with electricity generated through walking
Sensors 23 02802 i005
Pradeep Kumar, Inchara, 2021 [35]-(copyrights authorized by IEEE)
Obstacle Detection Sensor: Ultrasound
Processing Board: Renesas microcontroller
Feedback: Smartphone audio output
Functionality:
  • Obstacle detection and moisture detection
  • Location notification using google maps
  • User alert through audio feedback
Sensors 23 02802 i006
Rao, Singh, 2021 [36]-(copyrights authorized by IEEE)
Obstacle Detection Sensor: Ultrasound, Fisheye camera
Processing Board: Raspberry Pi Zero
Feedback: Vibration
Functionality:
  • Pothole and staircase detection, people detection to determine crowded places
  • Vibration on the left and right to alert about detection on the corresponding side and vibration of both to alert about staircases
  • Navigation using google map API integrated with voice-based interface for audio guidance
Sensors 23 02802 i007
Rahman, Islam, 2019 [37]
Obstacle Detection Sensor: Ultrasound
Processing Board: Arduino Uno
Feedback: Buzzer, Smartphone
Functionality:
  • Detect obstacles in the front and backside
  • Wet or slippery surface detection
  • Feedback through buzzer and smartphone notification as audio through earphone
Sensors 23 02802 i008
Raja and Santhosh, 2021 [38] (copyrights authorized by Elsevier)
Obstacle Detection Sensor: Ultrasound, Infrared
Processing Board: ARM cortex M3 LPC1768
Feedback: Buzzer
Functionality:
  • Obstacle detection and false positive detection
  • Detect whether the shoe is worn or not
  • Varying sound intensity based on the distance to obstacle
  • Distance parameters of user send to caretaker through cloud, using WI-FI module
Sensors 23 02802 i009
Yang, Jung, 2020 [30]
Obstacle Detection Sensor: Infrared
Processing Board: Arduino
Feedback: Piezoelectric Buzzer
Functionality:
  • Detect obstacle and direction of the shoes
  • Direction control of the infrared sensor using the data from accelerometer and gyrometer
  • Buzzer alert for obstacles within 60 cm
Table 2. A Summary table of Sensor Features [40,42,43,44].
Table 2. A Summary table of Sensor Features [40,42,43,44].
SENSORPERCEIVED ENERGYADVANTAGES DISADVANTAGES
LIDAR
(3D)
Laser Signal
[emitted]
Accurate distance measurement
Wide field of view
Precise measurement of depth
360° high-resolution mapping
Can measure outlines of objects
Unaffected by lighting conditions
Expensive
Affected by dust, rain, and snowy conditions
Only objects in the scanning plane are detected
3D point cloud storage requires large memory
The point cloud is sparse
RADARMillimeter wave radio waves
[emitted]
Reliable
Accurate distance and relative speed measurement
Suitable for medium to long distance range (200 m)
150° wide field of view
Good Angular Resolution
Robust in different weather and environmental conditions
Expensive
Heterogenous reflectivity of materials makes processing tricky
Lower processing speed compared to camera and lidar
Lacks fine resolution needed for obstacle detection
CameraVisible light
[ambient]
Low cost, Compact size
Rich Contextual information
Vision similar to human eyes
No interference problems with the environment
Estimate boundaries of objects
Requires ambient light to illuminate the field of view
Susceptible to changes in light, dust, rain, and snow
High Computation cost
No depth information provided
UltrasoundSound waves above 20 kHz
[emitted]
Low cost, simple to operate
Lightweight, robustness, and fast response time
Good performance in poor lighting and transparent objects
Detect a wide range of materials
Not suitable for medium to long distance range, normally more than 5 m
Affected by temperature, pressure, and ambient noise in the environment
Wide beam width and sensitivity to mirror-like surfaces cause specular reflections
Cannot distinguish shape and size
Must be perpendicular to the target as possible to receive the correct range data.
InfraredInfrared Light
[emitted]
High-resolution, low-cost, and light weight
Faster response time than ultrasound
Can measure temperature
Sensitive to weather conditions
Short detection range
Affected by dim light conditions
Table 3. A Summary Table of Obstacle Detection Sensor Features [Lidar: [10,50,58,59,60,61], Radar: [62,63], Camera: [64,65,66,67], Ultrasound: [68], Infrared: [69].
Table 3. A Summary Table of Obstacle Detection Sensor Features [Lidar: [10,50,58,59,60,61], Radar: [62,63], Camera: [64,65,66,67], Ultrasound: [68], Infrared: [69].
SENSORSRANGEMASSOPERATING TEMPERATUREFIELD OF VIEWACCURACYRESOLUTIONDIMENSIONPOWERSCOSTOUTPUT
3D LIDAR [Velodyne VLP 16]
Sensors 23 02802 i010
30–100 m830 g−10 °C to 60 °C 360° horizontal, 30 ° vertical ± 3 cmAngular resolution: 2° vertical, 0.1–0.4° horizontal103 mm diameter × 72 mmVoltage: 9 v–18 V
Power Consumption: 8 W
AUD 5000+•Up to 300,000 points per second
•100 Mbps Ethernet connection
•UDP packets containing distances, calibrated reflectivities, rotation angles, synchronized time stamps (μs resolution)
2D LIDAR
[RPLIDAR A2M8]
0.15–12 m190 g0 ℃ to 40 ℃360° 1% of range (</= 3 m),
2% of range (3–5 m),
2.5% of range (above 5 m)
Angular: 0.45°
Range: </= 1% of range (below 12),
</= 2% of range (12 m ~16 m)
76 mm diameter × 41 mmVoltage: 5 V
Current: 450 mA–600 mA
Power Consumption: 2.25–3 W
AUD 6408000 points obtained with 10 Hz rotational speed
1D LIDAR
[TF Mini Lidar]
0.3–12 m6.1 g−20°C to 60 °C 2.3°1% (0.3 m–6 m),
2% (6 m–12 m)
5 mm42 × 15 × 16 mmVoltage: 4.5 V–6 V
Power Consumption: 0.12 W
AUD 75.36Distance obtained at 100 Hz
Solid State Lidar
[ibeo LUX 4L]
Sensors 23 02802 i011
Up to 50 m998.7 g−40 °C to +85 °C110° (H) × 3.2° (V)10 cmAngular Resolution (H × V): 0.25° × 0.8°
Range Resolution: 4 cm
164.5 × 93.2 × 88 mmVoltage: 9–27 V
Power Consumption:7 W
-Distance and echo pulse width
Radar
[Delphi ESR]
Sensors 23 02802 i012
Long range: 1–174 m
Mid-range: 1–60 m
Velocity range: < −100 m/s to 40 m/s
Lateral: −20 m/s to +20 m/s
575 g-Horizontal
Long range: ±10°
Midrange: ±45°
Vertical: 4.2° to 4.75°
Range: < =/- 0.5 m (long), < ± 0.25 m (mid)
Range rate: < ± 0.12 m/s
Azimuth angle Centroid: < ± 0.3° (corner reflector targets in long-range),< ± 0.5°(other targets in long range), < ± 1.0° (mid-range)
Azimuth angle: <3.5° (long range)
<12 ° (mid-range)
173.7 × 90.2 × 49.2 mm-AUD 2500The estimated centroid of the detected object which includes the range to the centroid, its bearing angle, its longitudinal and lateral speeds, its acceleration, and the power of the returned signal (Φ).
Camera
[Raspberry pi camera]
-3 g−30 °C to 70 °C53.50 ± 0.13° (H),
41.41 ± 0.11° (V)
-Still resolution: 5 MP
Sensor resolution: 2592 × 1944 pixels
25 × 24 × 9 mmPower Consumption: 325 mWAUD 25Images: 1080 p @ 30 fps, 720 p @ 60 fps, 480 p @ 90 fps
Stereo Camera
[Roboreception RC Visard 160]
Depth Range: 0.5–3 m850 g0 °C to 50 ℃Horizontal 61°, Vertical 48°Depth Accuracy: 0.4–13 mmImage resolution: 1280 × 960 pixels, 1.2 MP
Lateral Resolution: 0.5–2.8 mm
Depth Resolution: 0.1–3.3 mm
230 × 75 × 84 mmVoltage: 18–30 V
Power Consumption: 25 W
-Right and left rectified image, depth image, confidence image at 0.8–25 Hz frames per second
TOF Camera
[IFM O3D03]
3–8 m766.95 g−10 °C to +50 °C60° × 45°-Image Resolution: 352 × 264 pixels72 × 65 × 82.6 mmVoltage: 20.4–28.8 V
Current: <2400 mA
Power consumption: 10 W
AUD 30003D image data obtained with a reading rate of 25 HZ
Ultrasound
[HC SR04]
Sensors 23 02802 i013
2 cm–4 m8.5 g−15 °C to 70 °C30° conical3 mm-45 × 20 × 15 mmVoltage: 5 V DC
Current: 15 mA
Under AUD 5Distance estimated from the time between send (eight k40 Hz signals) and received pulses
Infrared
[Sharp GP2Y0A02YK0F]
Sensors 23 02802 i014
20–150 cm4.8 g−10 °C to 60 °C10°--44.5 × 18.9 × 21.6 mmVoltage: 4.4 V–5.5 V
Current consumption: 33mA
AUD 30Distance
Table 4. Selected obstacle detection sensors, type of wearables, sensor specifications, and feedback for assistive devices.
Table 4. Selected obstacle detection sensors, type of wearables, sensor specifications, and feedback for assistive devices.
Obstacle Detection SensorNumber of
Sensors
DeviceType of ObstacleMassRange FeedbackCostReference
Ultrasound 5Wearable jacketPath obstacles3 g2 cm–400 cmBuzzer, VibratorLow[82]
2BeltNear object, distant object Light2 cm–400 cmVibrationLow[92]
Camera2Bicycle Helmet Foreground objectLight10–20 mAcoustic FeedbackLow[81]
Microwave radar1CaneFloor/suspended obstaclesLight1 m–5 m Acoustic, VibrationLow[93]
Infrared sensor 2Foldable StickHigh-level /floor level obstacle, staircasesLightUp to 200 cmAudio through earphoneLow[94]
ToF Distance sensor7BeltLow and High Obstacle 8 g0–14 mVibration beltHigh[83]
Tfmini Lidar and Ultrasound1 eachSmart GlassesObstacles within the 1.7 m height, descending stairsLight10 cm–12 m (Lidar), 2 cm–300 cm (ultrasound)Buzzer, AudioLow[79]
Lidar and Web Camera1 eachHaptic StrapChair, person, bottle, bicycleLightNot givenVibration and AudioLow[95]
Asus Xtion Pro camera1 Strap on chest, HandcuffPath ObstaclesMedium0.8–3.5 mVibrationHigh[96]
Microsoft Kinect1Strap on neckPath ObstaclesMediumNot givenAudioMedium[85]
Table 5. Main features of the obstacle detection shoe systems taken from the reviewed literature.
Table 5. Main features of the obstacle detection shoe systems taken from the reviewed literature.
Shoe NameApplicationSensorsProcessor BoardAdditional Device in Overall SystemAdditional SensingAccuracyAlerting TechniqueBattery LifeDetection Range
Smart-Alert Walker [91]Visually ImpairedUltrasound on shoes (obstacle),
Camera on the cane (obstacle),
Water Sensor (Moisture)
ArduinoSmart-fold caneWater sensing95.54% for common obstaclesVibration Alert in leg,
Audio output for the detected obstacles
--
Smart Bottine [32]Autistic PeopleUltrasonic sensor HC SR04 (obstacle)
Infrared(insect)
IMU MPU6050 (fall)
NodeMCU (ESP8266)Smart PhoneInsect/Reptile Detection,
Fall Detection, Location Tracking
-Smartphone, Piezo buzzer--
Smart Shoe
[33]
Visually ImpairedUltrasonic sensor HCSR04(obstacle)
Infrared sensor(pothole)
Moisture sensor (water)
Temperature sensor LM35 (Fire)
Arduino NanoSmart PhoneLocation tracking,
Pothole Detection,
Hot object detection, Slippery Surface, health tracking
-Voice sent to user’s headphone3–4 Hrs20 cm–4 m
Smart Shoe [34]Visually ImpairedUltrasonic Sensor HCSR04 (obstacle),
Accelerometer ADXL335 (Foot motion)
STM32L432KCSmartphoneGait sensing, Fall detection-Vibration Motor--
Smart Shoe system [36]Visually ImpairedUltrasonic Sensor, Fisheye cameraRaspberry pi Zero (streaming, actuation), Smartphone(detection)Smart phoneNavigation-Vibration Motor--
Shoe System [38,100]Visually ImpairedUltrasonic (obstacle), Infrared sensor (obstacle),
Force sensitive resistor (Shoe wearing)
ARM cortex M3 LPC1768-Detect whether the shoe is worn by user-Buzzer--
Real Time Assistive Shoe [35]Visually ImpairedUltrasonic sensor (obstacle),
Moisture sensor (soil moisture)
Renesas RL78/G13SmartphoneMoisture Detection, Navigation-Audio output-2 cm–80 cm
Clear Vision- Smart Shoes [101]Visually ImpairedUltrasound HCSR04Arduino Nano (ATMEGA328)Knee BandHeight Detection-Vibration--
Smart Assistive shoe [102]Visually ImpairedUltrasound HCSR04NodeMCUSmartphoneShoe Position Finder-Vibration--
Smart Shoe [103]Visually ImpairedUltrasoundArduino UNOSmart glasses--Vibration--
Smart Shoe [104]Visually ImpairedUltrasound HCSR04Arduino Nano---Buzzer--
Obstacle Detection Shoe [30]Visually ImpairedInfrared sensor (obstacle,
Accelerometer (Shoe direction)
Arduino 101 board-Shoe Direction-Piezoelectric Buzzer--
Smart Shoe [98]Visually ImpairedUltrasound HCSR04, Water Sensor (wet),
MPU6050 sensor(fall)
Arduino Mega-Wet detection, Fall detectionOverall accuracy-95.33%, Sensitivity of 98% and false detection rate of 5.3%.Audible notification and vibration motors2 h2 cm–300 cm
COMPASS [105]Visually ImpairedUltrasound (obstacle), Raspberry Pi camera (text)ESP32 development board(shoe)
Raspberry Pi (bracelet)
Smart Bracelet, SmartphoneText Detection-Beeper--
Blind Shoe [37]Visually ImpairedUltrasonic sensor (obstacle),
Water level sensor(water)
Arduino UNOSmartphoneSlippery or wet surface97.33%Buzzer, Audio -2 cm–4 m, 15 degree
BUZZFEET [99]Visually ImpairedUltrasound (obstacle), Infrared (pit) Arduino Lilypad Audio processor modulePit detection-Audio--
NavGuide [31]Visually ImpairedUltrasound(obstacle),
Liquid Detector Sensors (wet floor)
Customized microcontroller -Wet floor Detection-Audio, Vibration850–1000 min-
Fall Prevention ShoesElderlyLine laser (obstacle), Camera (obstacle, gait) - -Gait detection-Alarm message -0.5–1 m
IPrevent Shoes [106,107]Senior PeopleRadarLaptop------
Smart Shoes [108]Visually challengedUltrasonic sensor---89.5%Tapping at the foot arch5 h0–2 m (regular) 0–1 m (crowd)
Table 6. Microcontroller boards used for obstacle detection shoes. The features are taken from the papers and datasheets.
Table 6. Microcontroller boards used for obstacle detection shoes. The features are taken from the papers and datasheets.
Microcontroller
Board
DescriptionReference
ArduinoControls, processes, and generates all inputs and outputs. It receives the echo signals from the ultrasonic sensor that trigger it to take further actions and checks if the obstacle is there. It generates an immediate alert using a buzzer. It also generates a caption for the image captured by a camera and later converts that caption into speech that is played through an audio device.[91]
Arduino Nano—small, complete, breadboard-friendly board based on ATmega328 CPU clocked at 16 MHZ, 2 KB SRAM, 32 KB flash, 22 digital I/o pins, 8 analog pins and mini-USB port.[33,101,104,114]
Arduino UNO-equipped with the well-known ATmega328 P and the Atmega 16U2 Processor.[37,103]
Arduino 101 Board—To adjust the detection range of the sensor according to the walking direction, Arduino 101 boards, which contain Bluetooth, a six-axis accelerometer, and a gyrometer, were utilized.[30]
Arduino Mega—Atmega 2560-based with 54 digital input/output pins (of which 15 can be used as PWM outputs), 16 analog inputs, 4 UARTs (hardware serial ports), a 16 MHz crystal oscillator, a USB connection, a power jack, an ICSP header, and a reset button.[98,115]
Arduino Lilypad
Atmega 328 V, 1 KB SRAM,512 bytes EEPROM, 8 MHZ clock speed,
The Arduino Lilypad is attached with APR Module, GSM Module and Sensors. This Lilypad recognize the instructions sent by the Ultrasonic and IR sensors and works accordingly. It operates on 5v and is programmed by using Arduino IDE simulation platform. It receives and sends instructions accordingly.
[99]
NODE MCU 3.3 V operating voltage, 80 MHZ clock speed, 4 MB flash memory, 64 KB RAM, 11 digital pins, 1 analogue pin on this board and built-in Wi-Fi 802.11 b/g/n.
The ESP8266 LX106 microcontroller on the NodeMCU receives data from attached sensors, process the data, and use the uploaded code such as SSID of the Wi- Fi network, the password for the Wi-Fi network, to communicate with smartphone.
[32,102]
STM32L432KCUltra-low-power microcontrollers based on the high-performance Arm® Cortex®-M4 32-bit RISC core operating at a frequency of up to 80 MHz with floating point unit, 1.71 V to 3.6 V power supply.[34,116]
Raspberry Pi Zero The board incorporates a quad-core 64-bit Arm Cortex-A53 CPU, clocked at 1 GHz, 512 MB LPDDR2, 2.4 GHz 802.11 b/g/n wireless LAN and Bluetooth 4.2.
The streaming of sensors data and actuation on shoe will be performed by the raspberry pi, while the obstacle detection is performed by smartphone.
[36,117]
ARM cortex M3 LPC1768A flash memory of 512 KB, 64 KB data memory, a processor frequency of 100 Hz, 13 general purpose input-output (GPIO) registers _ 6 pulse width modulation (PWM) pins, 8 channel 12-bit analog to digital converter (ADC).
The ultrasonic and infrared data given to the ARM cortex M3 microcontroller which determines if an obstacle is present or not.
[38]
Renesas microcontrollerLow level power consumption with supply voltage varying from 1.6–5.5 volts, the execution time can be varied from 32 Mhz–32 kHz, consists of 64 pins which include code flash memory, DMA controller, high-speed on-chip oscillator, serial interface, data flash memory.[35]
ESP32 Development BoardESP32 is a development board that incorporates both Wi-Fi and Bluetooth, which makes it a good choice to be utilized in projects related to embedded systems. It has Tensilica Xtensa Dual-Core 32-bit LX6 microprocessor which operates at either 160 or 240 MHz.[105]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Joseph, A.M.; Kian, A.; Begg, R. State-of-the-Art Review on Wearable Obstacle Detection Systems Developed for Assistive Technologies and Footwear. Sensors 2023, 23, 2802. https://doi.org/10.3390/s23052802

AMA Style

Joseph AM, Kian A, Begg R. State-of-the-Art Review on Wearable Obstacle Detection Systems Developed for Assistive Technologies and Footwear. Sensors. 2023; 23(5):2802. https://doi.org/10.3390/s23052802

Chicago/Turabian Style

Joseph, Anna M., Azadeh Kian, and Rezaul Begg. 2023. "State-of-the-Art Review on Wearable Obstacle Detection Systems Developed for Assistive Technologies and Footwear" Sensors 23, no. 5: 2802. https://doi.org/10.3390/s23052802

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop