Next Article in Journal
Implementation and Investigation of an Advanced Induction Machine Field-Oriented Control Strategy Using a New Generation of Inverters Based on dSPACE Hardware
Next Article in Special Issue
Enabling End-Users in Designing and Executing of Complex, Collaborative Robotic Processes
Previous Article in Journal
The Use of Machine Learning for Comparative Analysis of Amperometric and Chemiluminescent Methods for Determining Antioxidant Activity and Determining the Phenolic Profile of Wines
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Improvement of the Sensor Capability of the NAO Robot by the Integration of a Laser Rangefinder

1
Department of Industrial Engineering, University Rome Tor Vergata, I-00133 Rome, Italy
2
ENEA—Robotics and Artificial Intelligence Lab, “Casaccia” Research Centre, I-00123 Rome, Italy
*
Author to whom correspondence should be addressed.
Appl. Syst. Innov. 2022, 5(6), 105; https://doi.org/10.3390/asi5060105
Submission received: 11 June 2022 / Revised: 27 September 2022 / Accepted: 29 September 2022 / Published: 24 October 2022
(This article belongs to the Special Issue New Trends in Mechatronics and Robotic Systems)

Abstract

:
This paper focuses on integrating a laser rangefinder system with an anthropomorphic robot (NAO6—Aldebaran, United Robotics Group) to improve its sensory and operational capabilities, as part of a larger project concerning the use of these systems in “assisted living” activities. This additional sensor enables the robot to reconstruct its surroundings by integrating new information with that identified by the on-board sensors. Thus, it can identify more objects in a scene and detect any obstacles along its navigation path. This feature will improve the efficiency of navigation algorithms, increasing movement competence in environments where people live and work. Indeed, these environments are characterized by details and specificities within a range of distances that best suit the new robot design. The paper presents a laser finder integration project that consists of two different parts, which are as follows: the former, the mechanical part, provided the NAO robot’s head; the latter, the software, provided the robot with proper software drivers to enable integration of the new sensor with its acquisition system. Some experimental results in an actual environment are presented.

1. Introduction

According to well-known and widespread demographic studies, the number of people that have lost their self-sufficiency continues to increase, due to typical pathologies associated with old age or disabilities [1,2,3]. For this purpose, in recent years, a research field known as “assisted living” has been developed to search for adequate solutions to support this section of the population in their everyday life [4,5,6,7]. Different solutions have been suggested to provide this increasing proportion of people with proper and effective support tools to enable them to continue living in their own homes, while ensuring an adequate degree of self-sufficiency. Among these solutions, the use of anthropomorphic robots is promising [8,9,10].
The acceptability of any IT (information technology) tool by the elderly (and by fragile users in general) mainly depends on the characteristics presented by its interface. These interfaces must be adequate to fit the specific features and criticalities of the user [11,12,13,14,15] to improve the system’s acceptance in providing support to fragile users.
Among the various assistive technologies currently available, socially assistive robots (SARs) have proven to be helpful in elderly care. They can effectively assist frail users in continuing to live in their homes [16,17,18,19,20,21,22]. When an anthropomorphic form is added to these robots, the relationship can be further strengthened because user–robot interactions can adopt the same communication channels that people usually use. These robots can adapt to the specific characteristics of each user, using the person’s most efficient communication channel by applying reinforcements tuned to people’s particular abilities [23,24,25,26,27]. In this study, we used the anthropomorphic robot NAO [28], a well-known platform that is famous for its pleasant and reassuring characteristics [29,30,31,32].
In particular, the purpose of this study was to improve the sensory and operational capabilities of the NAO robot, as part of a larger funded project concerning the use of these systems in “assisted living” activities. The addition of a laser rangefinder sensor enabled the robot to reconstruct its surroundings, by integrating new information with that identified by the on-board sensors. As a consequence, the robot can better detail its surroundings, enabling it to recognize new objects and detect any obstacles that may be present in its path, allowing the navigation algorithms to operate more efficiently and increasing the autonomy of movement inside a house. For this purpose, a new helmet that houses a laser rangefinder sensor has been designed to be worn by the robot. This device inputs measured distance data directly into the robot’s memory, making the data available for other tasks, such as localization and navigation.
The developed system has been tested within living environments where people usually live and work and that are characterized by details and specificities within a range of distances that are well-suited to the chosen device.

2. Materials and Methods

2.1. NAO6—Aldebaran, Part of United Robotics Group

The NAO is a commercially available anthropomorphic robot designed to imitate human movement, while performing user-friendly interactions (Figure 1). It is a small robot, 58 cm in height, about 5.5 kg in weight, and one of the most widely used robotic platforms with similar features in the research field. In particular, several applications of innovative social and creative robotics algorithms are currently available.
The NAO robot is compatible with the ROS (Robot Operating System) standards [33], which makes it remarkably versatile and relatively easy to use in various applications. Moreover, it can be considered one of the most advanced humanoid robots that is commercially available today and is capable of expressing itself in several languages. Due to these features, the NAO robot represents a good example of an SAR that demonstrates excellent interaction capabilities when employed with autistic children, the elderly, or persons with disabilities, as reported by numerous studies available in the literature [34,35,36,37,38,39,40,41].
In this study, the model H25 (version 6) has been used. It is equipped with two 5-megapixel cameras with a resolution of 2592 × 1944 that are located in the forehead and are capable of a frame rate of up to 30 images/s in the high-definition mode. Moreover, to allow interaction with the user using voice commands, the robot is provided with four directional microphones positioned on the head, a speech synthesizer, and two high-fidelity speakers at ear height. An Intel quad-core Atom E3845 processor at 1.91 GHz that is able to process up to 8 GB of memory and a software system based on the Linux Built-in OS are the core of the onboard processing unit devoted to engine control, communications, and ancillary services (e.g., voice recognition, etc.). The data transmission with peripherals outside of the robots or with the user themselves is assured by a network card provided with an Ethernet cable (the port is positioned on the back of the head), a Wi-Fi wireless transmitter, and two infrared transceivers placed on the robot’s head at eye height.
From a mechanical point of view, NAO is a robot with 25 DoF (degrees of freedom) that uses four typologies of electric motors to move its joints, each presenting a different torque value. The robot has also been provided with sensors to control the joints’ positions (i.e., encoders) and to measure each motor’s electrical absorption for complete control of its movements. A lithium polymer battery (LiPo; 21.6V/2.9Ah) provides the robot with an operating time that varies from 45 to 120 min, depending on the performed activities.
Regarding the availability of on-board sensors for its localization and obstacle avoidance, NAO is provided with different families of sensors. An inertial measurement unit (IMU) with six degrees of freedom is located in the middle of the trunk. With this device, the robot can measure its kinematic parameters and, in this way, also evaluate its postural position. Furthermore, it is equipped with contact sensors positioned on its surface to act as additional touch input and for collision detection. The former are three capacitive touch sensors that have been installed on the head and each of the hands, respectively. The latter are two bumper sensors that have been placed on the front of each foot and four FSR (force sensitive resistor)-type pressure sensors in their lower part for the measure of the applied pressure.
The robot performs the obstacle detection operations by using, in addition to the two cameras, an ultrasonic (US) system composed of several transmitters and receivers working at 40 kHz that allow the scan of the scene in front of it. In particular, the obstacle’s position is achieved with trigonometric formulas, starting from measuring the two distances obtained by the US sensors and the known distance between them and detecting, in the range of 25–255 cm and a cone of 60°, the position of the nearest obstacle. No information about the obstacle’s dimension is available to the robot’s navigation system with this measurement technique. Moreover, below the minimum detection length of 25 cm, the robot cannot detect the obstacle’s distance, as the only available information is that something is in its path. In addition, it is worth noting that the measurement procedure takes about 10 ms to be completed.
Regarding motion odometry, this is computed by the NAO robot via exploiting the relative motion of the legs. Since a slip of the robot’s feet on the floor often occurs, the odometry error can be significant and quickly increase. Thus, its use for robot localization returns relatively poor results.

2.2. Autonomous Navigation and “Range Finder” Sensors

There are several critical issues concerning the autonomous navigation of a robot moving in an unknown environment, such as the accurate mapping of the surrounding area and the localization of each nearby object. This issue can be easily solved when the robot navigates an outdoor environment by exploiting the Global Navigation Satellite System (GNSS) signals. However, the solution must be accomplished using more complex indoor or underwater navigation algorithms where such a system is unavailable.
In the most straightforward cases, the robot has complete knowledge of the environment in which it moves, for example, by knowing the map of the environment or by adding distinctive markers within it. However, on a more general level, this information is not available. In this case, the robot must increase its knowledge while navigating, trying to build a detailed model of the surrounding environment, until obtaining a map and locating itself inside it. These operations can be performed online without needing any a priori knowledge of the area and are usually accomplished using specific and highly performing algorithms, known as SLAM (simultaneous localization and mapping) [42,43,44]. These algorithms exploit different mathematical and probabilistic approaches to solve the SLAM problem, but in all cases, their performance strongly depends on the accuracy and reliability of the scene’s measurements.
As described above, the NAO robot is equipped with a sonar-type sensor used by the robot’s navigation system to carry out obstacle avoidance procedures. Nevertheless, the limited action range of such a sensor and the long response time result in feeble performance that is unsuitable for providing the information usually exploitable by algorithms such as SLAM. Furthermore, as described above, the NAO’s odometry cannot be effectively employed for this purpose either.
The location of the cameras on the robot’s head does not yield any overlapping areas because the design allows a continuous alternation of forwarding gaze and gaze on the feet. Consequently, it is impossible to implement stereopsis to evaluate the distances and dimensions of the objects visible in the acquired images. The extraction of such parameters from monocular images, although possible through proper algorithms [45], shows limited performance and a high computational cost that is difficult to sustain by the NAO processor. Consequently, using a rangefinder laser seemed necessary to improve the system’s capabilities in its autonomous navigation. Most of the SLAM algorithms available in the literature are based on measurements obtained by odometry and/or scanning with laser rangefinders.
Currently, two leading families of laser rangefinders (laser rangefinder—LRF) are commercially available. They differ in the measurement principle used, which are as follows:
  • Time of flight measurement;
  • Measurement of phase shift.
The former family is represented by systems where the distance value is obtained based on the flight time measurement. This class of sensors presents the best performances in terms of precision and reliability. Nevertheless, it contains sensors of sizes and weights that are unsuitable for small robots in most cases. Indeed, this sensor family is commonly used in robots with larger dimensions and is mainly designed for outdoor use. On the contrary, the latter sensor family is features techniques for measuring distances, which are obtained based on phase shift measurements. These sensors generally have a much more compact structure than the previous sensors and show, in addition to considerably lower energy consumption, reduced weight, and dimensions, which makes them more suitable for use in small-size robots mainly designed for indoor use. For these reasons, using a rangefinder laser seems to be necessary to improve the system’s capability in its autonomous navigation.
Among the several rangefinder devices commercially available and suitable to be used with the NAO robot, we chose the following sensor that belongs to the LRF family based on phase displacement measurements: the LRF model URG-04LX-UG01 (Hokuyo Automatic Co., Ltd., Osaka, Japan), a device widely employed in both industrial and research fields and devoted to indoor applications only [46].
This device, as depicted in Figure 2, can acquire up to a maximum angle of 240°, ranging between 20 and 5600 mm. The device shows an angular resolution equal to 0.0063 rad (i.e., 0.36°), which means about three scans for each sexagesimal degree. A maximum number of 683 scans can be performed in the range between −120° and 120° and each of them are separated from the next by an angle of 0.36°. The sensor acquisition rate for the entire viewing angle (−120° and 120°) is equal to 100 ms, and it shows an accuracy that is dependent on the measured distance, which is as follows: ±30 mm (in the range of 60 mm to 1 m) and ±3% of the measurement (between 1 and 4.095 m). The device communication system is based on the standard Secure Communications Interoperability Protocol (SCIP2.0). The module can be connected with the host via the URG-SCIP protocol over a native Universal Serial Bus (USB) link or using the same port configured as a serial link. From a mechanical point of view, the Hokuyo Automatic’s rangefinder size is 50 × 50 × 70 mm, and its weight (without cable) is 139 g.
A few years ago, the French company that designed and built the NAO robot put a modified NAO head on the market, equipped with a rangefinder sensor [47]. This customization, for several reasons, is no longer available today, despite its undeniable effectiveness [48,49,50].
This work aims to restore this accessory by making it available again to the community of robot NAO users. It required both mechanical and electric design to overcome the issues of its suitable power supply and the relative mechanical hooking on the robot’s head. Therefore, the study aims to design and manufacture, by 3D printing, a helmet that is wearable by the robot and suitable for more applications [51,52]; the idea is to create a scaffold that is mountable on the robot’s head and that could host different typologies of payload. For example, this could consist of additional sensors or digital data memory to integrate the already available data obtained by the robot or a secondary battery to extend operational autonomy. Finally, the payload could also consist of an additional high-performance computing unit to implement those algorithms with high computational costs that are otherwise not directly executable by the robot.
This section describes the mechanical and electric project of a helmet that is customized to host the previously described rangefinder laser sensor.

2.3. Rangefinder Power Supply Circuit Design

Since the Hokuyo Automatic rangefinder device presents an electrical absorption profile that the robot may not be able to guarantee by its USB port, we decided to use an additional battery. Consequently, this required adapting the lithium polymer battery’s (LiPo; output voltage: 3.7 V and capacity: 1500 mAh for a weight of 20 g only) output voltage to the rangefinder’s power supply (i.e., 5 VDC) and the design and manufacturing of a suited electronic board for static conversion of the voltage and the battery recharge task. Figure 3 shows the design of the electronic board. The weight of the manufactured electronic board, plus the USB connector, is 26 g.

2.4. Design and Manufacturing of the Helmet

The helmet must present a structural support function for the rangefinder sensor, whose weight is about 140 g (plus the LiPo battery required for its power supply and the respective charging circuit board). Therefore, the piece must have adequate strength to hold the payload weight; consequently, it must be manufactured with a high-density material. In addition, the fixing of the different parts requires the use of self-tapping screws. After several tests were performed with other materials, a PLA (polylactic acid) filament was chosen, since it was a reasonable compromise between density, dimensional accuracy, and sustainability features.
The designed structure Is composed of the following three different pieces:
-
The rear hooking bracket (Figure 4a). This part is hooked, by two M2 screws (not visible) placed into two holes, onto the bottom of the back of the robot’s head.
-
The payload support base (Figure 4b). This part must hold the payload and can be placed laterally on the robot’s “ears”.
-
The payload protection enclosure (Figure 5a). This part must host the rangefinder laser device and the LiPo battery. In addition, it hosts the relative electronic board used for its recharging. Figure 5b depicts the protection enclosure with the rangefinder laser sensor.
The particular shape of the helmet employed in this design is dictated by the almost complete absence of flat surfaces and any attachment points on the robot’s head. Indeed, the robot skull is essentially a solid of rotation generated by the rotation of a low arch profile around a horizontal axis. Consequently, an object resting on the head tends to rotate and fall at the slightest movement. Therefore, the rear hooking bracket was designed to make use of the holes for two M2 screws located at the base of the occiput, near the robot’s neck, and was hooked onto this part, avoiding any unwanted rotation. In addition, this part was designed to join the payload support base where the rangefinder laser was placed. The dimensions of the rear hooking bracket are 120 × 100 × 100 mm, and the weight is 49 g.
The payload support base is composed of two parts connected through a flange and three screws. This feature allows for easier mounting of the piece on the robot’s head because of the low elasticity degree of the PLA material that does not tolerate extensive bending. Moreover, the helmet has been designed to finely adjust this piece’s position to fine-tune the rangefinder beam’s orientation (not only horizontally, but with a downward inclination to detect obstacles on the ground or upwards). For this purpose, this part presents two lateral grooves in the shape of an arc of a circle, where two screws that are placed in two holes have been arranged on both top sides of the rear hooking bracket. In this way, the payload support base is free to rotate around an axis, passing through the center of the robot’s “ears”, and it can be held in place by these screws once tightened. In particular, it can be tilted at an angle of ± 10° toward the horizontal plane. Its inclination does not vary during the robot’s motion since it has been fixed in advance, based on the conformation of the environment or measurement requirements. Finally, the payload support base was prepared with different holes to allow the part to house various size enclosures or to allow a small digital camera to pass through a hole tailored to host a 1/4” screw. The piece’s dimensions are 75 × 142 × 100 mm, and the weight is 132 g.
Figure 5 depicts the payload protection enclosure that was customized to host the rangefinder laser sensor. This enclosure was designed to host the base of the sensor. It presents a wide hole (diameter 52 mm) in the top for the rangefinder’s head, as shown in Figure 5b. In this figure, it is also possible to notice the switch and USB plug that is required to connect to the LiPo battery or to connect to a USB cable to recharge. The dimensions of the manufactured payload protection enclosure for this kind of sensor are 90 × 70 × 45 mm, and the weight is 41 g.
In summary, the total weight of the helmet’s structure is 222 g. Therefore, the whole weight of the helmet, including the sensor (rangefinder in this case), the battery, the electronic board, and the double USB cable, reaches 248 g. Figure 6 depicts the exploded-view drawing of the helmet that allows us to analyze its mechanical features in more detail.
Figure 7 shows the front, rear and side views of the manufactured helmet worn by the robot on its head. In particular, at the bottom of Figure 7b, it is possible to note a couple of holes that correspond with those located at the base of the occiput of the robot’s head that are required to hook the piece onto the robot using two M2 screws.
In Figure 7c, the flange and the relative three screws needed to assemble the support base and to help to simplify, as described above, the helmet-wearing operation on the robot’s head are shown. Moreover, it is possible to recognize the groove, in the shape of an arc of a circle, and the relative screw, required for its tilt adjustment.

3. Results

The dynamic control of the robot’s movements is based on the knowledge of the mass distribution and moments of inertia of its different solids. These parameters, which must be considered in the calculation of the robot’s direct dynamics, are contained in three terms named mass (M), center of mass (CoM), and inertial matrix (I0) [53]. The addition of the helmet, the rangefinder laser, and relative accessories on the robot’s head (an additive weight of about 250 g) has modified the mass distribution concerning its original configuration. Since this modification could reduce the robot’s stability, balance, and mobility, the effects of the new weight distribution on the inertia matrix of the robot’s head have been studied. Thus, these inertial parameters have been recomputed, considering the actual distribution of the weights and the dimensions of each part that belong to the helmet concerning the coordinate system shown in Figure 3a. The new inertial matrix is shown in (1).
C o M S = 0.00155 0 0.0734 I 0 S = 0.00645 9.327   10 6 4.632   10 5     9.327   10 6 0.00621 3.265   10 5     4.632   10 5 3.265   10 5 0.000161

3.1. Experimental Setup

The first tests performed on the robot aimed to verify its behavior’s fittingness, in terms of mechanical stability, when the helmet was worn. At the same time, the problems at the different joints and relative servo motors were investigated due to the greater weight placed on the robot’s head. Successively, the performance of the rangefinder sensor was tested in comparison with the ultrasound sensor in some operative scenarios.
Regarding the former issue, several different tests have been performed by carrying out a series of movements, starting with simple movements of the head only and, later, letting the robot walk. In these tests, any stability loss in the robot’s movement was monitored, together with any possible critical alteration in the current absorption of the servo motors and the relative operating temperatures. As for the second issue, to verify the improvements obtained by integrating the rangefinder laser sensor, the robot that wore the manufactured helmet was initially tested to compare the performances of the rangefinder sensor with those obtained by the sonar sensor embedded in the NAO robot (test #1). The latter test (test #2) evaluated three scenarios characterized by the presence of both fixed and moving objects. In these scenarios, the NAO robot was stationary or moving, acquiring information about the position of the obstacles in the surrounding environment, both with the camera and the rangefinder sensors. All the tests were performed in a domestic-like environment. In particular, in test #1, the robot was placed in front of a wall in a rest position, and the measure of the distance from the wall was carried out by using the rangefinder sensor over the robot’s head and the ultrasound sensor embedded in the robot’s trunk. Figure 8a depicts this experimental setup.
The three scenarios employed for test #2 are as follows:
  • Scenario #1: The robot NAO is moving in an environment of fixed objects. It is advancing towards an obstacle that consists of a large box placed at an initial distance of 200 cm (Figure 8b).
  • Scenario #2: The robot NAO is standing in an environment with the presence of moving objects (i.e., the Pepper Robot [54]). In particular, it is standing in front of an open door at a distance of 250 cm while the Pepper robot is moving towards it, passing through the door (see Figure 8c).
  • Scenario #3: The NAO robot is advancing in an environment where other objects are also in movement (i.e., the Pepper Robot—see Figure 8d). In particular, it was initially placed in a wide corridor at a distance from the second robot of more than 500 cm (maximum detectable range for the sensor). The acquisition of data began when both robots started to move toward each other.

3.2. Measurement Results

The acquisition procedure was the same in each previously described test; the robot, if moving, was stopped for the length of the acquisition of data and started moving again once completed. So, it was possible to repeat the measurement procedure several times to measure its accuracy. When involved in the tests, other robots moved in the same way. In the following polar diagrams, the NAO robot equipped with the laser is located in the center (at coordinates 0,0) and oriented toward 0°.

3.2.1. Test#1 Measurement Results

Figure 9 depicts the map of the room where the robot was placed for test #1. In particular, it is possible to identify a window located in a recess in the wall on the upper left side of the map, a radiator (depicted in dark gray) located on the upper part of the left wall, and two doors (shown in dashed red in the drawing) on the bottom part of both left and right walls, with the latter left open. The NAO robot was initially placed (test #1a, as shown in Figure 9) at a distance A = 100 cm from the front wall and B = 140 cm from the wall at its right. The next test (test #1b) was performed, maintaining the same distance B and reducing the distance A to 50 cm.
The laser rangefinder device was set with an opening angle of 240°, a detection range between 50 and 300 cm, and the same measurement was repeated one hundred times for each position. Figure 10a,b show the polar diagrams obtained at the two distances.
Both the polar diagrams shown in Figure 10 make it possible to note the objects in the room. In particular, at 2 m, 60° (as shown in Figure 10a), one can observe the radiator and the recess in the wall of the window, while at 1.5 m, 240°–270° (as shown in Figure 10a), the shape of the open door can be observed. Table 1 reports the averages of the one hundred measures of the distances in front of the robot, at what angle the laser rangefinder device identifies the distance, and the measure of the same distance obtained with the sonar sensor for both tests #1a and #1b. The values of the angles at which the distance values were determined can be found in brackets.
Figure 11 depicts the two histograms relative to the statistics of the laser rangefinder’s measurements.

3.2.2. Test #2 Measurement Results

Scenario #1

Figure 12 shows the measurement setup and the results of test #2 (scenario #1). The robot was placed at an initial distance of 200 cm (Point A in Figure 12) from one of the corners of a box sized 60 × 50 × 60 cm. The room is the same as test #1 (as shown in Figure 9), but the robot travels horizontally towards the left wall. In particular, the robot advanced towards this obstacle, while the rangefinder device continuously took measurements and stored them in the proper memory area. Moreover, the robot stopped at three specific distances from the box, (B) 150 cm, (C) 100 cm, and (D) 60 cm, to read the actual distances. The corresponding polar diagrams are shown in Figure 12c, while Figure 12b shows the relative images acquired by the robot’s camera.

Scenario #2

Figure 13 depicts the results of test #2 (scenario #2). In particular, it shows the pictures taken by a camera behind the NAO (Figure 13a) and those taken by the robot’s camera (Figure 13b). Figure 13c shows the polar diagram from the data acquired by the rangefinder device. The NAO robot was placed at an initial distance of about 150 cm from an open door from which the Pepper robot entered the room. The robot NAO remains steady for the entire test duration, while the Pepper robot is moving. Figure 13c shows the actual detected positions of the Pepper robot (circled in red).

Scenario #3

Figure 14 shows the results of test #2 (scenario #3). It shows the pictures that a camera took behind the NAO (Figure 14a) and those taken by the robot’s camera (Figure 14b). Figure 14c depicts the polar diagram from the data acquired by the rangefinder device. In this case, the robot was placed in a corridor with stationary obstacles. The NAO robot moves forward in steps of 50 cm, while the Pepper robot advances toward it to stop beside it. Figure 14c shows the actual detected positions of the Pepper robot (circled in red).

4. Discussion

The measurements highlighted how the designed helmet with the rangefinder device can provide a wide range of accurate information about the surrounding environment, which is otherwise not retrievable with the robot’s embedded sensors. At the same time, it does not significantly affect the stability of the robot when accomplishing its usual tasks. In each of the performed tests, the robot, due to the new settings described by equation (1), maintained its balance correctly and responded by executing the commands given without errors, both for the rotational movements of the head and the walking movements, maintaining a good level of balance and mobility.
Due to the additional battery used as a sensor power supply, no alterations in the current power consumption from the robot’s onboard battery were observed. In addition, no alterations were found concerning anomalous power consumption of the robot’s servo motors, and no supply current increases were detected in any of the stressed joints. The factory provides each servo motor with a proper current sensor. Moreover, each joint presents a specified current limitation value to protect the respective motor, the electronic board, and any mechanical parts of the joint itself. Thus, if the current reaches the maximum allowed value, the PWM (pulse wide modulation) signal returned by the control loop will be modified until the value of the current falls back into the allowed range.
Moreover, by using the NAO APIs, it is possible to retrieve the temperature information from each joint, actuator, CPU, and battery and store the value in the proper corresponding ALMemory key. If at least one of the items presents a non-null status, a HotDeviceDetected() event is triggered. In particular, the NAO software provides an alert code to monitor the temperature status for each device, from 0 (meaning a normal temperature) to 3 (meaning that the joints are critically hot). A non-null value of this code implies an automatic correction of the robot’s stiffness, which will be lowered in the case of a status equal to three, until the zero value is reached.
Once these embedded controls were activated within the tested behaviors, no malfunction warning regarding an anomalous current nor an increase in the temperature of the joints, actuators, CPU, or battery was reported during any of the tests. In addition, no robot protection system operation occurred regarding the robot’s stiffness, which was maintained during the tests.
The device consumption and the temperatures inside the machine remained stable. Due to the excellent balance and appropriate weight of the proposed system, even the movements of the devices related to the HeadYaw, the only degree of freedom allowed for the robot’s head, were smooth and not slowed down. This also affects the movements of the robot, which, as long as the walking speed is not increased, but maintained as recommended, remained normal and stable. This is the case in indoor environments (where the tests were carried out), where the factory characteristics of the NAO allow it to operate in complete safety. Since this is a simple characterization of the system, we have no speed constraint, so a setup was chosen in which the robot moves, stops, and acquires laser data and then moves again.
The precise adjusting mechanism provided by the helmet allows for the leveling of the rangefinder sensor. Thus, when the helmet is worn and the sensor is adequately oriented, the robot’s head must no longer be pitch rotated, while yaw rotation is freely allowed. This setup, combined with the firmness of the helmet lock, ensures parallelism with the floor.
The new robot-sensor system has been verified to study the behavior of rangefinder lasers within a living environment. In particular, the rangefinder device has been tested for the designed power supply and laser control through the software functions developed by the robot manufacturers. Moreover, by test #1, the reliability of the sonar and laser rangefinder measures were studied. The measurement was performed at 50 and 100 cm, respectively, and repeated 100 times. The obtained values (Table 1) are within the measurement error given by the manufacturer, and the values obtained by the US sensor present about the same uncertainty as the laser sensor (see Figure 11).
Nevertheless, the critical issue is that for the sonar sensor, unlike the laser sensor, both the measurements obtained do not fall within the error range. Furthermore, using the laser rangefinder allowed for a broader richness of information about the surrounding environment, whereas the US sensor can only provide measures of one-point distances. In contrast, the laser rangefinder can analyze the whole plane around the robot. In particular, this is visible in the polar diagrams shown in Figure 8a,b, as the system has been able to detect, at a precise distance, the room’s three walls within an angle of 240°. Furthermore, it could locate the recess of the window and the presence of the radiator at the left of the robot (partially visible in the photo in Figure 8a).
In test #2, the robot correctly performed the assigned task, moving without any stability problems in the three scenarios. Figure 12, Figure 13 and Figure 14 depict the results of these tests. As shown in the polar diagrams, the system can detect the presence and relative distance from the robot of the standing and moving objects in the scene. For example, in Figure 12 (scenario #1), it can be observed that the sensor was able to detect at a precise distance, in addition to the side of the box located at 0°, the three walls visible in the detection range of the sensor, the wall recess and the radiator (located between 330° and 360° at a distance of 200 cm).
In the test depicted in Figure 13 (test #2—scenario #2), it is possible to note that the system can detect (see the top polar diagram) the opened door and some other obstacles in the scene. In the polar diagrams below, one can observe the position (circled in red) of the Pepper robot that is advancing towards it. In particular, in the lowest row of the figure, one can note the rangefinder sensor can detect the exact position of the Pepper robot (1 m, 300°), which is located out of the on-board camera’s field of view.
Finally, Figure 14 describes the test where both robots move toward each other. In this case, it is possible to note that the distance from the wall at the left of the robot reduces as the NAO’s moves and the Pepper robot (whose position is circled in red in the figure) approaches. As in the previous scenario, the rangefinder sensor can detect the Pepper robot’s position (50 cm, 60°) outside of the on-board camera’s field of view.

5. Conclusions

The paper described the integration of the NAO robot with a laser rangefinder to improve the robot’s capabilities within unstructured environments. The study involved a mechanical and electrical project and the development of software drivers to position a laser sensor on the robot’s head to be used in its navigation system. This new information, integrated with that already available to the robot, will allow the navigation algorithms to operate more efficiently, increasing the efficiency and autonomy of movement inside an indoor environment such as a house, which is characterized by furniture and specificity within a range of distances that best suit the proposed robot customization.
As proof of this, the tests carried out show that the robot equipped with the new sensor can effectively identify the operating scenario, allowing it to detect any obstacles that may be present in its path. This feature also allows the robot to perform more complex operations, such as context analysis and object recognition. Another future development of this work involves performing other tests in which measurements are acquired. At the same time, the robot moves and uses the newly available information to implement efficient planning and navigation algorithms.

Author Contributions

Data curation, V.B. and A.Z.; Funding acquisition, V.B. and A.Z.; Methodology, V.B. and A.Z.; Software, V.B. and A.Z.; Validation, A.Z.; Writing—original draft, V.B. and A.Z.; Writing—review & editing, V.B. and A.Z. The authors contributed equally to this work. All authors have read and agreed to the published version of the manuscript.

Funding

This paper reports research funded under the “Research on the Electrical System” ENEA-MiSE Program Agreement (PTR 2019–2021), with support from the Italian Ministry of Economic Development.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. World Population Ageing 2019 the United Nations. Available online: https://www.un.org/en/development/desa/population/publications/pdf/ageing/WorldPopulationAgeing2019-Highlights.pdf (accessed on 20 September 2022).
  2. Ageing Europe Statistics on Population Developments. Available online: https://ec.europa.eu/eurostat/statistics-explained/index.php?title=Ageing_Europe_-_statistics_on_population_developments (accessed on 20 September 2022).
  3. Ageing Europe. Looking at the Lives of Older People in the EU. Eurostat 2019 Report. Available online: https://www.age-platform.eu/publications/ageing-europe-looking-lives-older-people-eu-eurostat-2019-report (accessed on 20 September 2022).
  4. Available online: https://www.enea.it/en/international-activities/eu-activities/strategic-initiatives/ambient-assisted-living-aal (accessed on 20 September 2022).
  5. Tinker, A.; Kellaher, L.; Ginn, J.; Ribe, E. Assisted Living Platform—The Long Term Care Revolution; Housing LIN: London, UK, 2013. [Google Scholar]
  6. Savage, N. Robots rise to meet the challenge of caring for old people. Nature 2022, 601, S8–S10. [Google Scholar] [CrossRef] [PubMed]
  7. Botticelli, M.; Monteriù, A.; Zanela, A.; Romano, S. Smart Homes and Assisted Living as an Additional Service Offered to the Users. In Proceedings of the 2019 IEEE 9th International Conference on Consumer Electronics (ICCE-Berlin), Berlin, Germany, 8–11 September 2019. [Google Scholar]
  8. Betriana, F.; Tanioka, R.; Gunawan, J.; Locsin, R.C. Healthcare Robots and Human Generations: Consequences for Nursing and Healthcare. Collegian 2022, 29, 767–773. [Google Scholar] [CrossRef]
  9. Jecker, N.S. You’ve got a friend in me: Sociable robots for older adults in an age of global pandemics. Ethic Inf. Technol. 2021, 23, 35–43. [Google Scholar] [CrossRef] [PubMed]
  10. Franke, A.; Nass, E.; Piereth, A.-K.; Zettl, A.; Heidl, C. Implementation of Assistive Technologies and Robotics in Long-Term Care Facilities: A Three-Stage Assessment Based on Acceptance, Ethics, and Emotions. Front. Psychol. 2021, 12, 694297. [Google Scholar] [CrossRef] [PubMed]
  11. Shuhaiber, A.; Mashal, I. Understanding users’ acceptance of smart homes. Technol. Soc. 2019, 58, 101110. [Google Scholar] [CrossRef]
  12. Daruwala, N.A.; Oberst, U. Individuals’ Intentions to Use Smart Home Technology: The Role of Needs Satisfaction and Frustration, Technology Acceptance and Technophobia. Available online: https://ssrn.com/abstract=4061510 (accessed on 20 September 2022).
  13. Sohn, K.; Kwon, O. Technology acceptance theories and factors influencing artificial Intelligence-based intelligent products. Telemat. Inform. 2020, 47, 101324. [Google Scholar] [CrossRef]
  14. Ben Jemaa, A.; Irato, G.; Zanela, A.; Brescia, A.; Turki, M.; Jaidane, M. Congruent Auditory Display and Front-Back Confusion in Sound Localization: Case of Elderly Driver. J. Transp. Res. Part F Traffic Psychol. Behav. 2018, 59, 524–534. [Google Scholar] [CrossRef]
  15. Pal, D.; Funilkul, S.; Vanijja, V.; Papasratorn, B. Analyzing the Elderly Users’ Adoption of Smart-Home Services. IEEE Access 2018, 6, 51238–51252. [Google Scholar] [CrossRef]
  16. Aymerich-Franch, L. Why it is time to stop ostracizing social robots. Nat. Mach. Intell. 2020, 2, 364. [Google Scholar] [CrossRef]
  17. Li, R.; Oskoei, M.A.; Hu, H. Towards ROS Based Multi-robot Architecture for Ambient Assisted Living. In Proceedings of the 2013 IEEE International Conference on Systems, Man, and Cybernetics, Manchester, UK, 13–16 October 2013; pp. 3458–3463. [Google Scholar] [CrossRef]
  18. Mayer, P.; Beck, C.; Panek, P. Examples of multimodal user interfaces for socially assistive robots in Ambient Assisted Living environments. In Proceedings of the 2012 IEEE 3rd International Conference on Cognitive Infocommunications (CogInfoCom), Kosice, Slovakia, 2–5 December 2012; pp. 401–406. [Google Scholar] [CrossRef]
  19. Pirhonen, J.; Tiilikainen, E.; Pekkarinen, S.; Lemivaara, M.; Melkas, H. Can robots tackle late-life loneliness? Scanning of future opportunities and challenges in assisted living facilities. Futures 2020, 124, 102640. [Google Scholar] [CrossRef]
  20. Gomez-Donoso, F.; Escalona, F.; Rivas, F.M.; Cañas, J.M.; Cazorla, M. Enhancing the Ambient Assisted Living Capabilities with a Mobile Robot. Comput. Intell. Neurosci. 2019, 2019, 9412384. [Google Scholar] [CrossRef]
  21. Mahmood, S.; Ampadu, K.O.; Antonopoulos, K.; Panagiotou, C.; Mendez, S.A.P.; Podlubne, A.; Antonopoulos, C.; Keramidas, G.; Hübner, M.; Goehringer, D.; et al. Prospects of Robots in Assisted Living Environment. Electronics 2021, 10, 2062. [Google Scholar] [CrossRef]
  22. Bui, H.-D.; Chong, N.Y. An Integrated Approach to Human-Robot-Smart Environment Interaction Interface for Ambient Assisted Living. In Proceedings of the 2018 IEEE Workshop on Advanced Robotics and Its Social Impacts (ARSO), Genova, Italy, 27–29 September 2018; pp. 32–37. [Google Scholar] [CrossRef]
  23. De Freese, V.; Wright, T.; Robalino, I.; Vesonder, G. Robotic Solutions for Eldercare. In Proceedings of the 2019 IEEE 10th Annual Ubiquitous Computing, Electronics & Mobile Communication Conference (UEMCON), New York, NY, USA, 10–12 October 2019; pp. 389–394. [Google Scholar] [CrossRef]
  24. Jia, Y.; Zhang, B.; Li, M.; King, B.; Meghdari, A. Human-Robot Interaction. Journal of Robotics 2018, 2018, 3879547. [Google Scholar] [CrossRef]
  25. Torta, E.; Oberzaucher, J.; Werner, F.; Cuijpers, R.H.; Juola, J.F. Attitudes toward socially assistive robots in intelligent homes: Results from laboratory studies and field trials. J. Hum. Robot. Interact. 2012, 1, 76–99. [Google Scholar] [CrossRef] [Green Version]
  26. Onyeulo, E.B.; Gandhi, V. What Makes a Social Robot Good at Interacting with Humans? Information 2020, 11, 43. [Google Scholar] [CrossRef] [Green Version]
  27. Keizer, R.A.O.; van Velsen, L.; Moncharmont, M.; Riche, B.; Ammour, N.; Del Signore, S.; Zia, G.; Hermens, H.; N’Dja, A. Using socially assistive robots for monitoring and preventing frailty among older adults: A study on usability and user experience challenges. Health Technol. 2019, 9, 595–605. [Google Scholar] [CrossRef]
  28. Available online: https://https://www.aldebaran.com/en/nao (accessed on 20 September 2022).
  29. Recio, D.L.; Segura, L.M.; Segura, E.M.; Waern, A. The NAO models for the elderly. In Proceedings of the 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Tokyo, Japan, 3–6 March 2013; pp. 187–188. [Google Scholar] [CrossRef]
  30. Robaczewski, A.; Bouchard, J.; Bouchard, K.; Gaboury, S. Socially Assistive Robots: The Specific Case of the NAO. Int. J. Soc. Robot. 2021, 13, 795–831. [Google Scholar] [CrossRef]
  31. Blavette, L.; Rigaud, A.-S.; Anzalone, S.M.; Kergueris, C.; Isabet, B.; Dacunha, S.; Pino, M. A Robot-Mediated Activity Using the Nao Robot to Promote COVID-19 Precautionary Measures among Older Adults in Geriatric Facilities. Int. J. Environ. Res. Public Health 2022, 19, 5222. [Google Scholar] [CrossRef]
  32. Andreasson, R.; Alenljung, B.; Billing, E.; Lowe, R. Affective Touch in Human–Robot Interaction: Conveying Emotion to the Nao Robot. Int. J. Soc. Robot. 2018, 10, 473–491. [Google Scholar] [CrossRef] [Green Version]
  33. Robot Operating System. Available online: https://www.ros.org (accessed on 20 September 2022).
  34. Joglekar, P.; Kulkarni, V. Humanoid Robot as a Companion for the Senior Citizens. In Proceedings of the 2018 IEEE Punecon, Pune, India, 30 November–2 December 2018; pp. 1–4. [Google Scholar]
  35. Ghosh, R.; Khan, N.; Migovich, M.; Wilson, D.; Latshaw, E.; Tate, J.A.; Mion, L.C.; Sarkar, N. Iterative User Centered Design of Robot-Mediated Paired Activities for Older Adults with Mild Cognitive Impairment (MCI). In Proceedings of the International Conference on Human-Computer Interaction, Tashkent, Uzbekistan, 20–22 October 2022; Springer: Cham, Switzerland, 2022; pp. 14–28. [Google Scholar]
  36. Arent, K.; Kruk-Lasocka, J.; Niemiec, T.; Szczepanowski, R. Social robot in diagnosis of autism among preschool children. In Proceedings of the 2019 24th International Conference on Methods and Models in Automation and Robotics (MMAR), Międzyzdroje, Poland, 26–29 August 2019; pp. 652–656. [Google Scholar]
  37. Lytridis, C.; Vrochidou, E.; Chatzistamatis, S.; Kaburlasos, V. Social engagement interaction games between children with Autism and humanoid robot NAO. In Proceedings of the 13th International Conference on Soft Computing Models in Industrial and Environmental Applications, San Sebastian, Spain, 6–8 June 2018; Springer: Cham, Switzerland, 2018; pp. 562–570. [Google Scholar]
  38. Assad-Uz-Zaman, M.; Rasedul Islam, M.; Miah, S.; Rahman, M.H. NAO robot for cooperative rehabilitation training. J. Rehabil. Assist. Technol. Eng. 2019, 6, 2055668319862151. [Google Scholar] [CrossRef] [Green Version]
  39. Siddique, T.; Al Marzooqi, R.; Alleem, H.R.; Fareh, R.; Baziyad, M.S.; Elsabe, A.Y.H. Evaluation of UE Exercises using NAO Robot for Poststroke Disabilities. In Proceedings of the 2022 Advances in Science and Engineering Technology International Conferences (ASET), Dubai, United Arab Emirates, 21–24 February 2022; pp. 1–6. [Google Scholar]
  40. Jiménez, M.; Ochoa, A.; Escobedo, D.; Estrada, R.; Martinez, E.; Maciel, R.; Larios, V. Recognition of Colors through Use of a Humanoid Nao Robot in Therapies for Children with down Syndrome in a Smart City. Res. Comput. Sci. 2019, 148, 239–252. [Google Scholar] [CrossRef]
  41. Pino, O.; Palestra, G.; Trevino, R.; De Carolis, B. The Humanoid Robot NAO as Trainer in a Memory Program for Elderly People with Mild Cognitive Impairment. Int. J. Soc. Robot. 2020, 12, 21–33. [Google Scholar] [CrossRef]
  42. Stachniss, C.; Leonard, J.J.; Thrun, S. Simultaneous Localization and Mapping. In Handbook of Robotics; Springer: Berlin/Heidelberg, Germany, 2016; pp. 1153–1176. [Google Scholar]
  43. Wen, S.; Sheng, M.; Ma, C.; Li, Z.; Lam, H.K.; Zhao, Y.; Ma, J. Camera Recognition and Laser Detection based on EKF-SLAM in the Autonomous Navigation of Humanoid Robot. J. Intell. Robot. Syst. 2018, 92, 265–277. [Google Scholar] [CrossRef] [Green Version]
  44. Pagnottelli, S.; Taraglio, S.; Valigi, P.; Zanela, A. Visual and laser sensory data fusion for outdoor robot localisation and navigation. In Proceedings of the 12th International Conference on Advanced Robotics ICAR 2005, Seattle, WA, USA, 18–20 July 2005; pp. 171–177. [Google Scholar]
  45. Hock, P.S.; Parasuraman, S.; Khan, M.; Elamvazuthi, I. Humanoid Robot: Behaviour Synchronization and Depth Estimation. Procedia Comput. Sci. 2015, 76, 276–282. [Google Scholar] [CrossRef] [Green Version]
  46. Available online: https://www.hokuyo-aut.jp/ (accessed on 20 September 2022).
  47. NAO Software. Available online: http://doc.aldebaran.com/1-14/family/robots/laser.html (accessed on 20 September 2022).
  48. Oßwald, S.; Görög, A.; Hornung, A.; Bennewitz, M. Autonomous climbing of spiral staircases with humanoids. In Proceedings of the 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems, San Francisco, CA, USA, 25–30 September 2011; pp. 4844–4849. [Google Scholar]
  49. Gardecki, A.; Podpora, M.; Kawala-Janik, A. Implementation of an External Laser Scanner into Control System of the NAO Robot. IFAC-Pap. OnLine 2018, 51, 231–237. [Google Scholar] [CrossRef]
  50. Fojtů, Š.; Havlena, M.; Pajdla, T. Nao Robot Localization and Navigation Using Fusion of Odometry and Visual Sensor Data. In Proceedings of the International Conference on Intelligent Robotics and Applications, Montreal, QC, Canada, 3–5 October 2012; Su, C.-Y., Rakheja, S., Liu, H., Eds.; Springer: Berlin/Heidelberg, Germany, 2012; pp. 427–438. [Google Scholar]
  51. Chua, C.K.; Leong, K.F. 3D Printing and Additive Manufacturing: Principles and Applications (with Companion Media Pack)-of Rapid Prototyping; World Scientific Publishing Company: Singapore, 2014. [Google Scholar]
  52. Rayna, T.; Striukova, L. From rapid prototyping to home fabrication: How 3D printing is changing business model innovation. Technol. Forecast. Soc. Chang. 2016, 102, 214–224. [Google Scholar] [CrossRef]
  53. NAO Technical Overview. Available online: http://doc.aldebaran.com/2-1/family/robots/masses_robot.html (accessed on 20 September 2022).
  54. Available online: https://www.aldebaran.com/en/pepper (accessed on 20 September 2022).
Figure 1. Robot NAO.
Figure 1. Robot NAO.
Asi 05 00105 g001
Figure 2. LRF URG-04LX-UG01 by Hokuyo Automatic Co.
Figure 2. LRF URG-04LX-UG01 by Hokuyo Automatic Co.
Asi 05 00105 g002
Figure 3. Front (a) and back (b) view of the PCB design for the recharge and voltage regulation board.
Figure 3. Front (a) and back (b) view of the PCB design for the recharge and voltage regulation board.
Asi 05 00105 g003
Figure 4. (a) Rear hooking bracket; (b) payload support base.
Figure 4. (a) Rear hooking bracket; (b) payload support base.
Asi 05 00105 g004
Figure 5. Sensor protection enclosure: without (a) and with (b) the rangefinder laser device.
Figure 5. Sensor protection enclosure: without (a) and with (b) the rangefinder laser device.
Asi 05 00105 g005
Figure 6. Exploded-view drawing of the helmet’s design: (1) rangefinder laser device, (2) sensor protection enclosure, (3) payload support base, (4) rear hooking bracket, (5) robot’s head, (6) USB connector; (7) LiPo Battery.
Figure 6. Exploded-view drawing of the helmet’s design: (1) rangefinder laser device, (2) sensor protection enclosure, (3) payload support base, (4) rear hooking bracket, (5) robot’s head, (6) USB connector; (7) LiPo Battery.
Asi 05 00105 g006
Figure 7. Helmet on the robot’s head: (a) front, (b) rear, and (c) side view.
Figure 7. Helmet on the robot’s head: (a) front, (b) rear, and (c) side view.
Asi 05 00105 g007
Figure 8. Measurement setups for test #1 (a) and test #2 scenario #1 (b), #2 (c) and #3 (d).
Figure 8. Measurement setups for test #1 (a) and test #2 scenario #1 (b), #2 (c) and #3 (d).
Asi 05 00105 g008
Figure 9. Test #1: map of the room with the robot.
Figure 9. Test #1: map of the room with the robot.
Asi 05 00105 g009
Figure 10. Polar diagrams are relative to the acquisition of the rangefinder device with the robot placed at a distance of 100 cm (a) and 50 cm (b) from the wall. The robot is located in the center (at coordinates 0,0), and it is oriented toward 0°.
Figure 10. Polar diagrams are relative to the acquisition of the rangefinder device with the robot placed at a distance of 100 cm (a) and 50 cm (b) from the wall. The robot is located in the center (at coordinates 0,0), and it is oriented toward 0°.
Asi 05 00105 g010
Figure 11. Test #1: Histograms for 100 laser rangefinder measurements of the distance between robot and wall. (a) 50 cm (test #1a) and (b) 100 cm (test #1b).
Figure 11. Test #1: Histograms for 100 laser rangefinder measurements of the distance between robot and wall. (a) 50 cm (test #1a) and (b) 100 cm (test #1b).
Asi 05 00105 g011
Figure 12. Test #2—scenario #1: (a) measurement setup, (b) pictures taken by the robot’s camera; (c) polar diagram from the data acquired by the rangefinder device at the distances of (A) 200 cm, (B) 150 cm, (C) 100 cm, and (D) 60 cm. The robot is located in the center (at coordinates 0,0) and oriented toward 0°.
Figure 12. Test #2—scenario #1: (a) measurement setup, (b) pictures taken by the robot’s camera; (c) polar diagram from the data acquired by the rangefinder device at the distances of (A) 200 cm, (B) 150 cm, (C) 100 cm, and (D) 60 cm. The robot is located in the center (at coordinates 0,0) and oriented toward 0°.
Asi 05 00105 g012
Figure 13. Test #2—scenario #2: (a) pictures taken by a camera located behind the NAO, (b) pictures taken by the robot’s camera; (c) polar diagram from the data acquired by the rangefinder device. The robot is located in the center of the diagram (coordinates 0,0) and oriented toward 0°. Circled in red is the position of the Pepper robot.
Figure 13. Test #2—scenario #2: (a) pictures taken by a camera located behind the NAO, (b) pictures taken by the robot’s camera; (c) polar diagram from the data acquired by the rangefinder device. The robot is located in the center of the diagram (coordinates 0,0) and oriented toward 0°. Circled in red is the position of the Pepper robot.
Asi 05 00105 g013
Figure 14. Test #2—scenario #3: (a) pictures taken by a camera located behind the NAO, (b) pictures taken by the robot’s camera; (c) polar diagram from the data acquired by the rangefinder device. The robot is located in the center of the diagram (coordinates 0,0) and oriented toward 0°. Circled in red is the position of the Pepper robot.
Figure 14. Test #2—scenario #3: (a) pictures taken by a camera located behind the NAO, (b) pictures taken by the robot’s camera; (c) polar diagram from the data acquired by the rangefinder device. The robot is located in the center of the diagram (coordinates 0,0) and oriented toward 0°. Circled in red is the position of the Pepper robot.
Asi 05 00105 g014
Table 1. Sonar and laser rangefinder distance measures.
Table 1. Sonar and laser rangefinder distance measures.
TestSonarLaser
Mean ValueVarianceMean Value Variance
#1a (50 cm)51.98 cm (6.6°)0.02 cm (0.3°)49.9 cm (6.5°)0.9 cm (0.1°)
#1b (100 cm)102.41 cm (2.2°)0.02 cm (0.3°)102.5 cm (2.6°)1.1 cm (0.1°)
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Bonaiuto, V.; Zanela, A. Improvement of the Sensor Capability of the NAO Robot by the Integration of a Laser Rangefinder. Appl. Syst. Innov. 2022, 5, 105. https://doi.org/10.3390/asi5060105

AMA Style

Bonaiuto V, Zanela A. Improvement of the Sensor Capability of the NAO Robot by the Integration of a Laser Rangefinder. Applied System Innovation. 2022; 5(6):105. https://doi.org/10.3390/asi5060105

Chicago/Turabian Style

Bonaiuto, Vincenzo, and Andrea Zanela. 2022. "Improvement of the Sensor Capability of the NAO Robot by the Integration of a Laser Rangefinder" Applied System Innovation 5, no. 6: 105. https://doi.org/10.3390/asi5060105

Article Metrics

Back to TopTop