Next Article in Journal
Analyzing the Effect of Rotary Inertia and Elastic Constraints on a Beam Supported by a Wrinkle Elastic Foundation: A Numerical Investigation
Next Article in Special Issue
Robot-Enabled Construction Assembly with Automated Sequence Planning Based on ChatGPT: RoboGPT
Previous Article in Journal
Cyclic Behavior of Slab–Wall Connections with Concrete-Filled Steel Tubes Embedded in Wall Piers
Previous Article in Special Issue
Quantitative Model of Multi-Subject Quality Responsibility in General Contracting Projects Based on Sailed Fish Optimizer
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Four-Dimensional (4D) Millimeter Wave-Based Sensing and Its Potential Applications in Digital Construction: A Review

1
Department of Building and Real Estate, The Hong Kong Polytechnic University, Hong Kong 999077, China
2
State Key Laboratory of Hydraulic Engineering Simulation and Safety, Tianjin University, Tianjin 300350, China
3
School of Economics and Management, Tongji University, Shanghai 200092, China
4
Department of Computing, The Hong Kong Polytechnic University, Hong Kong 999077, China
*
Author to whom correspondence should be addressed.
Buildings 2023, 13(6), 1454; https://doi.org/10.3390/buildings13061454
Submission received: 30 April 2023 / Revised: 20 May 2023 / Accepted: 30 May 2023 / Published: 2 June 2023

Abstract

:
Digital construction relies on effective sensing to enhance the safety, productivity, and quality of its activities. However, current sensing devices (e.g., camera, LiDAR, infrared sensors) have significant limitations in different aspects. In light of the substantial advantages offered by emerging 4D mmw technology, it is believed that this technology can overcome these limitations and serve as an excellent complement to current construction sensing methods due to its robust imaging capabilities, spatial sensing abilities, velocity measurement accuracy, penetrability features, and weather resistance properties. To support this argument, a scientometric review of 4D mmw-based sensing is conducted in this study. A total of 213 articles published after the initial invention of 4D mmw technology in 2019 were retrieved from the Scopus database, and six kinds of metadata were extracted from them, including the title, abstract, keywords, author(s), publisher, and year. Since some papers lack keywords, the GPT-4 model was used to extract them from the titles and abstracts of these publications. The preprocessed metadata were then integrated using Python and fed into the Citespace 6.2.R3 for further statistical, clustering, and co-occurrence analyses. The result revealed that the primary applications of 4D mmw are autonomous driving, human activity recognition, and robotics. Subsequently, the potential applications of this technology in the construction industry are explored, including construction site monitoring, environment understanding, and worker health monitoring. Finally, the challenges of adopting this emerging technology in the construction industry are also discussed.

1. Introduction

The construction industry is a multidisciplinary field where sensing technology plays a crucial role in enhancing the productivity and safety management of modern construction projects [1,2]. In recent years, advancements in smart construction, the Internet of Things (IoT) for construction, and digital twin in construction have increased the demands for sensing technologies [3]. The prevailing sensing methods in construction can be categorized into device-based and device-free approaches. In terms of device-based approaches [4], workers may carry sensors such as GPS, inertial measurement units, and accelerometers or have them installed on equipment to detect their locations and actions. These methods can accurately capture the condition and status of workers. However, using multiple sensing devices may compromise personal privacy and hinder construction activities, and installing numerous sensors on equipment is a time-consuming process that significantly increases construction costs. By contrast, device-free approaches have advantages in terms of sensing objects without interfering with objects [5,6]. Nowadays, computer vision is the most widely used and effective sensing method in construction [7]. However, its accuracy is highly dependent on lighting and environmental conditions, which limits its ability to sense the motion-related information of objects in harsh construction sites [8]. Additionally, due to its two-dimensional nature, it struggles with perceiving three-dimensional information and thus has difficulty measuring object motion accurately. Other commonly used sensors also have limitations in different aspects, as summarized in Table 1.
A millimeter wave (mmw) is a type of electromagnetic wave with a wavelength between 1 and 10 mm and a frequency between 30 and 300 GHz [11,12]. The history and evolution of mmw radar technology can be traced back to the early 20th century, when researchers first started experimenting with radio waves in the millimeter range for telecommunications [13]. Afterwards, the introduction of the frequency-modulated continuous wave (FMCW) technology made it possible to apply mmw technology for measuring the range, angle, and velocity (Doppler information) of objects by analyzing the reflected signals from the objects [14]. With over a century of development, mmw technology has been gradually applied in various fields, such as remote sensing, autonomous driving, and security screening [15]. Compared with other sensors, mmw radar technology has the following advantages: high-range resolution, penetration, and velocity measuring [16]. However, traditional mmw radars have limitations that include low angular resolution and issues with coverage, making it difficult to sense the evaluation information and track multiple targets in a wide field of view [17]. To address these limitations, 4D millimeter wave technology was proposed in 2019 and was initially intended for implementation in driver assistance systems. The 4D mmw radar inherits all the advantages of traditional mmw radars, such as robustness and reliability in accurately measuring the distance and velocity of detected objects in harsh environments. What sets the 4D mmw radar apart is its distinct ability to detect elevation; hence, the “4D” aspect consists of range, velocity, azimuth, and elevation [18]. This ground-breaking improvement opens a world of potential for mmw radar applications for various sensing purposes.
In light of the current limitations of sensing technologies in construction, 4D mmw radars are expected to provide potential measures to complement construction sensing due to their exceptional capabilities with respect to imaging, spatial sensing, velocity measurement, penetration, and all-weather operation [18]. The superiority of 4D mmw radars over sensors has been demonstrated in studies regarding fields such as autonomous driving and robotics [19].
As a nascent technology, there is a limited number of studies on the applications of 4D mmw radars. This study aims to comprehensively review their use across various domains and explore the potential of using emerging 4D mmw technology to enhance the current sensing methods in digital construction. Regarding the subsequent sections of this paper, the methodologies of data collecting and preprocessing are first presented in Section 2. After that, an overview of the data is presented in Section 3, followed by a summary of the major applications of 4D mmw technology based on previously identified keywords in Section 4. The potential applications of 4D mmw radars in digital construction are then discussed in Section 5, and finally, Section 6 concludes the study.

2. Methodology

The review strategy is illustrated in Figure 1. First, the scope is defined as 4D mmw-based sensing. The second step determines databases and keywords for collecting publications. The Scopus database was selected due to its wide coverage and well-recognized quality. The query for searching the Scopus is shown below.
Query: (title-abs-key (mmw or “millimeter wave” or “millimetre wave” or “millime-ter-wave” or “millimetre-wave”) and title-abs-key (radar) and title-abs-key (4D or 3D or “point cloud” or “point cloud” or “imagining”) and title-abs-key (sens* or identif* or perceive* or classif* or monitor*)) and pubyear > 2018 and pubyear < 2024.
The query covers two aspects separated by “and”: (1) the scope, including 4D millimeter wave technology, and (2) the sensing-based application. The query considers synonyms and word variations and combinations, e.g., “millimeter” is sometimes written as “millimetre”. Additionally, as the term “4D” is not widely accepted in the nascent stage, researchers may take “imaging” as an alternative or simply use “point cloud” to emphasize its major improvement over traditional mmw technology. Sensing may be alternated by identification, classification, and monitoring. The symbol “*” is used to replace one or more characters. For example, identif* includes identify, identifying, and identification. As the mmw technology may also be used for communication in addition to sensing, partial manual screening is required to filter out the communication-related papers. In addition, the publishing years of all the reviewed papers should span from 2019 (when the 4D mmw technology was first presented) to 2023 (when this paper was finished).
The metadata of the selected publications can be exported from Scopus, including titles, abstracts, authors, keywords, publishers, and years. In cases whereby some publishers (e.g., IET Biometrics and some proceedings) have no keywords-requirements, it is hard to involve them when conducting the following co-occurrence analysis in Citespace. To make full use of these data, the GPT-4 is applied to extract the keywords of these publications in an objective way. After that, all the generated keywords are exported into an RIS file that is accepted by Citespace using Python and the Rispy package. The prompt is designed as follows:
Prompt: I am investigating the applications of the 4D millimeter wave sensing technology. Please help to summarize the following paragraphs using five keywords. “Title: (title of the publication). Abstract: (abstract of the publication)”.
After preprocessing, all of the metadata are inputted into the Citespace software for further analysis, including (1) statistics—to analyze the distributions of the publications; (2) co-occurrence analysis—to generate the knowledge map of the research topics, and (3) clustering, which is applied to investigate the research trend(s). These three routine analyses will facilitate the identification of the major applications of 4D mmw-based sensing in diverse fields and highlight the most significant and most talked-about advantages of 4D mmw technology among researchers. Based on all of the findings and evidence, we can then explore the potential applications of this technology in the construction industry.

3. Overview of the Selected Publications

In this section, a routine analysis of the existing literature will be performed to examine the distributions, relationships, and trends of studies. These findings will serve as the foundation for identifying the major applications of 4D mmw-based sensing technology.

3.1. Distribution of Articles

A total of 213 documents were collected after keyword searching and manual screening. Figure 2, Figure 3, Figure 4, Figure 5 and Figure 6 indicate the article distribution based on years, journals, and regions, respectively. Figure 2 shows an obvious increasing trend of related research since 4D mmw technology was presented in 2019, implying that this technology is gaining more and more attention. It can be found from Figure 3 that most of the studies pertain to the fields of sensors or electrics. Sensors, IEEE Access, and IEEE Sensors Journals are the most significant journals—altogether accounting for 17.4% of the dataset—indicating that the theoretical research is a valuable focus point. It is evident that a significant proportion of the published literature is attributable to the proceedings of international conferences, indicating the high level of activity and extensive international collaborations in this field. Therefore, it is anticipated that the performance of 4D mmw radars will experience a significant breakthrough in the near future, despite the radars already demonstrating practical applicability. Figure 4 shows the major subjects of the publications, illustrating that about three-quarters of the studies fall in the areas of Computer Science, Engineering, and Physics and Astronomy. Some other fields include Mathematics, Earth and Planetary Sciences, Biochemistry, Genetics and Molecular Biology, Chemistry, Decision Science, Energy, and others. Figure 5 illustrates that China and the US play a dominant role in this field, accounting for over 60% of the studies conducted. Germany, the UK, and Canada also make significant contributions. As shown in Figure 6, most studies are carried out through international collaborations; however, there is still some regional bias present. For instance, Germany mainly collaborates with European countries, while Asian countries (regions) tend to work within their area.

3.2. Knowledge Map of Topics

In bibliometric analysis, a knowledge map that is represented by a keyword tree is commonly used to obtain an overall understanding of the selected documents. A keyword tree is generated based on the co-occurrence frequency of keywords in the publications and displays the hierarchical structure and evolution of keywords in a research area. The nodes of the tree represent the keywords and the links represent the co-occurrence relations. The size and color of the nodes indicate the frequency and burstiness (especially regarding citations) of the keywords, respectively. As aforementioned, GPT-4 was used to extract the keywords from publications that did not have any. Figure 7 shows an example of applying this process to reference [20].
A total of 1077 keywords with 690 distinct terms were collected from the publications. However, many of them refer to the same thing with different names—for example, mmw, millimeter wave, and millimetre wave. Bibliometrics software such as Citespace and VOSViewer find it difficult to clearly understand these terms. To reduce the bias generated by misunderstanding, all the keywords were reviewed carefully, and the following measures were taken: (1) All the ywords were converted into lowercase; (2) All the keywords with the same meanings were renamed with a unified term, i.e., an alias; (3) The keywords that are irrelevant to this study, such as “article”, “current”, “vector”, were renamed to render them useless; (4) An alias list that can be read by Citespace was created based on the above rules in the .alias format. Some examples are given in Table 2.
After the processing, 520 unique keywords remained, and Figure 8 shows their co-occurrence. The term “mmw” is hidden from the figure as all publications are mmw-related. It can be seen that “deep learning” and “point cloud” appear frequently, and “point cloud” has a larger centrality judging by the last two years, indicating that the main functions of 4D mmw-based sensing are realized by these techniques. Other technical terms such as “imaging” and “convolutional neural network” indicate that the resolutions of 4D mmw radars are high enough to be processed by imaging-based approaches, and their strong ties to “lidar” and “computer vision” means they can be coupled with the prevailing vision-based approaches. It can also be seen that “object detection” is the general goal of the research. Additionally, the high frequency of the term “Doppler radar” indicates that the Doppler (or velocity) information plays an important role in sensing. Regarding applications, they mainly include autonomous driving, human activity recognition, gesture recognition, and fall detection.
A log-likelihood ratio-based clustering operation was then performed on the keywords and titles of all the publications, and seven clusters were identified, as shown in Table 3. The results further demonstrate the practical application of 4D mmw technology in spatial sensing, high-resolution object imaging, human activity identification, and contactless vital sign monitoring.

3.3. Research Trend

The top 25 citation bursts with the strongest impact on the research topic were identified and ranked by their duration, as depicted in Table 4. It can be seen that the term “4D mmw radar technology”, which serves as an alias for related technical terms such as FMCW and MIMO, gained significant traction in the early, developmental stages of 4D mmw technology. In the last two years, researchers have shifted their focus towards exploring various applications of this technology, including human posture estimation and robot sensing systems.

4. Summary of Current Applications

According to the above results, particularly the knowledge map and clustering results, as well as manual article checking, four exemplary topics concerning 4D mmw-based sensing applications have been identified for further discussion, including autonomous driving and transport, human activity recognition, robotics, and others.

4.1. Autonomous Driving and Transportation

As is known, traditional mmw technology has already been applied in autonomous driving systems for many years, and the initial application of 4D mmw technology has also primarily been in the field of autonomous driving and transport. Therefore, it is no surprise that studies in this field remain paramount. When adding a new query “and title-abs-key (transport or driv* or navigat*)” to the initial query presented in Section 2, 61 publications can be found, meaning that about a quarter of the studies are related to autonomous driving and transportation. Compared to the other sensing devices used in autonomous driving, 4D mmw technology can provide real-time imaging of the environment, allowing for the accurate detection of objects and surroundings, even in adverse weather conditions. Thus, the benefits of using 4D mmw technology in the field of autonomous driving include improved safety, reliability, and enhanced performance in challenging environments. The keywords that are related to “autonomous driving” are shown in Figure 9. Object detection and imaging are recognized as the major missions of 4D mmw technology in this field. Furthermore, four research directions can be summarized as follows.
(1) Advanced driving-assistance systems (ADASs). The excellent ranging and velocity measuring abilities of 4D mmw radars can be utilized in an ADAS for various features, including (1) adaptive cruise control, which can adjust the vehicle’s speed and distance to the preceding vehicle [21]; (2) lane change assist, which can detect vehicles in adjacent lanes and warn the driver of potential collisions [22]; (3) blind spot detection, which can monitor the areas behind and beside the vehicle that are not visible to the driver and alert the driver of nearby vehicles [21,23]; and (4) forward collision warning, which can detect imminent collisions with vehicles or obstacles and warn the driver or activate the brakes [24]. Ranging and velocity measuring are the traditional strengths of mmw techniques, while 4D mmw techology can yield the azimuth and elevation as well, thus providing more measuring information.
(2) Object detection and recognition. 4D mmw radars can be used to identify objects such as cars, pedestrians, cyclists, and animals based on high-resolution radar images that contain spatial, Doppler (velocity), and intensity information relevant to the objects [25]. For example, Bai et al. [26] proposed the Radar Transformer, which is a deep learning network that uses self-attention to fuse the local and global features of the radar point cloud and classify objects into different categories. Paek et al. [27] proposed a 4D mmw dataset named K-Radar and a baseline deep neural network to detect objects on roads. Yahia et al. [28] developed a lightweight deep learning model named RadarFormer to identify objects in real time.
(3) Object tracking and motion prediction. This can be regarded as the extension of object detection and recognition. Based on Doppler information, 4D mmw technology can be applied to estimate the position, velocity, and even acceleration of moving objects [29]. The prevailing approaches can be divided into two categories: radar-only tracking and multi-sensor fusion tracking [9]. In radar-only tracking, the Kalman filter or deep learning model [30] may be developed and applied to the multiple objects detected by 4D mmw radars. In multi-sensor fusion studies, various data from LiDAR, camera, and GPS may be combined to improve the accuracy and robustness of object tracking [9,31].
(4) Self-localization and mapping. This task involves the estimation of the position and orientation of the ego-vehicle relative to the surrounding environment and the construction of a map of the environment based on sensor data [32]. Additionally, 4D mmw radars can be used for self-localization and mapping by matching the radar point clouds with a pre-built map or by creating a map online using the simultaneous localization and mapping (SLAM) technique [33,34,35]. Self-localization and mapping can help to improve the accuracy and robustness of autonomous driving by providing reliable information about the vehicle’s location and road geometry.

4.2. Human Activity Recognition

The application of 4D mmw technology in human activity recognition is significant, as it can be utilized in various aspects of daily life, such as security, smart home systems, and healthcare [36]. Figure 10 shows the high-frequency keywords that related to human activity recognition. The specific applications are outlined as follows.
(1) Gesture recognition. Radars based on 4D millimeter waves are capable of recognizing various gestures, including waving, pointing, and nodding, by analyzing the Doppler shifts and angles of the radar signals reflected by different parts of the human body [37,38,39]. This gesture recognition technology can facilitate natural and intuitive interaction between humans and machines, such as controlling smart devices or playing games with simple hand movements [40,41,42,43,44,45].
(2) Posture recognition. By analyzing the range and elevation information of radar signals reflected by the human body, 4D mmw radars are capable of recognizing various postures, such as standing, sitting, lying down, and bending [46,47]. Posture recognition can provide valuable insights into monitoring daily activities and the health status of elderly or disabled individuals, as well as help to detect falls or abnormal behaviors [48,49]. Figure 11 is an example of human posture tracking using a 4D mmw radar (performed by the authors of this paper), where the green and red points are 3D point clouds that interpreted from 4D mmw signals.
(3) Vital sign monitoring. By extracting the periodic fluctuations of radar signals caused by cardiac and respiratory motions, 4D mmw radars can continuously and non-invasively monitor vital signs such as heart rate [50,51] and respiration rate [50,52,53]. This technology enables the early detection of cardiac or respiratory anomalies or emergencies, providing valuable health assessments for elderly or disabled individuals [36,54]. Figure 12 shows a case study conducted by the authors of this paper, which aimed to measure the respiration of humans using a mmw radar.

4.3. Application in Robotics

In robotics, a 4D mmw radar can be integrated as a perception sensor to enable robots to detect and track objects, relocate, and avoid obstacles in their environment [55,56,57,58], which is similar to the applications in autonomous driving. In addition, some researchers have also applied this technology to develop human-following robots [59] and estimate the velocity of robots in visually degraded environments [60].

4.4. Other Applications

Some other applications of 4D mmw can be found in the literature. For instance, due to its exceptional penetration capability, it has been extensively adopted for security screening purposes to detect concealed threats such as weapons, explosives, or other threats to individuals [18,61]. Additionally, by leveraging the micro-Doppler information provided by mmw, biologists can employ 4D mmw radars to identify and quantify airborne insects [62,63], while structural engineers can use them to precisely measure the vibrations of industrial systems [64]. Furthermore, 4D mmw technology can be installed on heavy equipment to detect obstacles in complex working environments.

5. Potential Applications in Digital Construction

5.1. Sensing Requirements on Construction Sites

Before delving into the potential usages of 4D mmw-based sensing in digital construction, it is necessary to first ascertain the specific sensing tasks that are currently required on construction sites.
It is well known that the most talked-about factors on construction sites are safety, productivity, and progress, so there is no doubt that the current sensing practices are primarily focused on these three aspects. According to the sensing subjects, the sensing practices on construction sites can be divided into three categories: (1) construction site monitoring [65,66], which mainly concerns the moving of objects on site; (2) environment understanding [67,68], which concerns static objects; and (3) health monitoring [4,69], which mainly pertains to workers. Table 5 summarizes the major sensing practices from the three aspects, along with the corresponding concerned factors on construction sites.
By combining the sensing requirements on construction sites with the major applications of 4D mmw technology that are presented in Section 4, we can state that 4D mmw technology holds great potential in enhancing the construction process due to its superior capabilities over traditional sensors, as elaborated in the following sections.

5.2. Site Monitoring

Site monitoring is crucial for optimizing productivity and ensuring safety in contemporary construction projects. Moreover, 4D mmw technology can be used for construction monitoring by providing high-quality high-resolution point clouds with Doppler information that can be used for object classification tasks, such as identifying different types of workers and construction equipment based on their radar signatures. This can help monitor the movement and activity of workers and equipment on construction sites and detect any potential hazards or conflicts.
As cameras are the most prevalent monitoring devices on modern construction sites, the 4D mmw radars could be considered as powerful complementary materials due to the susceptibility of cameras to light conditions and weather. In challenging situations, as depicted in Figure 13, 4D mmw technology is capable of sensing the workers and equipment with high accuracy despite obstacles and poor visibility caused by rain, fog, dust, or mud splashes on the lens. Moreover, the excellent velocity-sensing capability of 4D mmw technology enables effective object tracking on construction sites, surpassing camera-based velocity estimation, which relies on frame-to-frame derivations in accuracy. The micro-Doppler effect of 4D mmw radar signals can provide the micro-motion information of workers or equipment, thus helping to improve the traditional activity recognition approaches. Figure 14 shows the authors’ attempts at sensing workers and construction equipment on construction sites using a 4D mmw radar.

5.3. Environment Understanding

Besides site monitoring, which mainly focuses on dynamic objects, the sensing and understanding of the static objects on construction sites are also important for construction management [67]. In view of the various 3D reconstruction or holography technologies based on 4D mmw technology [70,71], it is possible to create realistic and accurate models of buildings and structures in different stages of the construction process. These models can be used for progress evaluation and safety assessment. As well as playing a complementary role to cameras, 4D mmw radars can also be used to enhance the visual and semantic information of 3D models. Watts et al. [72] aimed to rebuild the 3D shapes of building structures using mmw technologies. Yang and Zhu [73] attempted to reconstruct close 3D scenes using mmw technologies.
In the realm of autonomous driving, as previously mentioned, it has been demonstrated that utilizing SLAM algorithms to process point clouds generated from 4D mmw signals can facilitate the reconstruction of the surrounding environment.

5.4. Health Monitoring of Workers

Monitoring the physical conditions of construction workers is important to ensure safe and healthy working activities [74]. First, it can help detect or prevent medical problems, such as respiration and heart abnormalities, that could affect their performance or safety, as well as some occupational diseases. Additionally, it can help ensure the well-being of workers and their colleagues by identifying those who may need medical attention or isolation. Furthermore, it can help reduce accidents that occur due to work stress, poor health conditions, inattention, or falls from heights by alerting the supervisors or emergency services in case any abnormalities are detected [75].
Developed 4D mmw-based vital sign monitoring technology is to monitor the health status of construction workers by measuring their respiratory and cardiac motion patterns. It detects the micrometric vibrational movements of the workers, including chest wall movements and heartbeats, thus indicating their levels of fatigue or stress. This allows supervisors to continuously and accurately monitor the vital signs of workers, enabling them to intervene in case of abnormality or emergency. Futhermore, as the monitoring is non-invasive and contactless, it can bring more comfort and convenience to the workers, and also avoid exposing sensitive personal information such as skin type, clothing, or body shape [76]. Moreover, a 4D mmw radar can operate at a long distance and cover a wide area, which enables the remote and simultaneous monitoring of multiple workers without requiring their explicit consent or cooperation [77]. This can reduce the risk of intrusion or interference from other workers or devices.

5.5. Limitations of Using 4D mmw-Based Sensing in Construction

Although 4D mmw technology holds great potential for applications in construction, its current level of development presents certain limitations that cannot be overlooked, and these limitations can be summarized as follows:
(1) Complex restriction among parameters. According to the principle of FMCW radar measuring, there are complicated interrelated and mutual restraints between measuring performances [78], including maximum measurable distance (dmax), distance resolution (dres), maximum measurable velocity (vmax), velocity resolution (vres), maximum angular field of view (θmax), and angle resolution (θres). For instance, monitoring on construction sites may require a large dmax, which will result in a small vmax as the maximum bandwidth of a given radar is generally fixed. For another, with a given dmax, vmax will be reduced if dres is raised, meaning that the motion information will be weakened if attempts are made to enhance the point cloud quality of the objects on construction sites. Therefore, it is a challenging task to determine the appropriate parameter configurations of a given radar to ensure the accuracy and reliability of various monitoring activities on construction sites.
(2) High computing requirement. A mmw radar may generate a large amount of data, which require efficient and effective processing and analysis, such as signal processing, feature extraction, object classification, or tracking [79]. As a result, it is difficult to conduct real-time monitoring with ultra-high resolution. This poses a challenge for the design and implementation of 4D mmw radar systems, especially for applications that demand a high level of accuracy and timeliness. Moreover, the high computing requirement also increases the power consumption and cost of mmw radar systems, which may limit their scalability and deployment.
(3) Lack of commercialization. It is important to note that 4D mmw technology is still in the research and development stage and has not yet been widely adopted by the industry. Another challenging issue is that the existing radars are primarily designed for autonomous driving. This means that they may not be suitable for the specific requirements and challenges of construction sensing, such as detecting small movements, measuring deformation, identifying materials, and operating in harsh environments.
Nevertheless, technically speaking, the current challenges and issues do not involve significant technical bottlenecks. The growing demand for sensing in autonomous driving and daily human life will expedite the commercialization of 4D mmw radars. In view of the great benefits of 4D mmw technology, it is believed that its application in the construction industry is not far away.

6. Conclusions

Sensing is an increasingly critical task in digital construction, as appropriate sensing approaches are widely acknowledged to enhance the safety, productivity, and quality of construction activities. Considering the limitations of current sensing devices on construction sites, we believe that the 4D mmw technology has great potential for further sensing tasks in construction due to its strong imaging capability, spatial sensing capability, velocity measurement accuracy, penetration, and operational reliability in various weather conditions. Therefore, a comprehensive review of 4D mmw-based sensing was carried out in this study, generating the following four outcomes:
(1)
Regarding the methodology, the major innovation is that the famous large-language model, GPT-4, is initially applied to enhance traditional scientometric analysis. It is used to generate the keywords for the selected publications (e.g., IET Biometrics and some proceedings) that lack them. The complete bibliometric method involving GPT-4 is presented, i.e., (1) data searching, (2) metadata extraction, (3) metadata processing using GPT-4 and Python, and (4) Citespace-based analysis.
(2)
In total, 213 publications are collected according to our query rules. The dates demonstrate a geometric increase in attention towards 4D mmw-based sensing in recent years. The distribution of publishers shows that the theoretical research of 4D mmw technology is still a point of focus for researchers, and the significant proportion of international conference papers indicates the high level of activity and extensive international collaborations in this field. It can also be found that the major researchers are involved in the following fields: Computer Science, Engineering, and Physics and Astronomy. Additionally, China, the US, and Germany are identified as the most significant contributors in this field.
(3)
The major applications of 4D mmw-based sensing are identified based on the knowledge map and clustering results of Citespace. As a technology derived from autonomous driving, 4D mmw-based sensing has been widely utilized for ADAS, object detection and recognition, object tracking and motion prediction, and self-localization and mapping. As a high-resolution and robust sensor, a 4D mmw radar can and has been applied for human activity recognition, including for gesture recognition, posture recognition, and even vital sign monitoring. Robotics is the third application scenario of this technology, which mainly serves as a perception sensor to detect and track objects, relocate them, and avoid obstacles. Other applications can also be found, such as security screening and biometrics.
(4)
Combining the sensing requirements on construction sites with the major applications of 4D mmw technology, the potential applications of this technology in the construction industry are discussed. It is believed that site monitoring is where 4D mmw-based sensing will be most highly regarded, acting as a powerful complement to the prevailing camera surveillance on construction sites to monitor the positions and recognize the activities of workers and heavy equipment. Secondly, the excellent imaging capability of 4D mmw radars allow for the rebuilding of the 3D construction site environments, thus aiding safety management and progress. Additionally, the micro-Doppler effect of mmw signals can be used to monitor the health status of workers by detecting their heart rate and respiration.
Although there are still limitations in promoting 4D mmw technology in digital construction due to the issues of parameter design, computing, and commercialization, it is evident that the development of 4D mmw technology has rapidly grown in recent years. Furthermore, the increasing demand for sensing and digital technology is facilitating practical applications of 4D mmw technology. Therefore, it is widely believed that 4D mmw technology will play a significant role in the construction industry in the near future.

Author Contributions

Conceptualization, S.H. and J.Z.; methodology, S.H.; software, J.Z.; validation, Z.S.S.; formal analysis, J.W.; data curation, W.R.; writing—original draft preparation, S.H.; writing—review and editing, J.Z.; visualization, Z.S.S.; supervision, J.Z.; funding acquisition, S.H. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Start-up Fund for RAPs under the Strategic Hiring Scheme of the Hong Kong Polytechnic University (grant number P0042478) and Internal Research Fund of PolyU (UGC) (grant number P0045933).

Data Availability Statement

The data used in this research were collected from the Scopus database. Readers can search for these data through using the search parameters provided in the article.

Conflicts of Interest

The authors declare that they have no known competing financial interest or personal relationships that could have appeared to influence the work reported in this paper.

References

  1. Awolusi, I.; Nnaji, C.; Marks, E.; Hallowell, M. Enhancing Construction safety monitoring through the application of internet of things and wearable sensing devices: A review. In Computing in Civil Engineering 2019; American Society of Civil Engineers: Atlanta, GA, USA, 13 June 2019; pp. 530–538. [Google Scholar]
  2. Zhang, Y.; Chen, J.; Han, S.; Li, B. Big data-based performance analysis of tunnel boring machine tunneling using deep learning. Buildings 2022, 12, 1567. [Google Scholar] [CrossRef]
  3. Kanan, R.; Elhassan, O.; Bensalem, R. An IoT-based autonomous system for workers’ safety in construction sites with real-time alarming, monitoring, and positioning strategies. Autom. Constr. 2018, 88, 73–86. [Google Scholar] [CrossRef]
  4. Guo, H.; Yu, Y.; Xiang, T.; Li, H.; Zhang, D. The Availability of Wearable-device-based physical data for the measurement of construction workers’ psychological status on site: From the perspective of safety management. Autom. Constr. 2017, 82, 207–217. [Google Scholar] [CrossRef]
  5. Zhang, R.; Jing, X.; Wu, S.; Jiang, C.; Mu, J.; Yu, F.R. Device-Free Wireless sensing for human detection: The deep learning perspective. IEEE Internet Things J. 2021, 8, 2517–2539. [Google Scholar] [CrossRef]
  6. Faulkner, N.; Konings, D.; Alam, F.; Legg, M.; Demidenko, S. Machine learning techniques for device-free localization using low-resolution thermopiles. IEEE Internet Things J. 2022, 9, 18681–18694. [Google Scholar] [CrossRef]
  7. Jeelani, I.; Asadi, K.; Ramshankar, H.; Han, K.; Albert, A. Real-time vision-based worker localization & hazard detection for construction. Autom. Constr. 2021, 121, 103448. [Google Scholar] [CrossRef]
  8. Zhang, S.; Meng, W.; Li, H.; Cui, X. Multimodal spatiotemporal networks for sign language recognition. IEEE Access 2019, 7, 180270–180280. [Google Scholar] [CrossRef]
  9. Zhou, T.; Yang, M.; Jiang, K.; Wong, H.; Yang, D. MMW radar-based technologies in autonomous driving: A review. Sensors 2020, 20, 7283. [Google Scholar] [CrossRef]
  10. Rao, A.S.; Radanovic, M.; Liu, Y.; Hu, S.; Fang, Y.; Khoshelham, K.; Palaniswami, M.; Ngo, T. Real-time monitoring of construction sites: Sensors, methods, and applications. Autom. Constr. 2022, 136, 104099. [Google Scholar] [CrossRef]
  11. van Berlo, B.; Elkelany, A.; Ozcelebi, T.; Meratnia, N. Millimeter Wave Sensing: A review of application pipelines and building blocks. IEEE Sens. J. 2021, 21, 10332–10368. [Google Scholar] [CrossRef]
  12. Ren, W.; Qi, F.; Foroughian, F.; Kvelashvili, T.; Liu, Q.; Kilic, O.; Long, T.; Fathy, A.E. Vital sign detection in any orientation using a distributed radar network via modified independent component analysis. IEEE Trans. Microw. Theory Tech. 2021, 69, 4774–4790. [Google Scholar] [CrossRef]
  13. Wang, X.; Kong, L.; Kong, F.; Qiu, F.; Xia, M.; Arnon, S.; Chen, G. Millimeter wave communication: A comprehensive survey. IEEE Commun. Surv. Tutor. 2018, 20, 1616–1653. [Google Scholar] [CrossRef]
  14. Song, Y.; Jin, T.; Dai, Y.; Song, Y.; Zhou, X. Through-Wall human pose reconstruction via UWB MIMO radar and 3D CNN. Remote Sens. 2021, 13, 241. [Google Scholar] [CrossRef]
  15. Patel, V.M.; Mait, J.N.; Prather, D.W.; Hedden, A.S. Computational millimeter wave imaging: Problems, progress, and prospects. IEEE Signal Process. Mag. 2016, 33, 109–118. [Google Scholar] [CrossRef]
  16. Venon, A.; Dupuis, Y.; Vasseur, P.; Merriaux, P. Millimeter Wave FMCW RADARs for Perception, Recognition and Localization in Automotive Applications: A Survey. IEEE Trans. Intell. Veh. 2022, 7, 533–555. [Google Scholar] [CrossRef]
  17. Zhang, W.; Wang, P.; He, N.; He, Z. Super Resolution DOA Based on relative motion for fmcw automotive radar. IEEE Trans. Veh. Technol. 2020, 69, 8698–8709. [Google Scholar] [CrossRef]
  18. Zhang, W.; Martinez-Lorenzo, J.A. Toward 4-D imaging of on-the-move object at 2500 volumetric frames per second by software-defined millimeter-wave MIMO with compressive reflector antenna. IEEE Trans. Microw. Theory Tech. 2023, 71, 1337–1347. [Google Scholar] [CrossRef]
  19. Bai, J.; Li, S.; Tan, B.; Zheng, L.; Huang, L.; Dong, L. Traffic participants classification based on 3d radio detection and ranging point clouds. IET Radar Sonar Navig. 2022, 16, 278–290. [Google Scholar] [CrossRef]
  20. Li, T.; Zhao, Z.; Luo, Y.; Ruan, B.; Peng, D.; Cheng, L.; Shi, C. Gait recognition using spatio-temporal information of 3D point cloud via millimeter wave radar. Wirel. Commun. Mob. Comput. 2022, 2022, 4555136. [Google Scholar] [CrossRef]
  21. Kedzia, J.C.; Strand, B.; Abenius, E. Simulation of automotive radar sensors extended to 77 GHz long range detection. In GeMiC 2014: German Microwave Conference; VDE Verlag GmbH: Aachen, Germany, 2019. [Google Scholar]
  22. Liu, Y.; Liu, Y. A data fusion model for millimeter-wave radar and vision sensor in advanced driving assistance system. Int. J Automot. Technol. 2021, 22, 1695–1709. [Google Scholar] [CrossRef]
  23. Maruta, K.; Takizawa, M.; Fukatsu, R.; Wang, Y.; Li, Z.; Sakaguchi, K. Blind-Spot visualization via AR glasses using millimeter-wave v2x for safe driving. In Proceedings of the 2021 IEEE 94th Vehicular Technology Conference (VTC2021-Fall), Norman, OK, USA, 27 September–1 October 2021; Institute of Electrical and Electronics Engineers Inc.: Piscataway, NJ, USA, 2021; Volume 2021. [Google Scholar]
  24. Long, T.; Li, Y.; Zhang, W.; Liu, Q.; Chen, X.; Tian, W.; Yang, X. Wideband Radar System Applications. In Wideband Radar; Springer Nature: Singapore, 2022; pp. 173–197. [Google Scholar]
  25. Gu, Y.; Meng, S.; Shi, K. Radar-enhanced image fusion-based object detection for autonomous driving. In Proceedings of the 2022 IEEE International Conference on Signal Processing, Communications and Computing (ICSPCC), Xi’an, China, 25–27 October 2022; Institute of Electrical and Electronics Engineers Inc.: Piscataway, NJ, USA, 2022. [Google Scholar]
  26. Bai, J.; Zheng, L.; Li, S.; Tan, B.; Chen, S.; Huang, L. Radar transformer: An object classification network based on 4D mmw imaging radar. Sensors 2021, 21, 3854. [Google Scholar] [CrossRef]
  27. Paek, D.H.; Kong, S.H.; Wijaya, K.T. K-Radar: 4D radar object detection for autonomous driving in various weather conditions. In Proceedings of the Thirty-Sixth Conference on Neural Information Processing Systems Datasets and Benchmarks Track 2022, Virtual, 28 November 2022; pp. 1–11. [Google Scholar]
  28. Dalbah, Y.; Lahoud, J.; Cholakkal, H. RadarFormer: Lightweight and accurate real-time radar object detection model. In Image Analysis: 23rd Scandinavian Conference, SCIA 2023, Sirkka, Finland, 18–21 April 2023, Proceedings, Part I; Springer Nature: Cham, Switzerland, 2023; pp. 341–358. [Google Scholar]
  29. Jiang, M.; Xu, G.; Pei, H.; Feng, Z.; Ma, S.; Zhang, H.; Hong, W. 4D High-resolution imagery of point clouds for automotive MmWave radar. IEEE Trans. Intell. Transp. Syst. 2023, 1–15. [Google Scholar] [CrossRef]
  30. Konev, S.; Brodt, K.; Sanakoyeu, A. MotionCNN: A strong baseline for motion prediction in autonomous driving. arXiv 2022, arXiv:2206.02163. [Google Scholar]
  31. Wang, Y.; Guan, Y.; Li, S.; Wu, J.; Cheng, H. Fusion perception of vision and millimeter wave radar for autonomous driving. In Proceedings of the 8th International Conference on Computing and Artificial Intelligence 2022, Tianjin, China, 18–21 March 2022; pp. 767–772. [Google Scholar] [CrossRef]
  32. Sun, B.; Gao, S.; Zi, H.; Wu, Q. GAN based simultaneous localization and mapping framework in dynamic environment. J. King Saud Univ. Sci. 2022, 34, 102298. [Google Scholar] [CrossRef]
  33. Li, Y.; Liu, Y.; Wang, Y.; Lin, Y.; Shen, W. The millimeter-wave radar slam assisted by the RCS feature of the target and IMU. Sensors 2020, 20, 5421. [Google Scholar] [CrossRef] [PubMed]
  34. Dang, X.; Rong, Z.; Liang, X. Sensor fusion-based approach to eliminating moving objects for slam in dynamic environments. Sensors 2021, 21, 230. [Google Scholar] [CrossRef] [PubMed]
  35. Dang, X.; Liang, X.; Li, Y.; Rong, Z. Moving objects elimination towards enhanced dynamic SLAM fusing LiDAR and MmW-Radar. In Proceedings of the 2020 IEEE MTT-S International Conference on Microwaves for Intelligent Mobility (ICMIM), Linz, Austria, 23 November 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 1–4. [Google Scholar] [CrossRef]
  36. Ha, U.; Assana, S.; Adib, F. Contactless seismocardiography via deep learning radars. In Proceedings of the 26th Annual International Conference on Mobile Computing and Networking 2020, London, UK, 21–25 September 2020; pp. 1–14. [Google Scholar] [CrossRef]
  37. Li, Y.; Gu, C.; Mao, J.F. A 4D gesture sensing technique based on spatiotemporal detection with a 60 GHz FMCW MIMO Radar. In Proceedings of the 2021 IEEE MTT-S International Microwave Symposium (IMS), Atlanta, GA, USA, 6–11 June 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 462–465. [Google Scholar]
  38. Feng, Y.; Wang, C.; Xie, L.; Lu, S. A fine-grained gesture tracking system based on millimeter-wave. CCF Trans. Pervasive. Comp. Interact. 2022, 4, 357–369. [Google Scholar] [CrossRef]
  39. Chen, Q.; Li, Y.; Cui, Z.; Cao, Z. A hand gesture recognition method for Mmwave radar based on angle-range joint temporal feature. In Proceedings of the IGARSS 2022—2022 IEEE International Geoscience and Remote Sensing Symposium, Kuala Lumpur, Malaysia, 17–22 July 2022; Institute of Electrical and Electronics Engineers Inc. (IEEE): Piscataway, NJ, USA, 2022; pp. 2650–2653. [Google Scholar]
  40. Li, W.; Jiang, J.; Yao, Y.; Liu, D.; Gao, Y.; Li, Q. Hand character gesture recognition based on a single millimetre-wave radar chip. IET Radar Sonar Navig. 2022, 16, 208–223. [Google Scholar] [CrossRef]
  41. Xie, H.; Han, P.; Li, C.; Chen, Y.; Zeng, S. Lightweight midrange arm-gesture recognition system from MmWave radar point clouds. IEEE Sens. J. 2023, 23, 1261–1270. [Google Scholar] [CrossRef]
  42. Xia, Z.; Luomei, Y.; Zhou, C.; Xu, F. Multidimensional feature representation and learning for robust hand-gesture recognition on commercial millimeter-wave radar. IEEE Trans. Geosci. Remote Sens. 2021, 59, 4749–4764. [Google Scholar] [CrossRef]
  43. Palipana, S.; Salami, D.; Leiva, L.A.; Sigg, S. Pantomime: Mid-air gesture recognition with sparse millimeter-wave radar point clouds. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2021, 5, 1–27. [Google Scholar] [CrossRef]
  44. Hazra, S.; Santra, A. Short-range radar-based gesture recognition system using 3D CNN with triplet loss. IEEE Access 2019, 7, 125623–125633. [Google Scholar] [CrossRef]
  45. Zhang, G.; Lan, S.; Zhang, K.; Ye, L. Temporal-range-doppler features interpretation and recognition of hand gestures using MmW FMCW radar sensors. In Proceedings of the 2020 14th European Conference on Antennas and Propagation (EuCAP), Copenhagen, Denmark, 15–20 March 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 1–4. [Google Scholar]
  46. Zhao, Y.; Yarovoy, A.; Fioranelli, F. Angle-insensitive human motion and posture recognition based on 4D imaging radar and deep learning classifiers. IEEE Sens. J. 2022, 22, 12173–12182. [Google Scholar] [CrossRef]
  47. Kong, H.; Xu, X.; Yu, J.; Chen, Q.; Ma, C.; Chen, Y.; Chen, Y.C.; Kong, L. M3Track: MmWave-based multi-user 3D posture tracking. In Proceedings of the 20th Annual International Conference on Mobile Systems, Applications and Services 2022, Portland, Oregon, 27 June–1 July 2022; pp. 491–503. [Google Scholar]
  48. Wu, J.; Cui, H.; Dahnoun, N. A voxelization algorithm for reconstructing MmWave radar point cloud and an application on posture classification for low energy consumption platform. Sustainability 2023, 15, 3342. [Google Scholar] [CrossRef]
  49. Sengupta, A.; Jin, F.; Zhang, R.; Cao, S. Mm-Pose: Real-time human skeletal posture estimation using MmWave radars and CNNs. IEEE Sens. J. 2020, 20, 10032–10044. [Google Scholar] [CrossRef]
  50. Lai, J.; Sun, Y.; Luo, Z.; Yang, Y. 3D printed lens antenna for contactless heartbeat and respiration detection using Mm-wave radar sensing. In Proceedings of the 2022 IEEE MTT-S International Microwave Biomedical Conference (IMBioC), Suzhou, China, 16–18 May 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 180–182. [Google Scholar]
  51. Wu, J.; Dahnoun, N. A health monitoring system with posture estimation and heart rate detection based on millimeter-wave radar. Microprocess. Microsyst. 2022, 94, 104670. [Google Scholar] [CrossRef]
  52. Liu, L.; Zhang, S.; Xiao, W. Non-contact vital signs detection using Mm-wave radar during random body movements. In Proceedings of the 2021 IEEE 16th Conference on Industrial Electronics and Applications (ICIEA), Chengdu, China, 1–4 August 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 1244–1249. [Google Scholar]
  53. Shah, S.A.; Shah, S.Y.; Shah, S.I.; Haider, D.; Tahir, A.; Ahmad, J. Identifying elevated and shallow respiratory rate using MmWave radar leveraging machine learning algorithms. In Proceedings of the 2019 International Conference on Advances in the Emerging Computing Technologies (AECT) 2020, Al Madinah Al Munawwarah, Saudi Arabia, 10 February 2020; pp. 1–4. [Google Scholar]
  54. Prat, A.; Blanch, S.; Aguasca, A.; Romeu, J.; Broquetas, A. Collimated beam FMCW radar for vital sign patient monitoring. IEEE Trans. Antennas Propag. 2019, 67, 5073–5080. [Google Scholar] [CrossRef] [Green Version]
  55. Wang, L.; Zhang, X.; Xv, B.; Zhang, J.; Fu, R.; Wang, X.; Zhu, L.; Ren, H.; Lu, P.; Li, J.; et al. InterFusion: Interaction-based 4D radar and LiDAR fusion for 3D object detection. In Proceedings of the 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Kyoto, Japan, 23–27 October 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 12247–12253. [Google Scholar]
  56. Cheng, Y.; Su, J.; Jiang, M.; Liu, Y. A novel radar point cloud generation method for robot environment perception. IEEE Trans. Robot. 2022, 38, 3754–3773. [Google Scholar] [CrossRef]
  57. Wu, J.; Gao, J.; Yi, J.; Liu, P.; Xu, C. Environment perception technology for intelligent robots in complex environments: A Review. In Proceedings of the 2022 7th International Conference on Communication, Image and Signal Processing (CCISP), Chengdu, China, 18–20 November 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 479–485. [Google Scholar]
  58. Cheng, Y.; Pang, C.; Jiang, M.; Liu, Y. Relocalization based on millimeter wave radar point cloud for visually degraded environments. J. Field Robot. 2023, 40, 901–918. [Google Scholar] [CrossRef]
  59. Zhu, Y.; Wang, T.; Zhu, S. A novel tracking system for human following robots with fusion of MMW radar and monocular vision. Ind. Robot. Int. J. Robot. Res. Appl. 2022, 49, 120–131. [Google Scholar] [CrossRef]
  60. Kramer, A.; Stahoviak, C.; Santamaria-Navarro, A.; Agha-Mohammadi, A.; Heckman, C. Radar-inertial Ego-velocity estimation for visually degraded environments. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May 31–31 August 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 5739–5746. [Google Scholar]
  61. Gui, S.; Yang, Y.; Li, J.; Zuo, F.; Pi, Y. THz radar security screening method for walking human torso with multi-angle synthetic aperture. IEEE Sens. J. 2021, 21, 17962–17972. [Google Scholar] [CrossRef]
  62. Nithin, S.B. Researching Feasibility of MmWave Solutions for Insect Detection Applications. Master’s Thesis, University of Twente, Enschede, The Netherlands, 2021. [Google Scholar]
  63. Noskov, A.; Bendix, J.; Friess, N. A review of insect monitoring approaches with special reference to radar techniques. Sensors 2021, 21, 1474. [Google Scholar] [CrossRef] [PubMed]
  64. Jiang, C.; Guo, J.; He, Y.; Jin, M.; Li, S.; Liu, Y. MmVib: Micrometer-level vibration measurement with Mmwave radar. In Proceedings of the 26th Annual International Conference on Mobile Computing and Networking, London, UK, 21–25 September 2020; Association for Computing Machinery: New York, NY, USA, 2020; pp. 1–13. [Google Scholar]
  65. Luo, X.; Li, H.; Yang, X.; Yu, Y.; Cao, D. Capturing and understanding workers’ activities in far-field surveillance videos with deep action recognition and bayesian nonparametric learning. Comput. Aided Civ. Infrastruct. Eng. 2019, 34, 333–351. [Google Scholar] [CrossRef]
  66. Yan, X.; Zhang, H.; Li, H. Computer vision-based recognition of 3D relationship between construction entities for monitoring struck-by accidents. Comput. Aided Civ. Infrastruct. Eng. 2020, 35, 1023–1038. [Google Scholar] [CrossRef]
  67. Arashpour, M.; Ngo, T.; Li, H. Scene understanding in construction and buildings using image processing methods: A comprehensive review and a case study. J. Build. Eng. 2021, 33, 101672. [Google Scholar] [CrossRef]
  68. Liu, Z.; Kim, D.; Lee, S.; Zhou, L.; An, X.; Liu, M. Near Real-time 3D reconstruction and quality 3d point cloud for time-critical construction monitoring. Buildings 2023, 13, 464. [Google Scholar] [CrossRef]
  69. Li, H.; Lu, M.; Hsu, S.C.; Gray, M.; Huang, T. Proactive behavior-based safety management for construction safety improvement. Saf. Sci. 2015, 75, 107–117. [Google Scholar] [CrossRef]
  70. Wang, M.; Wei, S.; Liang, J.; Liu, S.; Shi, J.; Zhang, X. Lightweight FISTA-Inspired sparse reconstruction network for mmw 3-D holography. IEEE Trans. Geosci. Remote Sens. 2022, 60, 1–20. [Google Scholar] [CrossRef]
  71. Sun, Y.; Huang, Z.; Zhang, H.; Cao, Z.; Xu, D. 3DRIMR: 3D reconstruction and imaging via mmwave radar based on deep learning. In Proceedings of the 2021 IEEE International Performance, Computing, and Communications Conference (IPCCC), Austin, TX, USA, 29–31 October 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 1–8. [Google Scholar]
  72. Watts, C.M.; Pedross-Engel, A.; Reynolds, M.S. Through-wall k-Band and v-Band synthetic aperture radar imaging of building structures and utility infrastructure. In Passive and Active Millimeter-Wave Imaging XXII; SPIE: Bellingham, WA, USA, 2019; Volume 10994, pp. 68–77. [Google Scholar]
  73. Yang, Z.; Zhu, Z. An Ego-motion estimation method using millimeter-wave radar in 3D scene reconstruction. In Proceedings of the 2022 14th International Conference on Intelligent Human-Machine Systems and Cybernetics (IHMSC), Hangzhou, China, 20–21 August 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 18–21. [Google Scholar]
  74. Xing, X.; Li, H.; Li, J.; Zhong, B.; Luo, H.; Skitmore, M. A Multicomponent and neurophysiological intervention for the emotional and mental states of high-altitude construction workers. Autom. Constr. 2019, 105, 102836. [Google Scholar] [CrossRef]
  75. Ahn, C.R.; Lee, S.; Sun, C.; Jebelli, H.; Yang, K.; Choi, B. Wearable sensing technology applications in construction safety and health. J. Constr. Eng. Manag. 2019, 145, 03119007. [Google Scholar] [CrossRef]
  76. Wang, Y.; Wang, W.; Zhou, M.; Ren, A.; Tian, Z. Remote monitoring of human vital signs based on 77-GHz Mm-Wave FMCW radar. Sensors 2020, 20, 2999. [Google Scholar] [CrossRef] [PubMed]
  77. Vorobyov, A.; Daskalaki, E.; Roux, E.L.; Farserotu, J.; Dallemagne, P. Contactless vital signs sensing: A survey, preliminary results and challenges. In Proceedings of the 2020 XXXIIIrd General Assembly and Scientific Symposium of the International Union of Radio Science, Rome, Italy, 29 August–5 September 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 1–4. [Google Scholar]
  78. Jardak, S.; Alouini, M.-S.; Kiuru, T.; Metso, M.; Ahmed, S. Compact MmWave FMCW radar: Implementation and performance analysis. IEEE Aerosp. Electron. Syst. Mag. 2019, 34, 36–44. [Google Scholar] [CrossRef]
  79. Sun, S.; Zhang, Y.D. 4D automotive radar sensing for autonomous vehicles: A sparsity-oriented approach. IEEE J. Sel. Top. Signal Process. 2021, 15, 879–891. [Google Scholar] [CrossRef]
Figure 1. Methodology of this research.
Figure 1. Methodology of this research.
Buildings 13 01454 g001
Figure 2. Articles distributed by years.
Figure 2. Articles distributed by years.
Buildings 13 01454 g002
Figure 3. Articles distributed by journals.
Figure 3. Articles distributed by journals.
Buildings 13 01454 g003
Figure 4. Articles distributed by subjects.
Figure 4. Articles distributed by subjects.
Buildings 13 01454 g004
Figure 5. Articles distributed by regions.
Figure 5. Articles distributed by regions.
Buildings 13 01454 g005
Figure 6. Relationship between the articles published in different regions.
Figure 6. Relationship between the articles published in different regions.
Buildings 13 01454 g006
Figure 7. An example of the process used to extract keywords via using GPT-4.
Figure 7. An example of the process used to extract keywords via using GPT-4.
Buildings 13 01454 g007
Figure 8. Co-occurrence of keywords.
Figure 8. Co-occurrence of keywords.
Buildings 13 01454 g008
Figure 9. Keyword maps regarding the field of autonomous driving.
Figure 9. Keyword maps regarding the field of autonomous driving.
Buildings 13 01454 g009
Figure 10. Keyword map regarding the field of human activity recognition.
Figure 10. Keyword map regarding the field of human activity recognition.
Buildings 13 01454 g010
Figure 11. Posture tracking using a 4D mmw radar.
Figure 11. Posture tracking using a 4D mmw radar.
Buildings 13 01454 g011
Figure 12. Measuring respiration using a mmw radar.
Figure 12. Measuring respiration using a mmw radar.
Buildings 13 01454 g012
Figure 13. Construction scenarios captured under poor visibility conditions. (a) Water droplet obscuring lens (b) Mud obscuring lens (c) Lightless conditions.
Figure 13. Construction scenarios captured under poor visibility conditions. (a) Water droplet obscuring lens (b) Mud obscuring lens (c) Lightless conditions.
Buildings 13 01454 g013
Figure 14. Sensing moving objects on construction sites using a 4D mmw radar.
Figure 14. Sensing moving objects on construction sites using a 4D mmw radar.
Buildings 13 01454 g014
Table 1. Limitations of the commonly used sensors on construction sites [9,10].
Table 1. Limitations of the commonly used sensors on construction sites [9,10].
SensorLimitations
CameraThe accuracy and reliability highly depend on light and weather conditions; and it is weak in spatial sensing.
LiDARIt is prone to interference in poor weather conditions such as rain or snow; and the cost is high.
Depth cameraIt can only measure within a certain range (several meters), and is susceptible to noise and interference.
Ultrasonic sensorIt has a limited range and low measuring speed, and can be affected by ambient noise.
Infrared sensorThe accuracy is generally low and susceptible to interference; the sensing range typically falls within several meters; and the sensitivity can be influenced by material properties.
Table 2. Some examples of the aliases of keywords.
Table 2. Some examples of the aliases of keywords.
Unified Term/AliasIncluded Keywords
mmwmillimeter wave, millimetre wave, millimeter-wave, millimeter wave (mmw), millimeter waves, millimeter-wave (mm-wave), etc.
3D imaging3-D imaging, three-dimensional imaging, 3D radar imaging, etc.
autonomous drivingautonomous driving (ad), autonomous vehicle, automotive, etc.
Table 3. Clustering results of the keywords and titles.
Table 3. Clustering results of the keywords and titles.
No.Mean YearTop Terms (Log-Likelihood Ratio)
3420213D sar; perceptual learning framework; adaptive sparse; mmbody benchmark; 3D body reconstruction dataset
252020mmwave radar point cloud; lightweight midrange arm-gesture recognition system; natural language; processing approach
252021camera feature; multi-modal fusion; view disparity; vision fusion; object detection
222020support measurement; sensing design; ris-assisted wireless network; coal mine safety monitoring; joint communication
212020contactless electrocardiogram monitoring; human activity recognition; multi-person gait recognition; spatio-temporal information; point-cloud
212020car interior radar; advanced life-signs detection; robust hand-gesture recognition; hand gesture
202020automotive mmwave radar; point cloud; using 3D-printed helix antenna; circular polarization; epoxy wedge
72022radar point cloud; visual information fusion; center transfuser; 3D object detection
Table 4. Top 25 keywords with the strongest citation bursts.
Table 4. Top 25 keywords with the strongest citation bursts.
KeywordsYearStrengthBeginEnd2019–2023
2019–202320192.1320192020Buildings 13 01454 i001
5 g mobile communication system20191.2220192020Buildings 13 01454 i002
Nondestructive examination20191.1120192020Buildings 13 01454 i003
4D mmw radar technology20191.0320192019Buildings 13 01454 i004
Convolutional neural network20201.7120202019Buildings 13 01454 i005
Skeletal tracking20201.0920202021Buildings 13 01454 i006
Indoor localization20201.0920202020Buildings 13 01454 i007
Vehicle detection20201.0920202020Buildings 13 01454 i008
Lidar20211.6720212021Buildings 13 01454 i009
Autonomous driving20211.2420212021Buildings 13 01454 i010
Sensor fusion20210.9620212021Buildings 13 01454 i011
Graph neural networks20210.8720212021Buildings 13 01454 i012
Multi sensor fusion20210.8720212021Buildings 13 01454 i013
Multi target tracking20210.8720212021Buildings 13 01454 i014
Radar sensing20210.8720212021Buildings 13 01454 i015
Machine learning20221.7920222023Buildings 13 01454 i016
Wireless sensing20221.0720222023Buildings 13 01454 i017
Scattering20220.7120222023Buildings 13 01454 i018
Posture estimation20220.7120222023Buildings 13 01454 i019
Package20220.7120222023Buildings 13 01454 i020
Point cloud data20220.7120222023Buildings 13 01454 i021
Multimodal fusion20220.7120222023Buildings 13 01454 i022
Dataset20220.7120222023Buildings 13 01454 i023
Robot sensing systems20220.7120222023Buildings 13 01454 i024
Mapping20220.7120222023Buildings 13 01454 i025
Table 5. Sensing requirements and reasons on construction sites.
Table 5. Sensing requirements and reasons on construction sites.
Sensing Requirements on Construction SitesReasons
Construction site
monitoring
Position and trajectory of workerS
Position and trajectory of equipmentS, P1
Activity of workerS, P1
Activity of equipmentS, P1
Operation statusP1
Health monitoringHealth status of workers (heart rate, respiration, etc.)S
Environment
understanding
Environment (noise, air quality, etc.)S
Environment 3D modellingS, P2
Notes: Safety (S); Productivity (P1); Progress (P2).
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Han, S.; Zhang, J.; Shaikh, Z.S.; Wang, J.; Ren, W. Four-Dimensional (4D) Millimeter Wave-Based Sensing and Its Potential Applications in Digital Construction: A Review. Buildings 2023, 13, 1454. https://doi.org/10.3390/buildings13061454

AMA Style

Han S, Zhang J, Shaikh ZS, Wang J, Ren W. Four-Dimensional (4D) Millimeter Wave-Based Sensing and Its Potential Applications in Digital Construction: A Review. Buildings. 2023; 13(6):1454. https://doi.org/10.3390/buildings13061454

Chicago/Turabian Style

Han, Shuai, Jiawen Zhang, Zeeshan Shahid Shaikh, Jia Wang, and Wei Ren. 2023. "Four-Dimensional (4D) Millimeter Wave-Based Sensing and Its Potential Applications in Digital Construction: A Review" Buildings 13, no. 6: 1454. https://doi.org/10.3390/buildings13061454

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop