Next Article in Journal
Telling the Wood from the Trees: Ranking a Tree Species List to Aid Urban Afforestation in the Amazon
Previous Article in Journal
Optimal Strategies for Elderly Public Transport Service Based on Impact-Asymmetry Analysis: A Case Study of Harbin
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Elaborate Monitoring of Land-Cover Changes in Cultural Landscapes at Heritage Sites Using Very High-Resolution Remote-Sensing Images

1
International Research Center of Big Data for Sustainable Development Goals, Beijing 100094, China
2
Key Laboratory of Digital Earth Science, Aerospace Information Research Institute, Chinese Academy of Sciences, Beijing 100094, China
3
International Centre on Space Technologies for Natural and Cultural Heritage under the Auspices of UNESCO, Beijing 100094, China
4
University of Chinese Academy of Sciences, Beijing 100049, China
5
College of Geography and Tourism, Hengyang Normal University, Hengyang 421002, China
6
Research Center of Big Data Technology, Nanhu Laboratory, Jiaxing 314002, China
7
Advanced Institute of Big Data, Beijing 100093, China
*
Author to whom correspondence should be addressed.
Sustainability 2022, 14(3), 1319; https://doi.org/10.3390/su14031319
Submission received: 16 December 2021 / Revised: 21 January 2022 / Accepted: 22 January 2022 / Published: 25 January 2022

Abstract

:
Insufficient data and imperfect methods are the main obstacles to realize Target 11.4 of the Sustainable Development Goals (SDGs). Very high-resolution (VHR) remote sensing provides a useful tool to elaborate monitor land-cover changes in cultural landscapes so as to evaluate the authenticity and integrity of the cultural heritage sites (CHS). In this study, we developed a semi-automatic two-level workflow to efficiently extract delicate land-cover changes from bi-temporal VHR images (with spatial resolution ≤ 1 m), where most current studies can only manually interpret changes at this scale. Based on the monitoring result, we proposed an indicator named interference degree that can quantify the changes in cultural landscapes of the CHS as a complementary indicator to achieve Target 11.4 for SDGs. Three representative types of CHS with different landscapes were studied in 2015 and 2020 based on the VHR Google Earth images, including cave temples, ancient architectural buildings, and ancient sites. The proposed workflow was demonstrated to be effective in extracting delicate changes efficiently with the accuracy around 85%. The interference degree well reflects the preservation status of these CHS and can be periodically observed in a long term as an evaluation indicator. This study shows the potential to produce the first-hand global-monitoring data of CHS to support Target 11.4, thus serving for the sustainable development of the world’s cultural heritage.

1. Introduction

With urban growth and modernization, the uprising conflict between human activity and heritage conservation has received considerable attention [1]. Monitoring the preservation status of cultural-heritage sites (CHS) is important to maintain their sustainable development [2,3,4]. The United Nations has proposed 17 Sustainable Development Goals (SDGs) in 2015 [5]. Among them, Target 11.4 aims to strengthen efforts to protect and safeguard the world’s cultural and natural heritage. This target belonged to Tier III (i.e., definitions, methodologies, or standards are under development) when initially proposed and was classified as Tier II (i.e., indicators are well defined, but data are not regularly collected at country level) in 2019 [6]. However, the realization of this target still faces the challenge of insufficient data and imperfect methods [7].
The remote-sensing technique has shown a great advantage in cultural-heritage monitoring and management [8,9,10,11,12,13]. Remote-sensing images with a low or medium spatial resolution can provide monitoring data with a wide coverage and long time series, and they have been used widely to monitor changes in cultural landscapes of CHS [14,15,16,17]. However, observation at this scale cannot reach the heritage property level, and therefore some key elements surrounding the CHS cannot be observed. For example, individual buildings and narrow roads cannot be monitored from low- or medium-resolution images. The very high-resolution (VHR) remote-sensing images derived from commercial satellites [18], unmanned aerial vehicles (UAV) [19], and Google Earth [20,21] have become easily accessible, providing the possibility to observe CHS at a fine scale.
Currently, visually interpretation of CHS and the related targets from VHR images is still the mainstream [22,23,24,25] in cultural-landscape monitoring, and only a few studies applied machine-learning methods for image interpretation [26,27]. An important reason is that processing VHR images is rather complicated for non-remote-sensing experts. Target 11.4 has put forward the urgent demand of acquiring global CHS data at the fine monitoring scale. To achieve this, more intelligent methods need to be customized and popularized to monitor CHS from VHR images. Another problem is that only one indicator (i.e., the expenditure per capita spent on the preservation) is related to Target 11.4, which is over-simple and not effective to achieve this target [28]. Other complementary indicators should be developed to quantify the preservation status of CHS.
Therefore, to customize the method of elaborate monitoring the status of CHS, a semi-automatic workflow, was proposed based on the object-based image analysis (OBIA) to identify land-cover changes at two processing levels from bi-temporal VHR images. Afterwards, an indicator named interference degree was proposed to quantify the changes in cultural landscapes within a certain time period. The interference degree can reflect the impacts of human activities and natural factors on the CHS in some degree. Periodically monitoring cultural landscapes can provide a straightforward indicator to evaluate the authenticity and integrity of the CHS. The developed workflow was applied to three representative types of CHS with distinct cultural landscapes for a pilot test. The results demonstrate that this workflow has the potential to monitor global CHS with different types so as to provide the first-hand global-monitoring data at the fine scale.

2. Materials and Methods

2.1. Study Areas

Three CHS in China were studied, including the Yungang Grottoes, the Wooden Pagoda of Yingxian, and the Chengcun Ancient City Ruins. Among them, the Yungang Grottoes belong to cave temples; the Wooden Pagoda of Yingxian is an ancient architectural building; and the Chengcun Ancient City Ruins are typical ancient sites.
The Yungang Grottoes, located in the city of Datong, Shanxi Province, are ancient Chinese Buddhist temple grottoes (with 252 caves and 51,000 statues) from the 5th and 6th centuries. They were selected on the United Nations Educational, Scientific, and Cultural Organization (UNESCO) list in 2001. The Wooden Pagoda of Yingxian, also known as the Sakyamuni Pagoda of Fogong Temple, is a wooden Chinese pagoda in Ying County, Shanxi Province. It was built in 1056 during the Khitan-led Liao Dynasty. In 2013, the tower was selected on the tentative lists of UNESCO. The Chengcun Ancient City Ruins, located in Wuyishan City, Fujian Province, is the largest and best-preserved Han Dynasty city site found in southern China. In 1999, it was included in the World Heritage List as the major item in Wuyishan city’s application for UNESCO’s World Cultural Heritage and World Natural Heritage status. In 2013, it became the project unit for the establishment of the National Archaeological Park. Figure 1 shows the locations, protected areas, and remote-sensing images of these three CHS.

2.2. Data

Google-Earth images have shown great potential in monitoring CHS for its global coverage, the choices of various spatial resolution, the availability of historical data, and low costs [20]. Therefore, the bi-temporal VHR Google-Earth images were used to monitor land-cover changes in the study areas. The year of 2015 was used as the benchmark year since SDGs were proposed in 2015, and the year of 2020 was chosen as the second time. The changes in cultural landscapes within five years were monitored. The core zone and the buffer zone consist of the main protected area of a CHS. The core- and buffer-zones’ data [29] released by cultural-relic units were used to provide boundary information.
Google-Earth imagery is actually a mosaic of many images with different spatial resolutions (from 15 m to 10 cm) from different time periods and multiple image providers such as Landsat satellites operated by National Aeronautics and Space Administration (NASA), the United States Geological Survey (USGS), and some commercial providers [30]. Google Earth Pro provides a tool to output the VHR imagery with three multispectral bands (i.e., red, green, and blue). The true digital number values (also including the spectral reflectance) are not available; thus, the Google-Earth imagery is not suitable for quantitative applications. Nevertheless, comparing with the costs of commercial VHR imagery and extensive image pre-processing work (e.g., image mosaicking) for global monitoring, Google Earth Pro is rather cost effective, and the output three-band VHR imagery can still be used for visual applications such as land-cover classification [25] and object recognition [27]. Google Earth Pro can output images with different levels of resolution. At the same level, the resolution of the output images may vary in different places, and not all the places have sub-meter images covered. Since elaborate monitoring of the cultural landscape of CHS was the primary goal in this study, we chose 1 m images for the Yungang Grottoes and the Chengcun Ancient City Ruins, and the actual resolution of the output images was 1.07 m. As the Wooden Pagoda of Yingxian has a rather small protected area and dense residential areas, we chose the sub-meter resolution image for this area, and the actual output resolution was 0.27 m. The original satellite sensors that provide these Google-Earth images can be tracked according to the covered area and the acquisition time. The data information is listed in Table 1.
For global monitoring, the selection of resolution directly determines the processing time. If a CHS occupies a very large area, imagery with a lower resolution (e.g., 5 m) is suggested to reduce computational costs. In contrast, if a CHS occupies a small area and is in an urban environment, such as the Wooden Pagoda of Yingxian, imagery with a sub-meter resolution can be used if available.

2.3. Methods

OBIA has shown great advantages in processing VHR images [31,32]. Therefore, a two-level workflow based on the OBIA was customized to extract land-cover changes in the cultural landscapes of the CHS. At level I, an unsupervised method was applied to extract the potentially changed objects; at level II, a supervised machine-learning method was used to refine the result and obtain the from-to land cover-change information. The flowchart of the proposed workflow is shown in Figure 2, and the specific steps are as follows.
  • Image co-registration: The bi-temporal Google-Earth images were co-registered. Since change detection on VHR images has a high-quality requirement, image co-registration was applied using the Image Registration Workflow tool in ENVI 5.3 [33] to ensure the root mean square error (RMSE) of geometrical accuracy was less than two pixels.
  • Change vector analysis (CVA): The CVA [34] was applied to the bi-temporal images to derive a pixel-wise difference image (Equation (1)).
    V = i = 1 K ( p i q i ) 2
    V is the Euclidean distance between the spectral vectors of two pixels pi and qi at the same location from bi-temporal images. K is the number of multispectral bands.
  • Thresholding: The Otsu thresholding [35] was applied to the difference image. This algorithm can conform a threshold t by minimizing intra-class intensity variance or equivalently by maximizing inter-class variance, so as to divide changed and unchanged pixels.
  • Image segmentation: Independently, the spectral bands of bi-temporal images were stacked to combine information from both times, and image segmentation based on the fractal net evolution algorithm (FNEA) method [36] was performed on the stacked spectral bands. As a key step of OBIA, image segmentation can group pixels into multiple image segments (objects) according to some pre-defined homogeneity criteria. The pixels inside an object were assumed to be with high homogeneity. Therefore, the basic unit is an object rather than a pixel in OBIA (an illustration is shown in Figure 3).
  • Extracting potentially changed objects: Since the same element can exhibit great spectral heterogeneity in VHR images, extracting changed elements at the object level rather than the pixel level can avoid introducing the salt-and-pepper noise (i.e., isolated pixels). The changed objects were extracted according to the segmentation and Otsu thresholding results (Equation (2)).
    O = { 1       if   N c > N · r 0       if   N c N · r   ,
    O = 1 indicates that an object is a potentially changed object; otherwise, O = 0. Nc is the number of changed pixels in this object, and N is the total pixels in this object. r is an empirical threshold. In the study, r was set to 50% through trial-and-error tests, which means if the changed pixels exceed a half in an object, this object is then defined as a changed object. This step converts the processing unit of the imagery from pixels to objects.
  • Object-based classification: CVA is sensitive to false changes caused by many factors such as different sensors, the light condition, seasonal changes, etc. [37]. Therefore, the supervised classification was performed separately on the bi-temporal images to further reduce the false alarm. The random-forest (RF) classifier was used in this study. This method utilizes ensemble learning to combine many decision trees and has been reported to achieve a high accuracy [38]. The training objects based on the segmentation results were selected separately on the two images. The mean and standard deviation of pixel values inside each object were used as the object-level features for classification. The selected training objects provide prior knowledge of the object-level features across three spectral bands. In the training process, a five-fold cross-validation was used to automatically determine the optimal parameters in the RF algorithm such as the number of trees, the criterion to measure the quality of a split, the maximum depth of the tree, etc. Based on the selected parameters, the classification was then applied to those potentially changed objects.
  • Refining changed objects: After deriving the classification results, the land-cover classes were compared per object in the bi-temporal images. The object with the same class label in both images was regarded as the false change and thus categorized as the unchanged object. Manual editing may be stepped in to refine the results. Especially, some changes caused by reconstruction work were difficult to be identified using machine-learning methods and thus were delineated manually.
  • Calculating interference degree: The percentage of changes (Equation (3)) was calculated for the core zone and the buffer zone of a CHS.
    I = A c A ,
    where Ac is the area value of all the changed objects in the core/buffer zone, and A is the area value of the core/buffer zone. We defined this percentage I as the interference degree, which directly reflects the amount of land-cover changes in cultural landscapes surrounding a CHS. We also obtained the from-to land cover-change information that shows the conversion types of these changes from the classification result.

3. Results and Findings

According to UNESCO and the heritage community [29], a core zone has strict protection status, where human intervention must be kept to a minimum. A buffer zone may set limits to protect views, settings, land uses, and other aspects but may also positively encourage developments that would be beneficial to the site and community. Therefore, a high interference degree in the core zone usually implies a high risk to the heritage site. A high value in the buffer zone indicates the heritage site needs to be further evaluated to assess whether these changes have strengthened or weakened the relationship between humans and heritage.
By applying the proposed workflow to our three study sites, the interference degree of these CHS is listed in Table 2. As can be seen, the interference degrees in the core zone were all lower than those in the buffer zone, suggesting a better protection in the core zone. The Yungang Grottoes had the highest interference degree in the buffer zone (4.93%) among three study areas and a median value in the core zone (0.64%). The Wooden Pagoda of Yingxian had nothing changed in the core zone and only some minor changes in the buffer zone (0.51%). The interference degree of the Chengcun Ancient City Ruins also remained at a high level in the core zone (1.11%) and the buffer zone (3.54%).
Five classes can basically cover land-cover types in the study areas, including built-up land, farmland, barren land, vegetation, and water. Built-up land includes buildings, roads, and some under-developed areas. Farmland includes planted and unplanted farmland, without paddy fields in these study areas. Barren lands are mostly bare soil and rocks, with a few tidal-flat areas. Vegetation includes woodland, bush, and grass. Water includes rivers and reservoirs. Among the five classes, built-up land and farmland were closely related to human activities, and the other three classes (i.e., barren land, vegetation, and water) contribute to environmental factors. Some research assigned different weights to different land-cover classes for risk assessment [39]. In these three study areas, the changes of these five land-cover classes were interactive. Especially, the changes of barren land and vegetation were mostly caused by human intervention rather than natural changes. Therefore, without distinguishing different impacts of land-cover classes on the interference degree, these five classes were used for classification to refine the change-detection result and to obtain the change information (some study sites may not include all the five classes). The from-to land cover-change information is shown in Figure 4, Figure 5 and Figure 6, and heat maps of the transition matrix indicating change quantities and directions [40] are displayed in Figure 7 for all three study sites.
The greatest change in the Yungang Grottoes was from vegetation to built-up land (Figure 7a), appearing in the southwest of the buffer zone (Figure 4). This change was due to a large increase in solar panels, which can be seen clearly from VHR images. The second greatest change was from built-up land to barren land in a village (called Wuguantun village) in the northern part of the buffer zone. The Yungang Grottoes are surrounded by some coal mining areas, and people in the Wuguantun village used to mine for a living. Due to a serious threat to the grottoes, the mining activities stopped in nearby villages. People moved away from the Wuguantun village after the coal mining stopped; therefore, many low flat-roofed buildings in the village were demolished. Additionally, a little construction work was implemented in the southwest of the village and around the Yungang Museum (at the east of the core zone). No construction work happened in the core zone, and only little vegetation turned into barren land. We visited the Yungang Grottoes on 25 March 2021 to record the recent situation at those changed places. The on-site photos of newly built solar panels and demolished buildings in the Wuguantun village are shown in Figure 4, which accord with the extraction result.
The Wooden Pagoda of Yingxian has a rather small core zone, and no change happened in this area (Figure 5). Some minor changes in the buffer zone were due to newly built buildings and some reconstruction work (Figure 7b). Vegetation in the park was converted into a few paths and barren lands for a better accessibility. The farmland barely changed except for different seasonal crops.
Chongyang River goes across the protected area of the Chengcun Ancient City Ruins (Figure 6). The greatest change was from vegetation to farmland (Figure 7c), indicating an increasing demand of the utilization of land. This type of change happened in one of the core zones, causing a high interference level. The changes in the river and along the riverbank were also great, and these changes were largely contributed by human activities. A sluice was built across the river, causing a decrease in tidal flats at lower reaches. Many construction works along the river were carried out during these five years. A road was repaired and widened on one side of the river (from barren land to built-up land); another road was built passing through the farmland on the other side of the river (from farmland to built-up land).
The cultural landscape of each study site is distinctive. The Yungang Grottoes are embraced between mountains and only a few villages in the surrounding. There are no large farmland patches in this area, so impacts from human activity are expected to be little. The Wooden Pagoda of Yingxian is located near a border between the city and a suburban area. Densely distributed buildings are in the south, and a large patch of farmland are in the north; thus, urban construction and agricultural activities may both cause interferences to the pagoda. The Chengcun Ancient City Ruins are in an intermountain basin near the Chengcun village. Farmland and thick vegetation are on both banks of the river. Natural (e.g., forest and water) and man-made (e.g., construction and farming) factors may both affect the ancient ruins. Interestingly, the Wooden Pagoda of Yingxian has the best preservation status. Although located in a relatively vulnerable environment that is easy to be affected by human activities, the core zone of the pagoda is strictly protected, and the buffer zone remains at a low interference level. The Yungang Grottoes and the Chengcun Ancient City Ruins both show a rather high interference level, but their land-cover change trends indicate very different results. For the Yungang Grottoes, the changes include the utilization of new energy and improvements on the rural environments, which would lead to a decrease in human interference in the long term. Therefore, these changes were made towards a better preservation of the CHS. It would be beneficial to further evaluate whether these changes were made too fast and caused potential threats to the grottoes. For the Chengcun Ancient City Ruins, however, the changes were caused by a great amount of construction work and forest degradation. These changes may damage the natural ecological balance and break the integrity of the cultural landscape, implying a potential unsustainable development at this site. Therefore, more attention should be paid to the Chengcun Ancient City Ruins. It is suggested to report these abnormal changes and periodically monitor the cultural landscape in the future. Additionally, other techniques such as interferometric synthetic aperture radar (InSAR) [41,42] and light detection and ranging (LiDAR) [43,44] can be combined to monitor the ruins and provide a comprehensive risk assessment.

4. Analysis and Discussion

4.1. Accuracy Assessment

The number of objects at specific steps is listed in Table 3, including segmentation, CVA, classification, training samples, and the final edited results. The result indicates that the potentially changed objects selected by the unsupervised CVA greatly narrow the region of interests. The supervised classification further removes false changes, accounting for nearly a half of those potentially changed objects. The percentage of manually delineated objects was from 15% to 35% of the total objects. Although we edited the final result to derive the accurate interference degree, the semi-automatic workflow was far more efficient than pure manual work. In practice, the segmentation result has already provided the boundary information; therefore, the missing objects can be copied from the segmentation result, without actually delineating their boundaries.
To demonstrate the effectiveness of the proposed workflow, we validated the accuracy of the classification and change-detection results. For classification, we assessed the overall accuracy and the F-score [45] for each class based on all the classified objects. For change detection, three popular measures were used, including the F-score, precision, and recall (Equations (4)–(7)). The accuracy of change detection was based on the whole study site, where the true changes were interpreted visually from VHR images, assisted by field visits.
F - score = 2 · precision · recall precision + recall ,
overall   accuracy = T P + T N T P + F P + T N + F N ,
precision = T P T P + F P ,
recall = T P T P + F N ,
where TP, TN, FP, and FN represent true positive, true negative, false positive, and false negative, respectively. The accuracy of classification and change-detection results at two processing levels of the developed workflow are shown in Figure 8 and Figure 9, respectively.
The classification accuracy suggests the overall performance of the RF classifier was between 0.85 to 0.93, which was generally better than the change-detection accuracy. Since we only used the changed/unchanged label in the classification results to refine the change-detection results, some misclassified objects at the individual image may still lead to a correct change-detection result. In general, the F-Score of the change-detection result was around 0.7 at Level I (i.e., unsupervised CVA method only) and improved to 0.85 at Level II (i.e., supervised classification). All three cases showed an increase in the precision, indicating that some false changes included in the unsupervised CVA result can be removed through supervised classification.

4.2. Detecting Delicate Changes

In the developed workflow, the utilization of OBIA has the advantage in detecting delicate changes and thus can quantify the accurate change amount. One advantage is that the selection of an optimal scale for image segmentation, which has been a difficult task in OBIA, does not have a great influence on the final result. It is because we only focused on extracting changed elements, so the scale parameter can be set to a relatively small value to detect delicate changes without worrying about spatial discontinuity in the classification results. For example, a large amount of solar panels were built in the Yungang Grottoes. By applying our workflow with the scale parameter set to 40 for image segmentation, most of the solar panels can be solely extracted, with the shortest width of a solar panel being only 2 m in the image. In an internal monitoring report of the Yungang Grottoes, the changes in cultural landscapes were manually interpreted based on the aerial photograph in 2013 and the Gaofen-2 image in 2018. It is almost impossible to delineate each solar panel in such a large area; therefore, the whole area covered by solar panels was regarded as a changed patch (Figure 10a). Moreover, the area covered by sparsely distributed solar panels was regarded as an unchanged patch (Figure 10b). The traditional interpretation fails to delineate delicate changes and would lead to an inaccurate estimation of the area value of changed objects. In comparison, our workflow can take the full advantage of VHR images and extract delicate changes more efficiently.

4.3. Future Work

Accurately extracting information from VHR images has always been a challenge in remote sensing. For this reason, traditional methods are usually based on manual interpretation to ensure an accurate extraction. It may not be troublesome to implement regular monitoring by manually updating changes for individual cultural sites. Especially, interpretation based on machine-learning methods is not as accurate as visual interpretation, and the former usually requires experts’ intervention. However, obtaining global-monitoring data is a labor-extensive work. Therefore, a semi-intelligent process is still a good option to meet the urgent demand of acquiring global-monitoring data. The developed workflow involves popular machine-learning methods for an easy use. The CVA and RF methods were adopted for the change detection and classification, respectively. The other state-of-the-art methods can be used instead at each step to improve the algorithm performance. Especially, the deep-learning technique can be used for classification in this workflow. The convolutional neural network (CNN), the most popular method in deep learning, can be applied by feeding a huge amount of sample images into the network to derive a well-trained model. This model has a better generalization than machine-learning methods and can then be used to classify images at various study sites, avoiding sample selection at each study site and each time phase, thus cutting off the only step that requires experts’ intervention. Consequently, the developed semi-automatic workflow can turn into an automatic one and be efficiently applied to the global CHS. Additionally, introducing intelligent algorithms allows this workflow to be popularized among non-remote-sensing researchers.
The changes without conversion types (e.g., reconstruction or destruction of buildings) cannot be detected by this workflow. One solution is to involve additional features for classification such as geometrical and spatial features. Nevertheless, since the classification was performed on the potentially changed objects only, the recall cannot be further improved (Figure 9). Another alternative is to use some more-advanced methods to identify the changes directly without considering the change types, such as a CNN model. It requires a huge amount of sample images labeled with change/unchanged to train a good model, which is worth trying after we generate the first batch of global-monitoring data and derive a reliable sample source.
When applying the interference degree to assess the risks of the global CHS, different types of CHS should have different evaluation criteria. The ancient sites, for example, are usually in a natural environment. The ancient architectural buildings, on the other side, are with a more vulnerable structure (e.g., decayed or insect attacked) in most cases. Thus, ancient sites may have a higher-tolerant value of interference degree than ancient architectural buildings that are categorized of high risk. Moreover, the status of CHS can have a close link to the local economy. The developing places would have a higher demand of land and lead to a high interference degree; the under-developed places may not have enough funds to maintain CHS, causing the interference mainly contributed by natural factors; developed places have already-established protection systems, and the interference degree is expected to be low.
Finally, despite the many advantages of Google-Earth imagery, three multispectral bands (red, green, and blue) greatly limit its ability to accurately classify land-cover classes with similar spectral features. More detailed land-cover classes can be involved to obtain some other indicators related to human activities at the fine scale such as human-impact intensity [46]. Additionally, bi-temporal Google-Earth images may come from different satellite sensors, and the large variations between two images would lead to a poor change-detection result. Moreover, historical Google-Earth images may not be available at a desired time. Recently, satellite constellations constructed by a number of artificial satellites have gained attention [47], providing four- or eight-band multispectral images with a 3–5 m spatial resolution and covering the entire land surface of the Earth every day [48]. These images have the ability to monitor CHS with a high frequency and low cost based on the proposed workflow, which can be an interesting point for future research.

5. Conclusions

This study proposed a solution to elaborately monitor cultural landscapes at heritage sites. In particular, a semi-automatic workflow was customized to extract land-cover changes from VHR remote-sensing images. Superior to traditional methods based on manually interpretation, the developed workflow can monitor delicate changes and greatly improve the efficiency with the accuracy around 85%. The object-based two-level workflow combines unsupervised CVA and supervised classification algorithms, which do not require a high-level technique skill to use for non-remote-sensing researchers. This semi-automatic workflow also demonstrates flexibility and good generalization. It can be improved by simply replacing advanced algorithms at each step and even evolves into an automatic workflow by introducing deep-learning techniques. Moreover, we proposed an indicator of the interference degree based on the monitoring result to quantify the amount of changed elements caused by human activities and natural factors. This indicator well reflects different preservation statuses of three diverse CHS in our case study. It can be periodically observed as a complementary indicator to evaluate the status of CHS and thus offer assistant decision-making in cultural-heritage preservation. This study demonstrates a practical effective technological solution to generate the first-hand global CHS data at the fine monitoring scale so as to evaluate Target 11.4 for the sustainable development of the world CHS.

Author Contributions

Conceptualization, F.C.; writing, Y.T. and F.C.; data curation, W.Y., Y.D. and H.W.; visualization, W.Y., H.W. and Y.T.; investigation, W.Y. and F.C.; methodology, Y.T. and F.C.; funding acquisition, L.J., Z.S. and F.C.; supervision, L.J. and F.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Key R&D Program of China, grant number 2019YFC1520800; the Strategic Priority Research Program of the Chinese Academy of Sciences, grant number XDA19030502; the National Key Research and Development Program of China, grant number 2017YFE0134400; and the National Natural Science Foundation of China, grant number 41972308, 42071312, 42171291.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

The authors thank three anonymous reviewers for providing helpful comments and suggestions to improve the manuscript.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Forino, G.; Mackee, J.; Meding, J.V. A proposed assessment index for climate change-related risk for cultural heritage protection in Newcastle (Australia). Int. J. Disaster Risk Reduct. 2016, 19, 235–248. [Google Scholar] [CrossRef]
  2. Oikonomopoulou, E.; Delegou, E.T.; Sayas, J.; Moropoulou, A. An innovative approach to the protection of cultural heritage: The case of cultural routes in Chios Island, Greece. J. Archaeol. Sci. Rep. 2017, 14, 742–757. [Google Scholar] [CrossRef]
  3. Andretta, M.; Coppola, F.; Modelli, A.; Santopuoli, N.; Seccia, L. Proposal for a new environmental risk assessment methodology in cultural heritage protection. J. Cult. Herit. 2017, 23, 22–32. [Google Scholar] [CrossRef]
  4. Ashrafi, B.; Kloos, M.; Neugebauer, C. Heritage impact assessment, beyond an assessment Tool: A comparative analysis of urban development impact on visual integrity in four UNESCO World Heritage Properties. J. Cult. Herit. 2021, 47, 199–207. [Google Scholar] [CrossRef]
  5. United Nations. Transforming Our World: The 2030 Agenda for Sustainable Development. 2015. Available online: https://sdgs.un.org/2030agenda (accessed on 20 January 2022).
  6. United Nations. IAEG-SDGs Tier Classification for Global SDG Indicators. 2019. Available online: https://unstats.un.org/sdgs/files/Tier%20Classification%20of%20SDG%20Indicators_28%20Dec%202020_web.pdf (accessed on 20 January 2022).
  7. Guo, H.; Chen, F.; Sun, Z.; Liu, J.; Liang, D. Big Earth Data: A practice of sustainability science to achieve the Sustainable Development Goals. Sci. Bull. 2021, 66, 1050–1053. [Google Scholar] [CrossRef]
  8. Negula, I.D.; Sofronie, R.; Virsta, A.; Badea, A. Earth Observation for the World Cultural and Natural Heritage. Agric. Agric. Sci. Proced. 2015, 6, 438–445. [Google Scholar] [CrossRef] [Green Version]
  9. Cuca, B.; Hadjimitsis, D.G. Space technology meets policy: An overview of Earth Observation sensors for monitoring of cultural landscapes within policy framework for Cultural Heritage. J. Archaeol. Sci. Rep. 2017, 14, 727–733. [Google Scholar] [CrossRef] [Green Version]
  10. Xiao, W.; Mills, J.; Guidi, G.; Rodríguez-Gonzálvez, P.; Barsanti, S.G.; González-Aguilera, D. Geoinformatics for the conservation and promotion of cultural heritage in support of the UN Sustainable Development Goals. ISPRS J. Photogramm. Remote Sens. 2018, 142, 389–406. [Google Scholar] [CrossRef]
  11. Levin, N.; Ali, S.; Crandall, D.; Kark, S. World Heritage in danger: Big data and remote sensing can help protect sites in conflict zones. Glob. Environ. Chang. 2019, 55, 97–104. [Google Scholar] [CrossRef]
  12. Luo, L.; Wang, X.; Guo, H.; Lasaponara, R.; Zong, X.; Masini, N.; Wang, G.; Shi, P.; Khatteli, H.; Chen, F.; et al. Airborne and spaceborne remote sensing for archaeological and cultural heritage applications: A review of the century (1907–2017). Remote Sens. Environ. 2019, 232, 232. [Google Scholar] [CrossRef]
  13. Yao, Y.; Wang, X.; Lu, L.; Liu, C.; Wu, Q.; Ren, H.; Yang, S.; Sun, R.; Luo, L.; Wu, K. Proportionated distributions in spatiotemporal structure of the world cultural heritage sites: Analysis and countermeasures. Sustainability 2021, 13, 2148. [Google Scholar] [CrossRef]
  14. Agapiou, A.; Alexakis, D.D.; Lysandrou, V.; Themistocleous, K.; Sarris, A.; Cuca, B.; Themistocleous, K.; Hadjimitsis, D.G. Impact of urban sprawl to cultural heritage monuments: The case study of Paphos area in Cyprus. J. Cult. Herit. 2015, 16, 671–680. [Google Scholar] [CrossRef]
  15. Agapiou, A.; Lysandrou, V.; Alexakis, D.D.; Themistocleous, K.; Cuca, B.; Argyriou, A.; Sarris, A.; Hadjimitsis, D.G. Cultural heritage management and monitoring using remote sensing data and GIS: The case study of Paphos area, Cyprus. Comput. Environ. Urban Syst. 2015, 54, 230–239. [Google Scholar] [CrossRef]
  16. Luo, L.; Wang, X.; Liu, J.; Guo, H.; Lasaponara, R.; Ji, W.; Liu, C. Uncovering the ancient canal-based tuntian agricultural landscape at China’s northwestern frontiers. J. Cult. Herit. 2017, 23, 79–88. [Google Scholar] [CrossRef]
  17. Elagouz, M.H.; Abou-Shleel, S.M.; Belal, A.A.; El-Mohandes, M.A.O. Detection of land use/cover change in Egyptian Nile Delta using remote sensing. Egypt. J. Remote Sens. Space Sci. 2020, 23, 57–62. [Google Scholar] [CrossRef]
  18. Lasaponara, R.; Masini, N. Beyond modern landscape features: New insights in the archaeological area of Tiwanaku in Bolivia from satellite data. Int. J. Appl. Earth Obs. Geoinf. 2014, 26, 464–471. [Google Scholar] [CrossRef]
  19. Stek, T.D. Drones over Mediterranean landscapes. The potential of small UAV’s (drones) for site detection and heritage management in archaeological survey projects: A case study from Le Pianelle in the Tappino Valley, Molise (Italy). J. Cult. Herit. 2016, 22, 1066–1071. [Google Scholar] [CrossRef]
  20. Luo, L.; Wang, X.; Guo, H.; Lasaponara, R.; Shi, P.; Bachagha, N.; Li, L.; Yao, Y.; Masini, N.; Chen, F.; et al. Google Earth as a powerful tool for archaeological and cultural heritage applications: A review. Remote Sens. 2018, 10, 1558. [Google Scholar] [CrossRef]
  21. Nebbia, M.; Cilio, F.; Bobomulloev, B. Spatial risk assessment and the protection of cultural heritage in southern Tajikistan. J. Cult. Herit. 2021, 49, 183–196. [Google Scholar] [CrossRef]
  22. Nagendra, H.; Mairota, P.; Marangi, C.; Lucas, R.; Dimopoulos, P.; Honrado, J.P.; Niphadkar, M.; Mücher, C.A.; Tomaselli, V.; Panitsa, M.; et al. Satellite Earth observation data to identify anthropogenic pressures in selected protected areas. Int. J. Appl. Earth Obs. Geoinf. 2015, 37, 124–132. [Google Scholar] [CrossRef]
  23. Morehart, C.T.; Millhauser, J.K. Monitoring cultural landscapes from space: Evaluating archaeological sites in the Basin of Mexico using very high resolution satellite imagery. J. Archaeol. Sci. Rep. 2016, 10, 363–376. [Google Scholar] [CrossRef]
  24. Deroin, J.-P.; Kheir, R.B.; Abda, C. Geoarchaeological remote sensing survey for cultural heritage management. Case study from Byblos (Jbail, Lebanon). J. Cult. Herit. 2017, 23, 37–43. [Google Scholar] [CrossRef] [Green Version]
  25. Prokop, P. Remote sensing of severely degraded land: Detection of long-term land-use changes using high-resolution satellite images on the Meghalaya Plateau, Northeast India. Remote Sens. Appl. Soc. Environ. 2020, 20, 100432. [Google Scholar] [CrossRef]
  26. Liu, C.; Cao, Y.; Yang, C.; Zhou, Y.; Ai, M. Pattern identification and analysis for the traditional village using low altitude UAV-borne remote sensing: Multifeatured geospatial data to support rural landscape investigation, documentation and management. J. Cult. Herit. 2020, 44, 185–195. [Google Scholar] [CrossRef]
  27. Liu, Y.; Tang, Y.; Jing, L.; Chen, F.; Wang, P. Remote sensing-based dynamic monitoring of immovable cultural relics, from environmental factors to the protected cultural Site: A case study of the Shunji Bridge. Sustainability 2021, 13, 6042. [Google Scholar] [CrossRef]
  28. Nocca, F. The role of cultural heritage in sustainable development: Multidimensional indicators as decision-making tool. Sustainability 2017, 9, 1882. [Google Scholar] [CrossRef] [Green Version]
  29. World Heritage Centre. World Heritage and Buffer Zones. In Proceedings of the International Expert Meeting on World Heritage and Buffer Zones, Davos, Switzerland, 11–14 March 2008; Martin, O., Piatti, G., Eds.; UNESCO—World Heritage Centre: Paris, France, 2009; p. 25. [Google Scholar]
  30. Lesiv, M.; See, L.; Laso Bayas, J.C.; Sturn, T.; Schepaschenko, D.; Karner, M.; Moorthy, I.; McCallum, I.; Fritz, S. Characterizing the spatial and temporal availability of very high resolution satellite imagery in Google Earth and Microsoft Bing maps as a source of reference data. Land 2018, 7, 118. [Google Scholar] [CrossRef] [Green Version]
  31. Blaschke, T. Object based image analysis for remote sensing. ISPRS J. Photogramm. Remote Sens. 2010, 65, 2–16. [Google Scholar] [CrossRef] [Green Version]
  32. Blaschke, T.; Strobl, J. What’s wrong with pixels? Some recent developments interfacing remote sensing and GIS. GeoBIT/GIS 2001, 14, 12–17. [Google Scholar]
  33. Jin, X. ENVI Automated Image Registration Solutions. Harris Geospatial Systems Whitepaper. 2017. Available online: http://www.l3harrisgeospatial.com/Portals/0/pdfs/ENVI_Image_Registration_Whitepaper.pdf (accessed on 20 January 2022).
  34. Lambin, E.F.; Strahlers, A.H. Change-vector analysis in multitemporal space: A tool to detect and categorize land-cover change processes using high temporal-resolution satellite data. Remote Sens. Environ. 1994, 48, 231–244. [Google Scholar] [CrossRef]
  35. Otsu, N. A threshold selection method from gray-level histograms. IEEE Trans. Syst. Man Cybern. 1979, 9, 62–66. [Google Scholar] [CrossRef] [Green Version]
  36. Benz, U.C.; Hofmann, P.; Willhauck, G.; Lingenfelder, I.; Heynen, M. Multi-resolution, object-oriented fuzzy analysis of remote sensing data for GIS-ready information. ISPRS J. Photogramm. Remote Sens. 2004, 58, 239–258. [Google Scholar] [CrossRef]
  37. Zhang, C.; Yue, P.; Tapete, D.; Jiang, L.; Shangguan, B.; Huang, L.; Liu, G. A deeply supervised image fusion network for change detection in high resolution bi-temporal remote sensing images. ISPRS J. Photogramm. Remote Sens. 2020, 166, 183–200. [Google Scholar] [CrossRef]
  38. Breiman, L. Random forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef] [Green Version]
  39. Bai, X.; Du, P.; Guo, S.; Zhang, P.; Lin, C.; Tang, P.; Zhang, C. Monitoring land cover change and disturbance of the Mount Wutai world cultural landscape heritage protected area, based on remote sensing time-series images from 1987 to 2018. Remote Sens. 2019, 11, 1332. [Google Scholar] [CrossRef] [Green Version]
  40. Lu, D.; Mausel, P.; Brondízio, E.; Moran, E. Change detection techniques. Int. J. Remote Sens. 2004, 25, 2365–2407. [Google Scholar] [CrossRef]
  41. Chen, F.; Guo, H.; Tapete, D.; Masini, N.; Cigna, F.; Lasaponara, R.; Piro, S.; Lin, H.; Ma, P. Interdisciplinary approaches based on imaging radar enable cutting-edge cultural heritage applications. Natl. Sci. Rev. 2021, 8, nwab123. [Google Scholar] [CrossRef]
  42. Chen, F.; Guo, H.; Ma, P.; Lin, H.; Wang, C.; Ishwaran, N.; Hang, P. Radar interferometry offers new insights into threats to the Angkor site. Sci. Adv. 2017, 3, e1601284. [Google Scholar] [CrossRef] [Green Version]
  43. Altuntas, C.; Hezer, S.; Kırlı, S. Image based methods for surveying heritage of masonry arch bridge with the example of Dokuzunhan in Konya, Turkey. Sci. Cult. 2017, 3, 13–20. [Google Scholar]
  44. Trier, Ø.; Reksten, J.H.; Løseth, K. Automated mapping of cultural heritage in Norway from airborne lidar data using faster R-CNN. Int. J. Appl. Earth Obs. Geoinf. 2021, 95, 102241. [Google Scholar] [CrossRef]
  45. Sasaki, Y. The truth of the F-measure. Teach. Tutor Mater. 2007, 1, 1–5. [Google Scholar]
  46. Quan, R.-C.; Wen, X.; Yang, X. Effects of human activities on migratory water birds at Lashihai Lake, China. Biol. Conserv. 2002, 108, 273–279. [Google Scholar] [CrossRef]
  47. Sakuma, A.; Yamano, H. Satellite constellation reveals crop growth patterns and improves mapping accuracy of cropping practices for subtropical small-scale fields in Japan. Remote Sens. 2020, 12, 2419. [Google Scholar] [CrossRef]
  48. Li, J.; Schill, S.R.; Knapp, D.E.; Asner, G.P. Object-based mapping of coral reef habitats using Planet Dove satellites. Remote Sens. 2019, 11, 1445. [Google Scholar] [CrossRef] [Green Version]
Figure 1. The study areas of three cultural-heritage properties.
Figure 1. The study areas of three cultural-heritage properties.
Sustainability 14 01319 g001
Figure 2. The flowchart of the developed workflow.
Figure 2. The flowchart of the developed workflow.
Sustainability 14 01319 g002
Figure 3. An illustration of an image at the pixel level (a), image segmentation (b), and the image at the object level (c) (different objects are distinguished by colors), and a segmentation result of a remote-sensing sub-scene (d).
Figure 3. An illustration of an image at the pixel level (a), image segmentation (b), and the image at the object level (c) (different objects are distinguished by colors), and a segmentation result of a remote-sensing sub-scene (d).
Sustainability 14 01319 g003
Figure 4. Land-cover changes in Yungang Grottoes (a). Zoomed images show the increase in solar panels in the southwest part of the buffer zone in Zone A (b); many low flat-roofed buildings were demolished, and a few buildings were built in the northern part of the buffer zone in Zone B (c). On-site photos show solar panels in Zone A (d), and most buildings have been demolished in Zone B (e).
Figure 4. Land-cover changes in Yungang Grottoes (a). Zoomed images show the increase in solar panels in the southwest part of the buffer zone in Zone A (b); many low flat-roofed buildings were demolished, and a few buildings were built in the northern part of the buffer zone in Zone B (c). On-site photos show solar panels in Zone A (d), and most buildings have been demolished in Zone B (e).
Sustainability 14 01319 g004
Figure 5. Land-cover changes in Wooden Pagoda of Yingxian (a). The zoomed images show that there is a little construction work in Zone A (b); vegetation was converted into a few paths and barren land in the park in Zone B (c).
Figure 5. Land-cover changes in Wooden Pagoda of Yingxian (a). The zoomed images show that there is a little construction work in Zone A (b); vegetation was converted into a few paths and barren land in the park in Zone B (c).
Sustainability 14 01319 g005
Figure 6. Land-cover changes in Chengcun Ancient City Ruins (a). The zoomed images show that a sluice was built across the river and that the road was widened and repaired on one side of the river in Zone A (b); vegetation was converted to farmland, and a new road was built on the other side of the river in Zone B (c).
Figure 6. Land-cover changes in Chengcun Ancient City Ruins (a). The zoomed images show that a sluice was built across the river and that the road was widened and repaired on one side of the river in Zone A (b); vegetation was converted to farmland, and a new road was built on the other side of the river in Zone B (c).
Sustainability 14 01319 g006
Figure 7. Heat maps of transition matrix for three study sites (unit: ha): (a) Yungang Grottoes, (b) Wooden Pagoda of Yingxian, and (c) Chengcun Ancient City Ruins.
Figure 7. Heat maps of transition matrix for three study sites (unit: ha): (a) Yungang Grottoes, (b) Wooden Pagoda of Yingxian, and (c) Chengcun Ancient City Ruins.
Sustainability 14 01319 g007
Figure 8. F-score of each class and overall classification accuracy (OA) for three study sties.
Figure 8. F-score of each class and overall classification accuracy (OA) for three study sties.
Sustainability 14 01319 g008
Figure 9. F-score, precision, and recall of the change-detection results based on the proposed workflow for three study sites.
Figure 9. F-score, precision, and recall of the change-detection results based on the proposed workflow for three study sites.
Sustainability 14 01319 g009
Figure 10. A comparison of the ability to detect delicate changes using traditional monitoring based on manual interpretation and the proposed workflow. (a) The whole patch covered by solar panels is identified as the changed patch (delineated in yellow), while the patch covered by sparsely distributed solar panels is missing (in blue rectangle); (b) most solar panels can be solely extracted (delineated in red).
Figure 10. A comparison of the ability to detect delicate changes using traditional monitoring based on manual interpretation and the proposed workflow. (a) The whole patch covered by solar panels is identified as the changed patch (delineated in yellow), while the patch covered by sparsely distributed solar panels is missing (in blue rectangle); (b) most solar panels can be solely extracted (delineated in red).
Sustainability 14 01319 g010
Table 1. Data information of the three cultural-heritage properties.
Table 1. Data information of the three cultural-heritage properties.
Data Acquisition TimeData Acquisition Time and SatellitesSpatial
Resolution
Core Area (ha)Buffer Area (ha)
Yungang Grottoes2015/01/112020/04/121.07 m115800
WorldView-2WorldView-3
Wooden Pagoda of Yingxian2015/11/012020/04/250.27 m7395
GeoEye-1GeoEye-1
Chengcun Ancient City Ruins2015/04/052020/03/151.07 m591418
WorldView-3WorldView-2
Table 2. Interference degree of three cultural-heritage properties.
Table 2. Interference degree of three cultural-heritage properties.
Interference Degree (%) in Core ZoneInterference Degree (%) in Buffer Zone
Yungang Grottoes0.644.93
Wooden Pagoda of Yingxian00.51
Chengcun Ancient City Ruins1.113.54
Table 3. The number of objects after each processing step.
Table 3. The number of objects after each processing step.
Yungang GrottoesWooden Pagoda of YingxianChengcun Ancient City Ruins
Segmentation15,87719,9957286
CVA1942135647
Training samples (bi-temporal)20151454882
20201624678
Classification130496440
Manual deletion of false changes18519118
Manual delineation of missing changes147916
Final result127285338
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Tang, Y.; Chen, F.; Yang, W.; Ding, Y.; Wan, H.; Sun, Z.; Jing, L. Elaborate Monitoring of Land-Cover Changes in Cultural Landscapes at Heritage Sites Using Very High-Resolution Remote-Sensing Images. Sustainability 2022, 14, 1319. https://doi.org/10.3390/su14031319

AMA Style

Tang Y, Chen F, Yang W, Ding Y, Wan H, Sun Z, Jing L. Elaborate Monitoring of Land-Cover Changes in Cultural Landscapes at Heritage Sites Using Very High-Resolution Remote-Sensing Images. Sustainability. 2022; 14(3):1319. https://doi.org/10.3390/su14031319

Chicago/Turabian Style

Tang, Yunwei, Fulong Chen, Wei Yang, Yanbin Ding, Haoming Wan, Zhongchang Sun, and Linhai Jing. 2022. "Elaborate Monitoring of Land-Cover Changes in Cultural Landscapes at Heritage Sites Using Very High-Resolution Remote-Sensing Images" Sustainability 14, no. 3: 1319. https://doi.org/10.3390/su14031319

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop