Next Article in Journal
Micro-/Nano-Fiber Sensors and Optical Integration Devices
Previous Article in Journal
A Red-Emitting Fluorescence Sensor for Detecting Boronic Acid-Containing Agents in Cells
Previous Article in Special Issue
Characterizing Seasonal Radial Growth Dynamics of Balsam Fir in a Cold Environment Using Continuous Dendrometric Data: A Case Study in a 12-Year Soil Warming Experiment
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Towards an Automated Approach for Monitoring Tree Phenology Using Vehicle Dashcams in Urban Environments

1
School of Geography, University of Nottingham, Nottingham NG7 2RD, UK
2
Nottingham Geospatial Institute, University of Nottingham, Nottingham NG7 2RD, UK
*
Author to whom correspondence should be addressed.
Sensors 2022, 22(19), 7672; https://doi.org/10.3390/s22197672
Submission received: 6 September 2022 / Revised: 3 October 2022 / Accepted: 8 October 2022 / Published: 10 October 2022
(This article belongs to the Special Issue Sensors and Their Application in Phenological Studies)

Abstract

:
Trees in urban environments hold significant value in providing ecosystem services, which will become increasingly important as urban populations grow. Tree phenology is highly sensitive to climatic variation, and resultant phenological shifts have significant impact on ecosystem function. Data on urban tree phenology is important to collect. Typical remote methods to monitor tree phenological transitions, such as satellite remote sensing and fixed digital camera networks, are limited by financial costs and coarse resolutions, both spatially and temporally and thus there exists a data gap in urban settings. Here, we report on a pilot study to evaluate the potential to estimate phenological metrics from imagery acquired with a conventional dashcam fitted to a car. Dashcam images were acquired daily in spring 2020, March to May, for a 2000 m stretch of road in Melksham, UK. This pilot study indicates that time series imagery of urban trees, from which meaningful phenological data can be extracted, is obtainable from a car-mounted dashcam. The method based on the YOLOv3 deep learning algorithm demonstrated suitability for automating stages of processing towards deriving a greenness metric from which the date of tree green-up was calculated. These dates of green-up are similar to those obtained by visual analyses, with a maximum of a 4-day difference; and differences in green-up between trees (species-dependent) were evident. Further work is required to fully automate such an approach for other remote sensing capture methods, and to scale-up through authoritative and citizen science agencies.

1. Introduction

Street trees and urban forests influence quality of life for urban citizens. Trees lower air temperature, improve air quality, provide carbon sequestration, intercept precipitation, and regulate run-off and water quality [1]. Urban trees are a necessity for sustaining urban ecology [2]. They provide health benefits to humans, but can also negatively impact human well-being [3]. With an urbanisation projection of 68% for the global population by 2050 [4], there will be an increased dependence on urban trees [5]. Urban trees thus need measurement and monitoring, with their phenological characteristics being a key dataset that is required [6,7].
Phenology has long been monitored through field observations, with this perspective supplemented recently with fixed digital cameras capturing hyper-temporal imagery (e.g., [8,9]). This is useful for object-based (individual tree) analysis, but deployment of fixed sensors across space at fine spatial resolutions is challenging. Sensors on satellites (Earth observation, EO) have also been usefully deployed [10,11], but may not be suitable for individual tree phenological study within the urban context [12]. Sensors on unoccupied aerial vehicles (UAVs) have also been shown to be useful, especially in fusion with satellite data [13,14], but their deployment can be subject to regulatory restrictions. In this paper, we suggest that image data captured by dashcams deployed in vehicles is a new opportunity for urban-situated phenological measurement of trees. Such a remote sensing system offers a ground (road) level view of urban trees and if deployed as part of a crowd-sourced sensor-network or as part of an authoritative network (e.g., work vans, etc.), there is the opportunity to measure the phenology of urban trees beyond the local scale. Furthermore, using dashcams in this way may present an opportunity to collect training and validation data for UAV and satellite-based phenological studies. We present a pilot study to explore the potential of using dashcams for phenological studies. As an initial step this paper focuses on the extraction of tree green-up dynamics in an urban setting using a singular vehicle dashcam sensor.

2. Materials and Methods

2.1. Data Capture

A survey transect along a road (2000 m in total) in Melksham, UK (51.3704, −2.1376) was driven daily (between 07:45–08:30 GMT) across a 90-day capture period (3 March 2020 to 31 May 2020). Silver Birch, Lime, Chestnut, Hazel, Rowan, and London Plane trees featured, with six locations (Area of Interest—AOIs), comprising single or multiple trees, along the route selected for further study (Table 1). A commercially available dashcam, the Nextbase 322GW (NBDVR322GW), was mounted to a car interior, connected by a 12-24Vdc car power cable, and used for data capture. The sensor resolution was 2.12M Pixels (1936 × 1097) with a recording resolution of 1920 × 1080 (High Definition) at 60 RGB frames per second (fps), with a 140° lens angle and f1.6 focal ratio. Pixel dimensions of a single frame captured were 1366 × 768, with a resolution of 96 dpi and a 24-bit depth.

2.2. Automatic Detection of Roadside Trees at Different Phenological Stages

Each of the 90-day video stream frames containing a tree(s) at the six AOIs were manually extracted using the open-source ViedoLAN Client media player—video clips were panned through, and then viewed frame by frame, to derive the frame which provided what was deemed visually to be the optimum view (herein referred to as an image). There was a slight variation to this ‘optimum view’ across the 90-day capture period for each AOI, in part due to some minor movement of the dashcam position, and in part to vehicles obscuring the tree. The six AOIs provided a 523-image dataset (see Table 1). Frames for some days were lost due to connection failure in the car (speed bumps, etc.).
The You Only Look Once (YOLO) v.3 deep learning algorithm [15] was used to automatically extract the tree(s) at each AOI from each of the 523 images. Microsoft’s Visual Object Tagging Tool (VOTT) was used to annotate a subset of images—126 periodically selected across the 90 days (Table 2). For each of the 126 training images, a bounding box was manually drawn around each tree in every image and tagged with the ‘Tree’ class. The YOLOv3 algorithm was deployed on the remaining images to output the original images with bounding boxes and confidence scores for each tree in the image. The performance of the object detection model was evaluated with the Mean Average Precision (mAP) metric on the AOIs used in image analysis [16], which equated to 32 bounding boxes (due to some AOIs featuring multiple trees).

2.3. Image Post-Processing and Calculation of Phenological Metrics

Following a visual review of the object detector outputs, the detection results output from the YOLO algorithm was used to filter results to remove any outliers. A total of 64 bounding boxes had a Y minimum coordinate of 0, suggesting that the bounding box had reached the edges of the image. Based on an optimum view test image selection, it was assumed that any coordinates with a Y minimum coordinate had not produced a good enough bounding box for enclosing any trees. A further 62 bounding boxes were removed where the tree class confidence score was <0.30 (determined via a visual analysis, where accurate tree bounding boxes were observed despite a low confidence score of between 0.30 and 0.50). The remaining bounding box coordinates were then used to crop all test images to coordinates, reducing the image size for analysis to a new area of interest—a single tree.
Images were then converted from RGB values to the Hue, Saturation, Value (HSV) colour space to mitigate potential error due to solar variance and brightness in the image, affecting scene illuminance [17,18]. Any images still containing background features (i.e., not the tree), had a user-defined region of interest (ROI) generated. The Green Chromatic Coordinate (GCC) (GCC = green/(red + green + blue)) was then derived for each of all the ~90 image sets of cropped HSV ROI for the detected tree(s).
To further smooth the effects of variable scene illuminance in the data, the temporal data for each tree was calculated as a three-day rolling mean [19]. The start of the leaf-on season was calculated for each tree as per previous studies [9,10]. The image archive was also visually inspected within a validation process in order to ascertain the quality of the automatically computed phenological metric of green-up (following [9]).

3. Results

Tree Detection Evaluation Metrics

The evaluation dataset showed the YOLOv3 object detector model obtained a mAP accuracy of 87.32% with the IoU threshold set at 50%; and a mAP accuracy of 70.73% with the IoU threshold set at 75%. The detector consistently identified trees in the images; however, there was some variation in the accuracy of the bounding boxes produced. For example, sometimes the detector output multiple bounding boxes for a single tree (23% of test images in AOI_2; 18% in AOI_3; 22% in AOI_4, and 12% in AOI_5). These were straightforward to eliminate from further analysis, as typically where a tree had two or more bounding boxes, one box had a much higher confidence score hence the others were removed. A trend observed across all AOIs was that the detector performed poorer on trees with leaf off, and better with leaf on (see Figure 1 for an example). AOI_1 and AOI_6 were omitted from further analysis following the detector inference, due to the added complexity of obtaining and organising the bounding boxes for image processing where a single image contained >2 trees, particularly due to overlapping bounding boxes.
The GCC data for HSV ROI images and an example of images used, are plotted for AOI_2, AOI_3, AOI_4, and AOI_5a and b (Figure 2). The phenological signal is clearly well-defined for each AOI—for some AOIs the classic trapezoid phenology profile is displayed; for all AOIs the greenness values are seen to increase rapidly from the start of the time period (March) through to the end of the time period (May). This corresponded to a period of leafing-on in the deciduous trees and associated increased photosynthetic activity. The HSV processing had a stretching effect on the GCC values up until the start of green-up, after which it relatively smoothed the values.
The computed start of green-up (i.e., leaf on) based on the 3-day rolling mean GCC of each HSV ROI is shown in Table 3. Both the Horse Chestnut tree in AOI_2 and the Rowan tree in AOI_3 observed the onset of greenness in early- to mid-April (AOI_2 on the 10 April and AOI_3 on the 14 April. AOI_2 plateaued in greening just four days later on the 18 April, with AOI_3 greening slower, reaching a plateau of greenness on approximately the 9 May. The Plane tree of AOI_4 began green-up later in April, on the 24 April, reaching a maximum greenness on approximately the 29 April. It was difficult to distinguish the onset of greenness in the trees in AOI_5. For both trees in the AOI_5 there was an evident positive relationship between increases in greenness over time, although there were significant fluctuations; however, green-up could be pin-pointed to 23 April for AOI 5 and the 4 May for AOI_5b. In both cases, the fluctuations in the data made it impossible to ascertain if greenness had plateaued, so it was assumed that increased productivity for these trees continued beyond the time period of study. In all cases, these dates of green-up were similar to those obtained by visual analyses with a maximum of a 4-day difference being evident (Table 3).

4. Discussion

The potential of dashcams deployed on a vehicle (e.g., a commuter car) to capture high resolution and repeat imagery, subsequently enabling specific object-based analysis on urban trees (extraction of phenological metrics data) has been demonstrated. Of particular note, is that local variability in the green-up start date was captured, which is extremely important considering the heterogeneity of the urban fabric and morphology. The use of dashcams in this way offers a particular advantage over other proximal remote sensing approaches, for example Google Street View data [20,21], in the high temporal frequency afforded (and determined by the user). Further research is required however, and the aspects of this are discussed below.
The mAP accuracies obtained by the YOLOv3 detector were of sufficient quality to calculate the green-up of extracted trees in subsequent image analyses—the YOLOv3 detection algorithm’s capacity to repeatedly detect the same tree from multiple images over time was successful in the main, though further work using the YOLO detector should be undertaken on a much wider range of urban situations and across the whole year. Particular attention should be focused on ensuring the precise intersection of the validation bounding boxes and the detection bounding boxes as this is a determining factor in data consistency and it is important when deriving the greenness metric over time. Inconsistency in the detector precision resulted in error when deriving the greenness metric as this analysis subsequently could not be made to the same portion of the tree. Nonetheless, when used on dashcam data, this deep learning approach did enable object-based analysis, which is not a viable option from other remote sensing data-capture approaches, such as satellite remote sensing techniques, at the temporal resolutions required. Additionally, the deep learning approach largely reduced positioning issues related to the sensor itself. The nature of the dashcam imagery and the urban environment meant that data was subject to changing views; sensor movement and changing road position of the platform (vehicle) due to overtaking cyclists and avoiding roadworks etc., resulted in a specific tree not being repeatedly captured from the same position. However, by obtaining the bounding box coordinates of a given tree, it could be extracted from the data captured by the sensor, and any further analysis could be made solely on the tree, thus the ‘optimum view’ of a tree was not a necessity.
A limitation of using deep learning algorithms is the requirement that the image annotations produce a training dataset for the detector. Pre-existing labelled image catalogues are frequently used for either training or testing (e.g., [22,23]), but this could not be achieved for this application. Unlike previous work, this study aimed to detect trees across a section of the phenological cycle, capturing trees during late winter through to early summer as part of the methodology. Hence, there was a requirement that the training dataset used to train the YOLOv3 detector needed to include annotated images across the phenological timeframe of interest, both trees with leaf off and leaf on, and the growth stages between theses. The performance of the detector on trees with leaf off was comparably weaker than when trees had greened up. It is possible that this is due to the added complexity of visible bare branches, which requires more learning for the detector to work effectively. More research on this is required.
The greenness results reported in this study were largely successful in informing the dates of green-up in different species, and thus better representation of phenological variation could be obtained and incorporated into an improved detector. The difference between the visual analyses and the automated calculation of green-up date (four days maximum) concur with the timings of previous studies focusing on measuring phenological shifts over time (e.g., [24,25]). Developing an optimum model will require balance in the training dataset size in addition to proportional representation of tree phenology. Additionally, when considering the potential reproducibility of the detector in other geographical locations, increased tree species heterogeneity is an advised area of improvement for the training data. Once the desired portion of a tree was extracted from the dashcam RGB data, there was success in capturing the green-up of the tree when applying the Green Chromatic Coordinate metric. Unlike visual observations of phenological events throughout seasons, the collection of repeat digital imagery allows for a quantitative analysis in capturing a change in greenness from the leaf-off to the leaf-on change in trees. The data collected for this study was largely captured during a morning commute, and subsequently there was significant change in the scene illuminance as a result of changing daylight, particularly up until the transition from daylight saving hours to British Summer Time. The HSV colour space used was effective, though further work would be useful.
The potential demonstrated here for low-cost deployment, passive integration of the sensor into a driver’s daily routine, the use of vehicles of both authoritative (e.g., buses and local council vehicles) and non-authoritative bodies (e.g., a commuter), suggests that further research on establishing a dashcam network should be conducted. This should consider how citizen science could be used effectively (e.g., as per Citizens4EO—[26]). Crimmins and Crimmins [27,28] have highlighted the opportunities for engaging the public by encouraging the sharing of repeat photography for capturing phenological events. OpenStreetCam (openstreetcam.org) and Treezilla (treezilla.org) are examples of platforms for collecting and providing free and open data on trees, with the former affording the sharing of dashcam footage and once shared, the automated extraction of phenological metrics could be used to support satellite-based analyses (as training/validation data). A challenge would be to co-ordinate citizen scientists on a larger scale. However, Crooks et al. [29] have recognised that sensor-derived content from crowd-sourced data has great potential in bettering the understanding of city systems, with Park et al. [30] and Rashid et al. [31] acknowledging that sharing dashcam footage will greatly extend urban surveillance. In 2019, the AA Limited, a British motoring association, found that that 24% of respondents already owned a dashcam, with an additional 18% stating that they were seriously considering purchasing one (www.theaa.com/about-us/public-affairs/aa-populus-driver-poll-summaries-2019#june2019; accesses on 10 May 2021). This potential volume of temporal data would be unmatched and offers real scope for the data capture of trees at a national level. Data that is captured by those who make repeated commuter journeys would be of particular interest.

5. Conclusions

The dashcam-capture approach presented in this paper points towards an automated process for deriving green-up metrics for urban trees and indicates the scope to ascertain more metrics relating to other aspects of vegetation such as species, health, and size. Further methodological development is required to offer a fully automated process, including improving detector performance, scaling-up capture, and the calculation of the phenological metric of green-up (and potentially senescence). Nonetheless, results obtained in this pilot study suggests that this data capture method, and mobile computing advances in general, could be employed to produce an input dataset, which is required by urban planners for decision making, in addition to acting as a suitable measure for key actions as part of the UN SDG targets (i.e., around sustainable cities (SDG 11); life on land (SDG 15) and climate action (SDG13).

Author Contributions

Conceptualization, D.S.B., S.C. and G.F.; methodology, D.S.B., S.C. and G.F.; software, S.C.; formal analysis, S.C.; investigation, S.C.; resources, D.S.B.; data curation, S.C.; writing—original draft preparation, S.C. and D.S.B.; writing—review and editing, D.S.B. and G.F.; visualization, S.C.; supervision, D.S.B. and G.F.; project administration, D.S.B.; funding acquisition, D.S.B. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by the UKRI (EPSRC) under Grant (EP/S023577/1).

Data Availability Statement

Dashcam data are available on request.

Acknowledgments

This research was undertaken as part of the Geospatial Data Science MRes within the EPSRC Centre for Doctoral Training in Geospatial Systems. The authors would like to thank all participating staff at the University of Nottingham, UK, who provided the training and expertise which enabled this study to be conducted.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Meng, L. Green with phenology. Science 2020, 374, 1065–1066. [Google Scholar] [CrossRef]
  2. Mullaney, J.; Lucke, T.; Trueman, S.J. A review of benefits and challenges in growing street trees in paved urban environments. Landsc. Urban Plan. 2015, 134, 157–166. [Google Scholar] [CrossRef]
  3. Wolf, K.L.; Lam, S.T.; McKeen, J.K.; Richardson, G.R.A.; van den Bosch, M.; Bardekjian, A.C. Urban trees and human health: A scoping review. Int. J. Environ. Res. Public Health 2020, 17, 4371. [Google Scholar] [CrossRef] [PubMed]
  4. United Nations. Department of Economic and Social Affairs. In World Urbanisation Prospects; United Nations: New York, NY, USA, 2019. [Google Scholar]
  5. Wood, S.L.R.; Jones, S.K.; Johnson, J.A.; Brauman, K.A.; Chaplin-Kramer, R.; Fremier, A.; Girvetz, E.; Gordon, L.J.; Kappel, C.V.; Mandle, L.; et al. Distilling the role of ecosystem services in the Sustainable Development Goals. Ecosyst. Serv. 2018, 29, 70–82. [Google Scholar] [CrossRef] [Green Version]
  6. Dallimer, M.; Tang, Z.; Gaston, K.J.; Davies, Z.G. The extent of shifts in vegetation phenology between rural and urban areas within a human-dominated region. Ecol. Evol. 2016, 6, 1942–1953. [Google Scholar] [CrossRef] [Green Version]
  7. Li, X.; Yuyu, Z.; Asrar, G.; Mao, J.; Li, X.; Li, W. Response of vegetation phenology to urbanization in the conterminous United States. Glob. Chang. Biol. 2017, 23, 2818–2830. [Google Scholar] [CrossRef]
  8. Richardson, A.D.; Jenkins, J.P.; Braswell, B.H.; Hollinger, D.Y.; Ollinger, S.V.; Smith, M.L. Use of digital webcam images to track spring green-up in a deciduous broadleaf forest. Oecologia 2007, 152, 323–334. [Google Scholar] [CrossRef]
  9. Morris, D.E.; Boyd, D.S.; Crowe, J.A.; Johnson, C.S.; Smith, K.L. Exploring the potential for automatic extraction of vegetation phenological metrics from traffic webcams. Remote Sens. 2013, 5, 2200–2218. [Google Scholar] [CrossRef] [Green Version]
  10. Li, F.; Song, G.; Liujun, Z.; Yanan, Z.; Di, L. Urban vegetation phenology analysis using high spatio-temporal NDVI time series. Urban For. Urban Green. 2017, 25, 43–57. [Google Scholar] [CrossRef]
  11. Moon, M.; Richardson, A.D.; Milliman, T.; Friedl, M.A. A high spatial resolution land surface phenology dataset for AmeriFlux and NEON sites. Sci. Data 2022, 9, 448. [Google Scholar] [CrossRef] [PubMed]
  12. Jaeger, D.M.; Looze, A.C.M.; Raleigh, M.S.; Miller, B.W.; Friedman, J.-M.; Wessman, C.A. From flowering to foliage: Accelerometers track tree sway to provide high-resolution insights into tree phenology. Agric. For. Meteorol. 2022, 318, 108900. [Google Scholar] [CrossRef]
  13. Park, J.Y.; Muller-Landau, H.C.; Lichstein, J.W.; Rifai, S.W.; Dandois, J.P.; Bohlman, S.A. Quantifying leaf phenology of individual trees and species in a tropical forest using unmanned aerial vehicle (UAV) images. Remote Sens. 2019, 11, 1534. [Google Scholar] [CrossRef] [Green Version]
  14. Thapa, S.; Garcia Millan, V.E.; Eklundh, L. Assessing forest phenology: A multi-scale comparison of near-surface (UAV, spectral reflectance sensor, phenocam) and satellite (MODIS, SENTINEL-2) remote sensing. Remote Sens. 2021, 13, 1597. [Google Scholar] [CrossRef]
  15. Redmon, J.; Farhadi, A. Yolov3: An incremental improvement. arXiv 2018, arXiv:1804.02767. [Google Scholar]
  16. Padilla, R.; Netto, S.L.; da Silva, E.A. A Survey on Performance Metrics for Object-Detection Algorithms. In Proceedings of the 2020 International Conference on Systems, Signals and Image Processing (IWSSIP), Niteroi, Brazil, 1–3 July 2020; pp. 237–242. [Google Scholar]
  17. Yang, W.; Wang, S.; Zhao, X.; Zhang, J.; Feng, J. Greenness identification based on HSV decision tree. Inf. Process. Agric. 2015, 2, 149–160. [Google Scholar] [CrossRef] [Green Version]
  18. Chaves-González, J.M.; Vega-Rodríguez, M.A.; Gómez-Pulido, J.A.; Sánchez-Pérez, J.M. Detecting skin in face recognition systems: A colour spaces study. Digit. Signal Process. 2010, 20, 806–823. [Google Scholar] [CrossRef]
  19. Sonnentag, O.; Hufkens, K.; Teshera-Sterne, C.; Young, A.M.; Friedl, M.; Braswell, B.H.; Milliman, T.; O’Keefe, J.; Richardson, A.D. Digital repeat photography for phenological research in forest ecosystems. Agric. For. Meteorol. 2012, 152, 159–177. [Google Scholar] [CrossRef]
  20. Branson, S.; Wegner, J.D.; Hall, D.; Lang, N.; Schindler, K.; Perona, P. From Google Maps to a fine-grained catalog of street trees. ISPRS J. Photogramm. Remote Sens. 2018, 135, 3–30. [Google Scholar] [CrossRef] [Green Version]
  21. Rodríguez-Puerta, F.; Barrera, C.; García, B.; PérezRodríguez, F.; García-Pedrero, A.M. Mapping tree canopy in urban environments using point cloudsfrom airborne laser scanning and street level imagery. Sensors 2022, 22, 3269. [Google Scholar] [CrossRef] [PubMed]
  22. Wegner, J.D.; Branson, S.; Hall, D.; Schindler, K.; Perona, P. Cataloging public objects using aerial and street-level images-urban trees. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 6014–6023. [Google Scholar]
  23. Jodas, D.S.; Yojo, T.; Brazolin, S.; Velasco, G.D.N.; Papa, J.P. Detection of trees on street-view images using a convolutional neural network. Int. J. Neural Syst. 2022, 32, 2150042. [Google Scholar] [CrossRef] [PubMed]
  24. Menzel, A.; Sparks, T.; Estrella, N.; Koch, E.; Aasa, A.; Ahas, R.; Alm-Kübler, K.; Bissolli, P.; Braslavská, O.; Briede, A.; et al. European phenological response to climate change matches the warming pattern. Glob. Chang. Biol. 2006, 12, 1969–1976. [Google Scholar] [CrossRef]
  25. Berra, E.F.; Gaulton, R. Remote sensing of temperate and boreal forest phenology: A review of progress, challenges and opportunities in the intercomparison of in-situ and satellite phenological metrics. For. Ecol. Manag. 2021, 480, 118663. [Google Scholar] [CrossRef]
  26. Boyd, D.S.; Foody, G.M.; Brown, C.; Mazumdar, S.; Marshall, H.; Wardlaw, J. Citizen science for Earth Observation (Citzens4EO): Understanding current use in the UK. Int. J. Remote Sens. 2022, 43, 2965–2985. [Google Scholar] [CrossRef]
  27. Crimmins, M.A.; Crimmins, T.M. Monitoring plant phenology using digital repeat photography. Environ. Manag. 2008, 41, 949–958. [Google Scholar] [CrossRef]
  28. Crimmins, T.M.; Crimmins, M.A. Large-scale citizen science programs can support ecological and climate change assessments. Environ. Res. Lett. 2022, 17, 065011. [Google Scholar] [CrossRef]
  29. Crooks, A.T.; Croitoru, A.; Jenkins, R.; Mahabir, P.; Agouris, P.; Stefanidis, A. User-generated big data and urban morphology. Built Environ. 2016, 42, 396–414. [Google Scholar] [CrossRef]
  30. Park, S.; Kim, J.; Mizouni, R.; Lee, U. Motives and concerns of dashcam video sharing. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, San Jose, CA, USA, 7–12 May 2016; pp. 4758–4769. [Google Scholar]
  31. Rashid, M.T.; Zhang, D.Y.; Wang, D. DASC: Towards a road Damage-Aware Social-media-driven Car sensing framework for disaster response applications. Pervasive Mob. Comput. 2020, 67, 101207. [Google Scholar] [CrossRef]
Figure 1. Example comparison of detector outputs between leaf-off and leaf-on trees.
Figure 1. Example comparison of detector outputs between leaf-off and leaf-on trees.
Sensors 22 07672 g001
Figure 2. Temporal GCC plotted for AOIs 2, 3, 4, and 5 (two trees) calculated using the ROI HSV; examples of clipped RGB images, the converted HSV image, and the ROI to the HSV colour space, which was then also clipped to new coordinates from each images’ centroid.
Figure 2. Temporal GCC plotted for AOIs 2, 3, 4, and 5 (two trees) calculated using the ROI HSV; examples of clipped RGB images, the converted HSV image, and the ROI to the HSV colour space, which was then also clipped to new coordinates from each images’ centroid.
Sensors 22 07672 g002
Table 1. Description of Area of Interest.
Table 1. Description of Area of Interest.
AOIContextual DescriptorNumber of ImagesSingle or Multiple TreesSpecies
AOI_1LimeRow88MultipleLime
AOI_2Chestnut Tree88SingleHorse Chestnut
AOI_3Library Tree87SingleRowan
AOI_4Roundabout Tree87SinglePlane
AOI_5Firestation87MultipleAcer
AOI_6Semington Rd.86MultipleBirch, Long Leaved Lime
Table 2. Days on which a sample of images were annotated for each AOI with the aim of representing the vegetation at various growth stages.
Table 2. Days on which a sample of images were annotated for each AOI with the aim of representing the vegetation at various growth stages.
Day of Capture PeriodDay of YearDate
(DD/MM/YYYY)
DAY 1633 March 2020
DAY 157717 March 2020
DAY 30921 April 2020
DAY 510716 May 2020
DAY 601221 May 2020
DAY 7513716 May 2020
DAY 9015231 May 2020
Table 3. Start-of-green-up date for each AOI.
Table 3. Start-of-green-up date for each AOI.
AOISpecies of TreeDashcam Estimated
(DD/MM/YYYY)
Visual Inspection Estimated
(DD/MM/YYYY)
2Horse Chestnut10 April 202013 April 2020
3Rowan14 April 202010 April 2020
4Plane24 April 202022 April 2020
5aAcer23 April 202020 April 2020
5bAcer4 May 20204 May 2020
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Boyd, D.S.; Crudge, S.; Foody, G. Towards an Automated Approach for Monitoring Tree Phenology Using Vehicle Dashcams in Urban Environments. Sensors 2022, 22, 7672. https://doi.org/10.3390/s22197672

AMA Style

Boyd DS, Crudge S, Foody G. Towards an Automated Approach for Monitoring Tree Phenology Using Vehicle Dashcams in Urban Environments. Sensors. 2022; 22(19):7672. https://doi.org/10.3390/s22197672

Chicago/Turabian Style

Boyd, Doreen S., Sally Crudge, and Giles Foody. 2022. "Towards an Automated Approach for Monitoring Tree Phenology Using Vehicle Dashcams in Urban Environments" Sensors 22, no. 19: 7672. https://doi.org/10.3390/s22197672

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop