Next Article in Journal
Correction: Schlembach, F., et al. Round Robin Assessment of Radar Altimeter Low Resolution Mode and Delay-Doppler Retracking Algorithms for Significant Wave Height. Remote Sens. 2020, 12, 1254
Next Article in Special Issue
Potentials of Airborne Hyperspectral AVIRIS-NG Data in the Exploration of Base Metal Deposit—A Study in the Parts of Bhilwara, Rajasthan
Previous Article in Journal
Large-Scale Analysis of the Spatiotemporal Changes of Net Ecosystem Production in Hindu Kush Himalayan Region
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Airborne Hyperspectral Data Acquisition and Processing in the Arctic: A Pilot Study Using the Hyspex Imaging Spectrometer for Wetland Mapping

1
Geophysical Institute, University of Alaska Fairbanks, 903 Koyukuk Dr., Fairbanks, AK 99775-7320, USA
2
Efficient Use of Water in Agriculture Program, Institute of Agrifood Research and Technology, Fruitcentre, Parc Científic i Tecnològic Agroalimentari de Lleida 23, 25003 Lleida, Spain
3
Remote Sensing Unit, Flemish Institute for Technological Research (VITO), Boeretang 200, 2400 Mol, Belgium
4
Alaska Satellite Facility, Geophysical Institute, University of Alaska Fairbanks, 903 Koyukuk Dr., Fairbanks, AK 99775-7320, USA
5
Yukon Flats National Wildlife Refuge, U.S. Fish and Wildlife Service, 101 12th Avenue, Fairbanks, AK 99701, USA
*
Author to whom correspondence should be addressed.
Remote Sens. 2021, 13(6), 1178; https://doi.org/10.3390/rs13061178
Submission received: 25 February 2021 / Revised: 15 March 2021 / Accepted: 16 March 2021 / Published: 19 March 2021

Abstract

:
A pilot study for mapping the Arctic wetlands was conducted in the Yukon Flats National Wildlife Refuge (Refuge), Alaska. It included commissioning the HySpex VNIR-1800 and the HySpex SWIR-384 imaging spectrometers in a single-engine Found Bush Hawk aircraft, planning the flight times, direction, and speed to minimize the strong bidirectional reflectance distribution function (BRDF) effects present at high latitudes and establishing improved data processing workflows for the high-latitude environments. Hyperspectral images were acquired on two clear-sky days in early September, 2018, over three pilot study areas that together represented a wide variety of vegetation and wetland environments. Steps to further minimize BRDF effects and achieve a higher geometric accuracy were added to adapt and improve the Hyspex data processing workflow, developed by the German Aerospace Center (DLR), for high-latitude environments. One-meter spatial resolution hyperspectral images, that included a subset of only 120 selected spectral bands, were used for wetland mapping. A six-category legend was established based on previous U.S. Geological Survey (USGS) and U.S. Fish and Wildlife Service (USFWS) information and maps, and three different classification methods—hybrid classification, spectral angle mapper, and maximum likelihood—were used at two selected sites. The best classification performance occurred when using the maximum likelihood classifier with an averaged Kappa index of 0.95; followed by the spectral angle mapper (SAM) classifier with a Kappa index of 0.62; and, lastly, by the hybrid classifier showing lower performance with a Kappa index of 0.51. Recommendations for improvements of future work include the concurrent acquisition of LiDAR or RGB photo-derived digital surface models as well as detailed spectra collection for Alaska wetland cover to improve classification efforts.

Graphical Abstract

1. Introduction

In Alaska, wetlands cover twenty-two percent of the state’s area, according to the most recent survey carried out by [1]. However, over the past 200 years Alaska has lost less than one percent of its wetland area compared to an estimated fifty-three percent loss in other states in the US [2]. Climate change is likely to alter the historic stability of wetland conditions, particularly in the Arctic [1], as the hydrological inputs necessary for wetland formation change in response. The many climate destabilizers that Interior Alaska is experiencing may cause wetland degradation specifically through thermal erosion, permafrost thaw, changing snow cover amounts and duration, shifting precipitation patterns, and paludification [3,4,5,6]. In addition, the degradation or loss of wetlands may result in significant changes in weather systems and alter precipitation patterns themselves [7]. Wetlands in Alaska represent an important habitat that provide essential ecosystem functions and many benefits to humans, plants, and animals from local to continental scales. These benefits include food and habitat for vegetation, wildlife, fish and shellfish species, food and habitat for human subsistence gathering, flood storage and stormflow modification, ground-water recharge and discharge, and the maintenance of water quality [7,8].
Monitoring and describing changes to all physical, chemical, and biological parameters for the Alaska wetlands is unrealistic due to their scope and complexity. However, using identified and interpreted proxies to assess the condition of an environment is a proven ecological monitoring method [9]. Complex interactions between geology; topography; climate; and physical, biological, and chemical systems result in various hydrological regimes and wetland types, each with their own plant and animal species assemblages. With the increase in the extremity of hydrological regime, the degree of specialization and fidelity of plant species increases [10]. Therefore, vegetation species assemblages may be used to infer wetland type and is the basis for the widely utilized Cowardin wetland classification system [11,12]. This ecological classification system was developed by the U.S. Fish and Wildlife Service to establish consistent terms and definitions used in inventory of wetlands and to provide standard measurements for wetland mapping. It is based on vegetation cover interpreted from aerial photos and is also widely used in Canada for wetland mapping [11,12]. Traditionally, vegetation mapping and wetland inventories require labor-intensive, costly, and time-consuming field work, including taxonomical information, collateral and ancillary data analysis, and the visual estimation of percentage cover for each species [13]. This workload is exacerbated in Alaska from the added difficulty of accessing wetland areas that are far from population centers, road systems, or aircraft landing strips, and the need to traverse difficult terrain. The national wetland inventory for Alaska is based almost entirely on 1978–1986, 1:60,000-scale, color-infrared imagery collected as part of the Alaska High Altitude Photography Acquisition Program (AHAP), with only 42 percent of the state having been mapped as of 2019 [14]. More recent inventories and studies of Alaskan wetlands [1,3,5,15,16] are often based on medium-resolution multi-spectral earth observation imagery, such as Landsat. Though orbital hyperspectral sensors such as Hyperion on board the EO-1 satellite have acquired high latitude imagery, there are several drawbacks to their use, especially in wetland studies, including inadequate spatial resolution for upland wetland delineation and mapping [17] and substantial periods of cloud cover during summer in high latitudes that reduce the chance of quality data acquisition during satellite overpass.
Prior to 2015, there was no direct access to a research-grade hyperspectral imaging system in Alaska that could be deployed for airborne hyperspectral remote sensing [18]. It was only in 2017–2018 that National Aeronautics and Space Administration (NASA) acquired Airborne Visible Infrared Imaging Spectrometer Next Generation (AVIRIS NG) data over Alaska’s boreal forests [19]. The HySpex airborne data set, such as presented in this pilot study, complements the NASA AVIRIS NG dataset and has the advantage that it can be collected locally at targeted sites with a greater and controlled frequency, and at lower costs than the NASA AVIRIS NG dataset [20]. The campaign also has other general advantages and disadvantages of airborne remote sensing for wetland mapping [21].
Compared to field-based wetland mapping, remote sensing techniques offer an economical and practical alternative to discriminate and estimate biochemical and biophysical parameters of wetland species [13] and to assist academic researchers and government agencies in mapping and monitoring wetlands [17,22]. In particular, hyperspectral imaging, also called imaging spectroscopy, has the advantage of capturing the distinct spectral signatures of land covers associated with wetlands within the 400 to 2500 nm spectral region [23]. However, hyperspectral imaging in high latitudes does come with challenges from low sun angles that introduce pronounced anisotropic effects to images [18]. The goal of this study was to demonstrate the capability of airborne hyperspectral imaging for wetland mapping using the Yukon Flats as a test case by means of (a) commissioning a hyperspectral imaging system, HySpex, in a small aircraft for airborne data acquisition in high-latitude environments; (b) correcting for geometric and radiometric distortions that are uniquely inherent in a high-latitude environment; (c) developing image processing protocols to generate prototypes of seamless mosaics, hypercubes, and thematically classified image products; and (d) mapping major wetland types in selected sites in the Yukon Flats.

2. Study Area

The Yukon Flats (Figure 1 top panel) is a 25,900 km2 wetland located within the Yukon Flats National Wildlife Refuge in eastern Interior Alaska. It is of international significance as a breeding area for migratory birds who use all the major flyways in North America. Three pilot study areas were selected within the Yukon Flats representing the wide variety of available wetland habitats.
The Yukon Flats area is bisected by the Yukon River and the Arctic Circle, situated between the Brooks Range to the north and the White and Crazy mountains to the south, and extends 360 km east to west (65°45′ and 67°30′ north latitude and 142°30′ and 150°00′ west longitude). The area has a low relief and is situated upon a broad plain of active and abandoned poorly drained alluvial floodplain deposits with a high water table [24]. The Yukon Flats experience a cold continental subarctic climate, with extreme temperatures and solar radiation between summer and winter. Although the Yukon Flats lack extensive meteorological data records, the Fairbanks International Airport located roughly 150 km south of the refuge provides National Climatic Data Center values from 1951 to 2009 [3] with a mean annual temperature of −3 °C, mean January temperature of −23 °C, and mean July temperature of 17 °C. The annual precipitation at Fairbanks International Airport is 26.7 cm water equivalent with snow cover from October through to April [3].
Much of the Yukon Flats is underlain by discontinuous permafrost [16], which substantially affects the surface and subsurface hydrology. Approximately 40,000 shallow lakes cover an area of 1147 km2, and are mainly oxbow or thermokarst-type lakes [25]. Although the area receives little annual precipitation, permafrost layers inhibit water movement and drainage and permafrost thaw leads to the development of shallow thermokarst lakes [3,24,25,26]. In addition, the short ice-free season in the region limits water loss due to evapotranspiration [27]. There are closed basin lakes (no outlet) and open basin lakes (with outlets) found within the Yukon Flats, with some lakes having direct contact with the groundwater table [24]. Due to the shallowness of the lakes and diurnal mixing there is little stratification within the water column during summer. In the winter, many lakes freezse to the bottom leaving these waterbodies devoid of permanent fish stock. The shallow lakes, developed in the complex carbonate-rich alluvial sediments of the basin, have highly variable nutrient concentrations, but most water bodies within the Yukon Flats are rich in nutrients and are either eutrophic or hypereutrophic. The high nutrient levels of many lakes allow for high populations of phytoplankton and invertebrates. Moreover, many lakes (~25%) are slightly brackish [24], with the potential of increased salinity concentrations and eutrophication as annual temperatures continue to rise and evaporation and permafrost degradation increase [28].
In addition to the abundant lakes and waterways, the Yukon Flats host a variety of habitats, with mixed boreal forest dominated by black spruce (Picea mariana) and white spruce (Picea glauca) covering much of the area. Stands of Alaska birch (Betula neoalaskana), and quaking aspen (Populus tremuloides) are common, along with willow (Salix ssp.) and alder (Alnus ssp.) thickets. Graminoid and sedge (Carex spp.) grasslands occupy many areas, while emergent plants (Equisetum spp., Typha spp., Scirpus spp.) are found within the wetlands and peripheries of lakes. The abundance and diversity of land cover types, along with the large amount of shallow, nutrient-rich water bodies has made the Yukon Flats an important breeding habitat for waterfowl [28], with over 100 different species and an estimated 0.5 to 1.5 million ducks, geese, and swans nesting there annually, including lesser scaup (Aythya affinis), white-winged scoters (Melanitta fusca), and horned grebes (Podiceps auratus) [29].

3. HySpex System Commissioning and Data Acquisition

3.1. HySpex Hyperspectral Imaging System

The HySpex hyperspectral imaging system, manufactured by Norsk Elektro Optikk (NEO), is configured for airborne acquisition using two sensors connected to a data acquisition unit (DAU) and an inertial measurement unit (IMU). The sensors are pushbroom (or along-track) scanners. Both sensors have low stray light levels, low polarization sensitivity, and high signal to noise ratios that allow for imaging highly dynamic scenes. The VNIR-1800 sensor samples spectra in the 400 to 1000 nm region, with 182 spectral channels sampling 3.26 nm per channel and a 16-bit radiometric resolution. It has a 17° field of view, with 1800 spatial pixels of 0.16 mrad across-track by 0.32 mrad along-track. The SWIR-384 sensor samples spectra in the 950 to 2500 nm region, with 288 spectral channels sampling 5.45 nm per channel, with a 16-bit radiometric resolution. It has 384 spatial pixels with a resolution of 0.73 mrad along-track by 0.73 mrad cross-track, and a 16-degree field of view. Both sensors were equipped with a field expander optics that doubled the pixel field of view (FOV) for the VNIR and SWIR-sensors to 34 degrees and 32 degrees, respectively.
The DAU is a purpose-built Windows 7 machine that runs the HySpex acquisition and control software, HySpex AIR, and provides power to the HySpex sensors. The DAU software is monitored and controlled by the system operator in air via a retro-reflective touch screen. The IMU is an IMAR iTrace RT-F400 IMU/Global Positioning System (GPS). Coupled with a GPS receiver antenna, the IMU provided all kinematic measurements during acquisition, such as the acceleration, angular rate and roll, pitch, and yaw of the aircraft, as well as position and velocity.

3.2. Integration of HySpex into Aircraft

A Found Bush Hawk, owned and operated by the U.S. Fish and Wildlife Service, served as the acquisition platform for the HySpex system (Figure 2a). The VNIR-1800 and SWIR-384 sensors, along with an IMAR iTrace RT-f400 IMU/GPS, were attached to a passive vibration dampening mount (Figure 2b) and secured to the aircraft with the across-track sensor line perpendicular to the aircraft’s flight direction (Figure 2c). A GPS antenna connected to the IMU by coaxial cable was mounted to the roof of the aircraft and X and Y offsets between the antenna and IMU were measured and input into the IMU for the georectification of images during post processing. Field of view expander lenses were attached to each sensor (Figure 2d), increasing the VNIR-1800 and SWIR-384 across track field-of-views of 17° and 16° to 34° and 32°, respectively. This allowed for maintaining a flight altitude that would capture an approximately one-meter ground pixel resolution in the SWIR image while reducing the number of flight lines needed to cover a selected flight area with the required 40 to 75% side-lap for the later mosaicking of the scene. Finally, the system was controlled in flight by the compact, high-performance DAU, connected with a 1 terabyte solid state hard drive and a compact, touch screen flat-panel monitor for in-flight operation and system-monitoring (Figure 2e).
A QUINT-UPS 24-volt/3.4 amp-hour uninterruptible power supply provided power to the HySpex system in flight and was fed by a 12-volt to 24-volt DC converter plugged into the aircraft’s electrical system. The system was powered and tested prior to flight to ensure reliable operation and an unobstructed view through the aircraft’s floor aperture. While performing a ground test acquisition of the HySpex system, a visible and near-infrared high contrast checkboard was placed below the aircraft aperture and moved perpendicular to the detector lines of the HySpex sensors to simulate aircraft movement over ground. The VNIR and SWIR images were then inspected visually to determine if the view angles of both sensors would be unobstructed by the aircraft fuselage during flight acquisition (Figure 2e).
During camera integration in the aircraft, the offset between the IMU and GPS antenna receiver and the offset between the two HySpex cameras were measured with sub-centimeter precision. This information was important to incorporate in the processing stage to achieve the precise co-registration of the images acquired by both cameras.

3.3. Flight Planning and Data Acquisition

On 2 September and 3 September 2015, three areas at the Yukon Flats Refuge were flown (see Figure 1). The targeted spatial resolution was 0.5 m for the VNIR-1800 and 1 m for the SWIR-384 at 2451 feet above ground level at 165 km·hr−1 flight speed. These areas were chosen based on the water chemistry and ecological habitat. Two flight areas, A and B (Figure 1), were identified by Refuge personnel as “high priority” targets due to their historical knowledge of the plant and waterfowl communities. The camera time frame period was provided by the system manufacturer NEO and the maximum frame period, to prevent under-sampling the ground scene, was determined by flight altitude and ground speed. A 40% side-lap between flight lines was used to adjust for aircraft banking and to aid in the georectification process. A test acquisition flight line was flown over the target area with increasing integration times, until at least one spectral band was saturated for the relevant ground cover. The integration time was then reduced by roughly 10% of the saturation value to optimize the data quality for the ground cover type, while still having a margin for variability in reflectance levels throughout the spectral bands.
Images were acquired within two hours before and after solar noon. Study areas A and B were flown either east to west or north to south to account for and reduce the BRDF effects which are pronounced in areas of high latitude. A total of 7, 7, and 2 flight lines were taken for areas A, B, and C, respectively, with a total area coverage of 14.2, 12.5, and 3 km2, respectively. Raw image files for the VNIR and SWIR sensors and navigational data files from the IMU/GPS were recorded and stored on the DAU for processing.

4. Hyperspectral Data Processing

The HySpex data processing chain, originally developed by the German Aerospace Center (DLR) [30], was adapted for this study to address the challenges of hyperspectral imaging at high latitudes (see Appendix A for a graphic depiction of the image processing flow chain). This processing method uses raw imagery from the two HySpex cameras and positional and navigational data to produce robust, geo-registered surface radiance and reflectance products. In addition to the standard geometric and radiometric corrections, processing workflows were also added to minimize the effects of BRDF.

4.1. Raw Images to At-Sensor Radiance Images

Both the VNIR and SWIR cameras were radiometrically calibrated by the manufacturing company, NEO, using an integration sphere. This same calibration was applied prior to aerial surveys to ensure that the acquired images had high radiometric quality. The 16-bit raw integer value data acquired per flight line were converted to at-sensor radiance value imagery according to [31], as summarized below. The first step is image acquisition in raw VNIR and SWIR file format using image metadata. For each flight line, data were recorded on the DAU during image acquisition in 16-bit digital number (DN) format as following:
D N [ i , j ] = N i [ i , j ] · Q E [ i ] · R E [ i , j ] · B G [ i , j ] ,
where i is the spectral band number; j is the spatial pixel number; DN are the digital numbers (0 to 65,535); Ni is the number of incoming photons corresponding to spatial pixel j and band number i during the integration time t; QE is the quantum efficiency (photoelectron to photon ratio) of the total system, including optics and detector for band I; SF is a scaling factor expressing DN per photoelectron (scaling factor is determined during the radiometric calibration of the instrument); and BG is a background matrix.
Raw data were then converted to real time calibrated data, CN, using the relationship between real-time calibrated DN values and the incoming light, expressed as:
C N [ i , j ] = D N [ i , j ] B G [ i , j ] R E [ i , j ] · d w .
Finally, the at-sensor absolute radiance in W·m−2·sr−1·nm−1 was computed as follows:
L N [ , j ] = C N [ i , j ] · h · c · S L Q E [ i ] · S F · d w · t · A · W · Dl [ i ] · l [ i ] ,
where t is the integration time, h is the Planck constant, c is the speed of light, A is the area of the light entrance aperture, Ω is the solid angle of a single pixel, ∆λ is the spectral sampling of the camera in nanometers, λ is the wavelength in nanometers, and SL is the Global Land Survey Digital Elevation Model (GLSDEM) scaling factor determined during calibration.

4.2. Image Orthorectification

The orthorectification model and geocoding of VNIR and SWIR at-sensor radiance flight line was performed with PARGE (ReSe Applications software) using the navigation files generated by the IMU/GPS for a flight area. A subset and resampled portion of the 90 m resolution GLSDEM was used for terrain elevation data input and resampled to the final spatial resolution of the hyperspectral imagery, which was 1 m for both VNIR and SWIR bands.
To ensure a proper alignment between both the VNIR and SWIR cameras, a boresight calibration was performed before the orthorectification process by flying over the University of Alaska Fairbanks campus prior to the image acquisition at the study area. Ground control points with a high GPS accuracy were identified and then used for aerotriangulation to determine the image attitude with respect to a local mapping frame and compared to the IMU-derived attitude matrix to derive a boresight matrix. This resulted in a boresight calibration offset that was applied in the georectification process to correct for the angular misalignment between the frames of reference of the IMU and the VNIR and SWIR cameras.
Orthorectification accuracy was evaluated by performing an automatic image to image registration between VNIR band 171 (954 nm λ) and SWIR band 2 (954 nm λ) flight line orthoimages. One hundred tie points were generated between corresponding pixels of the two images and their root mean square error (RMSE) was evaluated. An RMSE of less than 0.5 m, indicating sub-pixel accuracy between tie points, was deemed satisfactory and indicated that both cameras were successfully aligned. Flight lines where then layer stacked in a single hypercube (see Figure 3).
Although the VNIR images were collected at a nominal spatial resolution of 0.5 m, they were resampled using the nearest-neighbor method to a 1m resolution to match the spatial resolution of the SWIR images.

4.3. Radiometric Correction

After generating an orthorectified hypercube, a radiometric correction was applied using ATCOR-4 (ReSe Applications), which is specially designed for HySpex hyperspectral data. ATCOR-4 performs both atmospheric and topographic corrections for airborne sensors for optical regions (0.4 to 2.5 μm) using topography, image geometry, aerosols, water vapor information, and sensor model information [32]. This procedure was necessary to correct for the effects of atmospheric water vapor, optical thickness of the atmosphere, position of the sun, and differences in illumination caused by topography. A brief explanation about ATCOR can be found in [33]. The DEM used in the orthorectification process also provides information on the elevation, slope, aspect, and scan angle for ATCOR-4 corrections. Aqua MODIS water vapor product (MYD_05), taken about the same time as the airborne imagery, was used as a proxy for water vapor information. This product has previously shown good agreement when compared with ground data [34]. Finally, the flight line navigation files and the sensor model generated during calibration were also used by ATCOR-4. Images were collected on days with clear skies and with a default visibility value of 23 km, which was used to account for aerosol load.
An additional consideration when correcting radiometry is to minimize the BRDF impact from the variation in viewing and solar illumination geometry. Anisotropic reflectance can significantly alter the measured radiometry and surface reflectance of the same land cover type depending on the solar illumination, wavelength, surficial properties, and viewing angle of the sensor and must be corrected to ensure consistent radiometry across a scene. High solar zenith angles that are exhibited in high-latitude regions such as Alaska [35] further exacerbate this effect. BRDF effects, such as those in our study area, are apparent in scenes where the view and/or solar zenith angles vary over a large angular range, causing the normally processed reflectance value for vegetated surfaces to deviate up to 30% [32]. The influence of the BRDF effects can be split into topographical effects dependent on the slope and aspect of the terrain and the structural and optical properties of the vegetation.
The topographic effects were partly accounted for in the topographic correction (image orthorectification), thus the BRDF effect correction method (BREFCOR) was applied to the radiometrically corrected hypercube imagery in ATCOR-4 accounting for the vegetation structure effects. The BREFCOR method corrects observer BRDF effects by applying a fuzzy classification index that covers all surface types as a unified continuous index, and then fits the Ross-Li-sparse reciprocal BRDF model to all flight line images within a flight area and land cover classes to obtain a generic BRDF correction function [32,36].

4.4. Spectral Binning and Final Mosaic

The hypercube output by BREFCOR contained 457 discrete spectral bands. Spectral binning was applied to the dataset to reduce the significant noise generated by high instrument sensitivity. After applying spectral binning, the hypercube was reduced to a total of 229 bands (see Table 1 and Figure 4). Additionally, atmospheric water vapor bands (from 1340 to 1411 nm and from 1786 to 1829 nm) were masked out of the data, as they contained no valuable information for the purposes of ground cover classification.
Once all flight lines were orthorectified, radiometrically corrected, and spectrally binned, they were mosaicked together to produce a single georeferenced hypercube image of the flight area. It was ensured that at least 25 tie points were found between two adjacent flight lines, leading to an RMSE for each flight area of less than 0.5 pixels.

5. Wetland Mapping

5.1. Category Definition

Flight areas A and B were classified following a six-category legend (see Table 2). These categories included the most relevant and common vegetation found in the study area and were those that could be detected in at least 1 m resolution pixels. The accessibility and constraints posed by the remote study sites made it impossible collect field spectra.

5.2. Training and Test Areas Selection and Band Selection

Training areas were photointerpreted and digitized using the 30 m spatial resolution “Vegetation Map and Classification: Northern, Western, and Interior Alaska” [37], which served as an auxiliary dataset. This map, developed in 2012, was derived from 18 previous regional maps and is the most recent and best vegetation map available for the study area. The visual interpretation of a HySpex natural color composite generated from bands 12 (λ = 487 nm), 41 (λ = 671 nm), and 55 (λ = 760 nm) and a HySpex false color composite generated from bands 41 (λ = 671 nm), 55 (λ = 760 nm), and 62 (λ = 804 nm) further aided the training area selection for vegetation classification. A total of 70% of total training areas were used as a training set for supervised classification, with the remaining 30% used in a post-classification accuracy assessment. The assignment of training and validation samples was performed randomly for each category.
Before image classification, a detailed spectral analysis of defined categories was applied to reduce data redundancy of hyperspectral imagery as well as to derive a subset of significant bands. For this purpose, the BandMax algorithm in ENVI was used. This algorithm increases classification accuracy by determining an optimal subset of bands to help a user separate the user’s targets from known background materials [38]. Using the training areas in A and B as user-defined target and background materials, the BandMax algorithm identified a subset of 120 optimal bands for further wetland mapping.

5.3. Image Classification Methods: Hybrid Classification, Maximum Likelihood and Spectral Angle Mapper (SAM)

Three methods were used for classifying wetlands: (a) maximum likelihood supervised classification, (b) a hybrid classification (unsupervised + supervised), and (c) spectral angle mapper (SAM). All classifications were run with the ENVI software.
For the maximum likelihood classification, probability thresholds were set between 0 and 1 and a single probability threshold value of 0.5 was used for all classes. The hybrid classification consisted of an IsoData unsupervised classification combined with a supervised classification [39]. We ran IsoData several times with a minimum number of classes ranging from 7 to 48, a number of iterations from 20 to 30, a minimum pixels per class from 25 to 50, a minimum class distances from 5 to 15, and maximum merge pairs from 2 to 6. Finally, in the supervised step, depending on the fidelity and representativeness thresholds, resulting clusters from the unsupervised classification class were linked to a corresponding thematic class or left as unclassified. SAM is a physically based spectral classification that uses an n-D angle to match pixels to reference spectra [40]. The average spectra from the training areas were derived and different threshold configurations (ranging from 0.15 to 1.75) for each category were used to run SAM.

6. Results and Discussion

6.1. Commissioning and Data Acquisition

The HySpex system was successfully commissioned into a Found Bush Hawk aircraft. The designed flexibility and modular nature of the HySpex system made it relatively straight forward to test, transport, install, and remove multiple times before the acquisition flights. This allowed us to perform preflight measurements and test components of the system at a local hanger. The custom made aluminum mounting plate for the vibration dampening mount and wooden mounting board for the DAU and power system, with measured mounting hardpoints for the camera system and aircraft floor plates, ensured that the HySpex system could be easily mounted and unmounted as needed. This is a particularly useful feature for aircraft with multiple concurrent missions and for remote sensing in high latitude regions such as Alaska that witness rapidly changing weather conditions. This flexible setup also reduced costs by not committing aircraft use solely to data acquisition. Experience gained in this study will be helpful for commissioning similar airborne data acquisition systems.

6.2. Image Processing

6.2.1. Systematic VNIR Sensor Response Drop Correction and Systematic Stripping in VNIR and SWIR Spectral Bands

During initial geometric processing, a systematic drop in the VNIR sensor response was discovered along the edges of the acquired imagery (Figure 5). To evaluate the systematic drop, the VNIR camera was tested using a calibrated integration sphere with and without field of view expander optics attached. The drop was only present when the expander optics were attached. To maintain a high radiometric fidelity, the low sensor-response areas along the edges of the raw image files were masked out for all flight lines. However, this resulted in minor data gap of around 0.14 ha in the final mosaics for area B. This illustrates the importance of increasing the side-lap in the flight planning to account for HySpex sensor response drop areas when using the field expander on the VNIR camera.

6.2.2. Geometric and Radiometric Corrections

The Found Bush Hawk proved to be an excellent acquisition platform for HySpex imagery that led to a robust geometric correction, with an RMSE of less than 0.5 between flight lines. However, issues with geometric correction were detected due to the relative instability that are typical of flight in smaller single-engine aircraft. Image artifacts were due to the inherent turbulence (high roll, yaw, pitch) associated with small aircraft, which could not be automatically compensated by the IMU. These errors were corrected through data post-processing. This is a quality convenience tradeoff. Although larger aircraft platforms may provide more stability and may need fewer flight attitude corrections while collecting imagery in a flight line, smaller aircraft provide more flexibility when commissioning instrumentation and flying over remote areas. Overall, the low RMSE of less than 0.5 m in geometric correction obtained for the whole dataset collected in this study shows the configuration robustness for both the platform and geometric correction choice that yielded high-end hyperspectral orthomosaics for high latitudes (Figure 6).
Real-Time Kinematic (RTK) GPS measurements that provide more accurate GPS readings can be used for more accurate geometric correction. Subscription to satellite-based augmentation systems, such as OmniStar that utilizes L1/L2 carrier-phase correction signals with dual-frequency compatible receivers such as the IMAR iTRACE, or the use of post-processing differential GPS correction in regions with GPS receiver base stations may also be used to increase the accuracy of georectification.
Carefully planned flight lines reduced the differential illumination affects due to the position of the sun relative to flight path (Figure 6). The specific flight plan designed for high latitude areas for image acquisition (flying north to south and east to west) was successfully applied and BRDF effects were minimized in the dataset, thus allowing for seamless mosaics. However, flight lines in area C displayed strong BRDF effects as they were not collected using the same specific directions as the other areas. This was particularly visible in the final mosaic (Figure 6 panel C) where the lower flight line was brighter than the upper one, yielding approximately 32% reflectance difference between the two flight lines. This strongly suggests that for high latitude hyperspectral image acquisition, flight plans should be designed in advance flying north-to-south or east-to-west to minimize BRDF effects. Another factor that impacted the data quality in this study was the presence of a low cirrus cloud in an otherwise clear sky day that obstructed one of the flight lines in area B, altering reflectance values. During geometric and atmospheric processing, several bands within the VNIR and SWIR images were found to exhibit systematic striping across the image in the along-track (or flight) direction. To ensure high radiometric fidelity, bands exhibiting systematic striping and spectral noise from water vapor 1 to 3 (416 to 422 nm), 99 to 102 (726 to 736 nm), 122 to 128 (799 to 818 nm), 136 to 150 (843 to 888 nm), 156 (907 nm), and 200 to 204 (1112 to 1133 nm) were omitted from the final data set. This was carried out before spectral binning and classification processing.
After removing these bands, averaged spectra (training and test spectra dataset) for all land categories displayed a good visual performance except for water, which showed negative values (Figure 7, negative values for water were excluded for a better spectral comparison). In the water bodies, ATCOR4 undercorrected the water spectra, giving negative values beyond 820 nm where water reflectance was supposed to be 0, similar to those reported by [41]. For image classification purposes, negative values beyond 820 nm were set to 0. To compute absolute reflectance, in situ spectra as well as better water vapor estimates are needed for radiometric correction.
Image-derived spectra for spruce and deciduous vegetation were visually compared to similar spectra taken from available spectral libraries. For the spruce category, 12 samples of white spruce (Picea glauca) and black spruce (Picea mariana) spectra [42] were averaged. For the deciduous category, 12 samples of paper birch (Betula neoalaskana) and aspen (Populus tremuloides) spectra were also averaged [42]. A total of 100 pixels representative of both categories in study areas A and B were sampled, excluding shadowed areas for better spectra comparison.
HySpex spruce spectra were similar in shape to published spruce spectra (Figure 8 lower panel) but had a lower overall reflectance, likely due to the differences in phenology. The spectral libraries included spectra from the peak summer whereas the HySpex images were collected in early fall. Although more research is needed to fully compare field and HySpex spectra for wetland characterization, the radiometric correction applied to HySpex imagery produced appropriate spectra for wetland land categories.

6.3. Image Classification: Results

Best classification performance for A and B flight areas occurred when using the maximum likelihood classifier with a Kappa agreement coefficient ranging from 0.94 to 0.96, followed by the SAM classifier with a Kappa value ranging from 0.6 to 0.64 and, lastly, by the hybrid classifier, showing a lower performance with a Kappa value of ranging from 0.46 to 0.56 (Figure 9 and Figure 10 and, Table 3 and Table 4 and Appendix B Table A1, Table A2, Table A3 and Table A4). Moreover, the producer and user accuracy were also higher for the maximum likelihood classifier, with average values from 72% to 100%. SAM classification relies on external datasets, therefore, the lack of field spectra or a specific spectral library for high latitude areas for different types of vegetation, water and soils in this study may have caused this method to perform poorly. The training areas were used in place of a spectral reference dataset. However, this approach did not yield better results than the maximum likelihood classification. Pixel-based classifiers, such as SAM, perform best for hyperspectral imagery when extracting individual spectra for different plant species [43], which we lacked. Moreover, when these training areas were used with the SAM classifier, more than 35% of the final image was not classified due to the lack of proper spectra. To assign non- classified pixels to categories, a selective mode filter with a 3 by 3 convolution matrix was used. This removed non-classified pixels and increased image accuracy, but the performance was still poorer than the maximum likelihood method. In [44], the authors similarly found that maximum likelihood classification achieved a higher accuracy than SAM based on image-derived spectral endmembers utilizing airborne hyperspectral CASI data in an inland wetland complex near the Grand River, a tributary of Lake Erie in Ontario, Canada. This study also grouped different wetland vegetation species into broader land cover classes.
Hybrid classification showed the lowest performance, with an averaged Kappa value of 0.5 for both flight areas. The misclassification between equisetum, spruce (in area A), bog (in area B) and deciduous classes was the main reason for this lower performance. Finally, it is interesting to note that the low cirrus cloud that affected the area B bottom right section caused misclassification in the hybrid and maximum likelihood classifications, but it did not impact the final map for SAM classification (Figure 9). An advantage of SAM classification is that because it is dependent on the angle between two vectors in n-dimensional space, but not the vector magnitude, it is insensitive to varying magnitudes of illumination for the same cover types [40,45].
For all classifications, there was confusion between spruce and equisetum categories in areas with emergent vegetation, though this confusion was the least in maximum likelihood classifications. An increase in the training dataset on these areas a posteriori contributed to the better results using maximum likelihood. Expanded training datasets were also used in both hybrid and SAM classification, however the results did not improve. The mixing of plant and water signals in areas of flooded emergent vegetation results in a decrease in total reflected radiation, and the intensity of this effect is dependent on vegetation density, water depth, and canopy structure [46]. Due to the wide range of reflectance values represented in areas of emergent vegetation, the spectral signature of this class may overlap the signals from water, terrestrial vegetation, and soil. The lowered reflectance values of the equisetum class in some areas were similar to the spruce class spectral signature, causing misclassification between the two when using spectrally based methods. Similar misclassifications between conifer species and emergent vegetation when using a SAM on hyperspectral imagery were also described by [43] and in other wetland studies [22,44,46]. To solve this issue, a buffer approximating 100 m along the lakes could be applied to mask out spruce pixels within these areas and reclassify them into equisetum category. If available, a LIDAR flight or an optical camera used to derive a digital surface model may also help to discriminate between both categories, as spruce trees are taller than equisetum.
Timing image acquisition during periods of maximum spectral separability between plant communities or species of interest may further enhance classification efforts. In [47], the authors found that spectral reflectance differences were statistically significant between stands of hydrophytes, with maximum separation of species occurring during flowering or early seed stages. This, however, requires a detailed characterization of spectral responses over time between land cover types, which currently does not exist for Alaska in published spectral libraries.
Deciduous and spruce classes also showed some confusion. Deciduous and spruce usually form mixed forest, which is not easy to classify in this landscape. Moreover, due to the low angle of the sun above the horizon, the resulting imagery has self-cast shadows projected in the same vegetation that could lead to misclassification between both categories. Spruce and deciduous classes were notably misinterpreted as equisetum in these shadowed forested areas.
Therefore, resampling the whole imagery at a 5 m spatial resolution could be useful to increase the accuracy for both categories by integrating the shadowed regions within a pixel and averaging the spectral response between shadowed and non-shadowed areas of the same class. The addition of a shadow class could also be useful here, especially for any spatial statistical analyses that end users might perform on the thematic maps for land management or other purposes.
Bare ground category showed an intermediate accuracy in area A but a low accuracy in area B and had the least area representativeness of all other categories. Thus, it was not an easy category to classify and needs to be better defined. Bare ground showed spectra similar to dead vegetation, and the spectral endmember of this category may not be static.
Finally, water, bog, and equisetum categories yielded a high averaged accuracy of more than 90% in the user and producer accuracy for the maximum likelihood classifier. These categories each covered broad land cover types and incorporated different vegetation types with similar spectral, spatial, and canopy characteristics. The further segmentation of these classes into narrower categories would require more in situ knowledge of and data for the imaged areas, as well as detailed spectral characterization, and possibly alternate classification techniques to accurately identify and assign narrower classes.

7. Conclusions

In this study, the viability of airborne hyperspectral imaging for wetland mapping in the high latitudes of Alaska is demonstrated. High-resolution, orthorectified imagery with a good radiometry was produced for selected areas where only low-resolution and decades-old imagery existed. The classified vegetation maps derived from airborne hyperspectral images are an important contribution that help to further the understanding of how vegetation is responding to the rapidly changing climate in Interior Alaska.
A hyperspectral imaging system (HySpex) was configured in a Found Bush Hawk aircraft and used to acquire data over selected sites in the Yukon Flats. Custom designed mounting provided the flexibility needed for the system to be installed in different aircraft and ensured fast setup and break down before and after each flight, freeing the aircraft for other flights. This flexible yet robust mounting system helped with reducing commissioning costs and ensures the long-term viability of airborne data acquisition. Data were acquired with optimized flight configuration over three study areas that together represented the variety of wetland vegetation cover and water chemistry.
The hyperspectral data processing chain developed by the German Aerospace Center (DLR) was adapted for high latitudes by including a step for BRDF removal. Boresight calibration helped in geometrically correcting both the VNIR and SWIR images with an RMSE of less than 0.5 m at a 1 m spatial resolution. Data were successfully corrected radiometrically using ATCOR4 and ancillary water vapor data from the MODIS water vapor product. Although field spectra were not collected during image acquisition due to logistic constrains, visual comparison with available spectral libraries suggested a good radiometric correction.
For wetlands mapping, a 6-category legend was established based on previous USGS and USFWS information and previously available maps. Three different classification methods were applied in the two targeted areas using a spectral subset of selected 120 bands: hybrid classification, spectral angle mapper, and maximum likelihood. Final wetland maps were successfully classified using a maximum likelihood method with Kappa values higher than 0.94, and the average user and producer accuracy more than 90% for almost all categories. The best classification performance occurred when using the maximum likelihood classifier, followed by the SAM classifier and lastly the hybrid classifier, which showed a lower performance, with averaged Kappa indices of around 0.95, 0.62, and 0.5, respectively, for the two study areas. Although the SAM methodology is specifically suited for hyperspectral mapping, the lack of field spectra hampered the final outcome. It is important to note that the spruce and equisetum spectra in emergent areas were quite similar due to the decrease in reflectance caused by the integration of water in pixels covering areas of emergent vegetation. This led to the misclassification of pixels, especially when using the SAM and hybrid classifiers. Misclassification also occurred between spruce and deciduous categories, although it was minor.
In order to improve these results, future work should focus on the integration of LIDAR data or a digital surface model derived from standard RGB data to distinguish between equisetum and spruce categories. In addition, building a spectral library for Alaska wetlands vegetation will improve classification efforts and allow for the application of other hyperspectral classification techniques.

Author Contributions

The research was conceived and designed together by P.G., J.C. and A.P., M.B. (Marcel Buchhorn) led the data processing and was assisted by P.G. and R.G. Data analysis was led by J.C. and assisted by P.G., M.B. (Mark Bertram) and N.G. assisted with the commissioning of the HySpex sensor and image acquisition that was led by M.B. (Marcel Buchhorn), J.C. and P.G. All authors contributed to the manuscript writing and editing. All authors have read and agreed to the published version of the manuscript.

Funding

Authors would like to acknowledge the funding support from the National Science Foundation (NSF) award number MRI-1338193 and the U.S. Fish and Wildlife Service award number F15AC01133. Patrick Graham also received additional support from the NSF award number OIA-1208927.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Acknowledgments

We thank Martin Bachmann from DLR, Ray Kokaly from USGS, and Chris Waigl from UAF for sharing their expertise in hyperspectral data processing. We would also like to thank Dr. Don Hampton at the Geophysical Institute-UAF for his help in the HySpex camera calibration.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A. Hyperspectral Data Processing Workflow

The hyperspectral data processing workflow, adapted from the DLR data processing workflow (Habermeyer et al., 2012), consists of three steps: raw image processing and image orthorectification (Figure A1), radiometric and BRDF corrections (Figure A2), and mosaicking and spectral binning (Figure A3). The specific methods used in each step are described in Section 4. “Hyperspectral data processing”.
Step 1: Raw image processing and image orthorectification: For quality control general sensor characteristics (e.g., spectral smile), sensor calibration and performance issues (e.g., striping, data drops), and external conditions during overflight (e.g., cloud cover) are assessed. Using the HySpex RAD module (software provided by the manufacturer), the laboratory derived calibration coefficients to convert brightness values (DN values) to at-sensor radiance are applied. The IMU/GPS data are then input in the HySpex NAV module. Later, images are orthorectified in the PARGE software using the NAV data, the HySpex sensor model, and a digital elevation model (DEM). Additionally, offset angles from a boresight calibration are applied. The outputs from Step 1 are VNIR and SWIR orthorectified images, scan angle files, and DEMs stacked into a hyperspectral supercube (see Figure A1).
Figure A1. Raw image processing and image orthorectification workflow.
Figure A1. Raw image processing and image orthorectification workflow.
Remotesensing 13 01178 g0a1
Step 2: Complete radiometric corrections including atmospheric, topographic, and BRDF corrections: This step brings together several input parameters measured or derived in the Step 1 into the SPECTRA module (Figure A2). Atmospheric correction is performed using a radiative transfer-based approach with ATCOR software to convert at-sensor radiance to surface reflectance values. Reliable atmospheric correction of the hyperspectral data requires a DEM and the robust parameterization of atmospheric column properties including atmospheric gases (water vapor and oxygen) and aerosol optical thickness (AOT). The DEM products are also used to apply BRDF corrections at this stage. Similarly, the user can provide atmospheric parameters based on atmospheric profiles of the study site (if available), or the processing chain will use a modeled standard atmosphere for the specific geographic region. The output from this step is a data cube corrected for geometric, atmospheric, and BRDF effects.
Figure A2. Radiometric correction workflow.
Figure A2. Radiometric correction workflow.
Remotesensing 13 01178 g0a2
Step 3: Mosaicking and spectral binning: After the radiometric correction step, the individual corrected flight lines are mosaicked. Finally, to further increase the signal to noise ratio, spectral binning is applied (Figure A3).
Figure A3. Mosaicking and spectral binning workflow.
Figure A3. Mosaicking and spectral binning workflow.
Remotesensing 13 01178 g0a3

Appendix B

Classification results for the hybrid and SAM classifiers for both A and B flight areas.
Table A1. Confusion matrix for area A hybrid classification. Results in %.
Table A1. Confusion matrix for area A hybrid classification. Results in %.
Wetlands Map to Test
Evaluation dataset abcdefTotalCommission errorUser’s accuracy
Water (a)100.00.00.00.00.00.0100.00.0100.0
Equisetum (b)0.048.90.012.238.90.0100.051.148.9
Bog (c)0.05.866.20.526.11.4100.033.866.2
Spruce (d)0.030.718.444.76.10.0100.055.344.7
Deciduous (e)0.016.545.62.535.40.0100.064.635.4
Bare ground (f)0.00.082.90.00.616.6100.083.416.6
Total100.0100.0100.0100.0100.0100.0
Omission error0.073.865.69.780.79.4 Kappa: 0.46
Producer’s accuracy100.026.234.490.319.390.6
Table A2. Confusion matrix for area B hybrid classification. Results in %.
Table A2. Confusion matrix for area B hybrid classification. Results in %.
Wetlands Map to Test
Evaluation dataset abcdefTotalCommission errorUser’s accuracy
Water (a)100.00.00.00.00.00.0100.00.0100.0
Equisetum (b)0.053.40.05.441.20.0100.046.653.4
Bog (c)0.00.085.90.07.76.4100.014.185.9
Spruce (d)0.00.00.079.520.50.0100.020.579.5
Deciduous (e)0.09.117.81.771.40.0100.028.671.4
Bare ground (f)0.00.088.10.00.011.9100.088.111.9
Total100.0100.0100.0100.0100.0100.0
Omission error0.68.282.612.859.011.9 Kappa: 0.56
Producer’s accuracy99.491.817.487.241.188.1
Table A3. Confusion matrix for area A SAM classification. Results in %.
Table A3. Confusion matrix for area A SAM classification. Results in %.
Wetlands Map to Test
Evaluation dataset abcdefTotalCommission errorUser’s accuracy
Water (a)100.00.00.00.00.00.0100.00.0100.0
Equisetum (b)0.040.516.924.917.60.0100.059.540.5
Bog (c)0.02.090.71.75.60.0100.09.390.7
Spruce (d)0.036.10.057.46.50.0100.042.657.4
Deciduous (e)0.01.40.04.394.20.0100.05.894.2
Bare ground (f)0.00.067.90.02.829.4100.070.629.4
Total100.0100.0100.0100.0100.0100.0
Omission error0.027.431.457.255.20.0 Kappa: 0.64
Producer’s accuracy100.072.668.642.844.8100.0
Table A4. Confusion matrix for area B SAM classification. Results in %.
Table A4. Confusion matrix for area B SAM classification. Results in %.
Wetlands Map to Test
Evaluation dataset abcdefTotalCommission errorUser’s accuracy
Water (a)100.00.00.00.00.00.0100.00.0100.0
Equisetum (b)0.048.80.018.832.40.0100.059.540.5
Bog (c)0.00.073.53.119.73.7100.09.390.7
Spruce (d)0.01.60.063.435.00.0100.042.657.4
Deciduous (e)0.044.17.33.245.10.3100.05.894.2
Bare ground (f)0.00.00.00.00.0100.0100.070.629.4
Total100.0100.0100.0100.0100.0100.0
Omission error0.027.431.457.255.20.0 Kappa: 0.64
Producer’s accuracy100.072.668.642.844.8100.0

References

  1. Flagstad, L.; Steer, M.A.; Boucher, T.; Aisu, M.; Lema, P. Wetlands across Alaska: Statewide Wetland Map and Assessment of Rare Wetland Ecosystems; University of Alaska Anchorage: Anchorage, AK, USA, 2018; p. 151. [Google Scholar]
  2. Dahl, T.E. Wetlands Losses in the United States 1780s to 1980s; U.S. Department of the Interior, Fish and Wildlife Service: Washington, DC, USA, 1990. [Google Scholar]
  3. Chen, M.; Rowland, J.C.; Wilson, C.J.; Altmann, G.L.; Brumby, S.P. Temporal and spatial pattern of thermokarst lake area changes at yukon flats, alaska. Hydrol. Process. 2014, 28, 837–852. [Google Scholar] [CrossRef]
  4. Jorgenson, M.T.; Racine, C.H.; Walters, J.C.; Osterkamp, T.E. Permafrost degradation and ecological changes associated with a warming climate in central alaska. Clim. Chang. 2001, 48, 551–579. [Google Scholar] [CrossRef]
  5. Roach, J.K.; Griffith, B.; Verbyla, D. Landscape influences on climate-related lake shrinkage at high latitudes. Glob. Chang. Biol. 2013, 19, 2276–2284. [Google Scholar] [CrossRef] [PubMed]
  6. Haynes, K.M.; Connon, R.F.; Quinton, W.L. Permafrost thaw induced drying of wetlands at scotty creek, nwt, canada. Environ. Res. Lett. 2018, 13, 114001. [Google Scholar] [CrossRef]
  7. Carter, V. Wetland hydrology, water quality, and associated functions. In United States Geological Survey Water Supply Paper; United States Geological Survey: Reston, VA, USA, 1996; Volume 2425, pp. 35–48. [Google Scholar]
  8. Hall, J.V.; Frayer, W.E.; Wilen, B.O.; Fish, U.S. Status of Alaska Wetlands. U.S. Fish & Wildlife Service: Alaska Region, AK, USA; Anchorage, AK, USA, 1994. [Google Scholar]
  9. Niemi, G.J.; McDonald, M.E. Application of ecological indicators. Annu. Rev. Ecol. Evol. Syst. 2004, 35, 89–111. [Google Scholar] [CrossRef] [Green Version]
  10. Tiner, R.W. Defining Hydrophytes for Wetland Identification and Delineation; Defense Technical Information Center: Fort Belvoir, VA, USA, 2012. [Google Scholar]
  11. Cowardin, L.M.; Fish, U.S.; Service, W.; Program, B.S. Classification of Wetlands and Deepwater Habitats of the United States; Fish and Wildlife Service, U.S. Department of the Interior: Washington, DC, USA, 1979. [Google Scholar]
  12. Finlayson, C.M.; van der Valk, A.G. Wetland classification and inventory: A summary. Vegetatio 1995, 118, 185–192. [Google Scholar] [CrossRef]
  13. Adam, E.; Mutanga, O.; Rugege, D. Multispectral and hyperspectral remote sensing for identification and mapping of wetland vegetation: A review. Wetl. Ecol. Manag. 2010, 18, 281–296. [Google Scholar] [CrossRef]
  14. Group, A.W.T.W. Statewide Wetland Inventory Ten Year Strategic Plan 2019–2029; U.S. Fish & Wildlife Service: Anchorage, AK, USA, 2019; p. 22. [Google Scholar]
  15. Jorgenson, M.T. Landcover Mapping for Bering Land Bridge National Preserve and Cape Krusenstern National Monument, Northwestern Alaska; U.S. Department of the Interior, National Park Service, Natural Resource Program Center: Washington, DC, USA, 2004. [Google Scholar]
  16. Pastick, N.J.; Jorgenson, M.T.; Wylie, B.K.; Minsley, B.J.; Ji, L.; Walvoord, M.A.; Smith, B.D.; Abraham, J.D.; Rose, J.R. Extending airborne electromagnetic surveys for regional active layer and permafrost mapping with remote sensing and ancillary data, yukon flats ecoregion, central alaska. Permafr. Periglac. 2013, 24, 184–199. [Google Scholar] [CrossRef] [Green Version]
  17. Klemas, V. Remote sensing of wetlands: Case studies comparing practical techniques. J. Coast. Res. 2011, 27, 418–427. [Google Scholar]
  18. Cristóbal, J.; Graham, P.; Buchhorn, M.; Prakash, A. A new integrated high-latitude thermal laboratory for the characterization of land surface processes in alaska’s arctic and boreal regions. Data 2016, 1, 13. [Google Scholar] [CrossRef] [Green Version]
  19. Miller, C.E.; Green, R.O.; Thompson, D.R.; Thorpe, A.K.; Eastwood, M.; Mccubbin, I.B.; Olson-duvall, W.; Bernas, M.; Sarture, C.M.; Nolte, S.; et al. Above: Hyperspectral Imagery from Aviris-Ng, Alaskan and Canadian Arctic, 2017–2018; ORNL DAAC: Oak Ridge, TN, USA, 2019. [Google Scholar]
  20. Petropoulos, G.P.; Manevski, K.; Carlson, T. Hyperspectral remote sensing with emphasis on land cover mapping: From ground to satellite observations. In Scale Issues in Remote Sensing; Weng, Q., Ed.; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 2014; pp. 289–324. [Google Scholar]
  21. Mahdavi, S.; Salehi, B.; Granger, J.; Amani, M.; Brisco, B.; Huang, W. Remote sensing for wetland classification: A comprehensive review. Gisci. Remote Sens. 2018, 55, 623–658. [Google Scholar] [CrossRef]
  22. Ozesmi, S.L.; Bauer, M.E. Satellite remote sensing of wetlands. Wetl. Ecol. Manag. 2002, 10, 381–402. [Google Scholar] [CrossRef]
  23. Kaufmann, H.; Förster, S.; Wulf, H.; Segl, K.; Guanter, L.; Bochow, M.; Heiden, U.; Müller, A.; Heldens, W.; Schneiderhan, T. Science Plan of the Environmental Mapping and Analysis Program (Enmap); Deutsches GeoForschungsZentrum GFZ: Postdam, Germany, 2012. [Google Scholar]
  24. Heglund, P.J.; Jones, J.R. Limnology of shallow lakes in the yukon flats national wildlife refuge, interior alaska. Lake Reserv. Manag. 2003, 19, 133–140. [Google Scholar] [CrossRef] [Green Version]
  25. Arp, C.D.; Jones, B.M. Geography of Alaska Lake Districts: Identification, Description, and Analysis of Lake-Rich Regions of a Diverse and Dynamic State; 2008-5215; U. S. Geological Survey: Reston, VA, USA, 2009.
  26. Shur, Y.; Kanevskiy, M.; Jorgenson, M.; Dillon, M.; Stephani, E.; Bray, M.; Fortier, D. Permafrost degradation and thaw settlement under lakes in yedoma environment. In Proceedings of the Tenth International Conference on Permafrost; International Contributions; Hinkel, K.E., Ed.; The Northern Publisher: Salekhard, Russia, 2012; Volume 10, p. 383. [Google Scholar]
  27. Ford, J.; Bedford, B.L. The hydrology of alaskan wetlands, U.S.A.: A review. Arct. Alp. Res. 1987, 19, 209–229. [Google Scholar] [CrossRef]
  28. Lewis, T.L.; Lindberg, M.S.; Schmutz, J.A.; Heglund, P.J.; Rover, J.; Koch, J.C.; Bertram, M.R. Pronounced chemical response of subarctic lakes to climate-driven losses in surface area. Glob. Chang. Biol. 2015, 21, 1140–1152. [Google Scholar] [CrossRef]
  29. Heglund, P.J. Patterns of Wetland Use among Aquatic Birds in the Interior Boreal Forest Region of Alaska; University of Missouri: Columbia, MI, USA, 1994. [Google Scholar]
  30. Habermeyer, M.; Bachmann, M.; Holzwarth, S.; Müller, R.; Richter, R. Incorporating a Push-Broom Scanner into a Generic Hyperspectral Processing Chain. In Proceedings of the 2012 IEEE International Geoscience and Remote Sensing Symposium, Munich, Germany, 22–27 July 2012; pp. 5293–5296. [Google Scholar]
  31. Norsk Elektro Optikk. Imaging Spectrometer: Users Manual; Norsk Elektro Optikk: Oslo, Norway, 2014. [Google Scholar]
  32. Richter, R.; Schlapfer, D. Atmospheric/Topographic Correction for Airborne Imagery; ReSe Applications: Wil, Switzerland, 2015. [Google Scholar]
  33. Davaadorj, A. Evaluating Atmospheric Correction Methods Using Worldview-3 Image; University of Twente: Enshede, The Netherlands, 2019. [Google Scholar]
  34. Cristóbal, J.; Jiménez-Muñoz, J.; Prakash, A.; Mattar, C.; Skoković, D.; Sobrino, J. An improved single-channel method to retrieve land surface temperature from the landsat-8 thermal band. Remote Sens. 2018, 10, 431. [Google Scholar] [CrossRef] [Green Version]
  35. Buchhorn, M. Ground-Based Hyperspectral and Spectro-Directional Reflectance Characterization of Arctic Tundra Vegetation Communities: Field Spectroscopy and Field Spectro-Goniometry of Siberian and Alaskan Tundra in Preparation of the Enmap Satellite Mission. Doctoral Dissertation, Universitaetsverlag Potsdam, Potsdam, Germany, 2014. [Google Scholar]
  36. Schläpfer, D.; Richter, R. Evaluation of Brefcor Brdf Effects Correction for Hyspex, Casi, and Apex Imaging Spectroscopy Data. In Proceedings of the 2014 6th Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing (WHISPERS), Lausanne, Switzerland, 24–27 June 2014; pp. 1–4. [Google Scholar]
  37. Boggs, K.; Flagstad, L.; Boucher, T.; Kuo, T.; Fehringer, D.; Guyer, S.; Megumi, A. Vegetation Map and Classification: Northern, Western and Interior Alaska, 2nd ed.; University of Alaska: Anchorage, AK, USA, 2016; p. 110. [Google Scholar]
  38. Pu, R. Hyperspectral remote sensing: Fundamentals and Practices; CRC Press: Boca Raton, FL, USA, 2017; p. 466. [Google Scholar]
  39. Serra, P.; Pons, X.; Saurí, D. Post-classification change detection with data from different sensors: Some accuracy considerations. Int. J. Remote. Sens. 2003, 24, 3311–3340. [Google Scholar] [CrossRef]
  40. Kruse, F.A.; Lefkoff, A.B.; Boardman, J.W.; Heidebrecht, K.B.; Shapiro, A.T.; Barloon, P.J.; Goetz, A.F.H. The spectral image processing system (sips)—interactive visualization and analysis of imaging spectrometer data. Remote. Sens. Environ. 1993, 44, 145–163. [Google Scholar] [CrossRef]
  41. Markelin, L.; Simis, S.; Hunter, P.; Spyrakos, E.; Tyler, A.; Clewley, D.; Groom, S. Atmospheric correction performance of hyperspectral airborne imagery over a small eutrophic lake under changing cloud cover. Remote Sens. 2016, 9, 2. [Google Scholar] [CrossRef] [Green Version]
  42. Hovi, A.; Raitio, P.; Rautiainen, M. A spectral analysis of 25 boreal tree species. Silva. Fenn. 2017, 51, 4. [Google Scholar] [CrossRef] [Green Version]
  43. Harken, J.; Sugumaran, R. Classification of iowa wetlands using an airborne hyperspectral image: A comparison of the spectral angle mapper classifier and an object-oriented approach. Can. J. Remote Sens. 2005, 31, 167–174. [Google Scholar] [CrossRef]
  44. Jollineau, M.Y.; Howarth, P.J. Mapping an inland wetland complex using hyperspectral imagery. Int. J. Remote Sens. 2008, 29, 3609–3631. [Google Scholar] [CrossRef]
  45. Leckie, D.G.; Cloney, E.; Jay, C.; Paradine, D. Automated mapping of stream features with high-resolution multispectral imagery: An example of the capabilities. Photogramm. Eng. Remote Sens. 2005, 71, 11. [Google Scholar] [CrossRef] [Green Version]
  46. Silva, T.S.F.; Costa, M.P.F.; Melack, J.M.; Novo, E.M.L.M. Remote sensing of aquatic vegetation: Theory and applications. Environ. Monit. Assess. 2008, 140, 131–145. [Google Scholar] [CrossRef]
  47. Best, R.G.; Wehde, M.E.; Linder, R.L. Spectral reflectance of hydrophytes. Remote Sens. Environ. 1981, 11, 27–35. [Google Scholar] [CrossRef]
Figure 1. Map of the Yukon Flats National Wildlife Refuge, Alaska, showing flight areas labeled A, B, and C. Upper panel in geographic projection system and lower panels in UTM-6N projection system. All panels are in datum WGS84.
Figure 1. Map of the Yukon Flats National Wildlife Refuge, Alaska, showing flight areas labeled A, B, and C. Upper panel in geographic projection system and lower panels in UTM-6N projection system. All panels are in datum WGS84.
Remotesensing 13 01178 g001
Figure 2. Commissioning the HySpex system for airborne acquisition. (a) Refuge Found Bush Hawk; (b) VNIR-1800, SWIR-384, and IMU/GPS units mounted to vibration dampening plate; (c) field of view expander optic attached to VNIR-1800 and SWIR-384 sensors; (d) detail of HySpex sensors in aircraft; (e) overhead view of the system secured and operating in aircraft; (f) testing system visibility through floor aperture of the aircraft.
Figure 2. Commissioning the HySpex system for airborne acquisition. (a) Refuge Found Bush Hawk; (b) VNIR-1800, SWIR-384, and IMU/GPS units mounted to vibration dampening plate; (c) field of view expander optic attached to VNIR-1800 and SWIR-384 sensors; (d) detail of HySpex sensors in aircraft; (e) overhead view of the system secured and operating in aircraft; (f) testing system visibility through floor aperture of the aircraft.
Remotesensing 13 01178 g002
Figure 3. Example of a HySpex hypercube for the flight area C. A flight line hypercube with spatial dimensions, x and y, and spectral dimension, μ. Each spatial pixel in the flight line has a corresponding spectral signal from 0.4 to 2.5 μm. Dark horizontal regions in the spectral dimension were masked out due to water absorption bands.
Figure 3. Example of a HySpex hypercube for the flight area C. A flight line hypercube with spatial dimensions, x and y, and spectral dimension, μ. Each spatial pixel in the flight line has a corresponding spectral signal from 0.4 to 2.5 μm. Dark horizontal regions in the spectral dimension were masked out due to water absorption bands.
Remotesensing 13 01178 g003
Figure 4. An example of the effect of spectral binning on the spectral profile of a single pixel of vegetation. The top panel shows the unbinned profile with 457-bands and the bottom panel shows the profile from 2x-binned 229-bands. Note the increased noise in the unbinned profile (top panel).
Figure 4. An example of the effect of spectral binning on the spectral profile of a single pixel of vegetation. The top panel shows the unbinned profile with 457-bands and the bottom panel shows the profile from 2x-binned 229-bands. Note the increased noise in the unbinned profile (top panel).
Remotesensing 13 01178 g004
Figure 5. Sensor response drop on edge pixels of VNIR camera flight line, shown by red line.
Figure 5. Sensor response drop on edge pixels of VNIR camera flight line, shown by red line.
Remotesensing 13 01178 g005
Figure 6. True color (red: band 38 (652 nm); green: band 22 (0.550 nm); blue: band 11 (481 nm)) orthomosaics for areas A, B, and C at a 1 m spatial resolution. A flight area shows a darkened flight line due to a low cirrus undetected when acquiring the imagery. C flight area shows Bidirectional reflectance distribution function (BRDF) effect between two flight lines flown northeast to southwest where the flight line at the top of the image is darker for the same ground cover type. The position of the sun is shown below the north arrow for each image. Coordinates are in UTM-6N and datum is in WGS-84.
Figure 6. True color (red: band 38 (652 nm); green: band 22 (0.550 nm); blue: band 11 (481 nm)) orthomosaics for areas A, B, and C at a 1 m spatial resolution. A flight area shows a darkened flight line due to a low cirrus undetected when acquiring the imagery. C flight area shows Bidirectional reflectance distribution function (BRDF) effect between two flight lines flown northeast to southwest where the flight line at the top of the image is darker for the same ground cover type. The position of the sun is shown below the north arrow for each image. Coordinates are in UTM-6N and datum is in WGS-84.
Remotesensing 13 01178 g006
Figure 7. Averaged spectra extracted from training and test areas for all classification categories. Note that negative values for water were removed for better visual comparison of the spectra.
Figure 7. Averaged spectra extracted from training and test areas for all classification categories. Note that negative values for water were removed for better visual comparison of the spectra.
Remotesensing 13 01178 g007
Figure 8. Comparison of HySpex spectra (solid line) and spectral library data (dashed line) for deciduous (upper panel) and spruce (lower panel) categories. Atmospheric water vapor and noisy bands were removed from the spectral library data for a good visual comparison with the HySpex spectra. Deciduous spectral library data were only available up to 2300 nm.
Figure 8. Comparison of HySpex spectra (solid line) and spectral library data (dashed line) for deciduous (upper panel) and spruce (lower panel) categories. Atmospheric water vapor and noisy bands were removed from the spectral library data for a good visual comparison with the HySpex spectra. Deciduous spectral library data were only available up to 2300 nm.
Remotesensing 13 01178 g008
Figure 9. Wetland classification results for A area using a maximum likelihood classifier (right panel A), a Spectral Angle Mapper (SAM) classifier (middle panel B), and a hybrid classifier (left panel C).
Figure 9. Wetland classification results for A area using a maximum likelihood classifier (right panel A), a Spectral Angle Mapper (SAM) classifier (middle panel B), and a hybrid classifier (left panel C).
Remotesensing 13 01178 g009
Figure 10. Wetland classification results for B area using a maximum likelihood classifier (top panel A), a Spectral Angle Mapper (SAM) classifier (middle panel B), and a hybrid classifier (bottom panel C).
Figure 10. Wetland classification results for B area using a maximum likelihood classifier (top panel A), a Spectral Angle Mapper (SAM) classifier (middle panel B), and a hybrid classifier (bottom panel C).
Remotesensing 13 01178 g010
Table 1. Number of bands, spectral range, and spectral resolution before and after the binning process.
Table 1. Number of bands, spectral range, and spectral resolution before and after the binning process.
SensorBands Per HypercubeSpectral Range (nm) Spectral Resolution Per Band (nm)
Without Spectral BinningVNIR-18001–171416–9553.26
SWIR-384172–457960–25095.45
2x Spectral Binning VNIR-18001–85 418–9506.33
SWIR-38486–229957–250810.86
Table 2. Wetland mapping classes and description.
Table 2. Wetland mapping classes and description.
Class AttributeClass Description
Water-Areas of open water lacking emergent vegetation.
Equisetum spp. and emergent vegetation-Areas where perennial herbaceous vegetation accounts for 75–100% of the cover and the soil or substrate is periodically saturated with or covered with water.
Bog, grasses, and sedge-Areas characterized by natural herbaceous vegetation including grasses and forbs; herbaceous vegetation accounts for 75–100% of the cover.
White/black spruce-Areas of open or closed evergreen forest dominated by tree species (primarily Picea mariana and Picea glauca) that maintain their leaves all year, with a canopy that is never without green foliage.
Deciduous vegetation (including shrubs)-Areas dominated by trees tree species (primarily Betula neoalaskana and Populus tremuloides) and shrubs characterized by natural or semi-natural woody vegetation with aerial stems, generally less than 6 m tall, with individuals or clumps not touching to interlocking (including Salix spp., and Alnus spp.) that shed foliage simultaneously in response to seasonal change.
Bare ground-Areas characterized by bare rock, gravel, sand, silt, clay, or other earthen material, with little or no “green” vegetation present regardless of its inherent ability to support life. Vegetation, if present, was more widely spaced and scrubby than that in the “green” vegetated categories.
Table 3. Confusion matrix for area A maximum likelihood classification. Results in %.
Table 3. Confusion matrix for area A maximum likelihood classification. Results in %.
Wetlands Map to Test
Evaluation dataset abcdefTotalCommission errorUser’s accuracy
Water (a)100.00.00.00.00.00.0100.00.0100.0
Equisetum (b)0.096.00.04.00.00.0100.04.096.0
Bog (c)0.00.097.80.30.31.7100.02.297.8
Spruce (d)0.00.00.0100.00.00.0100.00.0100.0
Deciduous (e)0.00.05.56.188.30.0100.011.788.3
Bare ground (f)0.00.055.60.00.044.4100.055.644.4
Total100.0100.0100.0100.0100.0100.0
Omission error0.00.08.414.00.727.3 Kappa: 0.94
Producer’s accuracy100.0100.091.686.099.372.7
Table 4. Confusion matrix for area B maximum likelihood classification. Results in %.
Table 4. Confusion matrix for area B maximum likelihood classification. Results in %.
Wetlands Map to Test
Evaluation dataset abcdefTotalCommission errorUser’s accuracy
Water (a)100.00.00.00.00.00.0100.00.0100.0
Equisetum (b)100.00.00.00.00.00.0100.04.395.7
Bog (c)0.095.74.30.00.00.0100.01.198.9
Spruce (d)0.00.098.90.00.01.1100.00.0100.0
Deciduous (e)0.00.00.0100.00.00.0100.07.992.1
Bare ground (f)0.00.00.06.892.11.1100.00.0100.0
Total100.0100.0100.0100.0100.0100.0
Omission error0.60.03.113.70.021.4 Kappa: 0.96
Producer’s accuracy99.4100.096.986.3100.078.6
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Cristóbal, J.; Graham, P.; Prakash, A.; Buchhorn, M.; Gens, R.; Guldager, N.; Bertram, M. Airborne Hyperspectral Data Acquisition and Processing in the Arctic: A Pilot Study Using the Hyspex Imaging Spectrometer for Wetland Mapping. Remote Sens. 2021, 13, 1178. https://doi.org/10.3390/rs13061178

AMA Style

Cristóbal J, Graham P, Prakash A, Buchhorn M, Gens R, Guldager N, Bertram M. Airborne Hyperspectral Data Acquisition and Processing in the Arctic: A Pilot Study Using the Hyspex Imaging Spectrometer for Wetland Mapping. Remote Sensing. 2021; 13(6):1178. https://doi.org/10.3390/rs13061178

Chicago/Turabian Style

Cristóbal, Jordi, Patrick Graham, Anupma Prakash, Marcel Buchhorn, Rudi Gens, Nikki Guldager, and Mark Bertram. 2021. "Airborne Hyperspectral Data Acquisition and Processing in the Arctic: A Pilot Study Using the Hyspex Imaging Spectrometer for Wetland Mapping" Remote Sensing 13, no. 6: 1178. https://doi.org/10.3390/rs13061178

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop