Zum Hauptinhalt springen

Coupling UAV Hyperspectral and LiDAR Data for Mangrove Classification Using XGBoost in China’s Pinglu Canal Estuary

Ou, Jinhai ; Tian, Yichao ; et al.
In: Forests, Jg. 14 (2023-09-01), Heft 9, S. 1838-1838
Online academicJournal

Coupling UAV Hyperspectral and LiDAR Data for Mangrove Classification Using XGBoost in China's Pinglu Canal Estuary 

The fine classification of mangroves plays a crucial role in enhancing our understanding of their structural and functional aspects which has significant implications for biodiversity conservation, carbon sequestration, water quality enhancement, and sustainable development. Accurate classification aids in effective mangrove management, protection, and preservation of coastal ecosystems. Previous studies predominantly relied on passive optical remote sensing images as data sources for mangrove classification, often overlooking the intricate vertical structural complexities of mangrove species. In this study, we address this limitation by incorporating unmanned aerial vehicle-LiDAR (UAV-LiDAR) point cloud 3D data with UAV hyperspectral imagery to perform multivariate classification of mangrove species. Five distinct variable scenarios were employed: band characteristics (S1), vegetation index (S2), texture measures (S3), fused hyperspectral characteristics (S4), and a canopy height model (CHM) combined with UAV hyperspectral characteristics and LiDAR point cloud data (S5). To execute this classification task, an extreme gradient boosting (XGBoost) machine learning algorithm was employed. Our investigation focused on the estuary of the Pinglu Canal, situated within the Maowei Sea of the Beibu Gulf in China. By comparing the classification outcomes of the five variable scenarios, we assessed the unique contributions of each variable to the accurate classification of mangrove species. The findings underscore several key points: (1) The fusion of multiple features in the image scenario led to a higher overall accuracy (OA) compared to models that employed individual features. Specifically, scenario S4 achieved an OA of 88.48% and scenario S5 exhibited an even more impressive OA of 96.78%. These figures surpassed those of the individual feature models where the results were S1 (83.35%), S2 (83.55%), and S3 (71.28%). (2) Combining UAV hyperspectral and LiDAR-derived CHM data yielded improved accuracy in mangrove species classification. This fusion ultimately resulted in an OA of 96.78% and kappa coefficient of 95.96%. (3) Notably, the incorporation of data from individual bands and vegetation indices into texture measures can enhance the accuracy of mangrove species classification. The approach employed in this study—a combination of the XGBoost algorithm and the integration of UAV hyperspectral and CHM features from LiDAR point cloud data—proved to be highly effective and exhibited strong performance in classifying mangrove species. These findings lay a robust foundation for future research efforts focused on mangrove ecosystem services and ecological restoration of mangrove forests.

Keywords: UAV hyperspectral; UAV-LiDAR; XGBoost; mangrove species classification; Maowei Sea

1. Introduction

Mangroves are a specific type of tree that flourish in coastal areas and possess an array of ecological values, such as functioning as windbreaks, stabilizing shorelines, and purifying seawater. As complex coastal ecosystems, mangroves exhibit significant carbon sequestration capabilities, establishing themselves as vital blue carbon ecosystems and thereby playing an essential role in mitigating global warming [[1]]. Characterized by species and structural diversity, ecosystem complexity, and spatial and temporal variability, these factors collectively influence the growth of different mangrove species [[3], [5]]. Analyzing the distribution of species within mangrove ecosystems elucidates their biodiversity by evaluating biomass and carbon storage capacities [[7]]. However, the extent of mangrove ecosystems has gradually diminished in recent years owing to human activities, particularly those associated with agricultural production [[9]]. The protection and restoration of mangrove forests not only have profound implications for the ecological environment and economic development of nations but have also become the focal point of international concern. The Ramsar Convention on Wetlands (https://www.ramsar.org/, accessed on 12 June 2023) underscores the global importance of mangroves [[11]], and many countries have enacted protective measures for mangrove ecosystems [[12]]. In 2020, China proposed a targeted action plan for mangrove protection and restoration (2022–2025) [[14]], aiming to plant and restore 18,800 hectares (ha) of mangroves by 2025. Urgent research on mangrove species classification, along with an understanding of their structure and function, should provide a scientific foundation for their protection and restoration.

Various methods are available for investigating the diversity of mangrove species. The early traditional technique necessitated in-field classification, a process frequently found to be challenging, inefficient, and cumbersome, owing to the dense nature of mangroves and the uniqueness of their growth environments [[15]]. This direct approach could have unanticipated impacts on mangrove growth conditions. By contrast, remote sensing technology enables the collection of ground information without direct contact, providing multiband capability, efficiency, cost-effectiveness, and expansive coverage [[16]]. As a result, it has become widely utilized in mangrove monitoring [[11]]. Classifying mangrove forests through remote sensing imagery data not only enhances efficiency, but also minimizes manual intervention with forests, thereby averting adverse effects on mangrove ecological reserves. Common sources for mangrove classification include Landsat [[17]], IKONOS, QuickBird [[18]], SPOT [[19]], and other multispectral data, with bands ranging from four to seven. However, the complexity and diversity of mangrove communities may constrain multispectral characterization, yielding similar spectral responses among species [[20]], leading to potential misclassification. This challenge can be mitigated by incorporating hyperspectral remote sensing data [[22]].

Hyperspectral remote sensing images, characterized by substantial band information, high spectral resolution, and rich vegetation spectral reflectance, are applicable to relatively complex research areas [[23]]. In feature classification, hyperspectral data can offer elevated classification performance, particularly in the study of small-area features, thus breaking through the constraints of classification pixel size and exhibiting substantial potential [[24]]. For instance, Jia et al. [[24]] used hyperspectral data from the EO-1 Hyperion sensor and SPOT-5 data to classify mangrove forests, achieving an overall accuracy (OA) of 88%. However, this study was limited to a large area of 319.3 ha with a spatial resolution of 30 m × 30 m, hindering refined classification within smaller regions. Consequently, utilizing unmanned aerial vehicle (UAV) hyperspectral imagery for classifying mangrove species in confined areas has become an emerging research trend. Studies such as those by Yi et al. [[25]], who used UAV hyperspectral data with a 9 m × 9 m spatial resolution to classify mangroves in China's Shenzhen area, achieved an OA of 85.58%, and Hati et al. [[26]], who compared UAV-based hyperspectral data with spaceborne hyperspectral and multispectral data in mangrove classification achieved the highest accuracy of 87.61% with a 5 m × 5 m resolution. Unlike Yi et al. [[25]], Hati et al. [[26]] used high-resolution hyperspectral data to classify mangrove forests with a classification accuracy of 2.03%. These studies underscore the promising capabilities of hyperspectral data, especially UAV hyperspectral data, in achieving a highly precise classification of genera and species within complex mangrove forests.

Some researchers have begun to integrate multisource remote sensing data into their research and applications to enhance the precision and intricacy of mangrove classification [[27]]. With ongoing advancements in remote sensing imagery and the multifaceted nature of resolution selection, various types of remote sensing data sources, such as high-resolution remote sensing data and other multispectral data [[29]], can be chosen to synthesize and analyze mangrove attributes and transformations. The fusion of multiple data sources can offset the limitations inherent in individual data sources and provide more comprehensive and precise mangrove classification outcomes. Studies have shown that multisource remote sensing data can significantly augment mangrove classification accuracy. For example, Jiang et al. [[31]] utilized a combination of UAV-based imagery and WorldView-2 satellite imagery to classify mangroves and obtained a remarkable OA of 95.89%. Similarly, Cao et al. [[32]] used a fusion of UAV-based hyperspectral data and digital surface model (DSM) data for mangrove species classification, achieving an impressive OA of 97.22%. These results substantiate that UAV-based hyperspectral data can substantially enhance mangrove species classification accuracy and play a vital role in research.

The use of hyperspectral optical data has become inadequate for mangrove classification because of the intricate structure of mangrove species and high spectral resemblance among tree species. Moreover, the acquisition of optical data can be hampered by inclement weather conditions such as cloud cover and rain [[33]]. These constraints can be overcome by employing LiDAR, an active remote-sensing technology known for its high accuracy, resolution, and efficiency. LiDAR has become an integral component of remote sensing technology and has been widely adopted in mangrove monitoring because it can provide precise vertical 3D structural information [[34], [36]]. Studies such as those by Li et al. [[38]], who compared the contribution of image information from WordView-3 and LiDAR in mangrove classification, and Wang et al. [[39]], who combined Sentinel-2 data with LiDAR for mangrove species classification, found that LiDAR significantly improved the classification accuracy and solidified its essential role in the process. Additional investigations, such as those by Dian et al. [[40]], Naidoo et al. [[41]], and Cao et al. [[42]], further corroborate the importance of integrating LiDAR and optical data for mangrove species classification. They highlighted how the addition of LiDAR height feature information could enhance the classification accuracy and kappa coefficient.

Nevertheless, the prohibitive costs associated with acquiring UAV LiDAR data and UAV hyperspectral imagery have deterred many scholars from using these resources. The majority opt for using LiDAR-fused or UAV hyperspectral-fused multispectral data for mangrove categorization [[43]]. However, the synergistic employment of high-precision UAV hyperspectral and UAV LiDAR data for mangrove species classification remains relatively uncommon, resulting in an untapped potential in this field. This lack of integration hampers the ability to leverage the advantages of superior classification results. Therefore, the exploration and development of methodologies that combine UAV hyperspectral and UAV LiDAR data for mangrove species research are crucial for advancing this field.

Based on the available remote sensing image information, the methodology employed for classifying mangrove species is of critical significance. Some scholars have utilized RGB features of images, relying on visual interpretation for human classification, to visually assess and categorize mangrove forests [[45]]. However, this approach may result in misinterpretations, misclassifications, or omissions, stemming from the constraints posed by the low resolution of remote sensing images and the resemblances in symbiosis among mangrove species. The integration of machine learning algorithms with remote sensing images for species classification has emerged as a contemporary trend [[43]] because it can diminish the time spent on field investigation while providing a reliable and efficient solution. Many researchers have classified mangrove species by extracting vegetation indices or textural metrics from remote sensing images using machine-learning algorithms [[31], [41], [46]]. For example, Li et al. [[38]] exploited WordView-3 data fused with LiDAR data and applied support vector machine (SVM) and random forest (RF) algorithms, achieving high OAs of 88% and 87%, respectively. However, the 2 m × 2 m resolution was inadequate for more nuanced mangrove classification. Jiang et al. [[31]] combined hyperspectral and multispectral imagery using the RF algorithm and achieved an OA of 95.89%. Cao et al. [[32]] juxtaposed the performance of the K-nearest neighbors (KNN) and SVM algorithms, with SVM outperforming KNN by attaining a classification OA of 88.66% and a gain of 6.57%. Xu et al. [[47]] employed the extreme gradient boosting (XGBoost) algorithm to achieve a mangrove classification accuracy of 96.41%, surpassing that of Jiang et al. [[31]]. Fu et al. [[48]] examined five fundamental models (RF, XGBoost, LightGBM, CatBoost, and AdaBoost) in comparative machine learning algorithms, and ascertained that the XGBoost algorithm was suitable for mangrove classification, with the highest accuracy of 92.2%. Additionally, the XGBoost implementations by Fu et al. [[48]] and Xu et al. [[47]] demonstrated superior performance compared with those of Cao et al. [[32]]. XGBoost is a robust machine-learning algorithm extensively employed for classification, regression, and sorting tasks and is favored by researchers for its precision and computational efficiency [[49]]. Despite its popularity in inland tree classification [[50]], few studies have utilized XGBoost to classify mangrove species, particularly in the mangrove area of the Maowei Sea area of the Beibu Gulf. However, both Fu et al. [[48]] and Xu et al. [[47]] showed that the XGBoost algorithm can be effectively deployed for mangrove classification, exhibiting commendable performance.

Construction of the Pinglu Canal commenced in August 2022, with its estuary situated in the mangrove region of the Maowei Sea in the Beibu Gulf. Canal development inevitably induces alterations in the landscape along its banks and potentially harms adjacent unspoiled mangrove habitats [[52]]. Unfortunately, there has been a dearth of scholarly attention directed towards examining the distribution pattern of mangrove species in the estuary of the Pinglu Canal, leaving a void in basic information regarding the canal's impact on the local mangrove species. To fill this research gap, the present study leverages UAV-based hyperspectral data and LiDAR point cloud data, in conjunction with XGBoost, to undertake fine spatial resolution classification of mangrove species in the estuary of the Pinglu Canal in the Maowei Sea of the Beibu Gulf. The principal objectives of this study were to (1) evaluate the influence of diverse characterization variables on mangrove classification, and (2) probe the practicability of amalgamating UAV hyperspectral imagery with UAV-LiDAR data for mangrove classification.

2. Materials and Methods

2.1. Study Area

The study area is situated in the coastal region of Qinnan District, Qinzhou City, Guangxi, southwestern China (21°51′57″ N–21°51′43″ N, 108°34′41″ E–108°35′7″ E), encompassing a total area of 19.70 ha, as illustrated in Figure 1. This region is characterized by a South Asian tropical monsoon climate, with an average annual temperature of 22 °C. This locale functions as the estuary of the Pinglu Canal, where large expanses of mudflats have amassed, providing a natural environment conducive to mangrove proliferation. Local inhabitants exploit this area for oyster farming, consequently influencing the mangrove ecosystem in the vicinity [[54]].

Human activities related to production and daily life have had a substantial impact on mangroves. In 2005, this region was officially designated as a natural reserve specifically for mangroves, encompassing four types of mangrove species: Sonneratia apetala (SA) Linn., Acanthus ilicifolius (AI) Linn., Aegiceras corniculatum (AC) Linn., and Kandelia candel (KC) Linn., in addition to a non-mangrove species, Cyperus malaccensis (CM) Lam. Among these, SA is an introduced alien species noted for its rapid growth and robust vitality [[35]], typically overshadowing other mangroves. AI is multifunctional, playing an ecological role, while also being employed in traditional Chinese medicine for its antitussive and detoxifying properties. AC, with its elegant appearance, carries significant aesthetic value, whereas KC, distinguished by its thick branches and strong wind resistance, produces mature fruits that bear a resemblance to eggplants. CM, although not mangrove, coexists within a similar growth environment. Collectively, these four species within mangrove forests are widely dispersed along the coastline, serving as guardians of the Qinnan District and as natural sanctuaries for various coastal organisms.

2.2. Data Acquisition

2.2.1. Hyperspectral Data Acquisition

Data were acquired under clear and cloudless weather conditions on 15 December 2022. Mangrove forests were observed during the low-tide phase which facilitated data collection. Prior to the aerial survey, potential obstacles such as utility poles were identified to ensure flight safety. A DJI Matrice 300 quadcopter UAV equipped with a Roving Spectrum Hyperspectral Imager supplied by OptTrace Co. Ltd. (Hangzhou, China) was deployed. A total of 220 effective hyperspectral bands were captured at a spatial resolution of 0.1 m × 0.1 m and within a spectral range of 400–1000 nm (Figure 2). Table 1 lists the specific acquisition parameters. Post-acquisition image pre-processing included stitching, radiometric calibration, and atmospheric correction. The images were then resampled to a resolution of 0.5 m × 0.5 m through bilinear interpolation to optimize computational efficiency without compromising data integrity [[55]].

2.2.2. UAV-LiDAR Data Acquisition

UAV-LiDAR technology was employed to acquire real-time, high-precision structural parameters of mangrove forests [[35]]. An AS-900HL laser scanner system featuring a high-performance, high-precision, and stable sensor was utilized to capture LiDAR point cloud data. This system incorporates multiple echo technologies, enabling the penetration of LiDAR signals through the vegetation. The scanner supports both individual and dual antenna modes, facilitating the rapid acquisition of high-precision point cloud data.

Data collection was synchronized with hyperspectral data acquisition on 15 December 2022. The flight parameters included a speed of 6 m/s, an altitude of 150 m, and 80% and 70% overlap for the heading and evening directions, respectively. Pre-takeoff and post-landing standstill durations of three minutes were instituted to satisfy the data quality requirements. Subsequent data processing was performed using Inertial Explorer 8.9 software (NovAtel Inc., Calgary, AB, Canada) to calculate the position solution in the position and orientation system (POS), yielding high-precision navigation information. The data were further processed using Quick Terrain Modeler 8.2 software (Applied Imagery Inc., Chevy Chase, MD, USA) to generate point cloud products.

2.2.3. Mangrove Species Sample Data Acquisition

The acquisition of mangrove species sample data is critical for the accuracy of the classification model. Hand-held GPS devices—Jisibao UG802 (Beijing Unistrong Science & Technology Co., Ltd., Beijing, China) were utilized to obtain highly precise sample point coordinates that were accurate to eight decimal places for both latitude and longitude. The drones were deployed for this purpose. A total of 56 stations were selected, as shown in Figure 1c. The study area encompasses mangrove species SA, KC, AI, and AC, in addition to the non-mangrove species CM. Given the scarcity of SA, only 39 location pixels were obtained from three sites, whereas the more prevalent species, AC, yielded 11 sampling locations with 1683 pixels. Overall, 3193 quadrats, each measuring 0.5 m × 0.5 m, were acquired (Table 2). The sample data were partitioned using MATLAB 9.9 R2020B (Mathworks Inc., Natick, MA, USA); 70% was designated for model training, and the remaining 30% was designated for testing. To address the issue of data imbalance in model training, a sample weight adjustment technique was implemented.

2.3. Methods

The primary datasets employed for classification consisted of aerial UAV hyperspectral imagery and LiDAR point-cloud data. From the UAV hyperspectral data, we extracted 10 vegetation indices, seven texture measures, and 220 individual spectral bands pertaining to mangrove ecosystems. Concurrently, canopy height model (CHM) data were derived from the LiDAR point cloud dataset. The extracted feature parameters were amalgamated into five distinct scenarios. Subsequently, the XGBoost algorithm was deployed to execute species classification, drawing on a compiled dataset of mangrove species samples. Classification efficacy was assessed using an array of metrics, including OA, kappa coefficient, user accuracy (UA), and producer accuracy (PA). The evaluation was conducted using a confusion matrix methodology. Figure 3 illustrates the primary components and technical workflow of this study.

2.3.1. UAV-Hyperspectral Feature Extraction

The reflective properties of vegetation are influenced by various intrinsic attributes, such as leaf area index and foliar moisture content, thereby inducing disparities in spectral imagery. These variations provide unique characteristics for vegetation indices and texture measures [[56]], making them efficacious tools for species classification. Numerous studies on mangrove species classification underscore the robust utility of vegetation indices in this endeavor [[32], [41], [57]]. From the aerial UAV hyperspectral dataset, ten specific vegetation indices were extracted: blue green pigment index 2 (BGI2) [[59]], normalized difference vegetation index (NDVI) [[32]], formed difference vegetation index (RDVI) [[59]], transformed chlorophyll absorption in reflectance index (TCARI) [[60]], optimized soil-adjusted vegetation index (OSAVI) [[46]], modified chlorophyll absorption ratio index 1 (MCARI1) [[57]], modified chlorophyll absorption ratio index 2 (MCARI2) [[57]], photochemical reflectance index (PRI) [[61]], simple ratio index (SR) [[62]], and clumping index (CI) [[63]]. The mathematical formulations of these vegetation indices are presented in Table 3.

Texture measures provide localized spatial information by capturing tonal variations within an image [[64]]. Although the extraction of texture features from hyperspectral data could engender redundant band information, thereby attenuating the precision of classification models, this issue was mitigated by applying principal component analysis (PCA). PCA preserves the essential spectral information while simultaneously mitigating data redundancy and effectively performing a denoising function [[65]]. Prior to the extraction of the texture features, PCA was performed on 220 hyperspectral bands. The first principal component (PC1) accounted for a substantial 97.59% contribution rate, indicating that the preponderance of spectral information was retained within this component. The seven texture measures extracted based on PC1 were texture mean (TM), homogeneity (HOM), dissimilarity (Dis), correlation (Cor), variance (Var), entropy (Ent), and contrast (Con). The computational formulae for these texture measures are presented in Table 4.

2.3.2. UAV-LiDAR Height Feature Extraction

LiDAR data serve as an invaluable resource for assessing the vertical stratification of forest ecosystems [[67]]. The CHM, a pivotal parameter for gauging this vertical structure, is constructed by acquiring high-resolution terrain and vegetation height data through methodologies such as LiDAR or radar beam technologies. The CHM is ascertained by subtracting the DSM from the digital terrain model [[35]]. Within the CHM, each pixel value symbolizes the vertical height from the ground surface to the apex of vegetation at a given coordinate. This provides multifaceted insights into various forest attributes, including vertical structure [[68]], biomass [[35], [69]], vegetation types [[70]], and ecosystem services [[71]].

This study utilized the CHM with a spatial resolution of 0.5 m × 0.5 m derived from LiDAR point cloud data. CHM was acquired using the mangrove aboveground biomass retrieval system based on UAV-LiDAR, which is a laser imaging and ranging system (LiMARS, Qinzhou, China) that was independently developed by Professor Yichao Tian's research team from Beibu Gulf University, China [[35]]. During processing, LiMARS applied denoising techniques to the LiDAR data. It separated the point cloud data into ground and non-ground points, generating a digital elevation model for normalization. Subsequently, a DSM was generated. Finally, the LiMARS 1.0 software was used to produce the CHM [[35]] (Figure 4). This comprehensive workflow provided accurate and reliable vegetation height information for further analysis and research.

2.3.3. XGBoost Machine Learning Algorithm

XGBoost is an efficient gradient-boosting algorithm that is widely applied in machine learning and data mining [[56]]. The algorithm utilizes weak learners predicted on decision trees to accomplish gradient boosting. Uniquely tailored to handle high-dimensional and sparse datasets, XGBoost automatically eliminates redundant features and selects optimal computational parameters, thereby effectuating dimensionality reduction [[49]]. Within the XGBoost framework, the feature importance is ascertained, ranked, and normalized.

Hyperspectral remote sensing image classification often encounters challenges associated with data redundancy, owing to the myriad of correlated spectral bands. This surfeit of information can precipitate issues such as model overfitting and diminished prediction accuracy. To mitigate these limitations, this study leveraged the XGBoost library for training, prediction, feature importance calculation, and feature selection functionalities. The model configuration and validation were orchestrated using the scikit-learn library. To counteract imbalanced sample datasets, the "class_weight" parameter was set to "balanced", enabling XGBoost to adjust the class weights in accordance with the sample size of each class. Hyperparameter tuning was performed using a GridSearchCV that iteratively explored diverse hyperparameter configurations and trained and validated the model before selecting the optimal hyperparameter combination for enhanced performance. The XGBoost algorithm was configured using the following parameters: learning_rate = 0.1, eta = 0.1, max_depth = 5, min_child_weight = 3, and n_estimators = 300. To further refine the model, noise reduction techniques were applied to high-dimensional hyperspectral data. These adjustments were aimed at precluding model overfitting, while augmenting both the accuracy and mangrove classification capabilities of the model. The computational platform employed was an Intel Xeon Silver 4210 processor equipped with 64 GB RAM, and model training was conducted using Spyder 4.2.5 software (Open-source Python IDE, Spyder Community), and Python version 3.8.

2.3.4. Feature Variable Scenario Selection

In this study, different feature variables were used for prediction, and five different feature variable scenarios were established. Scenario 1 (S1) includes 220 individual-band hyperspectral data, S2 includes 10 vegetation indices generated from hyperspectral data, and S3 includes seven texture measures extracted from the first principal component of the hyperspectral data. S4 and S5 are the combined variables. S4 combines 220 individual hyperspectral bands, 10 textural vegetation indices, and seven textural indices. S5 adds active LiDAR data to S4, which is to say, the CHM data generated from the LiDAR data. The CHM data have a distinctly high characterization. In scenario S5, there are 238 variables.

Furthermore, based on the variable importance ranking of scenario S5 using the XGBoost algorithm, the importance of different variable classifications is discussed. XGBoost is based on decision trees and improves the prediction performance by iteratively adding trees [[49]]. In the model training process, the importance of the features is evaluated by calculating the information gain brought about by each feature split. Information gain represents the performance improvement of a model after segmentation using a feature [[73]]. Then, the information gain of each feature on all decision tree nodes is added to obtain the total information gain of each feature. To compare the relative importance of the different features, the total information gains are normalized to a percentage of the importance of the variable. Finally, the optimal model and the effects of individual and fused features on mangrove classification were evaluated by comparing the five scenarios from S1 to S5.

2.3.5. Accuracy Assessment

This study employs a confusion matrix to assess the accuracy and performance of the XGBoost algorithm in classifying various mangrove forest species across different scenarios. The aim is to evaluate the accuracy of the predictive model comprehensively [[74]]. The standard evaluation metrics derived from the confusion matrix include OA, which is the ratio of correctly classified samples to the total number of samples, and UA, which is the ratio of correctly classified samples to the total number of samples predicted to belong to a specific category by the classifier. PA is defined as the ratio of correctly classified samples to the total number of training samples in a given class. The kappa coefficient [[75]], which is a statistical measure used to assess classifier performance, considers the error introduced by random classification. This means that the classifier's accuracy surpasses the accuracy expected from random classification. These evaluation criteria yield values within the range of 0–1, where a value close to 1 signifies a high degree of classification consistency. To compute the confusion matrix, additional species data collected from the field samples were utilized as regions of interest. These samples were then compared against the classification outcomes to establish evaluation metrics for accuracy.

3. Results

3.1. Variable Importance Analysis

In this study, scenario S5 encompasses 238 features, encompassing all the attributes found in scenarios S1 through S4. Consequently, all features were incorporated into the XGBoost model and organized according to their respective degrees of importance. Subsequently, the top 30 most significant features were selected for visualization purposes (Figure 5). Within the framework of the mangrove classification model, the CHM attribute was the most prominent contributor. CHM provides insights into the spatial arrangement of mangrove forests, as derived from LiDAR data, and accounts for 3.03% of the influence of the overall feature set. Following CHM, TM assumed the second highest importance, with a contribution of 2.91%. TM represents the average gray value of the remotely sensed image and reflects the overall brightness of the image. "Near-Infrared Band 937" (NIR937) emerged as the third most crucial attribute, contributing to 2.03%. NIR937 operates at a wavelength of 937.72 nm and belongs to the near-infrared band.

Notably, among the top thirteen attributes in terms of importance, seven are related to texture measures. This observation indicates that texture measures play a substantial role in mangrove classification within the context of the fusion variables. Within the list of the thirty most significant variables, six correspond to vegetation indices: MCARI2, BGI2, CI, PRI, NDVI, and TCARI. The remaining variables within this subset pertain to the individual spectral bands.

3.2. Comparison of Mangrove Classification Accuracy

Subsequently, the OA and Kappa coefficients of scenarios S1–S5 were evaluated using the test samples, as presented in Table 5. Among the five feature-variable scenarios in this study, multi-feature fusion variable S5 exhibited the highest OA (96.78%, Kappa = 95.96%), surpassing S4 by 8.3% in OA and 10.54% in Kappa. The highest accuracy among the individual features was observed in S2 (OA = 83.55%, Kappa = 79.07%), which was similar to S1, but with only a marginal difference of 0.2% in OA and 0.26% in Kappa. A positive correlation was evident between the OA and Kappa coefficients across all feature scenarios, ranked from highest to lowest accuracy as follows: S5 > S4 > S2 > S1 > S3. Notably, the prediction accuracy of the seven texture features extracted from the first principal component, S3, exhibited a relatively low performance (OA = 71.28%, Kappa = 63.36%).

The introduction of multiple features considerably improved the accuracy of mangrove species prediction, particularly after incorporating LiDAR-derived CHM data. This enhancement led to the highest OA. This finding aligns with the outcomes of Cao's study [[42]], although their approach involved rotation forest and logistic model tree algorithms, resulting in less significant OA improvements of 2.09% and 2.78%, respectively. By contrast, this study utilizing the XGBoost algorithm, which is renowned for its adeptness in feature selection, yielded a more substantial accuracy improvement. The addition of CHM data to S5 resulted in an OA surpassing that of the individual feature scenarios S1, S2, and S3 by 13.43%, 12.43%, and 12.43%, respectively.

The UA and PA values of the classification results were calculated using a confusion matrix (Table 5). In scenarios S5 and S4, the UA reached 100% for categories SA and AI. Similarly, for other classifications, UA exceeded 93%. Notably, scenario S3, which employed an individual hyperspectral texture approach, demonstrated low PA values for the AI and KC. A visual representation of the normalized confusion matrices for each category across the five scenarios is presented (Figure 6). In S3, AI and CM experienced significant misclassifications, followed by AC and KC. In this study, the AI classification was notably affected by non-mangrove CM. An improvement in classification accuracy was observed with the addition of vegetation indices and individual hyperspectral bands to S4 based on S3. Furthermore, the incorporation of LiDAR-derived CHM data in S5, building on S4, significantly enhanced the model's ability to identify AI, CM, AC, and KC populations. Consequently, the UA and PA values exceeded 93.06%.

3.3. Mapping Mangrove Forests

The classification results for the five scenarios were visualized based on the model results (Figure 7). Notably, the S3 scenario displayed a limited distinction between the AI and KC categories, whereas CM exhibited subdivisions. The classification outcomes of S3 indicated fewer pixels classified as AI, KC, or CM. The densely clustered forest area of SA resulted from its more pronounced spectral characteristics and CHM data distinctions compared to those of other species. A closer examination of the localized classification outcomes (Figure 8) revealed that the multi-feature fusion variable classification showed enhanced visual performance. Conversely, individual feature forest scores demonstrated greater refinement in zoom area 1. Zooming further on the classification results for the SA area in zoom area 2, it became evident that the location of the SA was identifiable across all five scenarios. However, scenario S5 demonstrated an accurate identification of the extent of SA. In contrast, the other scenarios identified only a small number of pixels as SA. Within the study area, SA forms a concentrated forest stand, and its classification performance outperforms that of other mangrove species.

4. Discussion

4.1. Effects of Individual and Fused Feature Scenarios on Classification

In this study, three scenarios involving individual feature analysis and two scenarios employing fused-feature approaches were used as the foundational data for mangrove classification. When an individual UAV hyperspectral band data feature was employed in S1, the classification of mangrove species achieved high accuracy, akin to that achieved through the utilization of vegetation indices in S2. Notably, the accuracy is 0.2% higher than that observed for S1. Vegetation indices, which serve as crucial indicators of surface vegetation conditions, effectively reflect the growth, health, and coverage of mangrove vegetation by quantifying reflectance or radiation data across different spectral bands [[76]]. The performance of the vegetation indices in this study is noteworthy and plays a pivotal role in mangrove classification.

Texture features of an image also play a significant role in the classification process [[21]]. For instance, Jiang et al. [[31]] found that the influence of textural features on mangrove classification diminishes as the image resolution increases. It is essential to acknowledge that the efficacy of texture measures is influenced by the spatial resolution of the image, and inefficient texture measurements can lead to redundant information. Determining the optimal spatial resolution for employing texture measures is crucial [[77]]. In our study, a fixed spatial resolution of 0.5 m × 0.5 m was used, potentially limiting the overall effectiveness of texture-based classification. Interestingly, Jiang et al. [[31]] found that combining texture measures with vegetation indices tended to enhance classification accuracy. Our study corroborates this finding.

It is important to highlight that our analysis revealed that using individual texture features in S3, coupled with an image resolution of 0.5 m × 0.5 m, resulted in a lower OA and kappa coefficient of classification. Additionally, certain mangrove species (AI, CM, AC, and KC) were misclassified in the S3 scenario due to their similar spectral curves. Such misclassifications can be attributed to the challenges of optical remote sensing in accurately distinguishing species with similar spectral characteristics [[48], [58]], especially when non-mangrove species coexist within mangrove habitats [[31], [42]]. This underscores the complexity of employing high-resolution imagery alone for precise mangrove species classification, particularly when considering textural feature data. However, it is worth acknowledging that texture measures are sensitive to the spatial resolution of an image, and exploring the optimal resolution for their application remains a valuable avenue for future research.

The hyperspectral bands, vegetation indices, and texture measures were fused in S4 to further explore the potential of the fused features. The results revealed that fused features not only achieved higher accuracy than individual features but also reduced the likelihood of misclassifying species with similar spectral features. Normalizing the confusion matrix demonstrated that fused features could decrease the misclassification rate between AI and CM as well as improve PA and UA. Xu et al. [[47]] fused hyperspectral features and achieved a mangrove classification accuracy of 96.74%, surpassing individual band features, vegetation features, and texture features by 6.43%, 4.09%, and 4.78%, respectively. In our study, the accuracy of the fused feature S4 improved by 5.13%, 4.93%, and 17.2% compared with the individual features S1, S2, and S3, respectively. The substantial improvement in the texture feature accuracy in S3 could be attributed to the fact that only seven texture measures were considered, providing room for enhancement. In contrast, Xu et al. [[47]] employed 28 texture features with more variables, leading to a high classification accuracy of 91.69%. Remarkably, when LiDAR-derived CHM features were fused with S4, the final classification accuracy was equal to that reported by Xu et al. [[47]], who employed hyperspectral and LiDAR data. This finding substantiates the notion that fused features can compensate for the limitations of individual feature training datasets and enhance the classification accuracy. Upon evaluating the importance of the fused features, 238 features were deemed to contribute positively. Consequently, fused features exhibit substantial advantages over individual features in mangrove classification, offering favorable conditions for further accuracy improvement.

Mangrove forests thrive in coastal regions, with each mangrove species displaying distinct salt tolerance mechanisms and growth zones that are influenced by tidal variations [[78]]. In future classification endeavors, expanding the study area to encompass seawater salinity, soil salinity, and tidal fluctuations will enable the exploration of the effects of feature integration and mangrove species distribution in greater detail.

4.2. Contribution of LiDAR Data in Classification

LiDAR data, as an active remote sensing technique, are progressively gaining recognition for studying mangrove forest structures [[34]]. Each image of the derived CHM encapsulates valuable information regarding the vertical structure of the forest, including the height from the ground to the top of the vegetation at a specific location. These data offer insights into forest biomass, vegetation type, and ecosystem services [[80]]. Although previous studies primarily employed optical data for mangrove classification, the resulting accuracy mostly ranged between 70% and 89% [[24], [26]]. In this study, the integration of LiDAR-derived CHM data was applied in conjunction with UAV hyperspectral optical data to address the limitations of the latter and enhance the accuracy of mangrove forest classification. This integration yielded an impressive classification accuracy of 96.78%.

The significance of 238 variables was determined using the XGBoost algorithm, with the top five variables identified as CHM, TM, NIR937, Var, and MCARI2. CHM emerged as the variable with the highest contribution (3.03%). This finding underscores the pivotal role of LiDAR data in classification and aligns with the outcomes of prior research on mangrove classification [[31], [42]]. Cao et al. [[42]] also corroborated that extracting CHM information from LiDAR data greatly enhances species classification accuracy, thus highlighting the considerable advantages of incorporating LiDAR data into mangrove classification. Within the study area, the spectral features of the AC and KC, as well as the AI and CM, exhibited pairwise similarities. Furthermore, the AC and KC share similar heights. Scenario S5 achieved PA and UA values exceeding 93% by incorporating the CHM data into the classification process. This finding signifies that the inclusion of CHM data significantly amplifies the discrimination capabilities compared with scenarios devoid of CHM data. Non-native SA species were artificially planted along the Maowei Sea coastline in the west and the Kangxi Ling River estuary in the northwest. Its seeds were disseminated to the study area via seawater currents, resulting in a relatively modest number of SA trees. The mean CHM (Figure 4b) of the mangrove sample dataset revealed substantial discrepancies in the mean heights of mangrove species. Notably, SA exhibited the highest mean height (8.56 m), whereas the mean heights of AI and CM were similar, differing by a mean of 0.69 m. In the fused-feature scenario S5, the addition of CHM information notably enhanced the OA for multiple mangrove species. The UA of SA reached 100%, and this enhanced differentiation was visually discernible in the classification results (Figure 8, zoom area 2).

In contrast to conventional remote sensing data sources, the incorporation of LiDAR data enables us to move beyond relying solely on spectral information for species classification [[37]]. LiDAR data provide spatial and structural insights into mangrove forests, significantly augmenting the identification of mangrove species and showcasing tremendous potential across various facets of mangrove research. The features derived from LiDAR-acquired CHM data for SA are influenced by edge height, approximating the height of other tree species at the forest edge. Subsequent studies could incorporate additional data, such as LiDAR-derived DSMs [[32]], canopy cover [[81]], and leaf area index [[82]], to refine the accuracy of edge classification for different tree species.

4.3. Performance of XGBoost Algorithm Applied to Mangrove Classification

The inherent high dimensionality of hyperspectral data introduces the challenge of information redundancy [[83]], leading to an increased risk of misclassifying mangrove species and presenting a formidable hurdle for classification algorithms. Our data acquisition encompassed UAV hyperspectral and UAV-LiDAR point cloud data, yielding a substantial number of pixels at a resolution of 0.5 m × 0.5 m. To mitigate this issue, we employed the XGBoost algorithm with feature filtering. This approach facilitates the elimination of redundant information and prevents overfitting through iterative spanning-tree iterations [[49]]. Consequently, although the total pixel count in an individual layer might be as extensive as 787,405 pixels across a high-resolution image merged with 220 bands, 10 vegetation indices, seven texture measures, and LiDAR-derived CHM data, the pixel count generated by the XGBoost algorithm concurs with the input figures. Fu et al. [[48]] highlighted the merits of the XGBoost algorithm in terms of its high performance, notably surpassing alternative algorithms in their comparison. In our study, XGBoost was similarly employed. Although the comparison was indirect, it outperformed the KNN (OA = 82.09%) and SVM (OA = 88.66%) classification algorithms used by Cao et al. [[32]], showing superior performance.

In the realm of larger-scale studies, Zhang et al. [[84]] classified a mangrove forest spanning 797.58 ha, achieving an accuracy of 86.16% and a kappa coefficient of 0.8359 through optimal feature selection. Our study, encompassing a research area of 19.70 ha, achieved even higher classification accuracy (96.78%) and kappa coefficient (0.9596). This distinction may stem from the diverse remote sensing data sources employed in our study, specifically UAV hyperspectral and LiDAR data, with a resolution of 0.5 m × 0.5 m. In contrast, Zhang et al. [[84]] used multispectral and synthetic aperture radar (SAR) data with resolutions ranging from 2 m × 2 m to 8 m × 8 m. This divergence underscores the reliability of the XGBoost algorithm for handling extensive datasets and its proficiency in species classification across smaller areas of mangrove forests.

Gao et al. [[85]] combined WorldView-2 and Zhuhai-1 optical images to compare the XGBoost algorithm against the extreme random forest and successive projection algorithms. The XGBoost algorithm emerged as a superior performer (OA = 88.98%, kappa = 0.846). In our study, employing optical data and the XGBoost algorithm yielded highly accurate results (OA = 88.48%, Kappa = 0.8542), further confirming the stability of XGBoost for mangrove classification. Xu et al. [[47]] fused UAV hyperspectral and LiDAR data to classify mangrove forests using the XGBoost algorithm for mangrove forest classification, achieving a reasonable classification accuracy (OA = 96.41%), similar to our findings. These collective results further substantiate that the amalgamation of UAV hyperspectral and UAV LiDAR point cloud data using the XGBoost algorithm is well-suited for accurate mangrove species classification. The algorithm exhibits consistent performance in processing large-scale, high-resolution images, presenting a robust foundation for future investigations into long-term variations in mangrove species.

5. Conclusions

We used UAV hyperspectral and LiDAR point cloud data to enhance the classification of mangrove species within the estuary of the Pinglu Canal, Maowei Sea, Beibu Gulf, Guangxi, China. The XGBoost machine-learning algorithm was leveraged with multiple feature scenarios configured as training parameters. From this analysis, the following conclusions were drawn by evaluating the classification performance across various feature scenarios and incorporating LiDAR-derived CHM data:

  • (1) The integration of UAV-based hyperspectral data with LiDAR-derived CHM data has significant positive implications for the classification performance. The inclusion of CHM data effectively mitigates the risk of misclassifying mangrove species with similar spectral attributes. This study achieves the highest precision of 96.78% and a commendable kappa coefficient of 95.96%.
  • (2) The incorporation of texture measures leads to a reduction in the accuracy of mangrove species classification owing to the presence of redundant information. Enhancing the accuracy of mangrove species classification can be achieved by supplementing textural measures with data features derived from individual bands and vegetation indices.
  • (3) The XGBoost algorithm adeptly assesses the importance of features, producing highly accurate classification outcomes, while maintaining a stable operational framework. Its performance is exceptional, showing significant potential for widespread application in the field of mangrove classification.
Figures and Tables

Graph: Figure 1 The location of Guangxi Autonomous Region, China (a); The location of the estuary of the Pinglu Canal in the Maowei Sea of the Beibu Gulf and the scope of influence of the construction of the Pinglu Canal (b); the image of the study area and the location of the sample points acquired by UAV hyperspectral (R: band 92; G: band 53; B: band 28) (c). Note: SA: Sonneratia apetala Linn.; KC: Kandelia candel Linn.; AI: Acanthus ilicifolius Linn.; AC: Aegiceras corniculatum Linn.; CM: Cyperus malaccensis Lam.; MF: Mudflat.

Graph: Figure 2 3D composite of 220 UAV-hyperspectral bands (a) and UAV-hyperspectral curves of four mangrove species, Cyperus malaccensis, and mudflat extracted from 3193 sample pixels (b).

Graph: Figure 3 Workflow of this study.

Graph: Figure 4 CHM data derived from LiDAR, 3D vertical exaggeration 10% (a); CHM Mean Height (m) derived from LiDAR data of the sample (b).

Graph: Figure 5 Importance of the top 30 features in the XGBoost model, sorted in descending order (%).

Graph: Figure 6 Normalized Confusion Matrix for Mangrove Species Identification using XGBoost Algorithm in Various Feature Scenarios, including S1–S5 scenarios.

Graph: Figure 7 Classification maps of mangrove species using five feature scenarios S1 to S5 and an UAV image showing the zooming in areas for Figure 8.

Graph: Figure 8 Partially Magnified Classification Results showing mangroves growing along the mudflat (Zoom Area 1) and SA along with other mangrove growth areas (Zoom Area 2).

Table 1 Hyperspectral Image Specifications.

ParameterValues
Capture Date15 December 2022
Flight Altitude150 m
Spectral Range400–1000 nm
Spatial Resolution0.1 m × 0.1 m
Bands220
Flight Speed8 m/s

Table 2 Distribution of Sampling Stations and Sample Types in the Study Area.

Sample TypeSAAICMACKCMF
Number of stations36811208
pixel point391793491683322621
UAV aerial photography
UAV hyperspectral imagery

Table 3 Vegetation Indices Extracted from Hyperspectral Data.

Vegetation IndexFormulaReferences
Blue Green Pigment Index 2 (BGI2)

b200b165

[59]
Normalized Difference Vegetation Index (NDVI)

b75b121b75 + b121

[32]
Reformed Difference Vegetation Index (RDVI)

b75  b121b75 + b121

[59]
Transformed Chlorophyll Absorption in Reflectance Index (TCARI)

3×[(b110  b121)  0.2 × (b110  b165)×b110b121]

[60]
Optimized Soil Adjusted Vegetation Index (OSAVI)

(1 + 0.16) × (b75  b121)b75 + b121 + 0.16

[46]
Modified Chlorophyll Absorption Ratio Index 1 (MCARI1)

1.2 × [2.5 × (b75  b121)  1.3 × (b75  b165)]

[57]
Modified Chlorophyll Absorption Ratio Index 2 (MCARI2)

1.5 × [2.5 × (b75  b121)  1.3 × (b75  b165)](2 × b75 + 1)2  (6 × b75  5 × b121)  0.5

[57]
Photochemical Reflectance Index (PRI)

b168  b172b168 + b172

[61]
Simple Ratio Index (SR)

b75b121

[62]
Clumping Index (CI)

b55b99  1

[63]

Table 4 Texture Measures Generated from the First Principal Component of PCA.

Texture FeaturesFormulaReferences
Textural Mean (TM)

ui=iPi,j

[15]
Homogeneity (HOM)

Hom=ijP[i,j]1+(ij)2

Dissimilarity (Dis)

Dis=ijP[i,j]

Correlation (Cor)

Cor=ijijPi,juiujσiσj

Variance (Var)

σi2=i2Pi,jui2

Entropy (Ent)

Ent=ijPi,j,In,Pi,j

Contrast (Con)

Con=ijPij2Pi,j

Table 5 Confusion Matrix for Mangrove Classification in Various Feature Scenarios.

ScenariosS1S2S3S4S5
AccuracyPA (%)UA (%)PA (%)UA (%)PA (%)UA (%)PA (%)UA (%)PA (%)UA (%)
SA74.7597.3765.6698.4871.2199.374.7510088.38100
AI53.2591.1183.7793.4824.0338.5494.8110094.16100
CM96.2380.0091.9888.6466.0448.2897.6491.1994.8193.06
AC97.5367.8197.5368.2696.0262.0999.0575.8799.8197.23
KC56.3092.9252.0183.6224.1375.0061.9386.1998.1293.61
MF95.2398.4295.2397.2795.9995.8196.3798.8397.5298.27
OA (%)83.3583.5571.2888.4896.78
Kappa (%)78.8179.0763.3685.4295.96

Author Contributions

Conceptualization, Y.T. and J.O.; methodology, Y.T.; software, Q.Z. and X.X.; validation, Y.T., J.O., Y.Z. and J.L.; formal analysis, Y.T.; investigation, Y.T., Q.Z., J.O., X.X. and J.T.; resources, Y.T.; data curation, Y.Z. and J.O.; writing—original draft preparation, J.O.; writing—review and editing, Y.T.; visualization, J.O.; supervision, Y.T.; project administration, Y.T. All authors have read and agreed to the published version of the manuscript.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Footnotes 1 Disclaimer/Publisher's Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. References Inoue T. Carbon Sequestration in Mangroves. Blue Carbon in Shallow Coastal Ecosystems: Carbon Dynamics, Policy, and ImplementationKuwae T., Hori M. ; Springer: Singapore. 2019: 73-99. 9789811312953 2 Kathiresan K. Importance of Mangrove Ecosystem. Int. J. Mar. Sci. 2012; 2: 70-89. 10.5376/ijms.2012.02.0010 3 Wang M., Cao W., Jiang C., Yan Y., Guan Q. Potential Ecosystem Service Values of Mangrove Forests in Southeastern China Using High-Resolution Satellite Data. Estuar. Coast. Shelf Sci. 2018; 209: 30-40. 10.1016/j.ecss.2018.05.023 4 Flores-de-Santiago F., Kovacs J.M., Wang J., Flores-Verdugo F., Zhang C., González-Farías F. Examining the Influence of Seasonality, Condition, and Species Composition on Mangrove Leaf Pigment Contents and Laboratory Based Spectroscopy Data. Remote Sens. 2016; 8226. 10.3390/rs8030226 5 Magalhães I.A.L., de Carvalho Júnior O.A., de Carvalho O.L.F., de Albuquerque A.O., Hermuche P.M., Merino É.R., Gomes R.A.T., Guimarães R.F. Comparing Machine and Deep Learning Methods for the Phenology-Based Classification of Land Cover Types in the Amazon Biome Using Sentinel-1 Time Series. Remote Sens. 2022; 144858. 10.3390/rs14194858 6 Miao J., Zhen J., Wang J., Zhao D., Jiang X., Shen Z., Gao C., Wu G. Mapping Seasonal Leaf Nutrients of Mangrove with Sentinel-2 Images and XGBoost Method. Remote Sens. 2022; 143679. 10.3390/rs14153679 7 Pham T.D., Yokoya N., Bui D.T., Yoshino K., Friess D.A. Remote Sensing Approaches for Monitoring Mangrove Species, Structure, and Biomass: Opportunities and Challenges. Remote Sens. 2019; 11230. 10.3390/rs11030230 8 Tian Y., Huang H., Zhou G., Zhang Q., Xie X., Ou J., Zhang Y., Tao J., Lin J. Mangrove Biodiversity Assessment Using UAV Lidar and Hyperspectral Data in China's Pinglu Canal Estuary. Remote Sens. 2023; 152622. 10.3390/rs15102622 9 Bunting P., Rosenqvist A., Hilarides L., Lucas R.M., Thomas N., Tadono T., Worthington T.A., Spalding M., Murray N.J., Rebelo L.-M. Global Mangrove Extent Change 1996–2020: Global Mangrove Watch Version 3.0. Remote Sens. 2022; 143657. 10.3390/rs14153657 Damastuti E., de Groot R., Debrot A.O., Silvius M.J. Effectiveness of Community-Based Mangrove Management for Biodiversity Conservation: A Case Study from Central Java, Indonesia. Trees For. People. 2022; 7: 100202. 10.1016/j.tfp.2022.100202 Kuenzer C., Bluemel A., Gebhardt S., Quoc T.V., Dech S. Remote Sensing of Mangrove Ecosystems: A Review. Remote Sens. 2011; 3: 878-928. 10.3390/rs3050878 Rogers K., Boon P.I., Branigan S., Duke N.C., Field C.D., Fitzsimons J.A., Kirkman H., Mackenzie J.R., Saintilan N. The State of Legislation and Policy Protecting Australia's Mangrove and Salt Marsh and Their Ecosystem Services. Mar. Policy. 2016; 72: 139-155. 10.1016/j.marpol.2016.06.025 Sidik F., Supriyanto B., Krisnawati H., Muttaqin M.Z. Mangrove Conservation for Climate Change Mitigation in Indonesia. WIREs Clim. Chang. 2018; 9: e529. 10.1002/wcc.529 National Development and Reform Commission and Ministry of Natural Resources, PRC. National Master Plan for Major Ecosystem Protection and Restoration Projects (2020–2025). 2020Available online: http://www.gov.cn/zhengce/zhengceku/2020-08/29/content%5f5538354.htm(accessed on 28 April 2023) Tian Y., Huang H., Zhou G., Zhang Q., Tao J., Zhang Y., Lin J. Aboveground Mangrove Biomass Estimation in Beibu Gulf Using Machine Learning and UAV Remote Sensing. Sci. Total Environ. 2021; 781: 146816. 10.1016/j.scitotenv.2021.146816 Rogan J., Chen D. Remote Sensing Technology for Mapping and Monitoring Land-Cover and Land-Use Change. Prog. Plan. 2004; 61: 301-325. 10.1016/S0305-9006(03)00066-7 Ali A., Nayyar Z.A. Extraction of Mangrove Forest through Landsat 8 Mangrove Index (L8MI). Arab. J. Geosci. 2020; 13: 1132. 10.1007/s12517-020-06138-4 Wang L., Sousa W.P., Gong P., Biging G.S. Comparison of IKONOS and QuickBird Images for Mapping Mangrove Species on the Caribbean Coast of Panama. Remote Sens. Environ. 2004; 91: 432-440. 10.1016/j.rse.2004.04.005 Rahimizadeh N., Babaie Kafaky S., Sahebi M.R., Mataji A. Forest Structure Parameter Extraction Using SPOT-7 Satellite Data by Object- and Pixel-Based Classification Methods. Environ. Monit. Assess. 2019; 192: 43. 10.1007/s10661-019-8015-x Manna S., Raychaudhuri B. Mapping Distribution of Sundarban Mangroves Using Sentinel-2 Data and New Spectral Metric for Detecting Their Health Condition. Geocarto Int. 2020; 35: 434-452. 10.1080/10106049.2018.1520923 Huang X., Zhang L., Wang L. Evaluation of Morphological Texture Features for Mangrove Forest Mapping and Species Discrimination Using Multispectral IKONOS Imagery. IEEE Geosci. Remote Sens. Lett. 2009; 6: 393-397. 10.1109/LGRS.2009.2014398 Valderrama-Landeros L., Flores-de-Santiago F., Kovacs J.M., Flores-Verdugo F. An Assessment of Commonly Employed Satellite-Based Remote Sensors for Mapping Mangrove Species in Mexico Using an NDVI-Based Classification Scheme. Environ. Monit. Assess. 2018; 190: 1-13. 10.1007/s10661-017-6399-z Ramakrishnan D., Bharti R. Hyperspectral Remote Sensing and Geological Applications. Curr. Sci. 2015; 108: 879-891 Jia M., Zhang Y., Wang Z., Song K., Ren C. Mapping the Distribution of Mangrove Species in the Core Zone of Mai Po Marshes Nature Reserve, Hong Kong, Using Hyperspectral Data and High-Resolution Data. Int. J. Appl. Earth Obs. Geoinf. 2014; 33: 226-231. 10.1016/j.jag.2014.06.006 Yi L., Zhang G., Wei Z., Wang M., Liu J., Wang L. Mangrove Forest Species Classification Based on the UAV Hyperspectral Images. Bull. Surv. Mapp. 2022; 11: 26. 10.13474/j.cnki.11-2246.2022.0320 Prakash Hati J., Samanta S., Rani Chaube N., Misra A., Giri S., Pramanick N., Gupta K., Datta Majumdar S., Chanda A., Mukhopadhyay A. Mangrove Classification Using Airborne Hyperspectral AVIRIS-NG and Comparing with Other Spaceborne Hyperspectral and Multispectral Data. Egypt. J. Remote Sens. Space Sci. 2021; 24: 273-281. 10.1016/j.ejrs.2020.10.002 Zhang X., Treitz P.M., Chen D., Quan C., Shi L., Li X. Mapping Mangrove Forests Using Multi-Tidal Remotely-Sensed Data and a Decision-Tree-Based Procedure. Int. J. Appl. Earth Obs. Geoinf. 2017; 62: 201-214. 10.1016/j.jag.2017.06.010 Sakti A.D., Fauzi A.I., Wilwatikta F.N., Rajagukguk Y.S., Sudhana S.A., Yayusman L.F., Syahid L.N., Sritarapipat T., Principe J.A., Trang N.T.Q. Multi-Source Remote Sensing Data Product Analysis: Investigating Anthropogenic and Naturogenic Impacts on Mangroves in Southeast Asia. Remote Sens. 2020; 122720. 10.3390/rs12172720 Chen B., Huang B., Xu B. Multi-Source Remotely Sensed Data Fusion for Improving Land Cover Classification. J. Photogramm. Remote Sens. 2017; 124: 27-39. 10.1016/j.isprsjprs.2016.12.008 Zhang J. Multi-Source Remote Sensing Data Fusion: Status and Trends. Int. J. Image Data Fusion. 2010; 1: 5-24. 10.1080/19479830903561035 Jiang Y., Zhang L., Yan M., Qi J., Fu T., Fan S., Chen B. High-Resolution Mangrove Forests Classification with Machine Learning Using Worldview and UAV Hyperspectral Data. Remote Sens. 2021; 131529. 10.3390/rs13081529 Cao J., Leng W., Liu K., Liu L., He Z., Zhu Y. Object-Based Mangrove Species Classification Using Unmanned Aerial Vehicle Hyperspectral Images and Digital Surface Models. Remote Sens. 2018; 1089. 10.3390/rs10010089 Abdel-Hamid A., Dubovyk O., Abou El-Magd I., Menz G. Mapping Mangroves Extents on the Red Sea Coastline in Egypt Using Polarimetric SAR and High Resolution Optical Remote Sensing Data. Sustainability. 2018; 10646. 10.3390/su10030646 Luo S., Wang C., Pan F., Xi X., Li G., Nie S., Xia S. Estimation of Wetland Vegetation Height and Leaf Area Index Using Airborne Laser Scanning Data. Ecol. Indic. 2015; 48: 550-559. 10.1016/j.ecolind.2014.09.024 Tian Y., Zhang Q., Huang H., Huang Y., Tao J., Zhou G., Zhang Y., Yang Y., Lin J. Aboveground Biomass of Typical Invasive Mangroves and Its Distribution Patterns Using UAV-LiDAR Data in a Subtropical Estuary: Maoling River Estuary, Guangxi, China. Ecol. Indic. 2022; 136: 108694. 10.1016/j.ecolind.2022.108694 Wannasiri W., Nagai M., Honda K., Santitamnont P., Miphokasap P. Extraction of Mangrove Biophysical Parameters Using Airborne LiDAR. Remote Sens. 2013; 5: 1787-1808. 10.3390/rs5041787 Yin D., Wang L. Individual Mangrove Tree Measurement Using UAV-Based LiDAR Data: Possibilities and Challenges. Remote Sens. Environ. 2019; 223: 34-49. 10.1016/j.rse.2018.12.034 Li Q., Wong F.K.K., Fung T. Classification of Mangrove Species Using Combined WordView-3 and LiDAR Data in Mai Po Nature Reserve, Hong Kong. Remote Sens. 2019; 112114. 10.3390/rs11182114 Wang D., Wan B., Qiu P., Tan X., Zhang Q. Mapping Mangrove Species Using Combined UAV-LiDAR and Sentinel-2 Data: Feature Selection and Point Density Effects. Adv. Space Res. 2022; 69: 1494-1512. 10.1016/j.asr.2021.11.020 Dian Y., Pang Y., Dong Y., Li Z. Urban Tree Species Mapping Using Airborne LiDAR and Hyperspectral Data. J. Indian Soc. Remote Sens. 2016; 44: 595-603. 10.1007/s12524-015-0543-4 Naidoo L., Cho M.A., Mathieu R., Asner G. Classification of Savanna Tree Species, in the Greater Kruger National Park Region, by Integrating Hyperspectral and LiDAR Data in a Random Forest Data Mining Environment. ISPRS J. Photogramm. Remote Sens. 2012; 69: 167-179. 10.1016/j.isprsjprs.2012.03.005 Cao J., Liu K., Zhuo L., Liu L., Zhu Y., Peng L. Combining UAV-Based Hyperspectral and LiDAR Data for Mangrove Species Classification Using the Rotation Forest Algorithm. Int. J. Appl. Earth Obs. Geoinf. 2021; 102: 102414. 10.1016/j.jag.2021.102414 Wang L., Jia M., Yin D., Tian J. A Review of Remote Sensing for Mangrove Forests: 1956–2018. Remote Sens. Environ. 2019; 231: 111223. 10.1016/j.rse.2019.111223 Wang X., Tan L., Fan J. Performance Evaluation of Mangrove Species Classification Based on Multi-Source Remote Sensing Data Using Extremely Randomized Trees in Fucheng Town, Leizhou City, Guangdong Province. Remote Sens. 2023; 151386. 10.3390/rs15051386 Purwanto A.D., Asriningrum W. Identification of mangrove forests using multispectral satellite imageries. Int. J. Remote Sens. Earth Sci. 2019; 16: 63-86. 10.30536/j.ijreses.2019.v16.a3097 Behera M.D., Barnwal S., Paramanik S., Das P., Bhattyacharya B.K., Jagadish B., Roy P.S., Ghosh S.M., Behera S.K. Species-Level Classification and Mapping of a Mangrove Forest Using Random Forest—Utilisation of AVIRIS-NG and Sentinel Data. Remote Sens. 2021; 132027. 10.3390/rs13112027 Xu Y., Zhen J.N., Jiang X.P., Wang J.J. Mangrove Species Classification with UAV-Based Remote Sensing Data and XGBoost. Natl. Remote Sens. Bull. 2021; 25: 737-752. 10.11834/jrs.20210281 Fu B., He X., Yao H., Liang Y., Deng T., He H., Fan D., Lan G., He W. Comparison of RFE-DL and Stacking Ensemble Learning Algorithms for Classifying Mangrove Species on UAV Multispectral Images. Int. J. Appl. Earth Obs. Geoinf. 2022; 112: 102890. 10.1016/j.jag.2022.102890 Chen T., Guestrin C. XGBoost: A Scalable Tree Boosting System. Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. San Francisco, CA, USA. 13 August 2016: 785-794 Cai L., Wu D., Fang L., Zheng X. Tree Species Identification Using XGBoost Based on GF-2 Images. For. Resour. Wanagement. 2019; 5: 44. 10.13466/j.cnki.lyzygl.2019.05.009 Guimarães N., Pádua L., Sousa J.J., Bento A., Couto P. Almond Cultivar Identification Using Machine Learning Classifiers Applied to UAV-Based Multispectral Data. Int. J. Remote Sens. 2023; 44: 1533-1555. 10.1080/01431161.2023.2185913 Huang X., Yuan J., Wang X., Kwan K., Zhang Q. Dynamics of mangrove change: Insights from 30-year observations of Maowei Sea. J. Mar. Sci. 2022; 40: 132. 10.3969/j.issn.1001-909X.2022.03.012 Zhang N., Bao Z., Zhang L., Gao Y., Xiao Y., Cheng J., Han Z. Study on the Influence and Countermeasures of the Canal Projects on Aquatic Ecological Environment of Rivers along the Line. E3S Web Conf. 2023; 393: 01038. 10.1051/shsconf/202316201038 Wei Z. Analysis on the Relationship between Mangrove and Aquaculture in Maowei Sea Based on Object-Oriented Method. E3S Web Conf. 2020; 165: 03022. 10.1051/e3sconf/202016503022 Kamal M., Phinn S., Johansen K. Characterizing the Spatial Structure of Mangrove Features for Optimizing Image-Based Mangrove Mapping. Remote Sens. 2014; 6: 984-1006. 10.3390/rs6020984 Ahamed T., Tian L., Zhang Y., Ting K.C. A Review of Remote Sensing Methods for Biomass Feedstock Production. Biomass Bioenergy. 2011; 35: 2455-2469. 10.1016/j.biombioe.2011.02.028 Cao J., Liu K., Zhu Y., Li J., He Z. Identifying Mangrove Species Using Field Close-Range Snapshot Hyperspectral Imaging and Machine-Learning Techniques. Remote Sens. 2018; 102047. 10.3390/rs10122047 Wan L., Lin Y., Zhang H., Wang F., Liu M., Lin H. GF-5 Hyperspectral Data for Species Mapping of Mangrove in Mai Po, Hong Kong. Remote Sens. 2020; 12656. 10.3390/rs12040656 Zarco-Tejada P.J., Berjón A., López-Lozano R., Miller J.R., Martín P., Cachorro V., González M.R., de Frutos A. Assessing Vineyard Condition with Hyperspectral Indices: Leaf and Canopy Reflectance Simulation in a Row-Structured Discontinuous Canopy. Remote Sens. Environ. 2005; 99: 271-287. 10.1016/j.rse.2005.09.002 Zhang C., Chen K., Liu Y., Kovacs J.M., Flores-Verdugo F., Santiago F.J.F. de Spectral Response to Varying Levels of Leaf Pigments Collected from a Degraded Mangrove Forest. J. Appl. Remote Sens. 2012; 6: 063501. 10.1117/1.JRS.6.063501 Zhang J., Rivard B., Sánchez-Azofeifa A., Castro-Esau K. Intra- and Inter-Class Spectral Variability of Tropical Tree Species at La Selva, Costa Rica: Implications for Species Identification Using HYDICE Imagery. Remote Sens. Environ. 2006; 105: 129-141. 10.1016/j.rse.2006.06.010 Jain N., Ray S.S., Singh J.P., Panigrahy S. Use of Hyperspectral Data to Assess the Effects of Different Nitrogen Applications on a Potato Crop. Precis. Agric. 2007; 8: 225-239. 10.1007/s11119-007-9042-0 Fang H. Canopy Clumping Index (CI): A Review of Methods, Characteristics, and Applications. Agric. For. Meteorol. 2021; 303: 108374. 10.1016/j.agrformet.2021.108374 Tan Y., Xia W., Xu B., Bai L. Multi-Feature Classification Approach for High Spatial Resolution Hyperspectral Images. J. Indian Soc. Remote Sens. 2018; 46: 9-17. 10.1007/s12524-017-0663-0 Tian Y., Guo P., Lu H. Texture Feature Extraction of Multiband Remote Sensing Image Based on Gray Level Co-Occurrence Matrix. Comput. Sci. 2004; 31: 162-163 Luo G., Chen G., Tian L., Qin K., Qian S.-E. Minimum Noise Fraction versus Principal Component Analysis as a Preprocessing Step for Hyperspectral Imagery Denoising. Can. J. Remote Sens. 2016; 42: 106-116. 10.1080/07038992.2016.1160772 Lim K., Treitz P., Wulder M., St-Onge B., Flood M. LiDAR Remote Sensing of Forest Structure. Prog. Phys. Geogr. Earth Environ. 2003; 27: 88-106. 10.1191/0309133303pp360ra Vastaranta M., Wulder M.A., White J.C., Pekkarinen A., Tuominen S., Ginzler C., Kankare V., Holopainen M., Hyyppä J., Hyyppä H. Airborne Laser Scanning and Digital Stereo Imagery Measures of Forest Structure: Comparative Results and Implications to Forest Mapping and Inventory Update. Can. J. Remote Sens. 2013; 39: 382-395. 10.5589/m13-046 Fischer R., Knapp N., Bohn F., Shugart H.H., Huth A. The Relevance of Forest Structure for Biomass and Productivity in Temperate Forests: New Perspectives for Remote Sensing. Surv. Geophys. 2019; 40: 709-734. 10.1007/s10712-019-09519-x Chang L., Zhang Z., Li Y., Mao X. Multilevel Extraction of Vegetation Type Based on Airborne LiDAR Data. Can. J. Remote Sens. 2020; 46: 681-694. 10.1080/07038992.2020.1850248 Zięba-Kulawik K., Hawrylo P., Wężyk P., Matczak P., Przewoźna P., Inglot A., Mączka K. Improving Methods to Calculate the Loss of Ecosystem Services Provided by Urban Trees Using LiDAR and Aerial Orthophotos. Urban For. Urban Green. 2021; 63: 127195. 10.1016/j.ufug.2021.127195 Lafortezza R., Giannico V. Combining High-Resolution Images and LiDAR Data to Model Ecosystem Services Perception in Compact Urban Systems. Ecol. Indic. 2019; 96: 87-98. 10.1016/j.ecolind.2017.05.014 Lin N., Fu J., Jiang R., Li G., Yang Q. Lithological Classification by Hyperspectral Images Based on a Two-Layer XGBoost Model, Combined with a Greedy Algorithm. Remote Sens. 2023; 153764. 10.3390/rs15153764 Congalton R.G., Green K. Assessing the Accuracy of Remotely Sensed Data: Principles and Practices3rd ed.; CRC Press: Boca Raton, FL, USA. 2019. 978-0-429-05272-9 Torabzadeh H., Leiterer R., Hueni A., Schaepman M.E., Morsdorf F. Tree Species Classification in a Temperate Mixed Forest Using a Combination of Imaging Spectroscopy and Airborne Laser Scanning. Agric. For. Meteorol. 2019; 279: 107744. 10.1016/j.agrformet.2019.107744 Xue J., Su B. Significant Remote Sensing Vegetation Indices: A Review of Developments and Applications. J. Sens. 2017; 2017: e1353691. 10.1155/2017/1353691 Yang Y., Tian Q., Zhan Y., Tao B., Xu K. Effects of Spatial Resolution and Texture Features on Multi-Spectral Remote Sensing Classification. J. Geo-Inf. Sci. 2018; 20: 99-107. 10.12082/dqxxkx.2018.170177 Kao W.-Y., Shih C.-N., Tsai T.-T. Sensitivity to Chilling Temperatures and Distribution Differ in the Mangrove Species Kandelia Candel and Avicennia Marina. Tree Physiol. 2004; 24: 859-864. 10.1093/treephys/24.7.859 Muhtadi A., Yulianda F., Boer M., Krisanti M. Spatial Distribution of Mangroves in Tidal Lake Ecosystem. IOP Conf. Ser. Earth Environ. Sci. 2020; 454: 012131. 10.1088/1755-1315/454/1/012131 Chan E.P.Y., Fung T., Wong F.K.K. Estimating Above-Ground Biomass of Subtropical Forest Using Airborne LiDAR in Hong Kong. Sci. Rep. 2021; 11: 1751. 10.1038/s41598-021-81267-8 Qu Y., Shaker A., Silva C., Klauberg C., Pinagé E. Remote Sensing of Leaf Area Index from LiDAR Height Percentile Metrics and Comparison with MODIS Product in a Selectively Logged Tropical Forest Area in Eastern Amazonia. Remote Sens. 2018; 10970. 10.3390/rs10060970 Li Q., Wong F.K.K., Fung T., Brown L.A., Dash J. Assessment of Active LiDAR Data and Passive Optical Imagery for Double-Layered Mangrove Leaf Area Index Estimation: A Case Study in Mai Po, Hong Kong. Remote Sens. 2023; 152551. 10.3390/rs15102551 Kumar B., Dikshit O., Gupta A., Singh M.K. Feature Extraction for Hyperspectral Image Classification: A Review. Int. J. Remote Sens. 2020; 41: 6248-6287. 10.1080/01431161.2020.1736732 Zhang C., Ren G., Wu P., Hu Y., Ma Y., Yan Y., Zhang J. Mangrove Species Classification in the Hainan Bamen Bay Based on GF Optics and Fully Polarimetric SAR. J. Trop. Oceanogr. 2023; 42: 153-168. 10.11978/2022096 Gao C., Jiang X.P., Zhen J.N., Wang J., Guo G. Mangrove Species Classification with Combination of WorldView-2 and Zhuhai-1 Satellite Images. Natl. Remote Sens. Bull. 2022; 26: 1155-1168. 10.11834/jrs.20221487

By Jinhai Ou; Yichao Tian; Qiang Zhang; Xiaokui Xie; Yali Zhang; Jin Tao and Junliang Lin

Reported by Author; Author; Author; Author; Author; Author; Author

Titel:
Coupling UAV Hyperspectral and LiDAR Data for Mangrove Classification Using XGBoost in China’s Pinglu Canal Estuary
Autor/in / Beteiligte Person: Ou, Jinhai ; Tian, Yichao ; Zhang, Qiang ; Xie, Xiaokui ; Zhang, Yali ; Tao, Jin ; Lin, Junliang
Link:
Zeitschrift: Forests, Jg. 14 (2023-09-01), Heft 9, S. 1838-1838
Veröffentlichung: MDPI AG, 2023
Medientyp: academicJournal
ISSN: 1999-4907 (print)
DOI: 10.3390/f14091838
Schlagwort:
  • UAV hyperspectral
  • UAV-LiDAR
  • XGBoost
  • mangrove species classification
  • Maowei Sea
  • Plant ecology
  • QK900-989
Sonstiges:
  • Nachgewiesen in: Directory of Open Access Journals
  • Sprachen: English
  • Collection: LCC:Plant ecology
  • Document Type: article
  • File Description: electronic resource
  • Language: English

Klicken Sie ein Format an und speichern Sie dann die Daten oder geben Sie eine Empfänger-Adresse ein und lassen Sie sich per Email zusenden.

oder
oder

Wählen Sie das für Sie passende Zitationsformat und kopieren Sie es dann in die Zwischenablage, lassen es sich per Mail zusenden oder speichern es als PDF-Datei.

oder
oder

Bitte prüfen Sie, ob die Zitation formal korrekt ist, bevor Sie sie in einer Arbeit verwenden. Benutzen Sie gegebenenfalls den "Exportieren"-Dialog, wenn Sie ein Literaturverwaltungsprogramm verwenden und die Zitat-Angaben selbst formatieren wollen.

xs 0 - 576
sm 576 - 768
md 768 - 992
lg 992 - 1200
xl 1200 - 1366
xxl 1366 -