Treffer: 顾及地理位置特征的近海水深遥感反演方法.

Title:
顾及地理位置特征的近海水深遥感反演方法. (Chinese)
Alternate Title:
Remote sensing inversion method for offshore water depth considering geographical location characteristics. (English)
Source:
Journal of Remote Sensing; Aug2025, Vol. 29 Issue 8, p2559-2574, 16p
Database:
Complementary Index

Weitere Informationen

Bathymetric maps with high spatial resolution can display topographic details and provide data support for maritime navigation, coastline management, and marine resource utilization and development. This study conducted experiments in the sea areas of Weizhou Island in China and Molokai Island in USA. With the support of Sentinel-2 and Landsat-9 images, a water depth inversion method was proposed first, which incorporates geographic location features as modeling elements. Then, an optimal water depth inversion model based on a Back Propagation Neural Network (BPNN) was constructed. Finally, different remote sensing data were used to perform accuracy tests of the inversion method proposed in this paper in various sea areas. During model selection, machine learning models consistently outperformed empirical models across all accuracy metrics. The BPNN model exhibits the highest modeling accuracy in machine learning models. In addition, the machine learning model is more stable, its inversion of the water depth map can better invert the actual water depth change in the experimental area, and its inversion image is smoother. The introduction of geographic location features can significantly improve the accuracy of water depth inversion. Experimental results have shown that the inversion accuracy in the Weizhou Island was improved from an R <sup>2</sup> value of 0.7666 to 0.9952, and the RMSE was reduced from 2.5016 m to 0.3578 m. As a validation experiment, the R <sup>2</sup> value in the Molokai Island area was 0.9939, and the RMSE decreased from 3.0165 m to 1.0189 m. At the same time, the introduction of geographic location features can also eliminate the influence of some clouds and fog on remote sensing images, with more accurate water depth inversion results being generated. The conclusion demonstrated that, using all bands of the image for modeling, the inversion of the water depth map is smoother. This method can better invert regional bathymetry trends, with fewer outliers and more accurate inversion results. After geographic location features were incorporated, the addition of vegetation index features did not yield better results. Instead, it slightly decreased the modeling accuracy of the model. Therefore, analyzing the autocorrelation between each element and making comprehensive decisions on modeling factors are important. In summary, the water depth inversion model constructed in this study has high accuracy, strong reliability, and good portability. It can be effectively used to measure shallow sea depth. [ABSTRACT FROM AUTHOR]

高效准确地获取高空间分辨率的浅海水深能够为海上航运、海洋资源普查与保护等提供数据支持。本文提出了一种引入地理位置特征作为建模要素的水深反演方法,构建了基于反向传播神经网络BPNN(Back Propagation Neural Network)的水深反演模型,分别利用Sentinel-2、Landsat 9等不同的遥感影像,在中国涠洲岛海域和美国莫洛凯岛海域对本文提出的方法进行精度和适用性检验。结果表明:在模型筛选过程中发现,机器学习模型建模精度高于其他所有经验模型,其中BPNN模型建模精度最高。引入地理位置特征后可以很好地提高水深反演精度。实验验证结果发现:涠洲岛地区反演精度 R <sup>2</sup>从0.7666提升到0.9952,RMSE从2.5016 m减小到0.3578 m;莫洛凯岛地区 R <sup>2</sup>也达到了0.9939,RMSE从3.0165 m减小到1.0189 m。表明本文构建的水深反演模型精度高、可靠性强、可移植性好,可以有效地用于浅海测深。此外,在加入地理位置特征的同时加入植被指数特征并没有取得更好的结果,反而使模型的建模精度略有降低,说明盲目地增加建模要素并不能提高模型精度,应分析各要素之间的自相关性,进行综合分析取舍建模因子。 [ABSTRACT FROM AUTHOR]

Copyright of Journal of Remote Sensing is the property of Editorial Office of Journal of Remote Sensing & Science Publishing Co. and its content may not be copied or emailed to multiple sites without the copyright holder's express written permission. Additionally, content may not be used with any artificial intelligence tools or machine learning technologies. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)