UDAMSR Net: An Unsupervised Degradation-Aware Network for Enhancing the Spatial Resolution of Spectral Images for Crop Sensing

Weijie Tang , Ruomei Zhao , Hong Sun , Minzan Li , Lang Qiao , Mingjia Liu , Guohui Liu , Yang Liu , Di Song

Engineering ›› : 202601031

PDF
Engineering ›› :202601031 DOI: 10.1016/j.eng.2026.01.031
Article
research-article
UDAMSR Net: An Unsupervised Degradation-Aware Network for Enhancing the Spatial Resolution of Spectral Images for Crop Sensing
Author information +
History +
PDF

Abstract

Low spatial resolution (LR) remote sensing data is widely adopted because of its lower cost, although its limited analytical precision constrains its full use in precision agriculture. By contrast, the acquisition of high spatial resolution (HR) data often requires substantial expense. To address this limitation, this study proposes an unsupervised degradation-aware multi-channel super-resolution network (UDAMSR) to enhance LR spectral images without requiring paired HR-LR training data. The main contributions are as follows: ① the original framework is extended with dedicated queue and reconstruction layers to process multispectral and hyperspectral image (HIS) cubes, and a contrast-learning-based degradation-aware module is integrated to address unknown real-world degradation; ② comprehensive evaluation is conducted using image quality metrics, spectral consistency analysis, and performance in crop remote sensing tasks, such as chlorophyll content estimation; ③ the generalization capability of the model is assessed using data from three imaging devices, two spatial scales (near-ground and unmanned aerial vehicle (UAV)), and two geographic regions. The results show that the proposed method achieves the best overall performance in the comprehensive evaluation, with a mean peak signal-to-noise ratio ($\bar{PSNR}$) of 32.78, a mean root mean squared error ($\bar{RMSE}$) of 6.93, a mean structural similarity index ($\bar{SSIM}$) of 0.89, and a mean spectral angle mapper ($\bar{SAM}$) of 0.131. The method effectively reduces the degradation in chlorophyll detection accuracy caused by spatial resolution reduction. The evaluation of generalization capability further shows that the proposed method demonstrates strong generalization across different spatial scales, geographic regions, devices, and data types. These results indicate that UDAMSR provides a robust, efficient, and cost-effective software solution that can compensate for hardware limitations and support high-quality crop phenotyping detection in diverse application scenarios.

Keywords

UDAMSR Net / Spatial resolution enhancement / Remote sensing image / Crop phenotyping / Crop sensing / Transfer learning

Cite this article

Download citation ▾
Weijie Tang, Ruomei Zhao, Hong Sun, Minzan Li, Lang Qiao, Mingjia Liu, Guohui Liu, Yang Liu, Di Song. UDAMSR Net: An Unsupervised Degradation-Aware Network for Enhancing the Spatial Resolution of Spectral Images for Crop Sensing. Engineering 202601031 DOI:10.1016/j.eng.2026.01.031

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

Morisse M, Wells DM, Millet EJ, Lillemo M, Fahrner S, Cellini F, et al. A European perspective on opportunities and demands for field-based crop phenotyping. Field Crop Res 2022; 276:108371.

[2]

Gao D, Li M, Zhang J, Song D, Sun H, Qiao L, et al. Improvement of chlorophyll content estimation on maize leaf by vein removal in hyperspectral image. Comput Electron Agr 2021; 184:106077.

[3]

Celis J, Xiao X, Wagle P, Basara J, McCarthy H, Souza L. A comparison of moderate and high spatial resolution satellite data for modeling gross primary production and transpiration of native prairie, alfalfa, and winter wheat. Agr Forest Meteorol 2024; 344:109797.

[4]

He L, Magney T, Dutta D, Yin Y, Köhler P, Grossmann K, et al. From the ground to space: using solar-induced chlorophyll fluorescence to estimate crop productivity. Geophys Res Lett 2020; 47(7):e2020GL087474.

[5]

Zhu J, Yin Y, Lu J, Warner TA, Xu X, Lyu M, et al. The relationship between wheat yield and sun-induced chlorophyll fluorescence from continuous measurements over the growing season. Remote Sens Environ 2023; 298:113791.

[6]

Barrett CB. Overcoming global food security challenges through science and solidarity. Am J Agr Econ 2021; 103(2):422-47.

[7]

Tilman D, Balzer C, Hill J, Befort BL. Global food demand and the sustainable intensification of agriculture. Proc Natl Acad Sci USA 2011; 108(50):20260-4.

[8]

Pan Y, Wang X, Zhang L, Zhong Y. E2EVAP: end-to-end vectorization of smallholder agricultural parcel boundaries from high-resolution remote sensing imagery. ISPRS J Photogramm Remote Sens 2023; 203:246-64.

[9]

Zhou G, Qian L, Gamba P. A novel iterative self-organizing pixel matrix entanglement classifier for remote sensing imagery. IEEE Trans Geosci Remote Sens 2024; 62:1-21.

[10]

Zhou G, Song N, Jia G, Wu J, Gao K, Huang J. Adaptive adjustment for laser energy and PMT gain through self-feedback of echo data in bathymetric LiDAR. IEEE Trans Geosci Remote Sens 2024; 62:1-22.

[11]

Xu X, Fu X, Zhao H, Liu M, Xu A, Ma Y. Three-dimensional reconstruction and geometric morphology analysis of lunar small craters within the patrol range of the Yutu-2 rover. Remote Sens 2023; 15(17):4251.

[12]

Zhang Z, Xu Y, Song J, Zou Q, Rasol J, Ma L. Planet craters detection based on unsupervised domain adaptation. IEEE Trans Aero Elec Sys 2023; 59(5):7140-52.

[13]

Yuan Y, Qin G, Li D, Zhou M, Shen YC, Ouyang Y. Real-time joint filtering of gravity and gravity gradient data based on improved Kalman filter. IEEE Trans Geosci Remote Sens 2024; 62:1-12.

[14]

Espejo-Garcia B, Mylonas N, Athanasakos L, Fountas S, Vasilakoglou I. Towards weeds identification assistance through transfer learning. Comput Electron Agr 2020; 171:105306.

[15]

Modi RU, Chandel AK, Chandel NS, Dubey K, Subeesh A, Singh AK, et al. State-of-the-art computer vision techniques for automated sugarcane lodging classification. Field Crop Res 2023; 291:108797.

[16]

Li B, Xu X, Zhang L, Han J, Bian C, Li G, et al. Above-ground biomass estimation and yield prediction in potato by using UAV-based RGB and hyperspectral imaging. ISPRS J Photogramm Remote Sens 2020; 162:161-72.

[17]

Liao Z, Dai Y, Wang H, Ketterings QM, Lu J, Zhang F, et al. A double-layer model for improving the estimation of wheat canopy nitrogen content from unmanned aerial vehicle multispectral imagery. J Integr Agr 2023; 22(7):2248-70.

[18]

Qiao L, Tang W, Gao D, Zhao R, An L, Li M, et al. UAV-based chlorophyll content estimation by evaluating vegetation index responses under different crop coverages. Comput Electron Agr 2022; 196:106775.

[19]

Tang W, Wang N, Zhao R, Li M, Sun H, An L, et al. Chlorophyll detector development based on snapshot-mosaic multispectral image sensing and field wheat canopy processing. Comput Electron Agr 2022; 197:106999.

[20]

Wang Y, Suarez L, Poblete T, Gonzalez-Dugo V, Ryu D, Zarco-Tejada PJ. Evaluating the role of solar-induced fluorescence (SIF) and plant physiological traits for leaf nitrogen assessment in almond using airborne hyperspectral imagery. Remote Sens Environ 2022; 279:113141.

[21]

Zhao R, Tang W, An L, Qiao L, Wang N, Sun H, et al. Solar-induced chlorophyll fluorescence extraction based on heterogeneous light distribution for improving in-situ chlorophyll content estimation. Comput Electron Agr 2023; 215:108405.

[22]

Xie Y, Plett D, Evans M, Garrard T, Butt M, Clarke K, et al. Hyperspectral imaging detects biological stress of wheat for early diagnosis of crown rot disease. Comput Electron Agr 2024; 217:108571.

[23]

Jia J, Chen J, Zheng X, Wang Y, Guo S, Sun H. Tradeoffs in the spatial and spectral resolution of airborne hyperspectral imaging systems: a crop identification case study. IEEE Trans Geosci Remote Sens 2022; 60:1-18.

[24]

Jia J, Zheng X, Wang Y, Chen Y, Karjalainen M, Dong S, et al. The effect of artificial intelligence evolving on hyperspectral imagery with different signal-to-noise ratio, spectral and spatial resolutions. Remote Sens Environ 2024; 311:114291.

[25]

Zhu Q, Guo X, Deng W, Shi S, Guan Q, Zhong Y, et al. Land-use/Land-cover change detection based on a Siamese global learning framework for high spatial resolution remote sensing imagery. ISPRS J Photogramm Remote Sens 2022; 184:63-78.

[26]

Kamei M. Effect of image resolution on automatic detection of whitefly (Hemiptera: Aleyrodidae) species on tomato leaflets using deep learning. Smart Agr Technol 2023; 6:100372.

[27]

Rocchini D. Effects of spatial and spectral resolution in estimating ecosystem α-diversity by satellite imagery. Remote Sens Environ 2007; 111(4):423-34.

[28]

Helfenstein IS, Schneider FD, Schaepman ME, Morsdorf F. Assessing biodiversity from space: impact of spatial and spectral resolution on trait-based functional diversity. Remote Sens Environ 2022; 275:113024.

[29]

Li M, Jia T, Wang H, Ma B, Lu H, Li S. AO-DETR: anti-overlapping DETR for X-Ray prohibited items detection. IEEE Trans Neur Net Lear 2025; 36(7):12076-90.

[30]

Wang W, Yin B, Li L, Li L, Liu H. A low light image enhancement method based on dehazing physical model. Comput Model Eng Sci 2025; 143(2):1595-616.

[31]

Xiao Y, Yuan Q, Jiang K, He J, Wang Y, Zhang L. From degrade to upgrade: Learning a self-supervised degradation guided adaptive network for blind remote sensing image super-resolution. Inform Fusion 2023; 96:297-311.

[32]

Ma X, Ding J, Sun H, Lu L, Cheng X, Zhang F, et al. A new computationally efficient algorithm to generate global fractional vegetation cover from Sentinel-2 imagery at 10 m resolution. Int J Digit Earth 2024; 17(1):2344592.

[33]

Li H, Xia J, Yang Z, Pan F, Liu Z, Liu Y. Meta-learning based domain prior with application to optical-ISAR image translation. IEEE Trans Circ Syst Vid 2024; 34(8):7041-56.

[34]

Zhou G, Liu W, Zhu Q, Lu Y, Liu Y. ECA-MobileNetV3(Large)+SegNet model for binary sugarcane classification of remotely sensed images. IEEE Trans Geosci Remote Sens 2022; 60:1-15.

[35]

Thomas C, Ranchin T, Wald L, Chanussot J. Synthesis of multispectral images to high spatial resolution: a critical review of fusion methods based on remote sensing physics. IEEE Trans Geosci Remote Sens 2008; 46(5):1301-12.

[36]

Sihvonen T, Duma Z, Haario H, Reinikainen S. Spectral profile partial least-squares (SP-PLS): local multivariate pansharpening on spectral profiles. ISPRS J Photogramm Remote Sens 2023; 10:100049.

[37]

Ma J, Yu W, Chen C, Liang P, Guo X, Jiang J. Pan-GAN: an unsupervised pan-sharpening method for remote sensing image fusion. Inform Fusion 2020; 62:110-20.

[38]

Xu H, Ma J, Shao Z, Zhang H, Jiang J, Guo X. SDPNet: a deep network for pan-sharpening with enhanced information representation. IEEE Trans Geosci Remote Sens 2021; 59(5):4120-34.

[39]

He J, Yuan Q, Li J, Xiao Y, Zhang L. A self-supervised remote sensing image fusion framework with dual-stage self-learning and spectral super-resolution injection. ISPRS J Photogramm Remote Sens 2023; 204:131-44.

[40]

Kong J, Ryu Y, Jeong S, Zhong Z, Choi W, Kim J, et al. Super resolution of historic Landsat imagery using a dual generative adversarial network (GAN) model with CubeSat constellation imagery for spatially enhanced long-term vegetation monitoring. ISPRS J Photogramm Remote Sens 2023; 200:1-23.

[41]

Yang D, Li Z, Xia Y, Chen Z. Remote sensing image super-resolution:challenges and approaches. In:Proceedings of the 2015 IEEE International Conference on Digital Signal Processing (DSP); 2015 Jun 21-24; Singapore City, Singapore. New York City:IEEE; 2015. p. 196-200.

[42]

Lanaras C, Bioucas-Dias J, Galliani S, Baltsavias E, Schindler K. Super-resolution of Sentinel-2 images: learning a globally applicable deep neural network. ISPRS J Photogramm Remote Sens 2018; 146:305-19.

[43]

Lei S, Shi Z, Zou Z. Super-resolution for remote sensing images via local-global combined network. IEEE Geosci Remote Sens Let 2017; 14(8):1243-7.

[44]

Kim J, Lee JK, Lee KM. Accurate image super-resolution using very deep convolutional networks. In:Proceedings of the IEEE Conference on Computer Vision And Pattern Recognition (CVPR); 2016 Jun 27-30; Las Vegas, NV, USA. New York City:IEEE; 2016. p. 27-30.

[45]

Lanaras C, Bioucas-Dias J, Galliani S, Baltsavias E, Schindler K. Super-resolution of Sentinel-2 images: learning a globally applicable deep neural network. ISPRS J Photogramm Remote Sens 2018; 146:305-19.

[46]

Gu J, Sun X, Zhang Y, Fu K, Wang L. Deep residual squeeze and excitation network for remote sensing image super-resolution. Remote Sens 2019; 11(15):1817.

[47]

Dong X, Xi Z, Sun X, Yang L. Remote sensing image super-resolution via enhanced back-projection networks. In:Proceedings of the IGARSS 2020-2020 IEEE International Geoscience and Remote Sensing Symposium; 2020 Sep 26-Oct 2; Waikoloa, HI, USA. New York City:IEEE; 2021. p. 1480-3.

[48]

Wang J, Shao Z, Huang X, Lu T, Zhang R, Ma J. Enhanced image prior for unsupervised remoting sensing super-resolution. Neural Networks 2021; 143:400-12.

[49]

Zhuang F, Qi Z, Duan K, Xi D, Zhu Y, Zhu H. A comprehensive survey on transfer learning. Pro IEEE. 2021; 109(1):43-76.

[50]

Zhao Z, Alzubaidi L, Zhang J, Duan Y, Gu Y. A comparison review of transfer learning and self-supervised learning: definitions, applications, advantages and limitations. Expert Syst Appl 2024; 242:122807.

[51]

Ma Y, Chen S, Ermon S, Lobell DB.Transfer learning in environmental remote sensing. Remote Sens Environ 2024; 301:113924.

[52]

Zhang Y, Hui J, Qin Q, Sun Y, Zhang T, Sun H, et al. Transfer-learning-based approach for leaf chlorophyll content estimation of winter wheat from hyperspectral data. Remote Sens Environ 2021; 267:112724.

[53]

Priyatikanto R, Lu Y, Dash J, Sheffield J. Improving generalisability and transferability of machine-learning-based maize yield prediction model through domain adaptation. Agr Forest Meteorol 2023; 341:109652.

[54]

Zhou J, Yang Q, Liu L, Kang Y, Jia X, Chen M, et al. A deep transfer learning framework for mapping high spatiotemporal resolution LAI. ISPRS J Photogramm Remote Sens 2023; 206:30-48.

[55]

Wang L, Wang Y, Dong X, Xu Q, Yang J, An W, et al. Unsupervised degradation representation learning for blind super-resolution. In:Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition; 2021 Jun 20-25; Nashville, TN, USA. New York City:IEEE; 2021. p. 10581-90.

[56]

Chen J, Song Y, Li D, Lin X, Zhou S, Xu W. Specular removal of industrial metal objects without changing lighting configuration. IEEE Trans Ind Inform 2024; 20(3):3144-53.

[57]

Wang Z, Zhang C, Chen Z, Hu W, Lu K, Ge L. ACR-Net: learning high-accuracy optical flow via adaptive-aware correlation recurrent network. IEEE Trans Circ Syst Vid 2024; 34(10):9064-77.

[58]

Tu B, Yang X, He B, Chen Y, Li J, Plaza A. Anomaly detection in hyperspectral images using adaptive graph frequency location. IEEE Trans Neur Net Lear 2025; 36(7):12565-79.

[59]

Wang L, Fu Q, Zhu R, Liu N, Shi H, Liu Z, et al. Research on high precision localization of space target with multi-sensor association. Opt Laser Eng 2025; 184:108553.

[60]

Chen J, Wang J, Wang J, Bai L. Joint fairness and efficiency optimization for CSMA/CA-based multi-user MIMO UAV Ad Hoc networks. IEEE J Sel Top Signal Process 2024; 18(7):1311-23.

[61]

He K, Fan H, Wu Y, Xie S, Girshick R. Momentum contrast for unsupervised visual representation learning. In:Proceedings of the IEEE/CVF conference on computer vision and pattern recognition; 2020 Jun 13-19; Seattle, WA, USA. New York City:IEEE; 2020. p. 9726-35.

[62]

He Q, Zhan J, Liu X, Dong C, Tian D, Fu Q. Multispectral polarimetric bidirectional reflectance research of plant canopy. Opt Laser Eng 2025; 184:108688.

[63]

Zhang R, Wang Y, Li Z, Ding F, Wei C, Wu M. Online adaptive keypoint extraction for visual odometry across different scenes. IEEE Robot Autom Let 2025; 10(7):7539-46.

[64]

Layati E, El Ghachi M. Oued Lakhdar watershed (Morocco), monitoring land use/cover changes: remote sensing and GIS approach. Geol Ecol Landsc 2025; 9(4)1286-98.

[65]

Li R, Wang Y, Sun S, Zhang Y, Ding F, Gao H. UE-extractor: a grid-to-point ground extraction framework for unstructured environments using adaptive grid projection. IEEE Robot Autom Let 2025; 10(6):5991-8.

[66]

Chen X, Jing R. Video super resolution based on deformable 3D convolutional group fusion. Sci Rep 2025; 15(1):9050.

[67]

Kruse FA, Lefkoff AB, Boardman JW, Heidebrecht KB, Shapiro AT, Barloon PJ, et al. The spectral image processing system (SIPS)—interactive visualization and analysis of imaging spectrometer data. Remote Sens Environ 1993; 44(2):145-63.

[68]

Chen G, Jia Y, Yin Y, Fu S, Liu D, Wang T. Remote sensing image dehazing using a wavelet-based generative adversarial networks. Sci Rep 2025; 15(1):3634.

[69]

Zhong F, Gao F, Liu T, Wang J, Sun J, Zhou H. Scattering characteristics guided network for ISAR space target component segmentation. IEEE Geosci Remote Sens Let 2025; 22:1-5.

[70]

Guan Y, Ding Y, Fang Y, Li J, Liu Y, Wang R, et al. Far-field femtosecond laser-driven λ/73 super-resolution fabrication of 2D van der Waals NbOI2 nanostructures in ambient air. Nat Commun 2025; 16(1):4149.

[71]

Cai H, Wang Y, Luo Y, Mao K. A dual-channel collaborative transformer for continual learning. Appl Soft Comput 2025; 171:112792.

[72]

Deng J, Liu S, Chen H, Chang Y, Yu Y, Ma W. A precise method for identifying 3-D circles in freeform surface point clouds. IEEE Trans Instrum Meas 2025; 74:1-13.

[73]

Jiang D, Wang H, Li T, Gouda MA, Zhou B. Real-time tracker of chicken for poultry based on attention mechanism-enhanced YOLO-Chicken algorithm. Comput Electron Agr 2025; 237:110640.

[74]

Ye Y, Lu Y, Su H, Tian Y, Jin S, Li G, et al. A hybrid bioelectronic retina-probe interface for object recognition. Biosens Bioelectron 2025; 279:117408.

[75]

Li L, Wang X, Huang D, He Y, Zhong Z, Xia Q. KEDM: knowledge-embedded diffusion model for infrared image destriping. IEEE Photonics J 2025; 17(3):1-9.

[76]

Wang S, Dong B, Xiong J, Liu L, Shan M, Koch AW, et al. Phase manipulating Fresnel lenses for wide-field quantitative phase imaging. Opt Lett 2025; 50(8):2683-6.

[77]

Gong L, Gao B, Sun Y, ZhangW, Lin G, Zhang Z. PreciseSLAM: robust, real-time, LiDAR-inertial-ultrasonic tightly-coupled SLAM with ultraprecise positioning for plant factories. IEEE Trans Ind Inform 2024; 20(6):8818-27.

PDF

0

Accesses

0

Citation

Detail

Sections
Recommended

/