A post-processing technique for Lagrangian artificial neural network approach to hyperspectral image classification

Qian Du, Harold Szu, Hsuan Ren

Research output: Contribution to journalConference articlepeer-review

3 Scopus citations


Lagrangian Artificial Neural Network (LANN) has been proposed recently for hyperspectral image classification. It is an unsupervised technique that can simultaneously estimate the endmembers and their abundance fractions without any prior information. Since the implementation of the LANN is completely unsupervised, the number of estimated abundance fraction images (AFI) is equal to the number of bands, which display the distribution of the corresponding endmember materials in the image scene. We find out that many AFIs are highly correlated and visually similar. In order to facilitate the following data assessment, a two-stage post-processing approach will be proposed. First, the number of endmembers ns resident in the image scene is estimated using a Neyman-Pearson hypothesis testing-based eigen-thresholding method. Next, an automatic searching algorithm will be applied to find the most distinct AFIs using the divergence as criterion, where the threshold is adjusted until the number of selected AFIs equals the ns estimated in the first stage. The experimental results using AVIRIS data shows the efficiency of the proposed post-processing technique in distinct AFI selection.

Original languageEnglish
Pages (from-to)17-24
Number of pages8
JournalProceedings of SPIE - The International Society for Optical Engineering
StatePublished - 2003
EventIndependent Component Analyses, Wavelets, And Neural Networks - Orlando, FL, United States
Duration: 22 Apr 200325 Apr 2003


  • Blind source separation
  • Classification
  • Hyperspectral imaging
  • Independent component analysis
  • Lagrangian artificial neural network


Dive into the research topics of 'A post-processing technique for Lagrangian artificial neural network approach to hyperspectral image classification'. Together they form a unique fingerprint.

Cite this