Vehicle detection in aerial surveillance using dynamic bayesian networks

Hsu Yung Cheng, Chih Chia Weng, Yi Ying Chen

Research output: Contribution to journalArticlepeer-review

160 Scopus citations

Abstract

We present an automatic vehicle detection system for aerial surveillance in this paper. In this system, we escape from the stereotype and existing frameworks of vehicle detection in aerial surveillance, which are either region based or sliding window based. We design a pixelwise classification method for vehicle detection. The novelty lies in the fact that, in spite of performing pixelwise classification, relations among neighboring pixels in a region are preserved in the feature extraction process. We consider features including vehicle colors and local features. For vehicle color extraction, we utilize a color transform to separate vehicle colors and nonvehicle colors effectively. For edge detection, we apply moment preserving to adjust the thresholds of the Canny edge detector automatically, which increases the adaptability and the accuracy for detection in various aerial images. Afterward, a dynamic Bayesian network (DBN) is constructed for the classification purpose. We convert regional local features into quantitative observations that can be referenced when applying pixelwise classification via DBN. Experiments were conducted on a wide variety of aerial videos. The results demonstrate flexibility and good generalization abilities of the proposed method on a challenging data set with aerial surveillance images taken at different heights and under different camera angles.

Original languageEnglish
Article number6054051
Pages (from-to)2152-2159
Number of pages8
JournalIEEE Transactions on Image Processing
Volume21
Issue number4
DOIs
StatePublished - Apr 2012

Keywords

  • Aerial surveillance
  • dynamic Bayesian networks (DBNs)
  • vehicle detection

Fingerprint

Dive into the research topics of 'Vehicle detection in aerial surveillance using dynamic bayesian networks'. Together they form a unique fingerprint.

Cite this