A system for automating identification of biological echoes in NEXRAD level II radar data

Loading...
Thumbnail Image

Date

2009

Journal Title

Journal ISSN

Volume Title

Publisher

Montana State University - Bozeman, College of Engineering

Abstract

Since its inception in the mid twentieth century, radar ornithology has provided scientists with new tools for studying the behavior of birds, especially with regards to migration. A number of studies have shown that birds can be detected using a wide variety of radar devices. Generally, these studies have focused on small portable radars that typically have a finer resolution than large weather surveillance radars. Recently, however, a number of researchers have presented qualitative evidence suggesting that birds, or at least migration events, can be identified using large broad scale radars such as the WSR-88D used in the NEXRAD weather surveillance system. This is potentially a boon for ornithologists because NEXRAD data covers a large portion of the country, is constantly being produced, is freely available, and is archived back into the early 1990s. A major obstacle is that identifying birds in NEXRAD data currently requires having a trained technician manually inspect a graphically rendered radar sweep. The immense amount of available data makes manual classification of radar echoes infeasible over any practical span of space or time. In this thesis, a system is presented for automating this process using machine learning techniques. This approach begins with classified training data that has been interpreted by experts or collected from direct observations. The data is preprocessed to ensure quality and to emphasize relevant features. A classifier is then trained using this data and cross validation is used to measure performance. The experiments in this thesis compare neural network, naïve Bayes, and k-nearest neighbor classifiers. Empirical evidence is provided showing that this system can achieve classification accuracies in the 80th to 90th percentile.

Description

Keywords

Citation

Copyright (c) 2002-2022, LYRASIS. All rights reserved.