Automatic geospatial content summarization and visibility enhancement by dehazing in aerial imagery
Abstract
[ACCESS RESTRICTED TO THE UNIVERSITY OF MISSOURI AT REQUEST OF AUTHOR.] The objective of this work is to develop a methodology to summarize the content of a long sequential geospatial video using a few geospatial coverage maps or minimosaics and to improve the visibility of generated mini-mosaics by haze removal. The mini-mosaics provide an overview of the areas visited over many hours by an aerial imaging vehicle such as drone. This is helpful for analysts to get a quick summary of the data without having to review the full video manually fast forwarding and rewinding or skimming which is time consuming and error prone. Automatic mosaic and coverage map generation is a challenging research problem especially in computer vision in scenes with significant structure and motion-induced parallax. In traditional approach, the mosaic generation can be very time consuming as it is quadratic with number of images. Besides, frame-to-frame matching suffers from error accumulation or drifting problem. In contrast, we develop a linear-time approach which reduces consecutive single frame matching problem to consecutive reference frame mapping. This approach is solely image based and uses no additional metadata information. A common visual problem of aerial data is the presence of haze. We develop a dehazing method Faster Dark Channel Prior (FDCP) by leveraging the spatial property of medium transmission, downsample the original image, calculate the transmission from the downsampled image and apply it to get the haze free radiance from the original image. FDCP is applied to improve the visibility of obtained mini-mosaics by removing haze either from individual images or coverage maps themselves. Experimental results show the efficiency and precision of the proposed methods for automatic mosaicing and dehazing of aerial data.
Degree
M.S.
Thesis Department
Rights
Access is limited to the campuses of the University of Missouri.