Please use this identifier to cite or link to this item: https://hdl.handle.net/2440/120659
Type: Thesis
Title: Efficient Deep Learning Models with Autoencoder Regularization and Information Bottleneck Compression
Author: Williams, Jerome Oskar
Issue Date: 2019
School/Discipline: School of Computer Science
Abstract: Improving efficiency in deep learning models implies achieving a more accurate model for a given computational budget, or conversely a faster, leaner model without losing accuracy. In order to improve efficiency, we can use regularization to to improve generalization to the real world, and compression to improve speed. Due to the information-restricting nature of regularization, these two methods are related. Firstly we present a novel autoencoder architecture as a method of regularization for Pedestrian Detection. Secondly, we present a hyperparameter-free, iterative compression method based on measuring the information content of the model with the Information Bottleneck principle.
Advisor: Carneiro, Gustavo
Suter, David
Sasdelli, Michele
Dissertation Note: Thesis (MPhil) -- University of Adelaide, School of Computer Science, 2019
Keywords: Machine learning
neural network
deep learning
computer vision
regularization
compression
information bottleneck
autoencoder
pedestrian detection
region of interest
convolutional
statistics
efficiency
Provenance: This electronic version is made publicly available by the University of Adelaide in accordance with its open access policy for student theses. Copyright in this thesis remains with the author. This thesis may incorporate third party material which has been used by the author pursuant to Fair Dealing exceptions. If you are the owner of any included third party copyright material you wish to be removed from this electronic version, please complete the take down form located at: http://www.adelaide.edu.au/legals
Appears in Collections:Research Theses

Files in This Item:
File Description SizeFormat 
Williams2019_MPhil.pdfThesis1.98 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.