Unsupervised pre-training for fully convolutional neural networks

Date
2016
Journal Title
Journal ISSN
Volume Title
Publisher
Institute of Electrical and Electronics Engineers
Abstract
Unsupervised pre-training of neural networks has been shown to act as a regularization technique, improving performance and reducing model variance. Recently, fully con-volutional networks (FCNs) have shown state-of-the-art results on various semantic segmentation tasks. Unfortunately, there is no efficient approach available for FCNs to benefit from unsupervised pre-training. Given the unique property of FCNs to output segmentation maps, we explore a novel variation of unsupervised pre-training specifically designed for FCNs. We extend an existing FCN, called U-net, to facilitate end-to-end unsupervised pre-training and apply it on the ISBI 2012 EM segmentation challenge data set. We performed a battery of significance tests for both equality of means and equality of variance, and show that our results are consistent with previous work on unsupervised pre-training obtained from much smaller networks. We conclude that end-to-end unsupervised pre-training for FCNs adds robustness to random initialization, thus reducing model variance.
Description
CITATION: Wiehman, S., Kroon, S. & De Villiers, H. 2016. Unsupervised pre-training for fully convolutional neural networks. Pattern Recognition Association of South Africa and Robotics and Mechatronics International Conference (PRASA-RobMech), 30 November-2 December 2016, Stellenbosch, South Africa.
The original publication is available at http://ieeexplore.ieee.org
Keywords
Neural networks (Computer science), Convolutions (Mathematics), Map segmentation -- Semantics
Citation
Wiehman, S., Kroon, S. & De Villiers, H. 2016. Unsupervised pre-training for fully convolutional neural networks. Pattern Recognition Association of South Africa and Robotics and Mechatronics International Conference (PRASA-RobMech), 30 November-2 December 2016, Stellenbosch, South Africa