NASA Logo

NTRS

NTRS - NASA Technical Reports Server

Back to Results
Classification improvement by optimal dimensionality reduction when training sets are of small sizeA computer simulation was performed to test the conjecture that, when the sizes of the training sets are small, classification in a subspace of the original data space may give rise to a smaller probability of error than the classification in the data space itself; this is because the gain in the accuracy of estimation of the likelihood functions used in classification in the lower dimensional space (subspace) offsets the loss of information associated with dimensionality reduction (feature extraction). A number of pseudo-random training and data vectors were generated from two four-dimensional Gaussian classes. A special algorithm was used to create an optimal one-dimensional feature space on which to project the data. When the sizes of the training sets are small, classification of the data in the optimal one-dimensional space is found to yield lower error rates than the one in the original four-dimensional space.
Document ID
19760020814
Acquisition Source
Legacy CDMS
Document Type
Contractor Report (CR)
Authors
Starks, S. A.
(Rice Univ. Houston, TX, United States)
Defigueiredo, R. J. P.
(Rice Univ. Houston, TX, United States)
Vanrooy, D. L.
(Rice Univ. Houston, TX, United States)
Date Acquired
September 3, 2013
Publication Date
April 1, 1976
Subject Category
Computer Programming And Software
Report/Patent Number
NASA-CR-147822
ICSA-TR-275-025-025
EE-TR-7603
Accession Number
76N27902
Funding Number(s)
CONTRACT_GRANT: NAS9-12776
CONTRACT_GRANT: AF-AFOSR-2777-75
Distribution Limits
Public
Copyright
Work of the US Gov. Public Use Permitted.
No Preview Available