Using Slowness Principle for Feature Selection: Relevant Feature Analysis

2014-04-25
Celikkanat, Hande
Kalkan, Sinan
We propose a novel relevant feature selection technique which makes use of the slowness principle. The slowness principle holds that physical entities in real life are subject to slow and continuous changes. Therefore, to make sense of the world, highly erratic and fast-changing signals coming to our sensors must be processed in order to extract slow and more meaningful, high-level representations of the world. This principle has been successfully utilized in previous work of Wiskott and Sejnowski, in order to implement a biologically plausible vision architecture, which allows for robust object recognition. In this work, we propose that the same principle can be extended to distinguish relevant features in the classification of a high-dimensional space. We compare our initial results with state-of-the-art ReliefF feature selection method, as well a variant of Principle Component Analysis that has been modified for feature selection. To the best of our knowledge, this is the first application of the slowness principle for the sake of relevant feature selection or classification.

Suggestions

Exploiting result diversification methods for feature selection in learning to rank
Djafari Naini, Kaweh; Altıngövde, İsmail Sengör (2014-01-01)
In this paper, we adopt various greedy result diversification strategies to the problem of feature selection for learning to rank. Our experimental evaluations using several standard datasets reveal that such diversification methods are quite effective in identifying the feature subsets in comparison to the baselines from the literature.
Novel Optimization Models to Generalize Deep Metric Learning
Gürbüz, Yeti Ziya; Alatan, Abdullah Aydın; Department of Electrical and Electronics Engineering (2022-8-24)
Deep metric learning (DML) aims to fit a parametric embedding function to data of semantic information (e.g. images) so that l2-distance between embedded samples is low whenever they share similar semantic entities. An embedding function of such behavior is attained by minimizing empirical expected pairwise loss that penalizes inter-/intra-class proximity violations in embedding space. Proxy-based methods which use a learnable embedding vector per class in their loss formulation are state-of-the-art. We fir...
Comparison of regression techniques via Monte Carlo simulation
Mutan, Oya Can; Ayhan, Hüseyin Öztaş; Department of Statistics (2004)
The ordinary least squares (OLS) is one of the most widely used methods for modelling the functional relationship between variables. However, this estimation procedure counts on some assumptions and the violation of these assumptions may lead to nonrobust estimates. In this study, the simple linear regression model is investigated for conditions in which the distribution of the error terms is Generalised Logistic. Some robust and nonparametric methods such as modified maximum likelihood (MML), least absolut...
Analysis of stochastic and non-stochastic volatility models.
Özkan, Pelin; Ayhan, Hüseyin Öztaş; Department of Statistics (2004)
Changing in variance or volatility with time can be modeled as deterministic by using autoregressive conditional heteroscedastic (ARCH) type models, or as stochastic by using stochastic volatility (SV) models. This study compares these two kinds of models which are estimated on Turkish / USA exchange rate data. First, a GARCH(1,1) model is fitted to the data by using the package E-views and then a Bayesian estimation procedure is used for estimating an appropriate SV model with the help of Ox code. In order...
A memetic algorithm for clustering with cluster based feature selection
Şener, İlyas Alper; İyigün, Cem; Department of Operational Research (2022-8)
Clustering is a well known unsupervised learning method which aims to group the similar data points and separate the dissimilar ones. Data sets that are subject to clustering are mostly high dimensional and these dimensions include relevant and redundant features. Therefore, selection of related features is a significant problem to obtain successful clusters. In this study, it is considered that relevant features for each cluster can be varied as each cluster in a data set is grouped by different set of fe...
Citation Formats
H. Celikkanat and S. Kalkan, “Using Slowness Principle for Feature Selection: Relevant Feature Analysis,” 2014, Accessed: 00, 2020. [Online]. Available: https://hdl.handle.net/11511/54731.