Feature selection and extraction are critical steps in many areas where pattern recognition techniques are applied. Feature selection and extraction are based on identifying and maximizing dependency relations. Gebelein's Maximal Correlation (GMC) is the most general form of dependence in that it does not make any statistical assumptions concerning the nature of the dependencies. Unfortunately, benefiting from such a useful measure in practice is generally impossible as there are only a few cases for which explicit formulae are available to calculate it. In this paper, we point out a parallel between GMC and the SINBAD algorithms, developed originally as a model of feature extraction for neurons in the cerebral cortex. We use SINBAD as a robust approximation to GMC to perform feature selection and extraction on a number of artificial and real datasets. We show that SINBAD estimates of GMC compare favorably to other well known feature selection and extraction methods based on mutual information, kernel canonical correlation analysis and principal component analysis.