Evolutionary support vector machines (ESVMs) are a novel technique that assimilates the learning engine of the state-of-the-art support vector machines (SVMs) but evolves the coefficients of the decision function by means of evolutionary algorithms (EAs). The new method has accomplished the purpose for which it has been initially developed, that of a simpler alternative to the canonical SVM approach for solving the optimization component of training. ESVMs, as SVMs, are natural tools for primary application to classification. However, since the latter had been further on extended to also handle regression, it is the scope of this paper to present the corresponding evolutionary paradigm. In particular, we consider the hybridization with the classical epsilon-support vector regression (epsilon-SVR) introduced by Vapnik and the subsequent evolution of the coefficients of the regression hyperplane. epsilon-evolutionary support regression (epsilon-ESVR) is validated on the Boston housing benchmark problem and the obtained results demonstrate the promise of ESVMs also as concerns regression.