We address the estimation of the Integrated Squared Error (ISE) of a predictor $\eta (x)$ of an unknown function f learned using data acquired on a given design ${\mathbf{X}_{n}}$. We consider ISE estimators that are weighted averages of the residuals of the predictor $\eta (x)$ on a set of selected points ${\mathbf{Z}_{m}}$. We show that, under a stochastic model for f, minimisation of the mean squared error of these ISE estimators is equivalent to minimisation of a Maximum Mean Discrepancy (MMD) for a non-stationary kernel that is adapted to the geometry of ${\mathbf{X}_{n}}$. Sequential Bayesian quadrature then yields sequences of nested validation designs that minimise, at each step of the construction, the relevant MMD. The optimal ISE estimate can be written in terms of the integral of a linear reconstruction, for the assumed model, of the square of the interpolator residuals over the domain of f. We present an extensive set of numerical experiments which demonstrate the good performance and robustness of the proposed solution. Moreover, we show that the validation designs obtained are space-filling continuations of ${\mathbf{X}_{n}}$, and that correct weighting of the observed interpolator residuals is more important than the precise configuration ${\mathbf{Z}_{m}}$ of the points at which they are observed.