One of the main goals of machine learning is to study the generalization performance of learning algorithms. The previous main results describing the generalization ability of learning algorithms are usually based on independent and identically distributed (i.i.d.) samples. However, independence is a very restrictive concept for both theory and real-world applications. In this paper we go far beyond this classical framework by establishing the bounds on the rate of relative uniform convergence for the Empirical Risk Minimization (ERM) algorithm with uniformly ergodic Markov chain samples. We not only obtain generalization bounds of ERM algorithm, but also show that the ERM algorithm with uniformly ergodic Markov chain samples is consistent. The established theory underlies application of ERM type of learning algorithms.
Some mathematical models in geophysics and graphic processing need to compute integrals with scattered data on the sphere.Thus cubature formula plays an important role in computing these spherical integrals.This paper is devoted to establishing an exact positive cubature formula for spherical basis function networks.The authors give an existence proof of the exact positive cubature formula for spherical basis function networks,and prove that the cubature points needed in the cubature formula are not larger than the number of the scattered data.