Relevance vector machine (RVM) is a Bayesian sparse kernel method for regression in statistical learning theory, which can avoid principal limitations of the support vector machine (SVM) and result in faster performance on test data while maintaining comparable generalization error. In this paper, we develop a logic framework of the evidence function approximation associated with RVM based on Taylor expansion instead of traditional technology called "completing the square." While constructing the term of completing the square, we have to find the term of completing the square by making use of some skill, which in practice increases the difficulty in dealing with the evidence function approximation associated with RVM. The logical framework in this paper based on Taylor expansion shows some advantages compared with the conventional method of completing the square, which is easier to be enforced due to the clear logical framework and avoids the difficulty in looking for the term of completing the square intentionally. From symmetry of covariance in a multivariate Gaussian distribution and algebraic knowledge, we derive approximation and maximization of the evidence function associated with RVM, which is consistent with the previous result using the novel logical framework. Finally, we derive the EM algorithm for RVM, which is also consistent with the previous result except that we use the precision matrix as the covariance.
展开▼