Jinxing LiBob ZhangGuangming LuHu RenDavid Zhang
Multiview learning methods often achieve improvement compared with single-view-based approaches in many applications. Due to the powerful nonlinear ability and probabilistic perspective of Gaussian process (GP), some GP-based multiview efforts were presented. However, most of these methods make a strong assumption on the kernel function (e.g., radial basis function), which limits the capacity of the real data modeling. In order to address this issue, in this paper, we propose a novel multiview approach by combining a multikernel and GP latent variable model. Instead of designing a deterministic kernel function, multiple kernel functions are established to automatically adapt various types of data. Considering a simple way of obtaining latent variables at the testing stage, a projection from the observed space to the latent space as a back constraint has also been simultaneously introduced into the proposed method. Additionally, different from some existing methods which apply the classifiers off-line, a hinge loss is embedded into the model to jointly learn the classification hyperplane, encouraging the latent variables belonging to the different classes to be separated. An efficient algorithm based on the gradient decent technique is constructed to optimize our method. Finally, we apply the proposed approach to three real-world datasets and the associated results demonstrate the effectiveness and superiority of our model compared with other state-of-the-art methods.
Jinxing LiGuangming LuBob ZhangJane YouDavid Zhang
Jinxing LiBob ZhangDavid Zhang
Jinxing LiBob ZhangGuangming LuDavid Zhang