State space models (SSM) have been widely applied for the analysis and
visualization of large sequential datasets. Sequential Monte Carlo (SMC) is a
very popular particle-based method to sample latent states from intractable
posteriors. However, SSM is significantly influenced by the choice of the
proposal. Recently Hamiltonian Monte Carlo (HMC) sampling has shown success in
many practical problems. In this paper, we propose an SMC augmented by HMC
(HSMC) for inference and model learning of nonlinear SSM, which can exempt us
from learning proposals and reduce the model complexity significantly. Based on
the measure preserving property of HMC, the particles directly generated by
transition function can approximate the posterior of latent states arbitrarily
well. In order to better adapt to the local geometry of latent space, the HMC
is conducted on Riemannian manifold defined by a positive definite metric. In
addition, we show that the proposed HSMC method can improve SSMs realized by
both Gaussian Processes (GP) and Neural Network (NN).