We propose a new framework based on optimization on manifolds to approximate the solution of a Lyapunov matrix equation by a low-rank matrix. The method minimizes the error on the Riemannian manifold of symmetric positive semi-definite matrices of fixed rank. We detail how objects from differential geometry, like the Riemannian gradient and Hessian, can be efficiently computed for this manifold. As minimization algorithm we use the Riemannian Trust-Region method of [Found. Comput. Math., 7 (2007), pp. 303--330] based on a second-order model of the objective function on the manifold. Together with an efficient preconditioner this method can find low-rank solutions with very little memory. We illustrate our results with numerical examples.