The problem of estimating sparse eigenvectors of a symmetric matrix attracts a lot of attention in many applications, especially those with high dimensional data set. While classical eigenvectors can be obtained as the solution of a maximization problem, existing approaches formulated this problem by adding a penalty term into the objective function that encourages a sparse solution. Nevertheless, the resulting methods achieve sparsity at a sacrifice of the orthogonality property. In this paper, we develop a new method to estimate dominant sparse eigenvectors without trading off their orthogonality. The problem is highly non-convex and too hard to handle. We apply the minorization-maximization (MM) framework where we iteratively maximize a tight lower bound (surrogate function) of the objective function over the Stiefel manifold. The inner maximization problem turns out to be the rectangular Procrustes problem, which has a closed-form solution. Numerical experiments show that the propose method matches or outperforms existing algorithms in terms of recovery probability and explained variance.