We consider the problem of diffusing cached content in an intermittently connected mobile network, starting from a given initial configuration to a desirable goal state where all nodes interested in particular contents have a copy of their desired contents. The goal is to minimize the time taken for the diffusion process to terminate at a goal state. Due to bandwidth and storage constraints, whenever two nodes encounter each other, they must decide which content if any to transfer to each other. While most prior work on this topic has focused on practically realizable heuristics for this problem, we take a more formal approach. Our main contribution is to show that, assuming global state information is available, this problem can be formulated as a stochastic shortest path problem, which is a kind of Markov decision process (MDP). Using this formulation, we numerically explore some small-scale examples for which we are able to obtain the optimal solution. The results show that the optimal diffusion strategy is very much a function of the underlying encounter graph.