Photos crowdsourced from mobile devices can be used in many applications such as disaster recovery to obtain information about a target area. However, such applications often have resource constraints in terms of bandwidth, storage, and processing capability, which limit the number of photos that can be crowdsourced. Thus, it is a challenge to use the limited resources to crowdsource photos that best cover the target area. In this paper, we leverage various geographical and geometrical information about photos, called metadata, to address this challenge. Metadata includes the location, orientation, field of view, and range of a camera. Based on metadata, we define photo utility to measure how well a target area is covered by a set of photos. We propose various techniques to analyze such coverage and calculate photo utility accurately and efficiently. We also study the problem of selecting photos with the largest utility under a resource budget, and propose an efficient algorithm that achieves constant approximation ratio. With our design, the crowdsourcing server can select photos based on metadata instead of real images, and thus use the limited resources to crowdsource the most useful photos. Both simulation and experimental results demonstrate the effectiveness of our design.