Level set estimation (LSE) is the process of using noisy observations of an unknown function to estimate the region(s) where the function values lie above a given threshold. It has a wide range of applications in many scientific and engineering areas, such as spectrum sensing or environment monitoring. In this paper, we study the optimum LSE of a time-varying random field under a total power constraint. A sensor performs uniform sampling of the random field and sends the samples to a fusion center, which estimates the level set by using distorted observations of the samples. Under a total power constraint, a higher sampling rate means less energy per sample, which may negatively impact the estimation performance, but also a stronger correlation between adjacent samples, which can improve the estimation accuracy. Thus it is critical to identify the optimum sampling rate that can minimize the LSE error provability. With the help of a Gaussian process (GP) prior model, we first develop an optimum LSE algorithm based on GP regression. The exact analytical LSE error probability of the LSE algorithm is then derived by considering a number of factors, such as the power consumptions of both sensing and transmission, the power constraint of the sensor, the sampling rate, and the probability distributions of the random field. To simplify analysis, we also obtain a closed-form upper bound of the LSE error probability. The optimum sampling rate is identified by using the analytical error probabilities.