Level set estimation (LSE) is the process of classifying the region(s) that the values of an unknown function exceed a certain threshold. It has a wide range of applications such as spectrum sensing or environment monitoring. In this paper, we study the the optimal LSE of a linear random field that changes with respect to time. A linear sensor network is used to take discrete samples of the spatially-temporally correlated random field in both the space and time domain, and the sensors operate under a total power constraint. The samples are congregated at a fusion center (FC), which performs LSE of the random field by using the noisy observation of the samples. Under the Gaussian process (GP) framework, we first develop an optimal LSE algorithm that can minimize the LSE error probability. The results are then used to derive the exact LSE error probability with the assistance of frequency domain analysis. The analytical LSE error probability is expressed as an explicit function of a number of system parameters, such as the distance between two adjacent nodes, the sampling period in the time domain, the signal-to-noise ratio (SNR), and the spatial-temporal correlation of the random field. With the analytical results, we can identify the optimum node distance and sampling period that can minimize the LSE error probability.