Correlation detection is an essential ingredient in noise radar. Such detection is achieved via coherent signal processing, which, conceivably, gives the best enhancement in the signal-to-noise ratio. Over the years, much research and progress has been made on the use of noise radar systems as means for effective through-wall detection. Information about a particular target's range and/or velocity are often acquired by comparing and analyzing both transmit and received waveforms. One of the widely used techniques employed to measure the degree of similarity between the two signals is correlation. The aforementioned methodology determines to what extent two waveforms match by multiplying and shifting one signal with respect to a time-lagged version of the second signal. This feature of correlation is very applicable to radar signals since a received signal from a target is delayed on the path of return to the receiving antenna. Transmission and reflection impairments will distort the propagating signals and degrade the correlation. Thus, it is essential that we try to study the effects that such degradations can have on the signals that will be used in the correlation process. This paper presents some concepts of a noise radar system, simulation studies, and an analysis of the results ascertained.