TY - GEN
T1 - Cross-correlation analysis of noise radar signals propagating through lossy dispersive media
AU - Smith, Sonny
AU - Narayanan, Ram Mohan
PY - 2011
Y1 - 2011
N2 - Correlation detection is an essential ingredient in noise radar. Such detection is achieved via coherent signal processing, which, conceivably, gives the best enhancement in the signal-to-noise ratio. Over the years, much research and progress has been made on the use of noise radar systems as means for effective through-wall detection. Information about a particular target's range and/or velocity are often acquired by comparing and analyzing both transmit and received waveforms. One of the widely used techniques employed to measure the degree of similarity between the two signals is correlation. The aforementioned methodology determines to what extent two waveforms match by multiplying and shifting one signal with respect to a time-lagged version of the second signal. This feature of correlation is very applicable to radar signals since a received signal from a target is delayed on the path of return to the receiving antenna. Transmission and reflection impairments will distort the propagating signals and degrade the correlation. Thus, it is essential that we try to study the effects that such degradations can have on the signals that will be used in the correlation process. This paper presents some concepts of a noise radar system, simulation studies, and an analysis of the results ascertained.
AB - Correlation detection is an essential ingredient in noise radar. Such detection is achieved via coherent signal processing, which, conceivably, gives the best enhancement in the signal-to-noise ratio. Over the years, much research and progress has been made on the use of noise radar systems as means for effective through-wall detection. Information about a particular target's range and/or velocity are often acquired by comparing and analyzing both transmit and received waveforms. One of the widely used techniques employed to measure the degree of similarity between the two signals is correlation. The aforementioned methodology determines to what extent two waveforms match by multiplying and shifting one signal with respect to a time-lagged version of the second signal. This feature of correlation is very applicable to radar signals since a received signal from a target is delayed on the path of return to the receiving antenna. Transmission and reflection impairments will distort the propagating signals and degrade the correlation. Thus, it is essential that we try to study the effects that such degradations can have on the signals that will be used in the correlation process. This paper presents some concepts of a noise radar system, simulation studies, and an analysis of the results ascertained.
UR - http://www.scopus.com/inward/record.url?scp=79960090203&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=79960090203&partnerID=8YFLogxK
U2 - 10.1117/12.887347
DO - 10.1117/12.887347
M3 - Conference contribution
AN - SCOPUS:79960090203
SN - 9780819485953
T3 - Proceedings of SPIE - The International Society for Optical Engineering
BT - Radar Sensor Technology XV
T2 - Radar Sensor Technology XV
Y2 - 25 April 2011 through 27 April 2011
ER -