Abstract
The normalized LMS algorithms offer low-computational complexity and inexpensive implementations for FIR adaptive filters. However, convergence rate decreases as the eigenvalue ratio (condition number) of the input autocorrelation matrix increases. Recursive least squares methods offer significant convergence rate improvement but at the expense of increased computational complexity. In this paper, we present a class of algorithms, collectively called Projection Methods, which offers flexibility in the tradeoff between computational complexity and convergence rate improvement. These methods are related to traditional normalized data reusing algorithms described by Schnaufer and Jenkins. Utilizing conjugate gradient and Tchebyshev methods, algorithms are developed which accelerate the convergence behavior of traditional normalized data reusing algorithms while maintaining excellent tracking performance.
Original language | English (US) |
---|---|
Pages | 746-749 |
Number of pages | 4 |
State | Published - 1997 |
Event | Proceedings of the 1997 40th Midwest Symposium on Circuits and Systems. Part 1 (of 2) - Sacramento, CA, USA Duration: Aug 3 1997 → Aug 6 1997 |
Other
Other | Proceedings of the 1997 40th Midwest Symposium on Circuits and Systems. Part 1 (of 2) |
---|---|
City | Sacramento, CA, USA |
Period | 8/3/97 → 8/6/97 |
All Science Journal Classification (ASJC) codes
- Electronic, Optical and Magnetic Materials
- Electrical and Electronic Engineering