RLS Adaptive Filtering with Sparsity Regularization
- Asst. Prof. Ender M. EK ¸
RLS Adaptive Filtering with Sparsity Regularization S IO Asst. - - PowerPoint PPT Presentation
RLS Adaptive Filtering with Sparsity Regularization S IO Asst. Prof. Ender M. EK GLU Istanbul Technical University Electronics and Communications Engineering Department Main Headings ISSPA 2010, Malaysia RLS Adaptive Filtering with
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.2
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.2
Introduction
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.2
Introduction ℓ1-RLS Algorithm
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.2
Introduction ℓ1-RLS Algorithm Simulation Results
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.2
Introduction ℓ1-RLS Algorithm Simulation Results Conclusions
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.3
Sparse adaptive filtering, where the impulse response for
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.3
Sparse adaptive filtering, where the impulse response for
The sparsity prior has applications in acoustic and network
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.3
Sparse adaptive filtering, where the impulse response for
The sparsity prior has applications in acoustic and network
Proportionate adaptive algorithm is a well-known approach
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.4
Recently, novel LMS type algorithms which incorporate the
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.4
Recently, novel LMS type algorithms which incorporate the
The common idea is to add a penalty term in the form of an
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.4
Recently, novel LMS type algorithms which incorporate the
The common idea is to add a penalty term in the form of an
Sparsity based adaptive algorithms have been mostly
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.5
Recursive least squares (RLS) adaptive filtering is another
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.5
Recursive least squares (RLS) adaptive filtering is another
In this paper, we propose an RLS adaptive algorithm for
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.5
Recursive least squares (RLS) adaptive filtering is another
In this paper, we propose an RLS adaptive algorithm for
The algorithm will utilize the modified RLS cost function with
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.5
Recursive least squares (RLS) adaptive filtering is another
In this paper, we propose an RLS adaptive algorithm for
The algorithm will utilize the modified RLS cost function with
We find the recursive minimization procedure in a manner
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.5
Recursive least squares (RLS) adaptive filtering is another
In this paper, we propose an RLS adaptive algorithm for
The algorithm will utilize the modified RLS cost function with
We find the recursive minimization procedure in a manner
The difference occurs in the weight vector update equation,
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.5
Recursive least squares (RLS) adaptive filtering is another
In this paper, we propose an RLS adaptive algorithm for
The algorithm will utilize the modified RLS cost function with
We find the recursive minimization procedure in a manner
The difference occurs in the weight vector update equation,
We will call this new algorithm as the ℓ1-RLS.
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.6
Firstly give a brief outline of the adaptive system
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.6
Firstly give a brief outline of the adaptive system
Then, we develop the novel ℓ1-RLS algorithm by outlining
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.6
Firstly give a brief outline of the adaptive system
Then, we develop the novel ℓ1-RLS algorithm by outlining
We give the final form of ℓ1-RLS algorithm.
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.6
Firstly give a brief outline of the adaptive system
Then, we develop the novel ℓ1-RLS algorithm by outlining
We give the final form of ℓ1-RLS algorithm. We will present simulation results comparing the novel
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.7
Consider the system identification setting given by the
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.7
Consider the system identification setting given by the
The aim of the adaptive system identification algorithm is to
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.7
Consider the system identification setting given by the
The aim of the adaptive system identification algorithm is to
In conventional RLS, the cost function to be minimized by
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.8
We assume that the underlying filter coefficient vector h has
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.8
We assume that the underlying filter coefficient vector h has
Hence, we want to modify the cost function in a manner that
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.8
We assume that the underlying filter coefficient vector h has
Hence, we want to modify the cost function in a manner that
A tractable way to force sparsity is by using the ℓ1-norm of
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.8
We assume that the underlying filter coefficient vector h has
Hence, we want to modify the cost function in a manner that
A tractable way to force sparsity is by using the ℓ1-norm of
Hence, we regularize the RLS cost function by including the
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.9
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.9
Here, γ > 0 is a parameter that governs the tradeoff
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.9
Here, γ > 0 is a parameter that governs the tradeoff
h(n)1 is the ℓ1 norm of the weight vector and is given by
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.10
We want to minimize this regularized cost function J(n) with
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.10
We want to minimize this regularized cost function J(n) with
In the standard RLS case when the cost function is simply
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.10
We want to minimize this regularized cost function J(n) with
In the standard RLS case when the cost function is simply
However, the ℓ1 norm term h(n)1 in J(n) in (3) is
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.10
We want to minimize this regularized cost function J(n) with
In the standard RLS case when the cost function is simply
However, the ℓ1 norm term h(n)1 in J(n) in (3) is
A substitute for the gradient in the case of nondifferentiable
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.11
One subgradient vector of the penalized cost function J(n)
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.11
One subgradient vector of the penalized cost function J(n)
The ith element of this vector is calculated as below.
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.12
We set the subgradient equal to zero to find the optimal least
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.12
We set the subgradient equal to zero to find the optimal least
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.12
We set the subgradient equal to zero to find the optimal least
Written for all i = 1, . . . , N together in a matrix form, results
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.13
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.13
Here, Φ(n) is the exponentially weighted deterministic
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.13
Here, Φ(n) is the exponentially weighted deterministic
r(n) is the deterministic cross-correlation estimate between
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.13
Here, Φ(n) is the exponentially weighted deterministic
r(n) is the deterministic cross-correlation estimate between
These two quantities can be updated by rank-one recursive
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.13
Here, Φ(n) is the exponentially weighted deterministic
r(n) is the deterministic cross-correlation estimate between
These two quantities can be updated by rank-one recursive
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.13
Here, Φ(n) is the exponentially weighted deterministic
r(n) is the deterministic cross-correlation estimate between
These two quantities can be updated by rank-one recursive
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.14
Instead of solving the normal equations for the optimal least
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.14
Instead of solving the normal equations for the optimal least
We assume that the sign of the weight values do not change
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.14
Instead of solving the normal equations for the optimal least
We assume that the sign of the weight values do not change
The normal equation can be rewritten as
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.15
We come up with the following result.
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.15
We come up with the following result.
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.16
Using the matrix inversion lemma, it can be shown that the
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.16
Using the matrix inversion lemma, it can be shown that the
The recursive update for the tab weight vector assumes its
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.16
Using the matrix inversion lemma, it can be shown that the
The recursive update for the tab weight vector assumes its
This update equation finalizes the ℓ1-RLS algorithm.
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.17
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.17
inputs: λ, γ, x(n), y(n)
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.17
inputs: λ, γ, x(n), y(n) initial values: h(−1) = 0,
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.17
inputs: λ, γ, x(n), y(n) initial values: h(−1) = 0,
for n := 0, 1, 2, . . .
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.17
inputs: λ, γ, x(n), y(n) initial values: h(−1) = 0,
for n := 0, 1, 2, . . . kλ(n) = P(n − 1)x∗(n)
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.17
inputs: λ, γ, x(n), y(n) initial values: h(−1) = 0,
for n := 0, 1, 2, . . . kλ(n) = P(n − 1)x∗(n) k(n) =
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.17
inputs: λ, γ, x(n), y(n) initial values: h(−1) = 0,
for n := 0, 1, 2, . . . kλ(n) = P(n − 1)x∗(n) k(n) =
ξ(n) = y(n) − hT(n − 1)x(n)
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.17
inputs: λ, γ, x(n), y(n) initial values: h(−1) = 0,
for n := 0, 1, 2, . . . kλ(n) = P(n − 1)x∗(n) k(n) =
ξ(n) = y(n) − hT(n − 1)x(n) P(n) = 1
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.17
inputs: λ, γ, x(n), y(n) initial values: h(−1) = 0,
for n := 0, 1, 2, . . . kλ(n) = P(n − 1)x∗(n) k(n) =
ξ(n) = y(n) − hT(n − 1)x(n) P(n) = 1
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.17
inputs: λ, γ, x(n), y(n) initial values: h(−1) = 0,
for n := 0, 1, 2, . . . kλ(n) = P(n − 1)x∗(n) k(n) =
ξ(n) = y(n) − hT(n − 1)x(n) P(n) = 1
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.18
When we compare the ℓ1-RLS weight update with the
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.19
We compare the performance of the novel ℓ1-RLS algorithm
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.19
We compare the performance of the novel ℓ1-RLS algorithm
The first experiment considers the tracking capabilities of
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.20
The first experiment considers the tracking capabilities of
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.20
The first experiment considers the tracking capabilities of
200 400 600 800 1000 10
−3
10
−2
10
−1
10
iteration MSD
RLS LMS ZA−LMS l1−RLS
Figure 1: Learning curves for ℓ1-RLS, RLS, ZA-LMS and LMS.
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.20
The first experiment considers the tracking capabilities of
200 400 600 800 1000 10
−3
10
−2
10
−1
10
iteration MSD
RLS LMS ZA−LMS l1−RLS
Figure 1: Learning curves for ℓ1-RLS, RLS, ZA-LMS and LMS.
ℓ1-RLS presents convergence and steady-state error
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.21
In the second experiment we compare the performance of
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.21
In the second experiment we compare the performance of
50 100 150 200 250 300 350 400 450 10
−4
10
−3
10
−2
10
−1
10
iteration MSE (dB)
RLS, 10 dB RLS, 20 dB RLS, 30 dB RLS, 40 dB l1−RLS, 10 dB l1−RLS, 20 dB l1−RLS, 30 dB l1−RLS, 40 dB
Figure 2: Learning curves for ℓ1-RLS and RLS for SNR=40, 30, 20 and 10 dB.
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.21
In the second experiment we compare the performance of
50 100 150 200 250 300 350 400 450 10
−4
10
−3
10
−2
10
−1
10
iteration MSE (dB)
RLS, 10 dB RLS, 20 dB RLS, 30 dB RLS, 40 dB l1−RLS, 10 dB l1−RLS, 20 dB l1−RLS, 30 dB l1−RLS, 40 dB
Figure 2: Learning curves for ℓ1-RLS and RLS for SNR=40, 30, 20 and 10 dB.
The ℓ1-RLS has better convergence and steady-state
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.22
This paper introduced a new RLS algorithm, namely ℓ1-RLS,
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.22
This paper introduced a new RLS algorithm, namely ℓ1-RLS,
The novel update equations for this algorithm are developed
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.22
This paper introduced a new RLS algorithm, namely ℓ1-RLS,
The novel update equations for this algorithm are developed
Numerical simulations demonstrate that the algorithm
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.22
This paper introduced a new RLS algorithm, namely ℓ1-RLS,
The novel update equations for this algorithm are developed
Numerical simulations demonstrate that the algorithm
Future work might include theoretical analysis for the steady
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.23
ISSPA 2010, Malaysia RLS Adaptive Filtering with Sparsity Regularization - p.23