Gradient-based ICA-CPA
Lieven De Lathauwer, Maarten De Vos February 1, 2012
Recently, we proposed a combination of Independent Component Analysis and Parallel Factor Analysis, which we called ICA-CPA. The computation was based on an ELSCS optimization framework. An alternative way of updating the costfunction (18) can be based on gradient descent. In this technical report, we provide the formulas explicitly:
For the cost function φ in (18) we have
φ(A + ∆A, B + ∆B, D + ∆D) = φ(A, B, D)+ < A∗, ∇Aφ >+ < A, ∇A∗φ >
+ < B∗, ∇Bφ >+ < B, ∇B∗φ >+ < C
∗
, ∇Cφ >+ < C, ∇C∗φ > (1)
up to first order terms, in which ∇A∗φ, ∇B∗φ, ∇C∗φ are the partial complex gradients of φ
[? ]. Due to the symmetry of φ, we have that ∇Aφ= (∇A∗φ) ∗ , ∇Bφ= (∇B∗φ) ∗ and∇Cφ= (∇C∗φ) ∗ . Computation of the expansion (1) yields:
∇A∗φ = A ·(B∗⊙ A ⊙ B ⊙ C∗)H·(B∗⊙ A ⊙ B ⊙ C∗) + A · (A ⊙ B ⊙ B∗⊙ C)H·(A ⊙ B ⊙ B∗⊙ C) −U[1;2,3,4,5]·(B∗⊙ A ⊙ B ⊙ C∗) − (U [3;1,2,4,5])∗·(A ⊙ B ⊙ B∗⊙ C) (2) ∇B∗φ = B ·(A∗⊙ A ⊙ B ⊙ C∗)H·(A∗⊙ A ⊙ B ⊙ C∗) + B · (A ⊙ B ⊙ A∗⊙ C)H·(A ⊙ B ⊙ A∗⊙ C) −U[2;1,3,4,5]·(A∗⊙ A ⊙ B ⊙ C∗) − (U [4;1,2,3,5])∗·(A ⊙ B ⊙ A∗⊙ C) (3) ∇C∗φ = C ·(A∗⊙ B∗⊙ A ⊙ B)H·(A∗⊙ B∗⊙ A ⊙ B) −U[5;1,2,3,4]·(A∗⊙ B∗⊙ A ⊙ B). (4)
Expressions (2)–(4) can be efficiently computed as follows:
∇A∗φ = 2A · [(AH· A) ∗ (BH· B) ∗ (BH· B) ∗ (CH· C)] −U ·2BH· 3AT·4BT·5CH−(U[3,2,1,4,5])∗·2BT·3AT·4BH·5CT ∇B∗φ = 2B · [(AH· A) ∗ (AH· A) ∗ (BH· B) ∗ (CH· C)] −(U[2,1,3,4,5]) ·2AH·3AT·4BT·5CH−(U[4,2,3,1,5])∗·2BT·3AH·4AT·5CT ∇C∗φ = C ·[(AH· A) ∗ (AH· A) ∗ (BH· B) ∗ (BH· B)] −(U[5,2,3,4,1]) ·2BH·3AT·4BT·5AH
References
[1] A. Hjorungnes and D. Gesbert. Complex-Valued Matrix Differentiation: Techniques and Key Results. IEEE Trans. Signal Process., 55:2740–2746, 2007.