nLab relation between determinant and trace

Contents

Contents

Statement

If AA is an n×nn \times n square matrix, then the determinant of its exponential equals the exponential of its trace

det(exp(A))=exp(tr(A)). \det\big(\exp(A)\big) \;=\; \exp\big(tr(A)\big) \,.

More generally, the determinant of AA is a polynomial in the traces of the powers of AA:

For 2×22 \times 2-matrices:

det(A)=12(tr(A) 2tr(A 2)) det(A) \;=\; \tfrac{1}{2}\left( tr(A)^2 - tr(A^2) \right)

For 3×33 \times 3-matrices:

det(A)=16((tr(A)) 33tr(A 2)tr(A)+2tr(A 3)) det(A) \;=\; \tfrac{1}{6} \left( (tr(A))^3 - 3 tr(A^2) tr(A) + 2 tr(A^3) \right)

For 4×44 \times 4-matrices:

det(A)=124((tr(A)) 46tr(A 2)(tr(A)) 2+3(tr(A 2)) 2+8tr(A 3)tr(A)6tr(A 4)) det(A) \;=\; \tfrac{1}{24} \left( (tr(A))^4 - 6 tr(A^2)(tr(A))^2 + 3 (tr(A^2))^2 + 8 tr(A^3) tr(A) - 6 tr(A^4) \right)

Generally for n×nn \times n-matrices (Kondratyuk-Krivoruchenko 92, appendix B):

(1)det(A)=k 1,,k n=1nk =nl=1n(1) k l+1l k lk l!(tr(A l)) k l det(A) \;=\; \underset{ { k_1,\cdots, k_n \in \mathbb{N} } \atop { \underoverset{\ell = 1}{n}{\sum} \ell k_\ell = n } }{\sum} \underoverset{ l = 1 }{ n }{\prod} \frac{ (-1)^{k_l + 1} }{ l^{k_l} k_l ! } \left(tr(A^l)\right)^{k_l}
Proof of (1)

It is enough to prove this for semisimple matrices AA (matrices that are diagonalizable upon passing to the algebraic closure of the ground field) because this subset of matrices is Zariski dense (using for example the nonvanishing of the discriminant of the characteristic polynomial) and the set of AA for which the equation holds is Zariski closed.

Thus, without loss of generality we may suppose that AA is diagonal with nn eigenvalues λ 1,,λ n\lambda_1, \ldots, \lambda_n along the diagonal, where the statement can be rewritten as follows. Letting p k=tr(A k)=λ 1 k++λ n kp_k = tr(A^k) = \lambda_1^k + \ldots + \lambda_n^k, the following identity holds:

i=1 nλ i=k 1,,k n=1nk =nl=1n(1) k l+1l k lk l!p l k l \prod_{i=1}^n \lambda_i = \underset{ { k_1,\cdots, k_n \in \mathbb{N} } \atop { \underoverset{\ell = 1}{n}{\sum} \ell k_\ell = n } }{\sum} \underoverset{ l = 1 }{ n }{\prod} \frac{ (-1)^{k_l + 1} }{ l^{k_l} k_l ! } p_l^{k_l}

This of course is just a polynomial identity, one closely related to various of the Newton identities that concern symmetric polynomials in indeterminates x 1,,x nx_1, \ldots, x_n. Thus we again let p k=x 1 k++x n kp_k = x_1^k + \ldots + x_n^k, and define the elementary symmetric polynomials σ k=σ k(x 1,,x n)\sigma_k = \sigma_k(x_1, \ldots, x_n) via the generating function identity

k0σ kt k= i=1 n(1+x it). \sum_{k \geq 0} \sigma_k t^k = \prod_{i=1}^n (1 + x_i t).

Then we compute

k0σ kt k = i=1 n(1+x it) = exp( i=1 nlog(1+x it)) = exp( i=1 n k1(1) k+1x i kkt k) = exp( k1(1) k+1p kkt k)\array{ \sum_{k \geq 0} \sigma_k t^k & = & \prod_{i=1}^n (1 + x_i t) \\ & = & \exp\left(\sum_{i=1}^n \log(1 + x_i t)\right) \\ & = & \exp\left(\sum_{i=1}^n \sum_{k \geq 1} (-1)^{k+1} \frac{x_i^k}{k} t^k \right)\\ & = & \exp\left( \sum_{k \geq 1} (-1)^{k+1} \frac{p_k}{k} t^k\right) }

and simply match coefficients of t nt^n in the initial and final series expansions, where we easily compute

x 1x 2x n= n=k 1+2k 2++nk n l=1 n1(k l)!(p ll) k l(1) k l+1. x_1 x_2 \ldots x_n = \sum_{n = k_1 + 2k_2 + \ldots + n k_n} \prod_{l=1}^n \frac1{(k_l)!} \left(\frac{p_l}{l}\right)^{k_l} (-1)^{k_l+1} \,.

This completes the proof.

References

The proof of (1) is spelled out in

  • L. A. Kondratyuk, I. Krivoruchenko, Appendix B of: Superconducting quark matter in SU(2)SU(2) colour group, Zeitschrift für Physik A Hadrons and Nuclei March 1992, Volume 344, Issue 1, pp 99–115 (doi:10.1007/BF01291027)

Last revised on August 29, 2021 at 12:13:10. See the history of this page for a list of all contributions to it.