If you’ve taken linear algebra and noticed that the trace and determinant of a matrix appear, up to sign, as coefficients of the characteristic polynomial of the matrix, then you may have wondered if the other coefficients have a similarly nice description. This turns out to be the case, but to explain we’ll need to assume knowledge about exterior powers of a vector space. Let be a field, let denote a finite dimensional vector space over and let denote a -linear map. Recall that and . Recall furthermore that the exterior power is a functor, where is the identity map of and for one has
Let be the characteristic polynomial of the linear operator . Then it turns out that . Our class was asked to prove this when I was a fourth year undergraduate at Queen’s, and I did so via a messy computation involving a choice of basis. I’d like a more sophisticated proof of this fact, and while I don’t have one yet, I have managed to give a nice proof of the following fact: let be an eigenvalue of . Then . This fact follows if you accept my description of the coefficients of simply by substituting into the characteristic polynomial. We’ll give an independent proof using the following cool lemma:
be an exact sequence of -vector spaces, and let for be endormorphisms commuting with the arrows of the exact sequence above. Then .
This lemma is not difficult to prove. For a five term short exact sequence, build a nice basis for which induces bases on and such that and are upper triangular in these bases. Then will also be upper triangular, and you can compute it’s trace to be the sum of the traces of and . Note that one should really base-change to an algebraically closed field, so that the matrices can be brought to upper-triangular form. To deduce the general result from the case of a five term short exact sequence, simply break the longer sequence up into five term short exact sequences; see also Proposition of Atiyah and MacDonald’s Introduction to Commutative Algebra.
With the lemma out of the way, recall that we now want to use it to prove that if is an eigenvalue for our operator , then
When this amounts to proving simply that . Recall that , and in this case since is not invertible. So we may now suppose that , and in this case the lined formula may be rearranged to read
When expressed in this form, it is clear that we should apply the homological lemma proved above taking and for . The exact sequence
is defined in the following way: let denote an eigenvector for of eigenvalue . Let be defined by mapping . Then for define by sending . Completing to a basis for allows one to easily show that the sequence above is exact. Now one simply needs to check that the sequence of maps commutes with the maps of the exact sequence, which is a simple calculation boiling down to the fact that fixes the vector . This completes the proof.
There are several things that I’m unhappy with in this post. Firstly, I haven’t shown that is the characteristic polynomial of . All I’ve shown is that has all of the eigenvalues of as roots. This isn’t even enough to prove that the minimal polynomial of divides my polynomial! What I’d like to prove is that
via some kind of “homological” argument which avoids the use of the Cayley-Hamilton theorem (from which this result does follow). This fact has always looked to me like some sort of Lefshetz fixed-point formula, and I think it’s intriguing that the proof of the partial result above relied on considering the operator and its fixed point .
Anyway, I’ll end with one other thing that I’d like to see proved in an elegant way. For this we’ll need to assume that the characteristic of our base field is zero. It’s not too hard to show that is nilpotent if and only if for all . Since is nilpotent if and only if its characteristic polynomial is of the form , the description of the coefficients of the characteristic polynomial above suggests that one might be able to express as a polynomial expression of the various . Without too much difficulty I was able to prove that
I quickly gave up trying to deduce a general formula, since it was getting late, and instead turned to the internet for help. During my brief search I was only able to find the following link describing “Bocher’s formula” for the coefficients of the characteristic polynomial. In our notation it reads
for . I’ve never seen this proved, so don’t quote this result from me. However, when it recovers the formula I found, and when it gives a recursive formula for as a polynomial expression of for . So it’d be nice to have an elegant proof to this version of “Bocher’s formula” as well.