Conversely, if for all , we have
for all . Taking the derivative at gives
Thus the Lie algebra of is
We have that if and only if is imaginary for all and
for all . Thus is determined by its imaginary diagonal entries and its complex entries above the diagonal. Its real dimension is thus
If is nonzero, then
so . Thus is not a complex subspace of .
For a challenge, try to show that there is no complex structure on : there is no linear map
such that
for all and
for all (for odd this is easy, but it is trickier for even).
whence . Conversely, if for all then
for all . Differentiating with respect to at gives
as required.
For the last part, we need only show that implies . But if then
so as required.
1. Problem 55 suggests that we consider the following basis of infinitesimal rotations around the axes:
In fact, these are the images of under an isomorphism . By calculation we have
These may remind you of the quaternion group, whose irreducible two-dimensional representation leads us to consider the following basis for :
We have
— we need the factors of for this, otherwise the right hand sides would be doubled.
It follows that the linear map taking to , to and to is an isomorphism of Lie algebras.
2. By the problems class (or problem 7), we know that has a basis with
We calculate that , and . It follows that satisfy the same commutation relations as
so that there is a Lie algebra isomorphism sending , , .
How might you think of this? Well, the eigenvalues of the linear map are , with and the eigenvectors. So you might look for an element of such that the eigenvalues of are , and above works; and are then the eigenvectors!
For a more conceptual approach, let , a bilinear form on . For each , is a linear map preserving this bilinear form. But it is possible to write down a basis of such that
With respect to this basis, is the bilinear form determined by and so for all . The derived map on Lie algebras is the desired isomorphism.
3. Here is a possible approach. Show that acts on the four-dimensional real vector space of Hermitian matrices by for and a Hermitian matrix. The quadratic form on is preserved by this action. It has signature ; indeed, it is positive definite on the space of matrices
and negative definite on the subspace of matrices
We obtain a map ; its derivative is the required isomorphism.
We deduce that the map is a diffeomorphism (it and its inverse are clearly smooth). Moreover, writing , , the latter space is
which is the three-sphere.
Firstly we will show that the Lie algebra of is contained in . Indeed, suppose that with
for all . Then for all ,
for all . Taking the derivative at gives
for all and taking the derivative of this at gives for all , whence .
Conversely, if is connected and , then I claim that for all . Indeed, for ,
as . So commutes with all elements of of the form . Since these generate by the connectedness assumption, we see .
For typesetting reasons I’ll write for the column vector .
Starting with :
by the multivariable chain rule. Thus acts as and a very similar calculation shows that acts as .
Finally,
so that acts as .
An alternative solution would be to compute
using the multivariate chain rule. The derivative of at -s and the derivative of is so we get that the required derivative is
which one can check agrees with the answers from before.
Remark. Another possible convention is to use rather than , which leads to slightly different formulas. This second convention is the same as if we considered elements of as row vectors, with matrices acting on the right, and defined instead
1. Let be the standard basis vector for . Then
Multiplying out, and noting that while , we get that
which is what we want (since ).
2. This is similar. We get, with ,
This simplifies to
as required.
3. Compute the action of on the basis vectors . Skipping the working, the result is
We will show that commutes with each of , and . Note that, since is a Lie algebra representation, we have
For instance, the first equation follows from . So we get
and therefore
Similarly, commutes with . Finally,
If is irreducible, then since commutes with all elements of it is a homomorphism and so is scalar by Schur’s lemma.
The representation is irreducible, and so is a scalar. To find the scalar, we just need to evaluate on a single element of ; I will use the highest weight vector . We have
so acts as the scalar on . Here we see why we might want to use instead: then it acts as .
Recall that acts as , acts as , acts as . We see that
If we apply this to a monomial of degree , we find that
We can explain this as follows: the space of homogeneous polynomial functions of degree is isomorphic to and so the calculation from the previous part applies!
We use the notation for the usual weight basis of , so has weight , for . We have the formulas and . We also abbreviate .
The weights of are . To obtain the weights of we add together all possible (unordered, possibly equal) pairs of these and get:
Thus
A highest weight vector of weight 4 is (clear as it is a symmetric product of highest weight vectors). To get a highest weight vector of weight 2 we must take a linear combination of and which is killed by . Since and we see that
is a weight vector of weight 0 killed by , so a highest weight vector of weight 0.
This time we must take all sums of unordered pairs of /distinct/ elements of . This gives
so that . A highest weight vector of weight is .
We have to add together all pairs of weights from and , giving
as the weights of . Thus the decomposition is
A highest weight vector of weight 5 is . We can now apply repeatedly (and divide out by constant factors where possible to keep the numbers small) to obtain a weight basis of the copy of in the representation, as shown in the table.
We have and so that
is a highest weight vector of weight 3. We apply repeatedly (and divide out scalars where possible) to obtain a weight basis of the copy of in the representation:
Notice that we can ‘cheat’ and obtain just the weight vectors with nonpositive weight, and then apply the symmetry sending to to obtain those of nonnegative weight.
Finally, we have , , and so that
is a highest weight vector of weight 1. Applying (or the symmetry discussed above) we see that this vector together with
is a weight basis for the copy of .
We must add together all unordered triples of (not necessarily distinct) elements of . We get that the weights of are:
so that
A highest weight vector of weight is . We have and so that
is a highest weight vector of weight 2.
The weights of are and the weights of are . Without loss of generality, . Adding these lists together, remembering multiplicity, we see that in the tensor product:
For weights , each occurs times as
; the same holds for their negatives.
Each weight occurs times; specifically, occurs as
This agrees with the weights of
and so this is the decomposition of into irreducibles.
Omitted (for now).