Today I would like to discuss an open “toy problem” proposed by Jairo Bochi and Bassam Fayad about the Lyapounov exponents of certain products of matrices. However, let me point out to everyone unfamiliar with Lyapounov exponents and cocycles that I will avoid the use of any technical definition. In particular, I hope this post will be accessible to a broad audience.
Basically, the “toy problem” can be stated as follows:
Problem (Bochi and Fayad, 2006). Let be an arbitrary hyperbolic matrix. Is it true that there are some constants and such that for almost every matrix and for every word satisfying the frequency condition :
for all ?
Before proceeding further, let me explain the italic terms in the statement of the problem and what makes this question very intuitive and natural.
First of all, I recall the concept of hyperbolic matrix. The usual way to define hyperbolicity is the following: we say that an invertible matrix A is hyperbolic if and only if none of its eigenvalues lies on the unit circle . In other words, A is hyperbolic whenever the modulus of all of its eigenvalues is different from 1. Of course, hyperbolic matrices are pleasant objects since they admit a simple description of its dynamics: we can decompose into the sum of two generalized eigenspaces (E being the sum of the eigenspaces of eigenvalues with modulus less than 1 and F being the sum of eigenspaces of eigenvalues with modulus greater than 1) so that the action of A restricted to these two subspace is very simple – is a contraction and is an expansion.
It turns out that when dealing with matrices (i.e., invertible matrices with determinant 1), we can define hyperbolicity in a short manner: is hyperbolic if and only if (where stands for the trace of A). It is an easy exercise of linear algebra left to the reader to check that these two concepts of “hyperbolicity” for matrices are equivalent.
Next, let me explain the meaning of “almost every matrix of “. It is well-known that is a three-dimensional manifold (I think that the best way to see this is using the polar decomposition theorem from linear algebra. It implies that any matrix A of can be written as where denotes the rotation by angle and is the diagonal matrix with entries and . Hence, these three parameters a,b,c serves as a local coordinate system). In particular, we can speak about subsets of “zero measure” on this 3-manifold (by just looking at the zero Lebesgue measure sets on the local charts). A more advanced way to introduce the notion of zero measure subsets of uses its Haar measure, which is a natural measure generalizing the notion of Lebesgue measure (in the sense that this measure is “invariant by translations”) to the setting of locally compact groups (such as and ).
After these preliminaries, we are ready to discuss some reasons making Bochi and Fayad question a very natural one. We begin by noticing that any hyperbolic matrix verifies where is the biggest eigenvalue of A. Hence, if our word possesses only 0’s, it follows that the estimate (2) holds. Now consider more general words satisfying the frequency condition (1) above. To simplify our explanation, let’s assume that is a rotation of angle . In this situation, the problem asks about the growth rate of the norm of a certain products of the matrices and for Lebesgue almost every . Of course, we can’t hope to control the norm of a product of these two matrices, at least when the proportion of 0’s and 1’s are more or less equal (i.e., when the word is balanced). In fact, if and , then it is not hard to see that the condition (2) is not verified for the word . Indeed, by direct computation, one sees immediately that for all . Looking more closely at this simple example, we note that there are two effects playing a role here: while the hyperbolic matrix tends to expand the norm of almost all vectors (i.e., it will expand the norm of every vector except for its stable vector ), the rotation sends the expanded vectors to the contracted vectors by , so that any growth gained in a previous step is completely lost in the next turn of the multiplication. In particular, in the case of a balanced word, these two effects fight one against the other but none of them wins at the end (and our products of matrices have bounded norm during this process).
On the other hand, if we impose a strong frequency condition, such as (1) with sufficiently small (perhaps depending on the size of the biggest eigenvalue of ), we can hope to get the estimate (2) by the following intuitive argument: whenever we see a long string of 0’s in our word, we are multiplying several times the matrix by itself, so that we see a lot of expansion of the unstable vector of (i.e., the eigenvector associated to the biggest eigenvalue $latex\lambda$; for instance, this vector is for the diagonal matrix considered in the previous paragraph). Of course, since we are eventually multiplying by rotations, there is an inevitable loss of norm at certain points of the word. However, since there are only a few rotations involved here, it is reasonable to expect an exponential growth of the norm of our products of matrices except if we are sufficiently unlucky to look at a rotation which sends our best expanding vectors to the stable direction of (for instance, the stable direction of the diagonal matrix of the previous paragraph is ). But, the probability of these “bad” rotations described above should be small (as the size of our words grows) because we have just few rotations to deal with.
Unfortunately, it doesn’t seem easy to formalize this naive argument. Nevertheless, we should point out that Fayad and Krikorian were able to partly solve this problem under a very restrictive frequency condition. By the way, I plan to discuss later in this blog this recent theorem of Fayad and Krikorian.
Let me close this post just by saying that the conclusion “for almost every ” can’t be improved: Bochi and Fayad proved that the set of matrices such that (2) don’t hold is residual (of course, this result doesn’t answers the problem because it is well-known that there are residual sets of zero measure).