Posted by: matheuscmss | April 19, 2008

## What’s the size of certain unbalanced products of SL(2,R) matrices?

Today I would like to discuss an open “toy problem” proposed by Jairo Bochi and Bassam Fayad about the Lyapounov exponents of certain products of $SL(2,\mathbb{R})$ matrices. However, let me point out to everyone unfamiliar with Lyapounov exponents and cocycles that I will avoid the use of any technical definition. In particular, I hope this post will be accessible to a broad audience.

Basically, the “toy problem” can be stated as follows:

Problem (Bochi and Fayad, 2006). Let $A_0\in SL(2,\mathbb{R})$ be an arbitrary hyperbolic matrix. Is it true that there are some constants $0<\theta<1$ and $\gamma>1$ such that for almost every matrix $A_1\in SL(2,\mathbb{R})$ and for every word $w\in\{0,1\}^{\mathbb{N}}$ satisfying the frequency condition :

(1) $\#\{j\in\{1,\dots,N\}: w_j=1\}\leq \theta N,$

we have

(2) $\|A_{w_N}\dots A_{w_1}\|>\gamma^N$

for all $N\in\mathbb{N}$?

Before proceeding further, let me explain the italic terms in the statement of the problem and what makes this question very intuitive and natural.

First of all, I recall the concept of hyperbolic matrix. The usual way to define hyperbolicity is the following: we say that an invertible $n\times n$ matrix A is hyperbolic if and only if none of its eigenvalues lies on the unit circle $S^1$. In other words, A is hyperbolic whenever the modulus of all of its eigenvalues is different from 1. Of course, hyperbolic $n\times n$ matrices are pleasant objects since they admit a simple description of its dynamics: we can decompose $\mathbb{R}^n = E\oplus F$ into the sum of two generalized eigenspaces (E being the sum of the eigenspaces of eigenvalues with modulus less than 1 and F being the sum of eigenspaces of eigenvalues with modulus greater than 1) so that the action of A restricted to these two subspace is very simple – $A|_E$ is a contraction and $A|_F$ is an expansion.

It turns out that when dealing with $SL(2,\mathbb{R})$ matrices (i.e., $2\times 2$ invertible matrices with determinant 1), we can define hyperbolicity in a short manner: $A\in SL(2,\mathbb{R})$ is hyperbolic if and only if $|tr A|>2$ (where $tr A$ stands for the trace of A). It is an easy exercise of linear algebra left to the reader to check that these two concepts of “hyperbolicity” for $SL(2,\mathbb{R})$ matrices are equivalent.

Next, let me explain the meaning of “almost every matrix of $SL(2,\mathbb{R})$“. It is well-known that $SL(2,\mathbb{R})$ is a three-dimensional manifold (I think that the best way to see this is using the polar decomposition theorem from linear algebra. It implies that any matrix A of $SL(2,\mathbb{R})$ can be written as $A = R_{b} H_c R_{a}$ where $R_{\mu}$ denotes the rotation by angle $\mu$ and $H_c = diag (c, 1/c)$ is the diagonal matrix with entries $c, c^{-1}$ and $c\geq 1$. Hence, these three parameters a,b,c serves as a local coordinate system). In particular, we can speak about subsets of “zero measure” on this 3-manifold (by just looking at the zero Lebesgue measure sets on the local charts). A more advanced way to introduce the notion of zero measure subsets of $SL(2,\mathbb{R})$ uses its Haar measure, which is a natural measure generalizing the notion of Lebesgue measure (in the sense that this measure is “invariant by translations”) to the setting of locally compact groups (such as $\mathbb{R}$ and $SL(2,\mathbb{R})$).

After these preliminaries, we are ready to discuss some reasons making Bochi and Fayad question a very natural one. We begin by noticing that any hyperbolic matrix $A_0\in SL(2,\mathbb{R})$ verifies $\|A_0^N\| = \|A_0\|^N = \lambda^N$ where $\lambda>|tr A_0|/2>1$ is the biggest eigenvalue of A. Hence, if our word $w\in\{0,1\}^{\mathbb{N}}$ possesses only 0’s, it follows that the estimate (2) holds. Now consider more general words satisfying the frequency condition (1) above. To simplify our explanation, let’s assume that $A_1\in SO(2,\mathbb{R})$ is a rotation $R_{\mu}$ of angle $\mu$. In this situation, the problem asks about the growth rate of the norm of a certain products of the matrices $A_0$ and $R_{\mu}$ for Lebesgue almost every $\mu$. Of course, we can’t hope to control the norm of a product of these two matrices, at least when the proportion of 0’s and 1’s are more or less equal (i.e., when the word is balanced). In fact, if $A_0 = diag (3, 1/3)$ and $A_1=R_{\pi/2}$, then it is not hard to see that the condition (2) is not verified for the word $w = (1010101010101010...)$. Indeed, by direct computation, one sees immediately that $\|A_{w_N}\dots A_{w_1}\|\leq 3$ for all $N\in\mathbb{N}$. Looking more closely at this simple example, we note that there are two effects playing a role here: while the hyperbolic matrix $A_0$ tends to expand the norm of almost all vectors (i.e., it will expand the norm of every vector except for its stable vector $e_2 = (0,1)$), the rotation $R_{\pi/2}$ sends the expanded vectors to the contracted vectors by $A_0$, so that any growth gained in a previous step is completely lost in the next turn of the multiplication. In particular, in the case of a balanced word, these two effects fight one against the other but none of them wins at the end (and our products of matrices have bounded norm during this process).

On the other hand, if we impose a strong frequency condition, such as (1) with $\theta\ll 1$ sufficiently small (perhaps depending on the size of the biggest eigenvalue $\lambda$ of $A_0$), we can hope to get the estimate (2) by the following intuitive argument: whenever we see a long string of 0’s in our word, we are multiplying several times the matrix $A_0$ by itself, so that we see a lot of expansion of the unstable vector of $A_0$ (i.e., the eigenvector associated to the biggest eigenvalue $latex\lambda$; for instance, this vector is $e_1=(1,0)$ for the diagonal matrix considered in the previous paragraph). Of course, since we are eventually multiplying by rotations, there is an inevitable loss of norm at certain points of the word. However, since there are only a few rotations involved here, it is reasonable to expect an exponential growth of the norm of our products of matrices except if we are sufficiently unlucky to look at a rotation which sends our best expanding vectors to the stable direction of $A_0$ (for instance, the stable direction of the diagonal matrix of the previous paragraph is $e_2=(0,1)$). But, the probability of these “bad” rotations described above should be small (as the size of our words grows) because we have just few rotations to deal with.

Unfortunately, it doesn’t seem easy to formalize this naive argument. Nevertheless, we should point out that Fayad and Krikorian were able to partly solve this problem under a very restrictive frequency condition. By the way, I plan to discuss later in this blog this recent theorem of Fayad and Krikorian.

Let me close this post just by saying that the conclusion “for almost every $A_1\in SL(2,\mathbb{R})$” can’t be improved: Bochi and Fayad proved that the set of matrices $A_1\in SL(2,\mathbb{R})$ such that (2) don’t hold is residual (of course, this result doesn’t answers the problem because it is well-known that there are residual sets of zero measure).

## Responses

1. Hi Matheus! Nice blog, I didn’t know about it.

I have two comments that I think that will complement the discussion:
(1) It’s not even known if there is a single pair of matrices A_0 hyperbolic and A_1 elliptic such that the conclusion holds.

(2) The proof that the conclusion does not hold for generic pairs $(A_0, A_1)$ (with A_0 hyp, A_1 elliptic) is very easy. First, one checks that the set in question is a $G_\delta$. Second, let us see that it is dense: For each $A_1$, every $A_2$ can be perturbed so that a power $A_2^k$ sends the expanding eigendirection of $A_1$ to the contracting one. Then one considers words $A_1^n A_2^k A_1^n$ with $k$ as above and $n$ much bigger. These have small norm despite the small proportion of $A_2$’s.

Bassam and myself worked hard on this problem for two months, before I gave up and started to ask smarter people… Good luck!

Um abraco,
Jairo

This site uses Akismet to reduce spam. Learn how your comment data is processed.