Recurrence Relations

(A more technical post follows.)

By the way, in both sets of talks that I mentioned in the previous post, early on I started talking about orthogonal polynomials P_n(\lambda)=\lambda^n+\mbox{lower powers}, and how they generically satisfy a three-term recurrence relation (or recursion relation):

\lambda P_n(\lambda) = P_{n+1}(\lambda) +S_n P_n(\lambda) +R_n P_{n-1}(\lambda) \ .

Someone raised their hand and ask why it truncates to three terms and on the spot I said that I’d forgotten the detailed proof (it has been many years since I thought about it) but recall that it follows straightforwardly from orthogonality. Lack of time meant that I did not want to try to reconstruct the proof on the board in real time, but it is a standard thing that we all use a lot because it is true for all the commonly used families of polynomials in lots of physics, whether it be Hermite, Legendre, Laguerre, Chebyshev, etc. Anyway, I finally got around to reminding myself of it and thought I’d record it here. Now all I have to do in future is point people here as a handy place to look it up. ([Although you can find equivalent discussions in several sources, for example this nice YouTube lecture here, which is part of a lovely series of lectures on polynomial approximation of functions, which is a fascinating topic in its own right.]

Ok, I should quickly remind what the setup is. The polynomials are normalised so that the nth one is P_n(\lambda)=\lambda^n+\mbox{lower powers} (they’re “monic”) and they are orthogonal with respect to the measure w(\lambda)d\lambda where w(\lambda) is called the “weight function” (it has some suitable properties we won’t worry about here). In the case of random matrix models we have w(\lambda) = \exp\{-N V(\lambda)\} for some potential V(\lambda) (here N is the size of the matrix; in this problem it is just a normalisation choice – you can just as well absorb it into the potential).

So we have the inner product:

\langle P_n, P_m\rangle\equiv \int w(\lambda) P_m(\lambda) P_n(\lambda) d\lambda = h_n\delta_{mn}\ ,

defining the orthogonality, where the h_n are some positive non-vanishing normalisation constants. Ok now we are ready for the proof.

Imagine there are terms in the recursion beyond the three terms. Let’s write these “remainder” terms as a linear combination of all lower polynomials up to degree n-2, so the recursion is tentatively:

\lambda P_n = P_{n+1} +S_n P_n +R_n P_{n-1} + \sum_{k=0}^{n-2} T_kP_k.

Taking the inner product \langle P_m, \lambda P_n\rangle for m=n-1, n or n+1 just tells you the definition of the recursion coefficients S_n and R_n in terms of ratios of inner products for those m, and for m any higher you get zero since the polynomial is then of too high order to give anything non-zero.

So S_n = \frac{\langle \lambda P_n,P_n\rangle}{\langle P_n,P_n\rangle} and R_n = \frac{\langle \lambda P_n,P_{n-1}\rangle}{\langle P_{n-1},P_{n-1}\rangle} .

Then you take the inner product \langle P_m, \lambda P_n\rangle for the cases m < n-2.

But this is also (by definition; I can let the lambda act in the opposite direction inside the integral) \langle\lambda P_m, P_n\rangle, which vanishes since the degree of the first entry, m+1, is less than n, and so it can only contain polynomials of degree less than n which are orthogonal to P_n. Therefore the inner product says T_m \langle P_m,P_m\rangle=0 in all those cases, which means that T_k=0 for k=0, 1,...n-2.

That’s it. All done. Except for the remark that given the expression for S_n above, when the weight function is even, the S_n vanish. (This is the case for even potentials in the case of random matrix models.)

Ok, one more useful thing: It is clear from the definition of the inner product integral that h_{n+1}=\langle P_{n+1},\lambda P_n\rangle. But you can also write this as h_{n+1}=\langle \lambda P_{n+1}, P_n\rangle and use the recursion relation \lambda P_{n+1} = P_{n+2}+S_nP_{n+1}+R_{n+1}P_n, and all these terms vanish in the integral except the last, and so we get h_{n+1}= R_{n+1}\langle P_n,P_n\rangle = R_{n+1}h_n.

Hence we’re derived an important relation: R_n=\frac{h_n}{h_{n-1}}\ .

(We essentially got this already in the earlier equation for R_n; just rearrange the action of \lambda up there again.)

–cvj

Bookmark the permalink.

2 Responses to Recurrence Relations

  1. prashant says:

    are you sure R_n = h_n/ h_n-1 ?

    seems wrong to me

  2. Pingback: Multicritical Matrix Model Miracles | Asymptotia

Leave a Reply

Your email address will not be published. Required fields are marked *