[Update: Thanks to Andreas for pointing out that I may have been a little sloppy in stating the maximum modulus principle! The version below is an updated and correct one. Also, Andreas pointed out an excellent post on "amplification, arbitrage and the tensor power trick" (by Terry Tao) in which the "tricks" discussed are indeed very useful and far more powerful generalizations of the "method" of E. Landau discussed in this post. The Landau method mentioned here, it seems, is just one of the many examples of the "tensor power trick".]
The maximum modulus principle states that if (where ) is a holomorphic function, then attains its maximal value on any compact on the boundary of . (If attains its maximal value anywhere in the interior of , then is a constant. However, we will not bother about this part of the theorem in this post.)
Problems and Theorems in Analysis II, by Polya and Szego, provides a short proof of the “inequality part” of the principle. The proof by E. Landau employs Cauchy’s integral formula, and the technique is very interesting and useful indeed. The proof is as follows.
From Cauchy’s integral formula, we have
for every in the interior of .
Now, suppose on . Then,
where the constant depends only on the curve and on the position of , and is independent of the specific choices of . Now, this rough estimate can be significantly improved by applying the same argument to , where , to obtain
, or .
By allowing to go to infinity, we get , which is what we set out to prove.
Polya/Szego mention that the proof shows that a rough estimate may sometimes be transformed into a sharper estimate by making appropriate use of the generality for which the original estimate is valid.
I will follow this up with, maybe, a couple of problems/solutions to demonstrate the effectiveness of this useful technique.