You are currently browsing the daily archive for December 23, 2007.

The Harvard College Mathematics Review (HCMR) published its inaugural issue in April 2007, and the second issue was out almost a fortnight ago. Clearly, the level of exposition contained in the articles is extremely high, and it is a pleasure reading all the articles even though a lot of it may not make a lot of sense to a lot of people. I would recommend anyone to visit their website and browse all their issues. For problem-solvers, the problem section in each issue is a delight!

Anyway, the first issue’s problem section contained a somewhat hard inequality problem (proposed by Shrenik Shah), which I was able to solve and for which my solution was acknowledged in the problem section of the second issue. The HCMR carried Greg Price’s solution to that particular problem, and I must say his solution is somewhat more “natural” and “intuitive” than the one I gave.

Well, I want to present my solution here but in a more detailed fashion. In particular, I want to develop the familiar AM-GM-HM inequality up to a point where the solution to the aforesaid problem turns out to be somewhat “trivial.” The buildup to the final solution itself is non-trivial, however. This post relies on the material presented in the classic book Problems and Theorems in Analysis I by George Pólya and Gabor Szegö. Again, I strongly recommend all problem-solvers to study this book. Anyway, we will now state the problem and discuss its solution. (I have slightly changed the formulation of the problem in order to avoid any possible copyright issues.)

Problem: For all distinct positive reals a and b, show that

\displaystyle \frac{a+b}{2} > \frac{a^{\frac{a}{a-b}}b^{\frac{b}{b-a}}}{e} > \frac{a-b}{\ln a - \ln b}.

First, let us discuss some facts.

1. AM-GM-HM inequality: If x_1, x_2, \ldots, x_n are n positive real numbers, then

\displaystyle (x_1 + x_2 + \ldots + x_n)/n \geq \sqrt[n]{x_1x_2\cdots x_n} \geq \frac{n}{1/x_1 + 1/x_2 + \ldots + 1/x_n},

with equality if and only if x_1 = x_2 = \ldots = x_n.

Proof: For a hint on proving the above using mathematical induction, read this. However, we will make use of Jensen’s inequality to prove the above result. We won’t prove Jensen’s inequality here, though it too can be proved using induction.

Jensen’s inequality: Let f : (a, b) \to \mathbb{R} be a continuous convex function. Let \lambda_1, \lambda_2, \ldots, \lambda_n be positive reals such that \lambda_1 + \lambda_2 + \ldots + \lambda_n = 1. Then for all x_1, x_2, \ldots, x_n \in (a,b), we have

\lambda_1f(x_1) + \lambda_2f(x_2) + \ldots + \lambda_nf(x_n) \geq f(\lambda_1x_1 + \lambda_2x_2 + \ldots + \lambda_nx_n),

with equality if and only if x_1 = x_2 = \ldots = x_n.

 

 

 

 

Now, in order to prove (1), consider the function f : (0,\infty) \to \mathbb{R} defined by f(x) = -\ln x. Indeed, f is continuous on the stated domain. Also, f''(x) = 1/x^2 > 0, which implies f is convex on (0, \infty). Therefore, using Jensen’s inequality and setting \lambda_1 = \lambda_2 = \ldots = \lambda_n = 1/n, for positive reals x_1, x_2, \ldots, x_n, we have

\displaystyle -\frac1{n} \left( \ln x_1 + \ln x_2 + \ldots + \ln x_n \right) \geq -\ln \left( \frac{x_1 + x_2 + \ldots + x_n}{n} \right)

\displaystyle \Rightarrow \frac{x_1 + x_2 + \ldots + x_n}{n} \geq \sqrt[n]{x_1x_2\cdots x_n} \quad \ldots (since \ln x is monotonically increasing on (0, \infty).)

This proves the first part of the inequality in (1). Now, replace each x_i with 1/x_i for 1 \leq i \leq n to derive the second part of the inequality. And, this proves the AM-GM-HM inequality.

We can generalize this further. Indeed, for any positive reals p_1, p_2, \ldots, p_n and positive reals x_1, x_2, \ldots, x_n, replacing each \lambda_i with p_i/(p_1 + p_2 + \ldots + p_n) for 1 \leq i \leq n, and using Jensen’s inequality for f(x) = -\ln x once again, we obtain the following generalized AM-GM-HM inequality, which we shall call by a different name:

2. Generalized Cauchy Inequality (non-integral version) : For any positive reals p_1, p_2, \ldots, p_n and positive reals x_1, x_2, \ldots, x_n, we have

\displaystyle \frac{p_1x_1 + \ldots + p_nx_n}{p_1 + \ldots + p_n} \geq  \left(  x_1^{p_1}\cdots x_n^{p_n}\right)^{1/(p_1 + \ldots + p_n)} \geq \frac{p_1 + \ldots + p_n}{p_1/x_1 + \ldots + p_n/x_n}.

Now, the remarkable thing is we can generalize the above inequality even further to obtain the following integral version of the inequality.

3. Generalized Cauchy Inequality (integral version) : Let f(x) and p(x) be continuous and positive functions on the interval [a, b] ; also suppose f(x) is not a constant function. Then we have

\displaystyle \frac{\int_{a}^{b} p(x)f(x)\, dx}{\int_{a}^{b} p(x)\, dx} > \exp\left({\frac{\int_{a}^{b} p(x)\ln f(x)\, dx}{\int_{a}^{b} p(x)\, dx}}\right) > \frac{\int_{a}^{b} p(x)\, dx}{\int_{a}^{b} \frac{p(x)}{f(x)}\, dx},

where \exp(x) = e^x.

Okay, now we are finally ready to solve our original problem. First, without any loss of generality, we can assume a < b. Now, we shall use the above version of the Generalized Cauchy Inequality, and so we set p(x) = 1 and f(x) = x. Here f and g are both positive functions on the interval [a, b]. Also, note that f is not a constant function.

 

 

 

Thus, we have \displaystyle \frac{\int_a^b 1\, dx}{\int_a^b \frac1{x} \, dx} = \frac{a-b}{\ln a - \ln b} \quad \ldots (*).

Also, \exp \left( \frac{\int_a^b \ln x \, dx}{\int_a^b 1 \, dx} \right) = \exp \left(  \frac{b\ln b - a\ln a - (b-a)}{b-a}\right) = \frac{a^{\frac{a}{a-b}} b^{\frac{b}{b-a}}}{e} \quad \ldots (**).

And, \displaystyle \frac{\int_a^b x\, dx}{\int_a^b 1 \, dx} = \frac{(b^2 - a^2)/2}{b-a} = \frac{a+b}{2} \quad \ldots (***).

Combining (*), (**) and (***), we immediately obtain the desired inequality. And, we are done.

Our other blog

Visitors to this blog

Blog Stats

  • 381,219 hits

Wikio Ranking

Wikio - Top Blogs - Sciences

Current Online Readers

Archives

December 2007
M T W T F S S
 12
3456789
10111213141516
17181920212223
24252627282930
31