# Real Analysis Final Exam Review

This is an interactive website to help you prepare for the cumulative portion of the final exam. You should be able to flawlessly write the kinds of proofs presented below. Pay careful attention to the templates. You must also study the homework for the last third of the course. The final exam will be longer than previous exams. Before you click on the dotted boxes, try to write the proofs youself.

### Prove the Fundamental Theorem of Calculus.

#### The First Form

We prove the theorem with $$E := \{a, b\}$$, as the general case can be obtained using this result and the Additivity Theorem.

Let $$\varepsilon > 0$$ be given. Because $$f \in R[a, b]$$ there exists $$\delta > 0$$ such that if $$\dot{P}$$ is any tagged partition with $$\left|\left| \dot{P} \right|\right| < \delta$$ then $$\left|S(f; \dot{P}) - \int_a^b(f)\right| < \varepsilon$$ . If the subintervals in $$\dot{P}$$ are $$[x_{i-1}, x_i]$$, then the Mean Value Theorem applied to $$F$$ on $$[x_{i-1}, x_i]$$ implies that there exist $$u_i \in (x_{i-1}, x_i)$$ such that $$F(x_i) - F(x_{i-1}) = F'(u_i)\cdot(x_i-x_{i-1}) = f(u_i)\cdot(x_i-x_{i-1})$$ , $$1 \leq i \leq n$$. Now let's use these $$u_i$$ as tags and the endpoints we've already examined to obtain the tagged partition $$\dot{P_u} := \{([x_{i-1}, x_i], u_i)\}^n_{i=1}$$ , which satisfies $$\left|\left|\dot{P}\right|\right| < \delta$$ by assumption, and we compute $$S(f;\dot{P_u})$$. If we add the terms and evaluate the telescoping sum, we find $$S(f;\dot{P_u}) =$$ $$\sum\limits_{i=1}^n f(u_i)\cdot(x_i - x_{i-1}) =$$ $$\sum\limits_{i=1}^n (F(x_i) - F(x_{i-1})) =$$ $$F(b) - F(a)$$ . Finally, we substitute $$F(b) - F(a) = S(f;\dot{P_u})$$ into $$\left|S(f; \dot{P}) - \int_a^b(f)\right| < \varepsilon$$ to obtain $$\left|F(b)-F(a) - \int_a^b f\right| < \varepsilon$$ , and because $$\varepsilon$$ was arbitrary, the proof is complete. $$\Box$$

### Given Definition 2.1.5, prove that a reasonable sequence converges to a given limit.

#### Prove that if $$x_n := \tfrac{1}{n}$$, then $$(x_n) \rightarrow 0$$.

Given $$\varepsilon > 0$$, let $$K > \tfrac{1}{\varepsilon}$$ . Then if $$n \geq K$$, we have $$\left|\tfrac{1}{n}-0\right| =$$ $$\tfrac{1}{n} \leq$$ $$\tfrac{1}{K} <$$ $$\tfrac{1}{\tfrac{1}{\varepsilon}} =$$ $$\varepsilon$$. Thus, we find that $$(\tfrac{1}{n}) \rightarrow 0$$ by Definition 2.1.5. $$\Box$$

#### Prove that if $$y_n := \tfrac{n^2}{3n^2 + 2}$$, then $$(y_n) \rightarrow \tfrac{1}{3}$$.

Given $$\varepsilon > 0$$, let $$K := \sqrt{\tfrac{2}{9\varepsilon}-\tfrac{2}{3}}$$. We may assume that $$\varepsilon < \tfrac{1}{3}$$. Then if $$n>K$$, we have $$\left|\tfrac{n^2}{3n^2+2}-\tfrac{1}{3}\right| =$$ $$\left|\tfrac{-2}{9n^2+6}\right| =$$ $$\tfrac{2}{9n^2+6} <$$ $$\tfrac{2}{9K^2+6} =$$ $$\varepsilon$$. Thus, we learn that $$(\tfrac{n^2}{3n^2+2}) \rightarrow \tfrac{1}{3}$$ by Definition 2.1.5. $$\Box$$

### Given Definition 2.6.3, prove that a reasonable sequence is Cauchy using the definition.

#### Prove that $$x_n := \tfrac{3}{n}$$ is Cauchy.

Given $$\varepsilon > 0$$, let $$H := \tfrac{3}{\varepsilon}$$. Then if $$m > n \geq H$$, we have $$\left|x_m - x_n\right| =$$ $$\left|\tfrac{3}{m} - \tfrac{3}{n}\right| =$$ $$3\left|\tfrac{n-m}{mn}\right| =$$ $$3(\tfrac{m-n}{mn}) <$$ $$3(\tfrac{m}{mn}) =$$ $$\tfrac{3}{n} <$$ $$\tfrac{3}{\tfrac{3}{\varepsilon}} =$$ $$\varepsilon$$. Thus, $$x_n$$ is Cauchy by Definition 2.6.4. $$\Box$$

#### Prove that $$y_n := \tfrac{1}{n+1}$$ is Cauchy.

Given $$\varepsilon > 0$$, let $$H := \tfrac{1}{\varepsilon}$$. Then if $$m > n \geq H$$, we have $$\left|y_m - y_n\right| =$$ $$\left|\tfrac{1}{m+1} - \tfrac{1}{n+1}\right| =$$ $$\tfrac{(m+1)-(n+1)}{(n+1)(m+1)} =$$ $$\tfrac{m-n}{(n+1)(m+1)} <$$ $$\tfrac{m}{(n+1)(m+1)} <$$ $$\tfrac{1}{n+1} <$$ $$\tfrac{1}{n} \leq$$ $$\tfrac{1}{H} =$$ $$\varepsilon$$. Thus, $$y_n$$ is Cauchy by Definition 2.6.4. $$\Box$$

### Given Definition 3.1.4, prove that a reasonable function has a given limit.

#### Prove that $$\lim\limits_{x \to 1} (x^2 + 3x) = 4$$.

Given $$\varepsilon > 0$$, let $$\delta := \tfrac{\varepsilon}{6}$$, assuming $$\delta < 1$$. Then if $$0 < \left|x-1\right| < \delta < 1$$, we compute that $$\left|x+4\right| \leq$$ $$\left|x\right| + 4 <$$ $$2 + 4 = 6$$ by the lemma. Thus, we find $$\left|(x^2 + 3x) - 4\right| =$$ $$\left|x-1\right|\left|x+4\right| <$$ $$\delta \cdot 6 =$$ $$\varepsilon$$. Therefore, we obtain $$\lim\limits_{x \to 1} (x^2 + 3x) = 4$$ by Definition 3.1.6. $$\Box$$

### Given Definition 4.1.1, prove that a reasonable function is continuous at a point.

#### Prove that $$f(x) := 2x+5$$ is continuous at $$x=3$$.

Given $$\varepsilon > 0$$, let $$\delta := \tfrac{\varepsilon}{2}$$. Then if $$\left|x-3\right| < \delta$$, we have $$\left|f(x)-f(3)\right| =$$ $$\left|(2x+5)-11\right| =$$ $$\left|2x-6\right| =$$ $$2\left|x-3\right| <$$ $$2 \cdot \delta =$$ $$\varepsilon$$. Thus, $$f(x)$$ is continuous at $$x=3$$ by Definition 4.1.2. $$\Box$$

### Given Definition 4.3.3, prove that a reasonable function is uniformly continuous.

#### Prove that $$f(x) := 7x-5$$ is uniformly continuous on $$\mathbb{R}$$.

Given $$\varepsilon > 0$$, let $$\delta := \tfrac{\varepsilon}{7}$$. Then if $$\left|u - v\right| < \delta$$, we have $$\left|f(u) - f(v)\right| =$$ $$\left| (7u-5) - (7v-5)\right| =$$ $$7\left|u - v\right| <$$ $$7 \cdot \delta =$$ $$\varepsilon$$. Thus, $$f$$ is uniformly continuous by Definition 4.3.3. $$\Box$$

### Given Definition 5.1.2, prove that a reasonable function has a derivative at a point.

#### Find the derivative of $$f(x) := x^2$$ at $$x=c \in \mathbb{R}$$.

We compute that $$f'(c) = \lim\limits_{x \to c}(\tfrac{x^2-c^2}{x-c}) =$$ $$\lim\limits_{x \to c}(\tfrac{(x+c)(x-c)}{x-c}) =$$ $$\lim\limits_{x \to c}(x+c) =$$ $$2c$$. Thus, we learn that $$f'(c) = 2c$$ by Definition 5.1.1. $$\Box$$

### Construct an $$\tfrac{\varepsilon}{2}$$ argument.

#### Prove that the sum of two convergent sequences is convergent.

Given $$(x_n) \rightarrow x$$ and $$(y_n) \rightarrow y$$, we will show that $$(x_n + y_n) \rightarrow x + y$$. Since $$(x_n) \rightarrow x$$, given $$\varepsilon > 0$$ there exists a $$K_1$$ such that if $$n \geq K_1$$ then $$\left|x_n - x\right| < \tfrac{\varepsilon}{2}$$. Similarly, because $$(y_n) \rightarrow y$$, there exists a $$K_2$$ such that if $$n \geq K_2$$ then $$\left|y_n-y\right| < \tfrac{\varepsilon}{2}$$. Let $$K := sup\{K_1, K_2\}$$. Then if $$n \geq K$$, we have $$\left|(x_n + y_n) - (x+y)\right| =$$ $$\left|(x_n - x) + (y_n - y)\right| \leq$$ $$\left|x_n-x\right| + \left|y_n - y\right| <$$ $$\tfrac{\varepsilon}{2} + \tfrac{\varepsilon}{2} =$$ $$\varepsilon$$. Thus, we learn that $$(x_n + y_n) \rightarrow x + y$$ by definition. $$\Box$$

#### Prove that a convergent sequence is Cauchy.

Given $$\varepsilon > 0$$, there exists a $$K$$ such that if $$n \geq K$$ then $$\left|x_n-x\right| < \tfrac{\varepsilon}{2}$$. Thus, if $$n, m \geq K$$, we have $$\left|x_n-x_m\right| =$$ $$\left|x_n-x+x-x_m\right| \leq$$ $$\left|x_n-x\right| + \left|x-x_m\right| <$$ $$\tfrac{\varepsilon}{2} + \tfrac{\varepsilon}{2} =$$ $$\varepsilon$$. Therefore, we know that a convergent sequence is Cauchy. $$\Box$$