20. Mathematics Review#

20.1. The Exponential Function#

For any real number \(x\), the exponential function \(\exp(\cdot)\) is defined as:

(20.1)#\[\begin{align} \exp(x) = 1 + \frac{x}{1!} + \frac{x^{2}}{2!} + \frac{x^{3}}{3!} + ... \label{exponential_function} \end{align}\]

Equation \(\eqref{exponential_function}\) defines series that converges absolutely for every \(x\) and uniformly in every bounded interval of \(\mathbb{R},\) implying that \(\exp\) is a continuous function. Furthermore, we can see from \(\eqref{exponential_function}\) that \(\exp(0) = 1.\) The number \(e = \exp(1)\) is called Euler’s constant.

It is possible to prove directly from \(\eqref{exponential_function}\) the important addition formula:

(20.2)#\[\begin{align} \exp(x) \exp(y) = \exp(x + y). \label{exp_addition_formula} \end{align}\]

Equation \(\eqref{exp_addition_formula}\) implies that \(\exp(x) \exp(-x) = \exp(x - x) = 1.\) Thus,

(20.3)#\[\begin{align} \exp(-x) = \frac{1}{\exp(x)}. \label{exponential_inverse} \end{align}\]

For every \(x > 0,\) equation \(\eqref{exponential_function}\) shows that \(\exp(x) > 1.\) Therefore, for \(x \in \mathbb{R}\) and \(h > 0\), equation \(\eqref{exponential_inverse}\) implies that

\[\begin{align*} \exp(x) = \frac{\exp(x + h)}{\exp(h)} < \exp(x + h), \end{align*}\]

which shows that the function \(\exp\) is strictly increasing. Furthermore, equation \(\eqref{exponential_inverse}\) also shows that \(\exp(-x) > 0\) for all \(x \geq 0.\) Therefore, \(\exp(x) > 0\) for \(x \in \mathbb{R}.\)

Furthermore, \(\eqref{exponential_function}\) also implies that \(\exp(x) > 1 + x\) for \(x > 0,\) which shows that \(\lim_{x \rightarrow \infty} \exp(x) = \infty.\) Equation \(\eqref{exponential_inverse}\) then shows that \(\lim_{x \rightarrow \infty} \exp(-x) = 0,\) or \(\lim_{x \rightarrow -\infty} \exp(x) = 0.\)

Figure made with TikZ

Property 20.1 (Properties of the Exponential Function)

We can now list the properties of the exponential function that we have discovered so far:

  1. \(\exp(0)=1.\)

  2. \(\exp(x) \exp(y) = \exp(x+y).\)

  3. \(\exp\) is a strictly increasing function.

  4. \(\exp(x) > 0\) for all \(x \in \mathbb{R}.\)

  5. \(\lim_{x \rightarrow -\infty} \exp(x) = 0\) and \(\lim_{x \rightarrow \infty} \exp(x) = \infty.\)

The fact that \(\exp\) is continuous and strictly increasing guarantees the existence of an inverse function that we call the natural logarithm and denote by \(\ln.\) In most cases we will refer to \(\ln(x)\) just as the logarithm of \(x.\) Many authors and programming languages denote the natural logarithm by \(\log.\) I will stick in my notes to the \(\ln\) notation, though.

Figure made with TikZ

We note that even though \(\exp(x)\) is defined for all \(x \in \mathbb{R},\) the function \(\ln(x)\) is only defined for \(x > 0,\) i.e., the domain of \(\ln\) is the range of \(\exp\) and vice-versa. Therefore, the definition of an inverse function implies that \(\ln(\exp(x)) = x\) for any \(x \in \mathbb{R},\) and that \(\exp(\ln(x)) = x\) for \(x > 0.\) In particular, we have that \(\ln(1) = \ln(\exp(0)) = 0.\)

The addition formula for the natural logarithm is the inverse of the formula for the exponential function:

(20.4)#\[\begin{align} \ln(x y) = \ln(x) + \ln(y). \label{log_addition_formula} \end{align}\]

Equation \(\eqref{log_addition_formula}\) implies that \(\ln(b^{n}) = n \ln(b)\) for \(n \in \mathbb{N}\) and \(b > 0.\) We can use this property to extend exponentiation of any positive real number \(b\) to real a power \(x\) by:

(20.5)#\[\begin{align} b^{x} = \exp(\ln(b) x) \quad (b \in \mathbb{R}^{+},\, x \in \mathbb{R}). \label{exponentiation} \end{align}\]

Clearly, \(b^{0} = \exp(\ln(b) \times 0) = 1.\) Also, note that:

(20.6)#\[\begin{align} b^{x} b^{y} & = \exp(\ln(b) x) \exp(\ln(b) y) \notag \\ & = \exp(\ln(b) x + \ln(b) y) \notag \\ & = \exp(\ln(b) (x + y)) \notag \\ & = b^{x + y}, \label{general_exponential} \\ \end{align}\]

where in the second and third line we have used equations \(\eqref{exp_addition_formula}\) and \(\eqref{exponentiation},\) respectively. A direct consequence of \(\eqref{general_exponential}\) is \(b^{1} b^{-1} = b^{1 - 1} = 1,\) so that \(b^{-1} = 1 / b.\) Finally, note that for any \(x\) and \(y\) in \(\mathbb{R}\) we have that:

\[\begin{align*} (b^{x})^{y} & = e^{y \ln(b^{x})} \\ & = e^{y \ln(b) x} \\ & = e^{(x y) \ln(b)} \\ & = b^{x y}. \end{align*}\]

Property 20.2 (Properties of Exponents)

Let \(b \in \mathbb{R}^{+},\) and \(x, y \in \mathbb{R}.\) Then we have that:

  1. \(\ln(b^{x}) = x \ln(b).\)

  2. \(b^{x} b^{y} = b^{x + y}.\)

  3. \((b^{x})^{y} = b^{x y}.\)

In particular, note that \(\eqref{exponentiation}\) implies \(e^{x} = \exp(x),\) which is another common way to write the exponential function. In what follows, I will use \(\exp(x)\) or \(e^{x}\) interchangeably to denote the exponential of \(x\).

20.2. Derivatives#

The derivative of a function \(f(x)\) at a point \(x\) measures the change in the function for a very small change in the underlying variable.

Definition 20.1

The derivative of \(f(x)\) at the point \(x\) is denoted by \(f'(x)\) and is defined as:

\[\begin{align*} f'(x) = \lim_{h \rightarrow 0} \frac{f(x + h) - f(x)}{h} \end{align*}\]

Example 20.1 (Derivative of a Linear Function)

Let us first apply Definition 20.1 to compute the derivative of \(x\) itself. Of course, intuitively the rate of change of \(x\) with respect to \(x\) is one-to-one, which suggest that this derivative should be 1. To see this formally, note that:

\[\begin{align*} \frac{(x + h) - x}{h} = 1, \end{align*}\]

which shows that \((x)' = 1.\)

Example 20.2 (Derivative of a Square Function)

We can apply Definition 20.1 to compute the derivative of \(x^{2}.\) We first note that:

\[\begin{align*} \frac{(x + h)^{2} - x^{2}}{h} = \frac{x^{2} + 2 xh + h^{2} - x^{2}}{h} = 2x + h, \end{align*}\]

which allows us to compute:

\[\begin{align*} (x^{2})' = \lim_{h \rightarrow 0} 2x + h = 2x. \end{align*}\]

Property 20.3 (Useful Differentiation Formulas)

In the following expressions, \(f\) and \(g\) are two differentiable functions.

  1. Sum Rule:

(20.7)#\[\begin{align} (f + g)' = f' + g' \label{derivative_addition} \end{align}\]
  1. Product Rule:

(20.8)#\[\begin{align} (f g)' = f' g + f g' \label{derivative_multiplication} \end{align}\]
  1. Inverse Rule (\(f \neq 0\)):

(20.9)#\[\begin{align} (1/f)' = - 1/(f')^{2} \label{derivative_inverse} \end{align}\]
  1. Chain Rule:

(20.10)#\[\begin{align} (g \circ f)' = (g' \circ f) f' \label{chain_rule} \end{align}\]

Example 20.3 (Scalar Multiplication)

Consider an arbitrary differentiable function \(f\) and let \(g = c,\) where \(c \in \mathbb{R}\) is a constant. Then, \(g' = 0\) and \(\eqref{derivative_multiplication}\) implies that:

\[\begin{align*} (f g)' = f' g + f g' = c f'. \end{align*}\]

Therefore, \((c f)' = c f'.\)

Example 20.4 (Derivative of a Power Function)

We can use \(\eqref{derivative_multiplication}\) to compute the derivative of \(x^{3}\) as follows:

\[\begin{align*} (x^{3})' & = (x x^{2})' \\ & = x^{2} + x (2 x) \\ & = 3 x^{2}. \end{align*}\]

There seems to be a pattern for powers of \(x,\) namely \((x^{n})' = n x^{n-1}.\) Remember that we saw in Example 20.1 that the property is true for \(n=1.\) Let us assume that \((x^{n})' = n x^{n-1}.\) Then,

\[\begin{align*} (x^{n+1})' & = (x x^{n})' \\ & = x^{n} + x (n x^{n-1}) \\ & = (n+1) x^{n}. \end{align*}\]

Therefore, we have that \((x^{n})' = n x^{n-1}\) for all \(x \in \mathbb{R}\) and \(n \in \mathbb{N}.\)

Example 20.5 (Derivative of the Exponential Function)

We can also apply Definition 20.1 to compute the derivative of the exponential function. Let \(f(x) = e^{x}\) and note that:

\[\begin{align*} \frac{e^{x + h} - e^{x}}{h} = e^{x} \left( \frac{e^{h} - 1}{h} \right). \end{align*}\]

We can use \(\eqref{exponential_function}\) to re-write the fraction in the expression above as:

\[\begin{align*} \frac{e^{h} - 1}{h} & = \frac{1}{h} \left( 1 + \frac{h}{1!} + \frac{h^{2}}{2!} + \frac{h^{3}}{3!} + ... - 1 \right) \\ & = 1 + \frac{h}{2!} + \frac{h^{2}}{3!} + \frac{h^{3}}{4!} + ..., \end{align*}\]

which shows that:

\[\begin{align*} f'(x) = e^{x} \lim_{h \rightarrow 0} \frac{e^{h} - 1}{h} = e^{x}. \end{align*}\]

Example 20.5 shows that the derivative of the exponential function is the same exponential function. This implies that the exponential function is a smooth function in the sense that it has continuous derivatives of all orders.

Example 20.6 (Derivative of the Logarithm Function)

Define \(f(x) = \exp(\ln(x)) = x.\) The chain rule then implies that:

\[\begin{align*} f'(x) & = \ln'(x) \exp(\ln(x)) \\ & = \ln'(x) x \\ & = 1. \end{align*}\]

Therefore, \(\ln'(x) = 1 / x.\)

Example 20.7

We can use Example 20.6 to prove an important result of the exponential function:

\[\begin{align*} e^{x} = \lim_{n \rightarrow \infty} \left( 1 + \frac{x}{n} \right)^{n} \end{align*}\]

To prove this, let \(a_{n} = \left( (1 + x/n)^{n} \right).\) Then,

\[\begin{align*} \ln(a_{n}) & = n \ln(1 + x/n) \\ & = x \left( \frac{\ln(1 + x/n) - \ln(1)}{x/n} \right). \end{align*}\]

Therefore,

\[\begin{align*} \lim_{n \rightarrow \infty} \ln(a_{n}) & = x \lim_{n \rightarrow \infty} \frac{\ln(1 + x/n) - \ln(1)}{x/n} \\ & = x \ln'(1) \\ & = x \left(\frac{1}{1}\right) \\ & = x. \end{align*}\]

We can then conclude that:

\[\begin{align*} \lim_{n \rightarrow \infty} \left( 1 + \frac{x}{n} \right)^{n} = \lim_{n \rightarrow \infty} e^{\ln(a_{n})} = e^{x}. \end{align*}\]

Example 20.8

Consider a power function such that \(f(x) = x^{\alpha}\) where \(x > 0\) and \(\alpha \in \mathbb{R}.\) Equation \(\eqref{exponentiation}\) implies that \(f(x) = e^{\alpha \ln(x)}.\) The chain rule allows us to compute:

\[\begin{align*} f'(x) = \alpha \ln'(x) e^{\alpha \ln(x)} = \alpha x^{-1} x^{\alpha} = \alpha x^{\alpha - 1}. \end{align*}\]

Property 20.4 (List of Common Derivatives)

Below is a list of common derivatives that we will use in this class.

Function

Derivative

\(e^{x}\)

\(e^{x}\)

\(\ln(x)\)

\(x^{-1}\)

\(x^{\alpha}\)

\(\alpha x^{\alpha - 1}\)

20.3. Differentials#

One of the most common uses of derivatives in finance is to approximate the change of a function with respect to a financial quantity such as a stock price, interest rate, volatility, among others. In order to achieve this, we need to introduce the notion of differential.

Definition 20.2

The differential of \(y = f(x)\) at the point \(x\) is denoted by \(dy\) and is defined as:

\[\begin{align*} dy = f'(x) dx, \end{align*}\]

where \(dx \in \mathbb{R}\) is an arbitrary quantity.

The differential of a function \(y = f(x)\) at a point \(x\) describes the linear relationship between \(dx\) and \(dy.\) It is for this reason that it is common to use \(\frac{dy}{dx}\) and \(f'(x)\) interchangeably. For example, we can write:

\[\begin{align*} \frac{d \ln(x)}{dx} = \frac{1}{x}, \end{align*}\]

to denote the derivative of \(\ln(x).\)

The figure below shows the graphical representation of the differential of a function. The differential of \(y\) over the differential of \(x\) then represents the slope coefficient of the tangency line of the function \(y = f(x)\) at the point \(x.\)

Figure made with TikZ

It is interesting to see how good notation can make some results easier to understand. Consider for example the chain rule defined in \(\eqref{chain_rule}.\) Let \(z = f(x)\) and \(y = (g \circ f)(x) = g(z).\) The chain rule can then be stated as:

(20.11)#\[\begin{align} \frac{dy}{dx} = \frac{dy}{dz} \frac{dz}{dx}. \label{chain_rule_differential} \end{align}\]

Another useful application of differentials is to use it to derive the derivative of a function inverse. Let \(y = f(x)\) so that \(x = f^{-1}(y).\) We then have:

(20.12)#\[\begin{align} \frac{dy}{dx} = \frac{1}{\frac{dx}{dy}}. \label{inverse_function_derivative} \end{align}\]

Example 20.9

Take \(y = \ln(x)\) so that \(x = e^{y}\). Then we have that:

\[\begin{align*} \frac{dy}{dx} & = \frac{1}{x}, \\ \frac{dx}{dy} & = e^{y} = x. \end{align*}\]

Clearly,

\[\begin{align*} \frac{dy}{dx} = \frac{1}{\frac{dx}{dy}}. \end{align*}\]

It is important to note that in Definition 20.2, the quantity \(dx\) need not be small but can be arbitrarily large. However, when \(\Delta x = dx\) is a small quantity, then we have that \(\Delta y = f(x + \Delta x) - f(x) \approx dy.\) To see this, note that:

\[\begin{align*} \lim_{\Delta x \rightarrow 0} \frac{\Delta y - dy}{\Delta x} & = \lim_{\Delta x \rightarrow 0} \frac{\Delta y}{\Delta x} - \frac{dy}{dx} \\ & = \lim_{\Delta x \rightarrow 0} \frac{f(x + \Delta x) - f(x)}{\Delta x} - f'(x) \\ & = f'(x)- f'(x) \\ & = 0. \end{align*}\]

Property 20.5

Take \(y = f(x)\) and define \(\Delta y = f(x + \Delta x) - f(x)\) for a change \(\Delta x\) in \(x.\) If \(\Delta x\) is small, then we have that:

\[\begin{align*} \Delta y \approx f'(x) \Delta x \end{align*}\]

20.4. Integrals#

We saw that derivatives describe changes of a function. Integrals are about just the opposite, aggregating the changes.

Consider an interval \([a,b]\) and consider the points \(\{x_{i}\}_{i=0}^{n}\) such that

\[\begin{align*} a = x_{0} < x_{1} < x_{2} < \cdots < x_{n-1} < x_{n} = b. \end{align*}\]

In each sub-interval \((x_{i-1}, x_{i})\) pick a point \(\xi_{i},\) where \(i = 1, 2, \cdots, n\) and consider the sum:

\[\begin{align*} f(\xi_{1})(x_{1}-x_{0}) + f(\xi_{2})(x_{2}-x_{1}) + \cdots + f(\xi_{n})(x_{n}-x_{n-1}) = \sum_{i=1}^{n} f(\xi_{i}) \Delta x_{i}, \end{align*}\]

where \(\Delta x_{i} = x_{i} - x_{i-1}.\) Geometrically this sum represents the area of the rectangles in the figure below.

Figure made with TikZ

The definite integral of a function \(f(x)\) from \(a\) to \(b\) is then defined as:

\[\begin{align*} \int_{a}^{b} f(x) \, dx = \lim_{n \rightarrow \infty} \sum_{i=0}^{n} f(\xi_{i}) \Delta x_{i}, \end{align*}\]

where \(\Delta x_{i} \rightarrow 0\) as \(n \rightarrow \infty\) and whenever the limit does not depend on how we make the sub-divisions. The integral of \(f\) over \([a,b]\) then represents the area under the curve \(y = f(x)\) from the point \(x = a\) to the point \(x = b.\)

Figure made with TikZ

Property 20.6 (Important Properties of the Definite Integral)

If \(f(x)\) and \(g(x)\) are two integrable functions over \([a,b],\) then

  1. \(\displaystyle \int_{a}^{b} f(x) + g(x) \, dx = \int_{a}^{b} f(x) \, dx + \int_{a}^{b} g(x) \, dx.\)

  2. \(\displaystyle \int_{a}^{b} A f(x) \, dx = A \int_{a}^{b} f(x) \, dx\) where \(A \in \mathbb{R}\) is a constant.

  3. \(\displaystyle \int_{a}^{b} f(x) \, dx = \int_{a}^{c} f(x) \, dx + \int_{c}^{b} f(x) \, dx\) as long as \(f(x)\) is integrable in \([a,c]\) and \([c,b].\)

  4. \(\displaystyle \int_{a}^{b} f(x) \, dx = - \int_{b}^{a} f(x) \, dx.\)

  5. \(\displaystyle \int_{a}^{a} f(x) \, dx = 0.\)

Definition 20.3

Any function \(F\) such that \(F'(x) = f(x)\) is called the antiderivative or indefinite integral of \(f.\)

Note that the antiderivative is not unique since if \(F\) is the antiderivative of \(f\) then so is

\[\begin{align*} (F(x) + c)' = F'(x) = f(x). \end{align*}\]

Property 20.7 (List of Common Antiderivatives)

Below is a list of common antiderivatives that we will use in this class.

Function

Antiderivative

\(e^{x}\)

\(e^{x}\)

\(1/x\)

\(\ln(x)\)

\(x^{\alpha}\)

\(\frac{x^{\alpha + 1}}{\alpha + 1}\)

Theorem 20.1 (The Fundamental Theorem of Calculus)

First Part

Let \(f\) be a continuous function on \([a,b].\) Let \(F\) be a function defined for all \(x \in [a,b]\) by:

\[\begin{align*} F(x) = \int_{a}^{x} f(t) \, dt. \end{align*}\]

Then F is uniformly continuous on \([a,b],\) differentiable on \((a,b),\) and \(F'(x) = f(x)\) for all \(x \in (a,b).\)

Second Part

Let \(f\) be a function defined on \([a,b]\) and \(F\) an antiderivative of \(f\) in \((a,b).\) If \(f\) is Riemann integrable on \([a,b]\) then

\[\begin{align*} \int_{a}^{b} f(x) \, dx = F(b) - F(a). \end{align*}\]

Note that the first part of Theorem 20.1 implies that \(F(a) = 0\) and \(F(b) = \int_{a}^{b} f(t) \, dt,\) so that \(\int_{a}^{x} f(t) \, dt = F(b) - F(a).\) However, the second part is stronger than this result since it does not assume that \(f\) is continuous.

Also, note that the second part of Theorem 20.1 can be interpreted as an application of the differential of \(F\):

\[\begin{align*} \int_{a}^{b} f(x) \, dx = \int_{a}^{b} F'(x) \, dx = \int_{a}^{b} dF(x) = F(b) - F(a). \end{align*}\]

Example 20.10 (Integral of a Square Function)

\[\begin{align*} \int_{1}^{3} x^{2} \, dx = \left. \frac{x^{3}}{3} \right|_{1}^{3} = \frac{9}{3} - \frac{1}{3} = \frac{8}{3}. \end{align*}\]

Example 20.11 (Integral of a Polynomial Function)

\[\begin{align*} \int_{-1}^{2} (2x^{2} + 3x - 1) \, dx & = \left. \left( 2 \frac{x^{3}}{3} + 3 \frac{x^{2}}{2} - x \right) \right|_{-1}^{2} \\ & = \frac{28}{3} - \frac{11}{6} \\ & = \frac{15}{2}. \end{align*}\]

Example 20.12 (Integral of an Inverse Function)

\[\begin{align*} \int_{0}^{1} \frac{dx}{x + 1} & = \ln(x + 1) \Big|_{0}^{1} \\ & = \ln(2) - \ln(1) \\ & \approx 0.69. \end{align*}\]

Example 20.13 (Present Value of a Continuous-Time Annuity)

The most common use of integrals in finance is to compute the present value of continuous cash flows. In general, the cash flow is represent by a continuous flow over time. Let us start splitting the interval \([0, T]\) into \(n\) intervals of length \(\Delta t = t_{i+1} - t_{i}.\) At each time \(t_{i}\) for \(i = 1, 2, \cdots, n,\) a security pays a cash flow of \(c \Delta t.\)

Figure made with TikZ


If the interest rate is \(r\) expressed with continuous compounding, the present value of these cash flows is:

\[\begin{align*} V_{n} & = \sum_{i=1}^{n} (c \Delta t) e^{-r t_{i}} = \sum_{i=1}^{n} c e^{-r t_{i}} \Delta t. \end{align*}\]

We can now compute the value of this sum as \(n \rightarrow \infty:\)

\[\begin{align*} V & = \lim_{n \rightarrow \infty} V_{n} \\ & = \int_{0}^{T} c e^{-r t} \, dt \\ & = c \left.\left( - \frac{e^{-r t}}{r} \right) \right|_{0}^{T} \\ & = \frac{c}{r} \left( 1 - e^{-r T} \right). \end{align*}\]

This is the continuous-time analog of the present value of an annuity.

Example 20.14 (Pricing a Bond Paying Coupons Continuously)

Consider a bond with face value \(F\), expiring at time \(T\) and paying coupons \(c \, dt\) at each time \(t \in [0, T].\) The price of this bond is then the present value of all the coupons, given in Example 20.13, plus the present value of the principal:

\[\begin{align*} B = \frac{c}{r} \left( 1 - e^{-r T} \right) + F e^{-r T}. \end{align*}\]

20.5. Improper Integrals#

It is common in mathematics, statistics and finance to consider intgrals over \((-\infty, b)\), \((a, \infty)\), or \((-\infty, \infty).\)

Example 20.15

Consider the continuous annuity presented in Example 20.13 when \(T \rightarrow \infty\). This is what we called a perpetuity, that in this case pays at each time \(t\) a cashflow of \(c \, dt.\) The value of this instrument is then:

\[\begin{align*} V & = \int_{0}^{\infty} c e^{-r t} \, dt \\ & = \lim_{T \rightarrow \infty} \int_{0}^{T} c e^{-r t} \, dt \\ & = \lim_{T \rightarrow \infty} \frac{c}{r} \left( 1 - e^{-r T} \right) \\ & = \frac{c}{r}. \end{align*}\]

Example 20.16

In statistics we frequently use the fact that:

\[\begin{align*} I = \int_{-\infty}^{\infty} e^{-x^{2}} \, dx = \sqrt{\pi}. \end{align*}\]

Note that,

\[\begin{align*} I^{2} & = \left( \int_{-\infty}^{\infty} e^{-x^{2}} \,dx \right)^{2} \\ & = \left( \int_{-\infty}^{\infty} e^{-x^{2}} \,dx \right) \left( \int_{-\infty}^{\infty} e^{-y^{2}} \,dy \right) \\ & = \int_{-\infty}^{\infty} \int_{-\infty}^{\infty} e^{-(x^{2} + y^{2})} \,dx\,dy. \end{align*}\]

If we change to polar coordinates,

\[\begin{align*} x & = r \cos(\theta), \\ y & = r \sin(\theta), \\ \end{align*}\]

we have that \(dx\,dy = r \,dr\,d\theta.\) Thus,

\[\begin{align*} I^{2} & = \int_{0}^{2\pi} \int_{0}^{\infty} r e^{-r^{2}} \,dr\,d\theta \\ & = \int_{0}^{2\pi} \left( \left. -\frac{1}{2} e^{-r^{2}} \right|_{0}^{\infty} \right) \,d\theta \\ & = \int_{0}^{2\pi} \frac{1}{2} \,d\theta \\ & = \pi. \\ \end{align*}\]