# The Wave Function 02

— 1.3. Probability —

In the previous post we were introduced to the Schrodinger equation (equation 1), stated Born’s interpretation of what is the physical meaning of the wave function and took a little glimpse into some philosophical positions one might have regarding Quantum Mechanics.

Since probability plays such an essential role in Quantum Mechanics it seems that a brief revision of some of its concepts is in order so that we are sure that we have the tools that allows one to do Quantum Mechanics.

— 1.3.1. Discrete variables —

The example used in the book in order to expound on the terminology and concepts of probability is of a set that consists of 14 people in a class room:

• one person has 14 years
• one person has 15 years
• three people have 16 years
• two people 22 years
• five people have 25 years

Let ${N(j)}$ represent the number of people with age ${j}$. Hence

• ${N(14)=1}$
• ${N(15)=1}$
• ${N(16)=3}$
• ${N(22)=2}$
• ${N(25)=5}$

One can represent the previous data points by use of a histogram:

The the total number of people in the room is given by

$\displaystyle N=\sum_{j=0}^{\infty}N(j) \ \ \ \ \ (2)$

Adopting a frequentist definition of probability Griffiths then makes a number of definitions of probability concepts under the assumption that the phenomena at study are discrete ones.

 Definition 1 The probability of an event ${j}$, ${P(j)}$ is proportional to the number elements that have the property ${j}$ and inversely proportional to the total elements (${N}$) under study. $\displaystyle P(j)=\frac{N(j)}{N} \ \ \ \ \ (3)$

It is easy to see that from equation 3 together with equation 2 it follows

$\displaystyle \sum_{j=0}^{\infty}P(j)=1 \ \ \ \ \ (4)$

After defining ${P(j)}$ we can also define what is the most probable value for ${j}$.

 Definition 2 The most value for ${j}$ is the one for which ${P(j)}$ is a maximum.
 Definition 3 The average value of ${j}$ is given by $\displaystyle =\sum_{j=0}^{\infty}jP(j) \ \ \ \ \ (5)$

But what if we are interested in computing the average value of ${j^2}$? Then the appropriate expression must be

$\displaystyle =\sum_{j=0}^{\infty}j^2P(j)$

Hence one can write with full generality that average value for some function of ${j}$, denoted by ${f(j)}$ is given by

$\displaystyle =\sum_{j=0}^{\infty}f(j)P(j) \ \ \ \ \ (6)$

After introducing the definition of maximum of a probability distribution it is time to introduce a couple of definitions that relate t the symmetry and spread of a distribution.

 Definition 4 The median is the value of ${j}$ for which the probability of having a larger value than ${j}$ is the same as the probability of having a value with a smaller value than ${j}$.

After seeing a definition that relates to the the symmetry of a distribution we’ll introduce a definition that is an indication of its spread.

But first we’ll look at two examples that will serve as a motivation for that:

and

Both histograms have the same median, the same average, the same most probable value and the same number of elements. Nevertheless it is visually obvious that the two histograms represent two different kinds of phenomena.

The first histogram represents a phenomenon whose values are sharply peaked about the average (central) value.

The second histogram on the other hand represents a phenomenon represents a more broad and flat distribution.

The existence of such a difference in two otherwise equal distributions introduces the necessity of introducing a measure of spread.

A first thought could be to define the difference about the average for each individual value

$\displaystyle \Delta j=j-$

This approach doesn’t work since that for random distributions one would expect to find equally positive and negative values for ${\Delta j}$.

One way to circumvent this issue would be to use ${|\Delta j|}$, and even though this approach does work theoretically it has the problem of not using a differentiable function.

These two issues are avoided if one uses the squares of the deviations about the average.

The quantity of interest in called the variance of the distribution.

 Definition 5 The variance of a distribution ,${\sigma ^2}$, that has an average value is given by the expression $\displaystyle \sigma ^2=<(\Delta j)^2> \ \ \ \ \ (7)$
 Definition 6 The standard deviation, ${\sigma}$, of a distribution is given by the square root of its variance.

For the variance it is

$\displaystyle \sigma ^2=-^2 \ \ \ \ \ (8)$

Since by definition 5 the variance is manifestly non-negative then it is

$\displaystyle \geq ^2 \ \ \ \ \ (9)$

where equality only happens when the distribution is composed of equal elements and equal elements only.

— 1.3.2. Continuous variables —

Thus far we’ve always assumed that we are dealing with discrete variables. To generalize our definitions and results to continuous distributions.

One has to have the initial care to note that when dealing with phenomena that allow for a description that is continuous probabilities of finding a given value are vanishing, and that one should talk about the probability of a given interval.

With that in mind and assuming that the distributions are sufficiently well behaved one has that the probability of and event being between ${x}$ and ${x+d}$ is given by

$\displaystyle \rho(x)dx \ \ \ \ \ (10)$

The quantity ${\rho (x)}$ is the probability density.

The generalizations for the other results are:

$\displaystyle \int_{-\infty}^{+\infty}\rho(x)dx=1 \ \ \ \ \ (11)$

$\displaystyle =\int_{-\infty}^{+\infty}x\rho(x)dx \ \ \ \ \ (12)$

$\displaystyle =\int_{-\infty}^{+\infty}f(x)\rho(x)dx \ \ \ \ \ (13)$

$\displaystyle \sigma ^2=<(\Delta x)^2>=-^2 \ \ \ \ \ (14)$