Skip to main content
\( \newcommand{\lt}{ < } \newcommand{\gt}{ > } \newcommand{\amp}{ & } \)

Section21.2Some History

Somewhat remarkably, given how long humans have been studying primes, the first people we know of compiling substantial data about them are Gauss and Legendre, around 1800.

Legendre first tried to estimate \(\pi(x)\). He said that \(\pi(x)\approx \frac{x}{\log(x)-A}\), where he fudges the constant \(A\approx 1.08366\). More precisely, he claimed that \(\pi(x)\) is asymptotic to this function.

Definition21.2.1

We say that two functions \(f(x)\) and \(g(x)\) are asymptotic to each other when \begin{equation*}\lim_{x\to\infty}\frac{f(x)}{g(x)}=1\end{equation*} Essentially, in the long run these functions get as close to each other as you like, on a percentage basis.

Here is another way to think about this. Think of the average chance of a number of size \(x\) being prime; Legendre guessed this was of the form \(\frac{1}{\log(x)-A}\). This general notion was based on a lot of data he had collected, and the constant \(A\) he finally settled on seemed to give the best match to the data.

Not long after this, Gauss came up with a solution that was more elegant – and despite not being ‘fitted’ to the data in the same way, was correct. And he didn't tell anyone for over fifty years! Gauss' conjecture was that \begin{equation*}\lim_{x\to\infty}\frac{\pi(x)}{x/\log(x)}=1\end{equation*} Or, using our new term, \(\pi(x)\) is asymptotic to \(\frac{x}{\log(x)}\).

Subsection21.2.1The first really accurate estimate and errors

In fact, Gauss makes this estimate even more precise. Here is the general idea.

First, reinterpret the proportion as suggesting that \(1/\log(x)\) integers near \(x\) are prime. If we do that, then we can think of \(1/\log(x)\) as a probability density function. What do we do with such functions? We integrate the function to get the cumulative amount!

That is, we should expect that \(\pi(x)\approx \int_2^x\frac{dt}{\log(t)}\) or equivalently \begin{equation*}\lim_{x\to\infty}\frac{\pi(x)}{\int_2^x\frac{dt}{\log(t)}}=1\; .\end{equation*}

Definition21.2.2

We give the name logarithmic integral 1  to the (convergent) integral \(Li(x)=\int_2^x \frac{dt}{\log(t)}\).

That a function as rigid as \(\pi\) would be close to an integral function should sound like it has a 100% probability of being crazy! But Gauss was no fool, and the accuracy is astounding.

Notice how much closer \(Li(x)\) is to the actual value of \(\pi(x)\) than the \(x/\log(x)\) estimate. It's usually closer by several orders of magnitude.

Subsection21.2.2Exploring \(Li\)

Can we try for some more analysis? Since we saw that \(x/\log(x)\) didn't seem to be as good an approximation, we'll leave it out for now. This graphic follows one along a roughly 1000-wide stretch at a time.

Based on this evidence, it seems clear that \(Li(x)\), even if it's a good approximation, should not ever be less than the actual count of primes. And yet, the English mathematician Littlewood proved the following result.

As remarkable as this seems, his student Skewes proved the following even more amazing fact.

In the original paper, this bound had a \(34\) instead of \(1000\) in the last exponent, but that result relied upon a special assumption (the so-called Riemann Hypothesis, see Chapter 25).

Today we know that the first time this “switch” happens is no higher than \(1.4\times 10^{316}\). There is still no explicit number known for which this is true, however, and we haven't even gotten remotely near those bounds with computers.

This sounds terrible, but actually is good news. After all, if \(\pi\) beats \(Li\) once in a while, then \(Li\) must be a great approximation indeed! So, just how great is it?