Quick! Add up all the numbers between 1 and one trillion, inclusive!
The story goes that the great Carl Friedrich Gauss in his days as a little kid in late-18th-century German primary school was told with the rest of his class to add up the numbers from one to one hundred. The teacher was probably tired of the little squirts and wanted to keep them busy for a while. Gauss is in solid contention for the title of Smartest Man Who Ever Lived, so it took him a few seconds. Not because he was a arithmetic savant - most mathematicians are not unusually good with actual numbers - but because he thought of a very clever shortcut. We don't know what the answer is, but we'll call it N.
Now for the clever part: write down the equation again, but backwards:
Then add the two equations together. On the left side, N + N = 2N. On the right side, 100 + 1 = 101, and 99 + 2 = 101, and 98 + 3 = 101, etc.
There's 100 of those terms, so 2N = 100*101. Thus N = 5050. Easy!
There's nothing stopping us from going on part 100, so if we want to add all the way up to a trillion, it's clear that there's going to be a trillion terms with a value of one trillion and one. In general, we have that the sum of all the numbers from 1 to N is given by this, our Sunday Function:
Plugging one trillion into that is pretty easy. It's a big number with a lot of zeros, but you can find it in a few seconds by hand, whereas actually doing addition a trillion times would be quite difficult.
This formula is a special case of Faulhaber's formula. That formula allows you to add all the powers of the numbers between 1 and N. In the general case, adding up the nth power of the numbers between 1 and N scales proportionally with Nn + 1.
Useful? Other than for getting little kids out of class, I don't really know. I'm sure there's an application somewhere, but it doesn't mater to me. The fun of math is not contingent on its usefulness.
- Log in to post comments
But you DIDN'T add up all the numbers between one and 1 trillion, only the integers. :0
on one "application" - these summation formula are used (far too often?) by lecturers introducing Riemann sums on the way to the definite integral.
These sorts of sums come up a lot in theoretical computer science--they're useful for analyzing algorithms involving nested "for" loops. For example:
for i=1 to n
for j=1 to i
(constant time operation)
This runs in less than C*n(n+1)/2 time (for some constant C), which is usually written O(n^2).
Don't you mean "integers"?
If you add up all the "numbers" between 1 and 1,000,000,000,000, you get infinity-- even if you assume "real" numbers.
The old undergrad mathematics major in me wonders if "between ... inclusive" means the same thing as "from ... inclusive".
Good comment on computer science applications. When series are introduced in pre-calc and again in calc 2, the professors of mathematics (or the adjunct instructors or HS teachers in the case of precalc) are often unaware of these important applications. Details such as which of n^2 or 2^n or n! grows fastest with n are often left dangling as a tool along the way to a boring convergence proof.
They are also unlikely to know just how frequently we use Taylor series for "small" expansions in physics.
There is a nice visual proof for this. Trying to draw it here:
1+2+3+4 = surface of the triangle:
O
OO
OOO
OOOO
Copy, rotate 180 degrees and place next to the first one:
OOOOO
OOOOO
OOOOO
OOOOO
And you have a Nx(N+1) rectangle composed of 2*(1+2+3+4) squares.
I also saw a variant for 1^2+2^2+3^2+4^2+... where you build a N*(N+1)*(N+1) box composed of 3 "pyramids" of 1^2+2^2+3^2+4^2+..., plus a triangle of 1+2+3+4+... to complete it.
Which gives 3*S = N*(N+1)*(N+1)-N*(N+1)/2 where S is the wanted sum of square. This gives 3*S = N*(N+1)*((N+1)-1/2) = N*(N+1)*(N+1/2)
And finally: S=N*(N+1)*(2*N+1)/6
I don't even try to draw it and don't remember the source, but I was able to find and check it on paper by updating altitude of each square after adding each piece.
So, the hard part is to make a nice drawing making the building of the box relatively obvious. IIRC the one that I initially saw used colors, transparency and "exploded view". It was convincing enough but maybe some good 3D mental vision was needed. Alternatively, using wood blocks to build this may create a nice puzzle/exercise for kids.
Lucas is right, this sort of function comes up in the analysis of algorithms. Don Knuth, Ron Graham and Oren Patashnik wrote an excellent book called "Concrete Mathematics" which goes into the details.
Numbers, natural numbers, surely there's not enough difference to quibble about! ;)
I like the algorithmic complexity application. That would make a good topic for a post one of these days.
I feel silly for never having looked at the derivation of this function before.
Matt: It's not often that I get the chance to be pedantic towards a Physics grad student. :P
More from Wikipedia