MATH 404/SS 316: History of Mathematics

Ayesha N. Ali

 

M. Omer Sheikh: 2003-02-0129                                                    Wednesday 14th May, 2003

 

DRAFT 1

 

Modern Electronic Computers and New Findings in Number Theory

 

 

HISTORY OF COMPUTERS

 

The first device that might be considered to be a computer in the modern sense of the word was conceived by the eccentric British mathematician and inventor Charles Babbage.

 

In 1822, Babbage proposed building a machine called the Difference Engine to automatically calculate mathematical tables. The Difference Engine was only partially completed when Babbage conceived the idea of another, more sophisticated machine called an Analytical Engine

 

The Analytical Engine was intended to use loops of Jacquard's punched cards to control an automatic calculator, which could make decisions based on the results of previous computations. This machine was also intended to employ several features subsequently used in modern computers, including sequential control, branching, and looping.

 

Working with Babbage was Augusta Ada Lovelace, the daughter of the English poet Lord Byron. Ada, who was a splendid mathematician and one of the few people who fully understood Babbage's vision, created a program for the Analytical Engine. Had the Analytical Engine ever actually worked, Ada's program would have been able to compute a mathematical sequence known as Bernoulli numbers. Based on this work, Ada is now credited as being the first computer programmer and, in 1979, a modern programming language was named ADA in her honor.

 

Babbage worked on his Analytical Engine from around 1830 until he died, but sadly it was never completed. It is often said that Babbage was a hundred years ahead of his time and that the technology of the day was inadequate for the task. Refuting this is the fact that, in 1834, two Swedish engineers called George and Edward Scheutz built a small Difference Engine based on Babbage's description.

 

The first true general- purpose electronic computer was the ENIAC, which was constructed at the University of Pennsylvania between 1943 and 1946. However, ENIAC's underlying architecture was very different to that of modern computers. During the course of designing ENIAC, its creators, John William Mauchly and J. Presper Eckert Jr., conceived the concept of stored program computing. Storage solved many of the problems that earlier computers had faced and this paved the way for doing mathematics on computers since intermediate results could now be stored for later computations conveniently.

 

WHAT IS NUMBER THEORY

 

Number theory is one of the oldest branches of pure mathematics, and one of the largest. Of course, it concerns questions about numbers, usually meaning whole numbers or rational numbers.

 

Elementary number theory involves divisibility among integers -- the division "algorithm", the Euclidean algorithm, elementary properties of primes, congruences including Fermat's little theorem and Euler's theorem.

 

Combinatorial number theory involves the number-theoretic study of objects which arise naturally from counting or iteration. This includes a study of many specific families of numbers - the binomial coefficients, the Fibonacci numbers, Bernoulli numbers, factorials, perfect squares, partition numbers and so on.

 

Algebraic number theory extends the concept of "number" to mean an element of some ring, usually the ring of integers in a finite algebraic extension of the rational number field. These arise naturally even when considering elementary topics and are also interesting in their own right.

 

Analytic number theory involves the study of the Riemann zeta function and other similar functions such as Dirichlet series. Other areas of number theory are also quite analytical e.g. additive number theory. Also, ideas from analysis (measure theory, dimension) can be used in probabilistic number theory. Finally, a significant amount of analysis is also used in Sieve methods, and other aspects of multiplicative number theory.

 

Transcendental number theory considers proofs of transcendence or algebraicity of numbers, and the extent to which numbers can be approximated by algebraic numbers.

 

Geometric number theory incorporates all forms of geometry. One may also investigate algebraic geometry with number theory, that is, one may study varieties such as algebraic curves and surfaces and ask if they have rational or integral solutions. This topic includes the theory of elliptic curves and finiteness results which apply to integral or higher-genus situations.

 

Computational number theory studies the effectiveness of algorithms for computation of number-theoretic quantities. Considerable effort has been expended in primality-testing and integer factorization routines, for example procedures which are in principle trivial, but whose naive solution is untenable in large cases. This field also considers integer quantities (e.g. the class number) whose usual definition is nonconstructive, and real quantities (e.g. the values of zeta functions) which must be computed with very high precision; thus this overlaps both computer algebra and numerical analysis.

 

 

COMPUTER AIDED DEVLOPMENTS IN NUMBER THEORY

 

Computational number theory studies problems from elementary, algebraic and analytic number theory which require the help of fast computers, particularly vector and parallel systems. This enlarges our knowledge, insight and understanding in this field and leads to mathematical and numerical solution techniques for the problems studied. Many problems in this field are extremely suitable for parallelization, and can be used as test-cases for high- performance and parallel computing techniques. For example, some algorithms for factoring large numbers can be carried out on a grid of heterogeneous computers where the number of computers in the grid is allowed to vary in a dynamical way.

 

The emergence of public-key cryptography has particularly triggered the study of algorithms for factorization and primality testing, for computing discrete logarithms, and for the solution of large sparse systems of linear equations over finite fields.

 

 

PI π

 

The passage below is taken from the introduction to "Pi: A Source Book" by L. Berggren, J. Borwein and P. Borwein.

 

"The story of pi reflects the most seminal, the most serious and sometimes the silliest aspects of mathematics. A surprising amount of the most important mathematics and a significant number of the most important mathematicians have contributed to its unfolding -- directly or otherwise.

 

Pi is one of the few concepts in mathematics whose mention evokes a response of recognition and interest in those not concerned professionally with the subject. It has been a part of human culture and the educated imagination for more than twenty five hundred years.

The computation of Pi is virtually the only topic from the most ancient stratum of mathematics that is still of serious interest to modern mathematical research. And to pursue this topic as it developed throughout the millennia is to follow a thread through the history of mathematics that winds through geometry, analysis and special functions, numerical analysis, algebra and number theory. It offers a subject which provides mathematicians with examples of many current mathematical techniques as well as a palpable sense of their historical development."

 

Given the above interest in the calculation of Pi some computer aided milestones in the computation of Pi in recent years are as under:

 

  • David H. Bailey 29 million and 10 billionth hexadecimal.
  • Fabrice Bellard 50 and 100 billionth hexadecimal with BBP algorithm.
  • Peter B. Borwein 10 billionth hexadecimal with BBP algorithm.
  • G.V. Chudnovsky and D.V. Chudnovsky 1, 2 and 4 billion with Chudnovsky formula. By March 1996, more than 8 billion digits have been calculated.
  • William Gosper 17.5 million digits with Ramanujan formula.
  • Guilloud and Bouyer 250,000, 500,000, 1 million and 2 million with arctan formulas.
  • Daniel Shanks and John Wrench Jr. 100,265 in 1961 with arctan formulas.
  • Yasumasa Kanada 2 million and 10 million decimal with arctan method, 100 million hexadecimal digits with A.G.M. and the other records from 4 million decimal in 1982 up to 6,442,000,000 decimal in 1995 with A.G.M. methods.
  • Simon Plouffe 10 billionth hexadecimal with BBP algorithm.
  • Daisuke Takahashi 100 million hexadecimal digits with A.G.M. and 3.2 billion, 4.2 billion and 6.4 billion decimal with A.G.M. methods.

 

 

OTHER AREAS OF EXPLORATION

 

Goldbach's conjecture says that every even number (except 2) is the sum of two primes. It has been checked up to 4*10^11 by Sinisalo in 1993, up to 10^14 by Jean-Marc Deshouillers, Yannick Saouter and Herman te Riele in 1997 (paper submitted to ANTS3), and up to 4*10^14 by Joerg Richstein. Olivier Ramaré proved in 1995 that every even integer is the sum of at most 6 primes.

 

Using a carefully optimized segmented sieve and an efficient checking algorithm, the Goldbach conjecture has been verified and is now known to be true up to 4×1014.

 

Sums of cubes. Olivier Ramaré and Francois Bertault have proved that every integer between 1,290,741 and 15000^3 = 3,375,000,000,000 is a sum of five cubes by exhaustive search. Using some results on sums of 4 cubes, Jean-Marc Deshouillers, Francois Hennecart and Bernard Landreau have improved this result to 10^16 (for sums of 5 cubes). Jean-Marc Deshouillers, Francois Hennecart and Bernard Landreau conjecture that every integer greater than 7,373,170,279,850 is a sum of 4 cubes.

 

These developments would have not been possible in any way without the use of computational methods and equipment.

 

The following are some more areas in which the development of advanced computing methodologies and the advent of fast computers have been instrumental:

 

  • Discovery of Perfect numbers
  • Integer factorization/factoring
  • Factoring large integers with the Number Field Sieve
  • Class number computations
  • Amicable numbers
  • Aliquot Sequences
  • The 3x+1 conjecture
  • Some generalizations of the 3x+1 conjecture
  • Enumeration of polyominoes and other animals
  • Determination of the minimum width of admissible prime constellations
  • Tables with values of the prime counting functions pi(x) and pi2(x)
  • Statistics about least primitive roots of prime numbers (Artin's conjecture)
  • Statistics about very large solutions of Pell's equation

 

 

PROVING FERMATS LAST THEOREM

 

Despite large prizes being offered for a solution, Fermat's Last Theorem remained unsolved for over three hundred. It has the dubious distinction of being the theorem with the largest number of published false proofs. For example over 1000 false proofs were published between 1908 and 1912. Using techniques based on Kummer's work, Fermat's Last Theorem was proved true, with the help of computers, for n up to 4 000000 by 1993.

 

 

ONLINE DISTRIBUTED COMPUTATION PROJECTS

 

GIMPS

Prime numbers have long fascinated amateur and professional mathematicians. The first prime numbers are 2, 3, 5, 7, 11, etc. For example, the number 10 is not prime because it is divisible by 2 and 5. A Mersenne prime is a prime of the form 2P-1. The first Mersenne primes are 3, 7, 31, 127, etc. There are only 39 known Mersenne primes.

 

GIMPS, the Great Internet Mersenne Prime Search, was formed in January 1996 to discover new world-record-size Mersenne primes. GIMP harnesses the power of thousands of small computers like yours to search for these "needles in a haystack".

 

The most fundamental component of the software is the implementation of the Lucas-Lehmer primality test. The efficient implementation was possible only with the advent of faster processors as can be noticed in the time period in which the project was started.

 

ECMNET

Richard Brent has predicted in 1985 in a paper entitled Some Integer Factorization Algorithms using Elliptic Curves that factors up to 50 digits could by found by the Elliptic Curve Method (ECM). Indeed, Peter Montgomery found in November 1995 a factor of 47 digits of 5^256+1, and Richard Brent set in October 1997 a new genuine record with a factor of 48 digits of 24^121+1.

 

The original purpose of the ECMNET project was to make Richard's prediction true, i.e. to find a factor of 50 digits or more by ECM. This goal was attained on September 14, 1998, when Conrad Curry found a 53-digit factor of 2^677-1 c150 using George Woltman's mprime program. The new goal of ECMNET is now to find other large factors by ecm, mainly by contributing to the Cunningham project, most likely the longest, ongoing computational project in history.

 

 

 

 

 

 

 

 

 

 

 

 

REFERENCES

 

http://www-gap.dcs.st-and.ac.uk/~history/Indexes/Number_Theory.html

http://www.msri.org/publications/ln/msri/2000/introant/shallit/2/

http://mathforum.org/library/topics/alg_nt/

http://www.ieeta.pt/~tos/hobbies.html

http://www.math.niu.edu/~rusin/known-math/index/11-XX.html

http://www.math.utah.edu/~alfeld/math/equations/equations.html

http://www.mathpages.com/home/inumber.htm

http://archives.math.utk.edu/topics/numberTheory.html

http://db.cwi.nl/projecten/project.php4?prjnr=84

http://db.cwi.nl/rapporten/index.php?prjnr=84

http://www.loria.fr/~zimmerma/records/aliquot.html

http://www-groups.dcs.st-and.ac.uk/~history/HistTopics/Fermat's_last_theorem.html

http://www.loria.fr/~zimmerma/records/ecmnet.html