## Lecture #7:  The difficulty of arithmetic

I remarked that China had announced within the last 10 days that all encryption products must be registered with the Bureau of State Security, and that the Bureau of State Security would not even discuss how these encryption products would be regulated. I further noted that many current commercial products such as spreadsheet programs stored their data in an encrypted format, and that the effect on the use of these products of this new regulation was not clear. It seems that China (and other countries) are rather scared of encryption. In this course in just a few weeks we would learn what was scaring these countries.

First we will try to make precise what "easy" and "hard" computations are. The systematic study of the difficulty of computations only began in the late twentieth century and is called "Computational Complexity". I said we would use as building blocks some simple arithmetic ideas: analyzing how much work addition and multiplication were. I handed out about 12 sheets with arithmetic computations on them [PDF|PS|TeX]: I filled in the blanks on the sheets with a bunch of different numbers, whose size ranged from 2 digits to about 8 digits. I had pairs of students work on each exercise: one did the computation and the other counted the number of elementary operations needed. I emphasized that I was not really interested in the answers to the computations, but I was interested in how many elementary operations were needed. These I defined as references to one digit addition and multiplication tables. There were several relevant questions during the exercise. I asserted that this was one way to measure how "hard" a computation was. Many people found that the question was different from anything they had previously encountered.

Then we entered the results systematically on the board. I tried to get an overestimate for the number of one digit operations required for addition and multiplication. It turned out that this depended upon the length or size (I will use the latter) of the number rather than the number itself. If m and n were the sizes of the two numbers, then addition took about m+n operations and multiplication took about mn multiplications and about another 4mn additions. To double a number of size m took 2m operations and to square it took about 5m2 operations.

I then asked the following sort of question: if I had a "machine" which squared numbers, and if it took 1 second to square a 1,000 digit number (or something like this) how long would it take to square a 2,000 digit number? This involved a considerable amount of discussion during which we had to understand proportionality questions. Many students found this difficult and I think unfortunately some never quite understood it.

Then I tried to analyze a "square inquiry" machine, whose input was integers, and whose output, one of the words {Yes, No}, answered the question, "Is the input a square?" There was considerable debate about how to arrange such a machine and how fast it would run. Some sophisticated suggestions were made, but I wanted only simple answers because those would be easy to analyze. Finally we decided that it would take each number up to X (which was the input, an integer whose size was m) and square it, and see if it was equal to X, and then reply. This would take 5m2 X computations. I did not get a chance to analyze the result of changing the size of the input to this computation.