Monday, September 07, 2009

Independent Study: Recursive Functions of Symbolic Expressions and Their Computation by Machine, Part I

(This is one of a series of posts about papers I'm reading for an independent study with Prof. Evan Chang at the University of Colorado, Boulder. The format of this and subsequent papers will be similar to that of a review of a paper submitted to a conference. These are already-published papers, so I'll be writing with the obvious benefit of hindsight.)

Submission: Recursive Functions of Symbolic Expressions and Their Computation by Machine, Part 1

Reviewer: Nicholas Dronen

Please give a brief, 2-3 sentence summary of the main ideas of this paper:
The author presents the basic symbols, syntax, and semantics of a Turing-complete programming language, LISP, providing compelling examples of the language's expressiveness. He then discusses how it is implemented on the IBM 704 computer.
What is the strength of this paper (1-3 sentences):
It describes a new programming language based in part on Church's lambda calculus which can be applied in both practical and theoretical domains.
What is the weakness of this paper (1-3 sentences):
Being such a seminal paper in the PL literature, it's hard to find fault with it. It is interesting that the author doesn't himself discuss deficiencies in the language or its implementation.
Evaluation
This paper is excellent. The author explains his ideas clearly and comprehensively.

The contemporary reader does have to adjust his vocabulary at times, because McCarthy's terminology is outdated — for example, where a contemporary paper would use the term memory cells, McCarthy says registers.
Novelty
To my knowledge, there was only one other functional programming language at the time of publication, IPL, but LISP was a big improvement. LISP's garbage collection is extremely useful to programmers, freeing them from having to manage memory explicitly.
Convincing
Extremely.
Worth Solving
This paper is seminal in part because it solves several problems at once.
Confidence
How confident are you in your evaluation of this paper?
Detailed Comments

This paper is exemplary in the literature of programming languages. It sets the standard for the formal definition of a programming language: present the basic symbols and syntax, define the semantics, discuss implementation issues.

I'm fascinated by the fact that the LISP function apply "plays the theoretical role of a universal Turing machine and the practical role of an interpreter." Since LISP is based at least in part on the lambda calculus, and the untyped lambda calculus is Turing-complete, it's fair to assume that the universality of apply follows as a consequence, but I'm not capable of providing a proof, nor, if I were capable of such, would I have time to do so. However, it's clear that the built-in substitution function subst has the same role as substitution in the lambda calculus. Another connection between LISP and formal representations of programming languages is that the function eval seems to be a handmaiden of the operational semantics of the language. The signature of the function is eval[e; a], where e is an expression to be evaluated, and a is list consisting of pairs, the items of which are, first, an atomic symbol, second, the expression for which the symbol stands.

And, practically speaking, apply is a function implemented in LISP that interprets LISP — highlighting the bootstrap problem that early computer science students grapple with, if only briefly: how do you compile a compiler? But further, one can view it as a virtual machine for LISP. The language's use of Polish notation reminds me of the stack-oriented nature of the Java virtual machine.

One of the questions I had from last week's paper is whether LISP was, as the paper described, actually capable of performing symbolic differentiation of functions. This paper provides a concrete answer. LISP is indeed capable of evaluating the derivative of a symbolic expression with respect to a variable. It's surprisingly simple, amounting approximately to replacing occurrences of the variable with ZERO or ONE, depending on whether they are being added or multiplied, respectively. So, for, example, the derivative of (PLUS 9 y) with respect to y is (PLUS 9 0) and the derivative of (TIMES 5 z) with respect to z is (TIMES 5 1).

Some of the built-in functions are named poorly: car is a mnemonic for "contents of the address part of register" and cdr stands for "contents of the decrement part of register." Today a language designer would give those functions names in harmony with their semantics, not their implementation. I believe this deficiency in early LISP was corrected at some point (with car replaced by first and cdr by rest), but I don't know what Scheme or Common LISP use.

Garbage collection does not occur intermittently. Rather, it occurs when the program needs memory but free memory has been exhausted. It does a simple reachability analysis. In the IBM 704, characters are 6 bits, and a machine word is 36 bits. A string of characters like "DIFFERENTIATE" is represented by a linked list of structures. The first field of each structure is a pointer to a 6-character memory cell. The second is a pointer to the cell of the next structure. The string "DIFFERENTIATE" decomposes to cells containing "DIFFER", "ENTIAT", and "E."

No comments: