0% found this document useful (0 votes)
27 views

Compiling LISP Procedures

The document discusses compiling LISP procedures by generating code for a target machine. It proposes compiling a small but important subset of LISP as a first step, including function application expressions with constant or nested function call arguments. Conventions for runtime function calls must first be defined, such as how to compute and pass argument values and return values from functions. The target machine architecture assumes a conventional design with accumulator registers and the ability to simulate a stack, important for LISP's recursion-based control structure.

Uploaded by

Edgar Silchenko
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views

Compiling LISP Procedures

The document discusses compiling LISP procedures by generating code for a target machine. It proposes compiling a small but important subset of LISP as a first step, including function application expressions with constant or nested function call arguments. Conventions for runtime function calls must first be defined, such as how to compute and pass argument values and return values from functions. The target machine architecture assumes a conventional design with accumulator registers and the ability to simulate a stack, important for LISP's recursion-based control structure.

Uploaded by

Edgar Silchenko
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Compiling LISP Procedures target machines.

--- The Architecture of the Target Machine


Dr. Bruce A. Pumplin The target machine is assumed to have a c o n v e n -
California State University, Chico tional architecture. It consists of a main storage unit c o n -
Chico, California 95929 nected to a central processor with one or m o r e
a c c u m u l a t o r - t y p e registers, The architecture is powerful
I. Introduction enough to simulate a stack data structure. The availability
of at least a simulated stack is of f i r s t - o r d e r significance
The p r o g r a m m i n g language LISP is frequently the lan- in processing a language such as LISP which is heavily
guage of choice for the implementation of artificial intel- dependent on recursion as its fundamental control s t r u c -
ligence (AI) systems, even though it has earned a r e p u t a - ture.
tion for being slow. This lack of speed is usually due to
the fact that c o m m o n l y available LISP implementations are --- The Instruction Set of the Target Machine
interpretive in nature rather than compile and go. And we The instruction set of the target machine is quite
all k n o w that interpreting source code is invariably s l o w e r modest. It is shown in Figure 1. The a c c u m u l a t o r
than simply executing the compiled object code. registers are denoted by R1, R2, R3, etc. A main m e m o r y
Another factor contributing to the slowness of w o r d is be denoted by LOC, A stack pointer indicating the
modern LISP systems is that m a n y of the applications current t o p e l e m e n t of the stack is denoted by SP. CONST
coded in LISP are themselves interpreters which process a denotes a constant.
higher-level applicationoriented "language." Examples of
such applications include programs to solve simple
geometric analogy problems, programs to control robots MOVEI Ri
CONST Load register i with a constant
which m o v e blocks around in a t h r e e - d i m e n s i o n a l world, (the load i m m e d i a t e instruction)
and programs which understand small subsets of English PUSH SP Ri Push the contents of Ri o n t o the stack
[33-35, especially Winston 1984]. POP SP Ri Pop the top of the stack into Ri
Thus, oftentimes AI applications are slowed d o w n by PUSHJ SP LOC The subroutine entry instruction:
t w o layers or levels of interpretation. The LISP program Push the current contents of
interprets the statements in the a p p l i c a t i o n - o r i e n t e d lan- the program counter o n t o the
guage while the LISP program itself is being interpreted by stack and branch to m e m o r y
the underlying LiSP system. The result, of course, is long location LOC
program execution times. And if the LISP system is r u n - CALL LOC The same as PUSHJ SP LOC
ning on a relatively slow m i c r o c o m p u t e r system, as is POPJ SP The subroutine return instruction:
m o r e and more often the case, the end result is extremely Pop the top of the stack into
long execution times.
the program counter
Sometimes these d o u b l y - i n t e r p r e t i v e systems are so MOVE Ri LOC Load Ri with the contents of LOC
slow that the results produced by the running application MOVE Ri Rj Load Ri with the contents of Rj
program are only available AFTER t h e y are needed. In MOVEM RI LOC Store the contents of Ri in m e m o r y
r e a l - t i m e AI applications, such systems are clearly not a c - location LOC
ceptable.
JUMP LOC Unconditional branch to m e m o r y
One w a y to improve the execution times of LISP location LOC
programs is to compile the LISP code before executing it. JUMPF Ri LOC Branch to m e m o r y location LOC if Ri
Section 2 b e l o w covers the necessary background i n f o r - contains the constant False
mation and lists the assumptions made in the w o r k JUMPT Ri LOC Branch to m e m o r y location LOC if Ri
reported on here. Section 3 defines the subset of LISP to does not contain the constant False
be compiled and presents the compiler in its source form.
[Note: anything other than False is
Section 4 indicates f o l l o w - u p w o r k already completed or
assumed to be True)
in progress. Section 5 reviews and summarizes the ideas
about compiling LISP procedures presented here. Figure 1
Information on h o w to compile LISP procedures is not
n o w readily available in an easy to understand format.
Texts on LISP [1-14] rarely treat the subject. Most are The target machine instruction set is more than adequate
c o n t e n t to present interpreters for LISP (usually a v e r y to support the first subset of LISP to be compiled. Larger
small subset of the full language) written in LISP. More subsets of LISP will more fully utilize the c o m p l e t e p o w e r
general texts on p r o g r a m m i n g languages [15-24] and of the underlying target machine instruction set.
c o m p i l e r writing [25-32] also rarely even mention the p o s -
3. Compiling LISP Procedures
sibility of compiling LISP procedures.
--- Introduction
An exception to the general lack of information about
compiling LISP is one of the seminal works devoted to We begin by considering a small, but v e r y important,
LISP by Allen [Allen 1978]. Unfortunately most readers subset of LISP. After developing a compiler for this sub-
find Allen's t r e a t m e n t of the subject very heady going. set, additional LISP language features could be added, one
One of the aims of the present work is to make Allen's or t w o at a time, until eventually a compiler for full LISP is
t r e a t m e n t more easily accessible to the ever g r o w i n g LISP obtained. This report is concerned only with the general
community. A n o t h e r aim is to bring the t r e a t m e n t up to ideas and specific details necessary to compile an initial
date. Yet another aim is to extend it in t w o ways - - to subset of LISP. Future reports will discuss compilers
enlarge the size of the subset of the language being c o m - capable of handling even larger subsets of the language.
piled and to generate target code for a number of different
--- The First Subset of LISP

SIGART N e w s l e t t e r , J a n u a r y 1987, N u m b e r 99 Page 27


The first compiler will be able to compile the f o l l o w - Before the c o m p i l e r for LISP procedures can be w r i t -
ing subset of LISP: function application expressions with ten the conventions associated with r u n - t i m e function
constant or nested function call arguments. The short BNF calls must be defined. These c o n v e n t i o n s must include
g r a m m a r shown in Figure 2 formally defines the first s u b - h o w . t o c o m p u t e argument values, h o w to pass a r g u m e n t
set of LISP to be compiled. values to functions, and h o w functions are to return the
values t h e y compute.

<expression > ::= ( < f u n c t i o n > < a r g - l > ... < a r g - n > ) An argument value will be c o m p u t e d using the a v a i l -
::= < i d e n t i f i e r > able cpu registers and then t e m p o r a r i l y stored on top of
<function >
the stack (so t h a t the cpu registers can be used to c o m -
<arg-1 > ::= < c o n s t a n t > I < e x p r e s s i o n >
pute the next a r g u m e n t value). The final argument value,
Figure 2 say argument value n, will not be pushed o n t o the stack
only to be i m m e d i a t e l y popped off again. Instead the final
argument value will be computed and m o v e d directly into
register Rn. See the next paragraph.
Note that figure 2 uses an extyended BNF notation.
The three dots are intended to mean "any n u m b e r of o c - A r g u m e n t values will be passed in the cpu registers
currences of." Also note that time g r a m m a r is recursive - - - - a r g u m e n t value 1 in R1, argument value 2 in R2, and so
that is, the syntactic unit < e p r e s s i o n > is defined in terms forth. Hence, after all of the a r g u m e n t values have been
of < a r g - l > which is defined in terms of < e x p r e s s i o n > . c o m p u t e d and stacked (except a r g u m e n t value n, of
This circular definition expresses precisely and consicely course) they must be popped off of the stack and into the
the structure of the t y p e of LISP expressions making up correct cpu register.
the first subset of LISP to be compiled.
Note: the business about not stacking a r g u m e n t value
An example of an expression defined by the g r a m m a r n is, in fact, a first a t t e m p t at optimizing the code
of Figure 2 is produced by the compiler. If all argument values were
treated the same way, the result would be that i m -
<f (g A) (h B) >
mediately after computing the final argument value and
w h e r e f, g, and h are asumed to be the names of pushing it o n t o the stack it w o u l d just have to be popped
user-defined or built-in procedures and A and B are c o n - off of the stack into register Rn. The c o n v e n t i o n adopted
stant arguments. Using the LISP c o n v e n t i o n of quoting here will save one pair of p u s h - p o p instructions. Not
constants and reverting to a case insensitive notation the much of a saving? The saving is one pair of p u s h - p o p in-
expression to be compiled b e c o m e s structions FOR EVERY FUNCTION CALLEDt The saving is
positively w o r t h the effort to achieve it.
< F < G (QUOTE A) > < H (QUOTE B) > >
Function values are to be returned in RI. When a
--- An Aside (Or Four) function is called, it is free to use the cpu registers
Four additional assumptions need to be presented b e - a n y w a y it needs to w i t h o u t having to restore t h e m before
fore the compiler can be discussed. returning. If any of the cpu registers contain information
to be used after the function call returns, this information
First, the LiSP procedure to be compiled is assumed must be stacked before the the function is called and then
to be syntactically and semantically correct, although the restored after the function returns.
latter is not really important. This assumption is justified
by recognizing that the compiler for LISP procedures is it- --- The First C o m p i l e r for LISP P r o c e d u r e s
self a procedure built into an otherwise conventional LISP The compiler for the initial subset of LISP is shown in
system. (See assumptions two and three below.) The Figure 3. The c o m p i l e r is presented in a subset of LISP
primary result of this assumption is that the compiler for which should run on virtually all of the LISP i m p l e m e n -
LISP procedures does not have to do any error checking. tations in existence. In particular, you do not need a v e r -
Second, the LISP procedure to be compiled is a s - sion of COMMON LISP [Steel 1984] to i m p l e m e n t the c o m -
sumed to be in its usual LISP form - - that is, it is a binary piler shown here.
tree. Presumably it was developed using a standard LISP
The c o m p i l e r comprises the main procedure COMPEXP
system. An i m p o r t a n t result of this assumption is that the
(COMPile EXPression) and t w o subsidiary procedures COM-
compiler for LISP procedures does not have to do either a
PLIS (COMPile a r g u m e n t LiSt) and COMPAPPLY (COMPile a
lexical analysis or a syntax analysis of the procedure to be
function APPLication).
compiled. Since the compiler starts with the equivalent of
a conventional parse tree, all it really has to be concerned The h i g h e s t - l e v e l procedure, COMPEXP, accepts the
with is the traditional code generation phase associated expression to be compiled and returns a list representing
with the compilation of o r d i n a ~ h i g h - l e v e l languages. the sequence of target language instructions to be carried
out in order to achieve the effect of evaluating the original
Third, the language used to i m p l e m e n t the c o m p i l e r is
input expression. COMPEXP begins by calling ISCONST to
going to be LISP. The result of this assumption is that the
see if the expression to be compiled is a constant. If so,
LISP compiler is just another LISP procedure. Which raises COMPEXP calls on MKSEND to generate a " m o v e
an interesting possibility - - once the c o m p i l e r is available immediate" instruction, If the expression handed to COM-
it could be used to compile itself. The result should be a PEXP is not a constant then it must be a function applica-
fast LISP compiler. tion. COMPEXP calls COMPLIS to compile the a r g u m e n t
Fourth, the t a r g e t language is also LISP. That is, the list and then calls COMPAPPLY to compile the actual f u n c -
first LISP compiler will generate LISP code. LISP tion call.
aficionados will understand this i m m e d i a t e l y - - others The subsidiary procedure COMPLIS accepts an ar-
please be tolerant and patient. Future reports w i l l in- g u m e n t list to be compiled and returns a list representing
crementally lead to a LISP compiler, written in LISP, which the sequence of target language instructions to be carried
will generate native machine code. out in order to achieve the effect of evaluating the original
--- R u n - t i m e Fun©tion Invocation C o n v e n t i o n s argument list. COMPLIS begins by checking to see if the

S I G A R T N e w s l e t t e r , J a n u a r y 1987, N u m b e r 99 P a g e 28
::: THE PRIMARY PROCEDURES ::: THE RECOGNIZER PROCEDURE
(DEFUN COMPEXP (EXP) (DEFUN ISCONST (X)
(COND ((ISCONST EXP) (OR (NUMBERP X)
(LIST (MKSEND 1 EXP))) (EQ X T)
(T (COMPAPPLY (FUNC EXP) (EQ X NIL)
(COMPLIS (ARGLIST EXP)) (AND (NOT (ATOM X))
(LENGTH (ARGLIST EXP)))) (EQ (FIRST X) ' QUOTE))
)) ))
(DEFUN COMPLIS (U)
(COND ((NULL U ) ' 0 ) ::: THE SELECTOR PROCEDURES
((NULL (REST U)) (DEFUN FUNC (X) (FIRST X))
(COMPEXP (FIRST U))) (DEFUN ARGLIST (X) (REST X))
(T (APPEND-3 (COMPEXP (FIRST U))
(LIST (MKALLOC 1)) ::: THE CONSTRUCTOR PROCEDURES
(COMPELS (REST U)))) (DEFUN MKSEND (DEST VAL) (LIST 'MOVEI DEST VAt..))
)) (DEFUN MKALLOC (DEST) (LIST 'PUSH 'SP DEST))
(DEFUN COMPAPPLY (FN VALS N) (DEFUN MKCALL (FN) (LIST 'CALL FN))
(APPEND-3 VALS (DEFUN MKLINK (N)
(MKLINK N) (COND ((= N 1) '0)
(LIST (MKCALL FN)) (T (CONCAT (MKMOVE N1)
)) (MKLINK1 (SUB1 N))))
Figure 3 ))
(DEFUN MKLINK1 (N)
(COND ((ZEROP N) ' 0)
argument list handed to it is empty. ( Note: COMPLIS calls (T (CONCAT (MKPOP N)
itself and this test is necessary to prevent an infinite (MKLINK1 (SUB1 N))))
recursion. ) If so, COMPLIS simply returns an empty list - - ))
that is, the compilation of an empty argument list is an (DEFUN MKPOP (N) (LIST 'POP 'SP N))
empty list of target language instructions. (DEFUN MKMOVE (DEST VAL) (LIST 'MOVE DEST VAt.))
If the argument list is not empty COMPLIS next Figure 4
checks to see if it is a list of one element. If so, COMPLIS
calls COMPEXP to compile the one argument. COMPLIS
then simply returns the list of target language instructions
The selector procedure FUNC simply returns the first
returned to it by COMPEXP.
element of the list handed to it. The selector procedure
If the argument list consists of t w o or more elements ARGLIST simply deletes the first element of the list handed
COMPLIS calls on COMPEXP to compile the first argument to it and returns the rest.
and then calls itself recursively to compile the rest of the
The seven constructor procedures are used to
argument list. COMPLIS returns the list of target language
instructions that will evaluate the original argument list. generate the actual target language instructions. These
procedures contain all of the machine-specific details
COMPAPPLY accepts a function name, a list of target known to the compiler. (Note: The names of these
language instructions that will evaluate an argument list procedures are those used by Allen - - they relate to ear-
and an integer equal to the length of the original ar- lier, more abstract material in the book [Allen 1578].)
gument list. COMPAPPLY returns the list of target instruc-
tions handed to it extended with target instructions that MKSEND generates a move immediate instruction.
(1) will ensure that the argument values are all in the c o r - MKALLOC generates a push instruction.
rect cpu registers, and (2) will then call the function. The MKCALL generates a function call instruction.
list of target language instructions returned by COM- MKLINK AND MKLINK1 together generate the requisite
PAPPLY is the compiled form of the original expression MOVE POP POP...POP sequence to put the computed
handed to COMPEXP. argument values where they belong just prior to an
In the spirit of good LISP programming style, the actual function call.
three primary procedures of the compiler are supported by MKPOP generates a pop instruction.
a "recognizer" procedure < ISCONST >, t w o "selector" MKMOVE generates a load instruction.
procedures ( FUNC and ARGLIST ), and seven "constructor"
procedures (MKSEND, MKALLOC, MKCALL, MKLINK, MKLINK1, The compiler is written in a version of LISP which includes
MKPOP, and MKMOVE). The seven constructor procedures the procedures FIRST, REST, CONCAT, APPEND-3, and
are the target code generation procedures. The definitions LISTP. All of these procedures may or may not be avail-
for these ten recognizer, selector, and constructor able on a particular LISP system. They weren't available
procedures are shown in Figure 4. on the author's system, UTLISP (University of Texas LISP)
running on a CDC CYBER mainframe. Hence, again in the
The recognizer procedure ISCONST checks to see if spirit of good LISP programming style, they were simply
the argument handed to it is a constant. ISCONST r e c o g - defined in terms of the primitive procewdures actually
nizes the following constants: numbers, the LISP atom T, available. The definitions for these five auxiliary
the LISP atom NIL, and any quoted expression. procedures are shown in Figure 5.

SIGART Newsletter, January 198'7, Number 99 Page 29


The first instruction (MOVEI 1 (QUOTE A)) moves the
::: THE AUXILIARY PROCEDURES
constant A into RI.
(DEFUN FIRST (X) (CAR X))
(DEFUN REST (X) (CDR X)) Immediately after the second instruction (CALL G) has
(DEFUN CONCAT (ELEMENT SEQUENCE) been executed the return address will have been pushed
(COND ((LISTP SEQUENCE) onto the stack and execution will continue with the code
for the function G. Eventually (3 should execute a sub-
(CONS ELEMENT SEQUENCE))
routine routine instruction, the effect of which will be to
(T '0) pop the return address off of the stack into the program
)) counter. Execution will continue with instruction 3 of
(DEFUN APPEND-3 (L1 L2 L3) Figure 6. Note that the function G has left its value (GA)
(APPEND LI (APPEND L2 L3)) in RI.
)
The third instruction (PUSH SP 1) pushes the value in
(DEFUN LISTP (X)
R1 onto the stack.
(COND (( CONSP X) T)
((NULL X) T) The fourth instruction (MOVEI 1 (QUOTE B)) moves the
(T NIL) constant B into RI.
)) Immediately after the fifth instruction (CALL H) has
been executed the return address will have been pushed
figure 5
onto the stack and execution will continue with the code
for the function H. Eventually H should execute a s u b -
routine return instruction, the effect of which will be to
- - - A Sample Compilation pop the return address off of the stack into the program
With all of the compiler procedures ( finally! ) defined, counter. Execution will continue with instruction 6 of
the LISP expression Figure 6. Note that the function H has left its value (H B)
in R1.
( F ( G (QUOTE A) ) ( H (QUOTE B) ) )
The sixth instruction (MOVE 2 1) copies the value in
can be compiled with the LISP command RI into R2.

(COMPEXP '( F ( G (QUOTE A) ) ( H (QUOTE B) ) ) ) The seventh instruction (POP SP 1) pops the value on
top of the stack into RI. After this instruction has been
The compiled form of the expression generated by COM- executed R1 will contain the value of the first argument to
PE×P is a list of eight target language instructions: the function F and R2 will contain the value of the second
( (MOVE I 1 (QUOTE A)) argument to F.
(CALL G)
(PUSH SP 1) The eighth instruction (CALL F) invokes the function F
(MOVE I 1 (QUOTE B)) with argument values (G A) and (F B). The value c o m -
(CALL H) puted by F using these argument values is returned in RI.
(MOVE 2 1) 4. FUTURE WORK
(POP SP I)
(CALL F) - - - Introduction
) The procedure COMPEXP developed above is capable
- - - E x e c u t i n g The Target Code of compiling a small but important subset of LISP into a
LISP pseudo-code. The development has been discussed
A trace of the execution of this compiled code is in depth in order to make clear the necessary background
shown, in Figure 6. material and to present the assumptions and conventions
adopted. Work already completed and work in progress is
concerned with extending COMPEXP in a number of
STEP INSTRUCTION R1 R2 STACK dimensions.
(Top...Bottom) - - - Compilers For Larger Subsets of LISP
0 ?? ?? ---
Compilers for larger subsets of LISP have already
1 (MOVEI 1 (QUOTE A)) A ?? ---
been completed. The first extension was to add the
2 (CALL G) A Ret-Add (G A)
capability of compiling L I S P conditional expressions.
(G A) ?? ---
These expressions have the general form
3 (PUSH SP 1) (G A) ?? (G A)
4 (MOVEI I (QUOTE B)) B ?? (G A) (COND (P1 El) (P2 E2) ... (Pn En) )
5 (CALL H) B ?? Ret-Add (G A) and are the LISP equivalent of the control structure
(H B) ?? (O A)
6 (MOVE 2 1) (H B) (H B) (GA) if P1 then E1 else
7 (POP SP 1) (G A) (H B) --- if P2 then E2 else
8 (CALL F) (G A) (H B) Ret-Add
(F) ?? --- if Pn then En

Figure 6 The second extension to COMPEXP was to add the


capability of compiling expressions with variables and of
compiling LISP lambda expressions.
Referring to Figure 6, initially (Step 0) the values of Compilers for even larger subsets of LISP are cur-
register 1 (R1) and register 2 (R2) are unknown and the rently being developed.
stack is empty.

SIGART Newsletter, January 1987, Number 99 Page 30


--- Compilers for Different Target Machines
12. Touretzky, D. (1984) LISP: A Gentle Introduction To
Another way in which the LISP expression compiler Symbolic Computation. Harper & Row, New York.
discussed in this report can be extended is by modifying 13. Wilensky, R. (1984) LISPcraft. Norton & Co., New
the constructor procedures so that they generate for dif- York.
ferent target machines. The first extension was to develop 14. Winston, P. and Horn, B. (1984) LISP, Second Edition.
a set of constructors for an actual machine. A DEC Addison-Wesley, Reading, Massachusetts.
PDP-11 system was available to the author so it was the --- Programming Languages
natural first choice. 15. Backhouse, R. (1980) Syntax of Programming Lan-
The second extension in this area was to move COM- guages. Prentice-Hall. Englewood Cliffs, New Jersey.
16. Harrison, M. (1978) Introduction To Formal Language
PEXP to a microcomputer. An IBM XT was available to the
author so it was selected. Theory. Addison-Wesley, Reading, Massachusetts.
17. Henderson, P. (1980) Functional Programming: Ap-
Compilers for other machines, some actually available plication and Implementation. Prentice-Hall,
and some of theoretical interest only, are currently being Englewood Cliffs, New Jersey.
developed. 18. MacLennan, B. (1983) Principles of Programming Lan-
- - - Optimization Possibilities guages: Design, Evaluation, and Implementation.
Holt, Rinehart, and Winston, New York.
Another way the compilers for LISP procedures have 19. Marcotty, M., and Ledgard, H. (1986) Programming
been extended is by considering some of the many pos- Language Landscape: Syntax, Semantics, and Im-
sibilities for optimizing the target code they generate. The plementation, Second Edition, SRA, Chicago.
goal has been to develop the shortest, fastest target code 20. Nicholls, J. (1975) The Structure and Design of Pro-
possible. Early results are promising. Efforts in this area gramming Languages. Addison-Wesley, Reading,
will be the subject of future reports. Massachusetts.
21. Organic, E., Forsythe, A., and Pluntmer, R. (1978) Pro-
--- Compiling PORLOG Programs gramming Language Structures. Academic Press,
Parallel work in progress is concerned with an inves- New York.
tigation of the possibility of producing compilers, in a 22. Pratt, T. (1984) Programming Languages: Design and
more or less automatic fashion, for various dialects of Implementation. Prentice-Hall, Englewood Cliffs,
New Jersey.
PROLOG and various hardware systems.
23. Tennet, R. (1981) Principles of Programming Lan-
5. SUMMARY guages. Prentice-Hall, Englewood Cliffs, New Jersey.
--- Compiling LISP Procedures Is Necessary 24. Tucker, A. (1986) Programming Languages, Second
Edition.
Many real-time AI applications simply can not be run - - - Compiler Writing
successfully on systems whose basic mode of operation is 25. Aho, A., Sethi, R. , and UIIman, J. (1986) Compilers:
to interpret the source code. These applications would Principles, Techniques, and Tools. Addison-Wesley,
benefit greatly from the availability of systems which are Reading, Massachusetts.
capable of running in a compile and go mode. 26. Barrett, W., Bates, R., Gustafson, D., and Couch,
J. (1986) Compiler Construction: Theory and Prac-
- - - Compiling LISP Procedures Is Simple
tice. SRA, Inc.
The development of the COMPEXP procedure 27. Bauer, F. and Eickel, J. (1976) Compiler Construction:
described above is an indication that compiling LISP An Advanced Course, Second Edition. Springer-
procedures is rather simple. Vergag, New York.
6. References 28. Lewi, J., DeVlaminch K., Huens, J., and Huybrechts,
M. (1979) A Programming Methodology in Compiler
- - - LISP Construction (Two Volumes). North-Holland,
1. Allen, J. (1978) Anatomy of LISP. McGraw Hill, New Amsterdam.
York. 29. Lewis II, P., Rosenkrantz, D., and Stearns, R. (1976)
2. Brooks, R. (1985) Programming in Common LISP. Compiler Design Theory. Addison-Wesley, Reading,
John Wiley & Son, New York. Massachusetts.
3. Danici, I. (1983) LISP Programming. Blackwell Scien- 30. Trembly, J. and Sorenson, P. (1982) An Implemen-
tific Publications, Oxford. tation Guide To Compiler Writing. McGraw-Hill, New
4. Gabriel, R. (1985) Performance and Evaluation of LISP York.
Systems. The MIT Press, Cambridge. 31. (1985) The Theory and Practice of
5. Hasemer, T. (1984) Looking At LISP. Addison-Wesley, Compiler Writing. McGraw-Hill, New York.
Reading, Massachusetts. 32. Waite, W. and Coos, G. (1984) Compiler Construction.
6. Holtz, F. (1985) LISP: The Language of Artificial Intel- Springer-Verlag, New York.
ligence. Tab Books, Inc., Blue Ridge Summit, Penn- - - - Artificial Intelligence
sylvania. 33. Charniak, E. and McDermott, D. (1985) Introduction To
7. Narayanan, A. and Sharkey, N. (1985) An Introduction Artificial Intelligence. Addison-Wesley, Reading,
to LISP. John Wiley & Sons, New York. Massachusetts.
8. Queinnec, C. (1984) LISP. John Wiley & Sons, New 34. Rich, E. (1983) Artificial Intelligence. McGraw-Hill,
York. New York.
9. Siklossy, L. (1976) Let's Talk LISP. Prentice-Hall, 35. Winston, P. (1984) Artificial Intelligence, Second Edi-
Englewood Cliffs, New Jersey. tion. Addison-Wesley, Reading, Massachusetts.
10. Steele, G. (1984) Common LISP: the Language, Digital
Press, Burlington, Massachusetts.
11. Tatar, D. (1986) A Programmer's Guide To Common
LISP. Digital Press, Burlington, Massachusetts.

SIGART N e w s l e t t e r , J a n u a r y 1987, N u m b e r 99 Page 31

You might also like