1 \documentstyle[11pt,fullpage
]{article
}
4 \def\AddSpace#1{\ifcat#1a\
\fi#1} % if next is a letter, add a space
5 \def\YACC#1{{\sc Yacc
}\AddSpace#1}
6 \def\TWIG#1{{\sc Twig
}\AddSpace#1}
7 \def\PROG#1{{\sc Burg
}\AddSpace#1}
8 \def\PARSER#1{{\sc Burm
}\AddSpace#1}
9 \def\CODEGEN#1{{\sc Codegen
}\AddSpace#1}
11 \title{{\sc Burg
} --- Fast Optimal Instruction Selection and Tree Parsing
}
13 Christopher W. Fraser \\
14 AT\&T Bell Laboratories \\
15 600 Mountain Avenue
2C-
464 \\
16 Murray Hill, NJ
07974-
0636 \\
17 {\tt cwf@research.att.com
}
20 Tera Computer Company \\
21 400 N.
34th St., Suite
300 \\
22 Seattle, WA
98103-
8600 \\
26 Dept. of Computer Sciences \\
27 University of Wisconsin \\
29 {\tt todd@cs.wisc.edu
}
34 \bibliographystyle{alpha
}
35 \newcommand\term[1]{{\it #1}}
36 \newcommand\secref[1]{\S\ref{#1}}
37 \newcommand\figref[1]{Figure~
\ref{#1}}
39 % rationale table making
41 {\catcode`\^^M=
13 \gdef\Obeycr{\catcode`\^^M=
13 \def^^M
{\\
}}%
42 \gdef\Restorecr{\catcode`\^^M=
5 }} %
45 % for printing out options
47 \newcommand\option[1]{% #1=option character
55 \PROG is a program that generates a fast tree parser using BURS
56 (Bottom-Up Rewrite System) technology. It accepts a cost-augmented
57 tree grammar and emits a C program that discovers in linear time an
58 optimal parse of trees in the language described by the grammar.
\PROG
59 has been used to construct fast optimal instruction selectors for use
60 in code generation.
\PROG addresses many of the problems addressed by
61 {\sc Twig
}~
\cite{aho-twig-toplas,appel-
87}, but it is somewhat less flexible and
62 much faster.
\PROG is available via anonymous
\var{ftp
} from
63 \var{kaese.cs.wisc.edu
}. The compressed
\var{shar
} file
64 \var{pub/burg.shar.Z
} holds the complete distribution.
66 This
document describes only that fraction of the BURS model that is
67 required to use
\PROG. Readers interested in more detail might start
68 with Reference~
\cite{balachandran-complang
}. Other relevant documents
69 include References~
\cite{kron-phd,hoffmann-jacm,hatcher-popl,chase-popl,pelegri-popl,pelegri-phd,wilhelm-tr,henry-budp,fraser-henry-spe-
91,proebsting-
91}.
73 \PROG accepts a tree grammar and emits a BURS tree parser.
74 \figref{fig-tree-grammar
} shows a sample grammar that implements a very
75 simple instruction selector.
79 #define NODEPTR_TYPE treepointer
80 #define OP_LABEL(p) ((p)->op)
81 #define LEFT_CHILD(p) ((p)->left)
82 #define RIGHT_CHILD(p) ((p)->right)
83 #define STATE_LABEL(p) ((p)->state_label)
87 %term Assign=1 Constant=2 Fetch=3 Four=4 Mul=5 Plus=6
89 con: Constant =
1 (
0);
92 addr: Plus(con,reg) =
4 (
0);
93 addr: Plus(con,Mul(Four,reg)) =
5 (
0);
94 reg: Fetch(addr) =
6 (
1);
95 reg: Assign(addr,reg) =
7 (
1);
97 \caption{A Sample Tree Grammar
\label{fig-tree-grammar
}}
99 \PROG grammars are structurally similar to
\YACC's. Comments follow C
100 conventions. Text between ``
\var{\%\
{}'' and ``
\var{\%\
}}'' is called
101 the
\term{configuration section
}; there may be several such segments.
102 All are concatenated and copied verbatim into the head of the generated
103 parser, which is called
\PARSER. Text after the second ``
\var{\%\%
}'',
104 if any, is also copied verbatim into
\PARSER, at the end.
106 The configuration section configures
\PARSER for the trees being parsed
107 and the client's environment. This section must define
108 \var{NODEPTR
\_TYPE} to be a visible typedef symbol for a pointer to a
109 node in the subject tree.
\PARSER invokes
\var{OP
\_LABEL(p)
},
110 \var{LEFT
\_CHILD(p)
}, and
\var{RIGHT
\_CHILD(p)
} to read the operator
111 and children from the node pointed to by
\var{p
}. It invokes
112 \var{PANIC
} when it detects an error. If the configuration section
113 defines these operations as macros, they are implemented in-line;
114 otherwise, they must be implemented as functions. The section on
115 diagnostics elaborates on
\var{PANIC
}.
117 \PARSER computes and stores a single integral
\term{state
} in each node
118 of the subject tree. The configuration section must define a macro
119 \var{STATE
\_LABEL(p)
} to access the state field of the node pointed to
120 by
\var{p
}. A macro is required because
\PROG uses it as an lvalue. A
121 C
\var{short
} is usually the right choice; typical code generation
122 grammars require
100--
1000 distinct state labels.
124 The tree grammar follows the configuration section.
125 \figref{fig-grammar-grammar
} gives an EBNF grammar for
\PROG tree
129 grammar:
{dcl
} '
%%' {rule}
131 dcl: '
%start' Nonterminal
132 dcl: '
%term' { Identifier '=' Integer }
134 rule: Nonterminal ':' tree '=' Integer cost ';'
136 cost: '(' Integer
{ ',' Integer
} ')'
138 tree: Term '(' tree ',' tree ')'
139 tree: Term '(' tree ')'
143 \caption{EBNF Grammar for Tree Grammars for
\PROG\
\label{fig-grammar-grammar
}}
145 Comments, the text between ``
\var{\%\
{}'' and ``
\var{\%\
}}'', and the
146 text after the optional second ``
\var{\%\%
}'' are treated lexically, so
147 the figure omits them. In the EBNF grammar, quoted text must appear
148 literally,
\var{Nonterminal
} and
\var{Integer
} are self-explanatory,
149 and
\var{Term
} denotes an identifier previously declared as a
150 terminal.
{\tt\
{$X$\
}} denotes zero or more instances of $X$.
152 Text before the first ``
\var{\%\%
}'' declares the start symbol and the
153 terminals or operators in subject trees. All terminals must be
154 declared; each line of such declarations begins with
\var{\%term
}.
155 Each terminal has fixed arity, which
\PROG infers from the rules using that terminal.
156 \PROG restricts terminals to have at most two children. Each terminal
157 is declared with a positive, unique, integral
\term{external symbol
158 number
} after a ``
\var{=
}''.
\var{OP
\_LABEL(p)
} must return the valid
159 external symbol number for
\var{p
}. Ideally, external symbol numbers
160 form a dense enumeration. Non-terminals are not declared, but the
161 start symbol may be declared with a line that begins with
164 Text after the first ``
\var{\%\%
}'' declares the rules. A tree grammar
165 is like a context-free grammar: it has rules, non-terminals,
166 terminals, and a special start non-terminal. The right-hand side of a
167 rule, called the
\term{pattern
}, is a tree. Tree patterns appear in
168 prefix parenthesized form. Every non-terminal denotes a tree. A chain
169 rule is a rule whose pattern is another non-terminal. If no start
170 symbol is declared,
\PROG uses the non-terminal defined by the first
171 rule.
\PROG needs a single start symbol; grammars for which it is
172 natural to use multiple start symbols must be augmented with an
173 artificial start symbol that derives, with zero cost, the grammar's
174 natural start symbols.
\PARSER will automatically select one
175 that costs least for any given tree.
177 \PROG accepts no embedded semantic actions like
\YACC's, because no one
178 format suited all intended applications. Instead, each rule has a
179 positive, unique, integral
\term{external rule number
}, after the
180 pattern and preceded by a ``
\var{=
}''. Ideally, external rule numbers
181 form a dense enumeration.
\PARSER uses these numbers to
report the
182 matching rule to a user-supplied routine, which must implement any
183 desired semantic action; see below. Humans may select these integers
184 by hand, but
\PROG is intended as a
\term{server
} for building BURS
185 tree parsers. Thus some
\PROG clients will consume a richer
186 description and translate it into
\PROG's simpler input.
188 Rules end with a vector of non-negative, integer costs, in parentheses
189 and separated by commas. If the cost vector is omitted, then all
190 elements are assumed to be zero.
\PROG retains only the first four
191 elements of the list. The cost of a derivation is the sum of the costs
192 for all rules applied in the derivation. Arithmetic on cost vectors
193 treats each member of the vector independently. The tree parser finds
194 the cheapest parse of the subject tree. It breaks ties arbitrarily.
195 By default,
\PROG uses only the
\term{principal cost
} of each cost
196 vector, which defaults to the first element, but options described
197 below provide alternatives.
201 \PARSER traverses the subject tree twice. The first pass or
202 \term{labeller
} runs bottom-up and left-to-right, visiting each node
203 exactly once. Each node is labeled with a state, a single number that
204 encodes all full and partial optimal pattern matches viable at that
205 node. The second pass or
\term{reducer
} traverses the subject tree
206 top-down. The reducer accepts a tree node's state label and a
207 \term{goal
} non-terminal --- initially the root's state label and the
208 start symbol --- which combine to determine the rule to be applied at
209 that node. By construction, the rule has the given goal non-terminal
210 as its left-hand side. The rule's pattern identifies the subject
211 subtrees and goal non-terminals for all recursive visits. Here, a
212 ``subtree'' is not necessarily an immediate child of the current node.
213 Patterns with interior operators cause the reducer to skip the
214 corresponding subject nodes, so the reducer may proceed directly to
215 grandchildren, great-grandchildren, and so on. On the other hand,
216 chain rules cause the reducer to revisit the current subject node, with
218 non-terminal, so
\term{x
} is also regarded as a subtree of
\term{x
}.
220 As the reducer visits (and possibly revisits) each node, user-supplied
221 code implements semantic action side effects and controls the order in
222 which subtrees are visited. The labeller is self-contained, but the
223 reducer combines code from
\PROG with code from the user, so
\PARSER
224 does not stand alone.
226 The
\PARSER that is generated by
\PROG provides primitives for
227 labelling and reducing trees. These mechanisms are a compromise
228 between expressibility, abstraction, simplicity, flexibility and
229 efficiency. Clients may combine primitives into labellers and reducers
230 that can traverse trees in arbitrary ways, and they may call semantic
231 routines when and how they wish during traversal. Also,
\PROG
232 generates a few higher level routines that implement common
233 combinations of primitives, and it generates mechanisms that help debug
236 \PROG generates the labeller as a function named
\var{burm
\_label} with
239 extern int burm_label(NODEPTR_TYPE p);
241 It labels the entire subject tree pointed to by
\var{p
} and returns the
242 root's state label. State zero labels unmatched trees. The trees may
243 be corrupt or merely inconsistent with the grammar.
245 The simpler
\var{burm
\_state} is
\var{burm
\_label} without the
246 code to traverse the tree and to read and write its fields. It may be
247 used to integrate labelling into user-supplied traversal code. A
250 extern int burm_state(int op, int leftstate, int rightstate);
252 It accepts an external symbol number for a node and the labels for the
253 node's left and right children. It returns the state label to assign
254 to that node. For unary operators, the last argument is ignored; for
255 leaves, the last two arguments are ignored. In general,
\PROG
256 generates a
\var{burm
\_state} that accepts the maximum number of child
257 states required by the input grammar. For example, if the grammar
258 includes no binary operators, then
\var{burm
\_state} will have the
261 extern int burm_state(int op, int leftstate);
263 This feature is included to permit future expansion to operators with
264 more than two children.
266 The user must write the reducer, but
\PARSER writes code and data that
269 extern int burm_rule(int state, int goalnt);
271 which accepts a tree's state label and a goal non-terminal and returns the
272 external rule number of a rule. The rule will have matched the tree
273 and have the goal non-terminal on the left-hand side;
\var{burm
\_rule}
274 returns zero when the tree labelled with the given state did not match
275 the goal non-terminal. For the initial, root-level call,
\var{goalnt
}
276 must be one, and
\PARSER exports an array that identifies the values
279 extern short *burm_nts
[] =
{ ...
};
281 is an array indexed by external rule numbers. Each element points to a
282 zero-terminated vector of short integers, which encode the goal
283 non-terminals for that rule's pattern, left-to-right. The user needs
284 only these two externals to write a complete reducer, but a third
285 external simplifies some applications:
287 extern NODEPTR_TYPE *burm_kids(NODEPTR_TYPE p, int eruleno, NODEPTR_TYPE kids
[]);
289 accepts the address of a tree
\var{p
}, an external rule number, and an
290 empty vector of pointers to trees. The procedure assumes that
\var{p
}
291 matched the given rule, and it fills in the vector with the subtrees (in
292 the sense described above) of
\var{p
} that must be reduced recursively.
293 \var{kids
} is returned. It is not zero-terminated.
295 The simple user code below labels and then fully reduces a subject tree;
296 the reducer prints the tree cover.
\var{burm
\_string} is defined below.
298 parse(NODEPTR_TYPE p)
{
299 burm_label(p); /* label the tree */
300 reduce(p,
1,
0); /* and reduce it */
303 reduce(NODEPTR_TYPE p, int goalnt, int indent)
{
304 int eruleno = burm_rule(STATE_LABEL(p), goalnt); /* matching rule number */
305 short *nts = burm_nts
[eruleno
]; /* subtree goal non-terminals */
306 NODEPTR_TYPE kids
[10]; /* subtree pointers */
309 for (i =
0; i < indent; i++)
310 printf("."); /* print indented ... */
311 printf("
%s\n", burm_string[eruleno]); /* ... text of rule */
312 burm_kids(p, eruleno, kids); /* initialize subtree pointers */
313 for (i =
0; nts
[i
]; i++) /* traverse subtrees left-to-right */
314 reduce(kids
[i
], nts
[i
], indent+
1); /* and print them recursively */
317 The reducer may recursively traverse subtrees in any order, and it may
318 interleave arbitrary semantic actions with recursive traversals.
319 Multiple reducers may be written, to implement multi-pass algorithms
320 or independent single-pass algorithms.
322 For each non-terminal $x$,
\PROG emits a preprocessor directive to
323 equate
\var{burm
\_}$x$
\var{\_NT} with $x$'s integral encoding. It also
324 defines a macro
\var{burm
\_}$x$
\var{\_rule(a)
} that is equivalent to
325 \var{burm
\_rule(a,
}$x$
\var{)
}. For the grammar in
326 \figref{fig-tree-grammar
},
\PROG emits
328 #define burm_reg_NT
1
329 #define burm_con_NT
2
330 #define burm_addr_NT
3
331 #define burm_reg_rule(a) ...
332 #define burm_con_rule(a) ...
333 #define burm_addr_rule(a) ...
335 Such symbols are visible only to the code after the second
336 ``
\var{\%\%
}''. If the symbols
\var{burm
\_}$x$
\var{\_NT} are needed
337 elsewhere, extract them from the
\PARSER source.
339 The
\option{I
} option directs
\PROG to emit an encoding of the input
340 that may help the user produce diagnostics. The vectors
342 extern char *burm_opname
[];
343 extern char burm_arity
[];
345 hold the name and number of children, respectively, for each terminal.
346 They are indexed by the terminal's external symbol number. The vectors
348 extern char *burm_string
[];
349 extern short burm_cost
[][4];
351 hold the text and cost vector for each rule. They are indexed by the
352 external rule number. The zero-terminated vector
354 extern char *burm_ntname
[];
356 is indexed by
\var{burm
\_}$x$
\var{\_NT} and holds the name of
357 non-terminal $x$. Finally, the procedures
359 extern int burm_op_label(NODEPTR_TYPE p);
360 extern int burm_state_label(NODEPTR_TYPE p);
361 extern NODEPTR_TYPE burm_child(NODEPTR_TYPE p, int index);
363 are callable versions of the configuration macros.
364 \var{burm
\_child(p,
0)
} implements
\var{LEFT
\_CHILD(p)
}, and
365 \var{burm
\_child(p,
1)
} implements
\var{RIGHT
\_CHILD(p)
}. A sample use
366 is the grammar-independent expression
367 \var{burm
\_opname[burm
\_op\_label(p)
]}, which yields the textual name
368 for the operator in the tree node pointed to by
\var{p
}.
370 A complete tree parser can be assembled from just
\var{burm
\_state},
371 \var{burm
\_rule}, and
\var{burm
\_nts}, which use none of the
372 configuration section except
\var{PANIC
}. The generated routines that
373 use the rest of the configuration section are compiled only if the
374 configuration section defines
\var{STATE
\_LABEL}, so they can be
375 omitted if the user prefers to hide the tree structure from
\PARSER.
376 This course may be wise if, say, the tree structure is defined in a
377 large header file with symbols that might collide with
\PARSER's.
379 \PARSER selects an optimal parse without dynamic programming at compile
380 time~
\cite{aho-johnson-dp-classic
}. Instead,
\PROG does the dynamic
381 programming at compile-compile time, as it builds
\PARSER.
382 Consequently,
\PARSER parses quickly. Similar labellers have taken as
383 few as
15 instructions per node, and reducers as few as
35 per node
384 visited~
\cite{fraser-henry-spe-
91}.
388 \PARSER invokes
\var{PANIC
} when an error prevents it from proceeding.
389 \var{PANIC
} has the same signature as
\var{printf
}. It should pass its
390 arguments to
\var{printf
} if diagnostics are desired and then either
391 abort (say via
\var{exit
}) or recover (say via
\var{longjmp
}). If it
392 returns,
\PARSER aborts. Some errors are not caught.
394 \PROG assumes a robust preprocessor, so it omits full consistency
395 checking and error recovery.
\PROG constructs a set of states using a
396 closure algorithm like that used in LR table construction.
\PROG
397 considers all possible trees generated by the tree grammar and
398 summarizes infinite sets of trees with finite sets. The summary
399 records the cost of those trees but actually manipulates the
400 differences in costs between viable alternatives using a dynamic
401 programming algorithm. Reference~
\cite{henry-budp
} elaborates.
403 Some grammars derive trees whose optimal parses depend on arbitrarily
404 distant data. When this happens,
\PROG and the tree grammar
405 \term{cost diverge
}, and
\PROG attempts to build an infinite
406 set of states; it first thrashes and ultimately exhausts
407 memory and exits. For example, the tree grammar in
408 \figref{fig-diverge-grammar
}
411 %term Const=17 RedFetch=20 GreenFetch=21 Plus=22
413 reg: GreenFetch(green_reg) =
10 (
0);
414 reg: RedFetch(red_reg) =
11 (
0);
416 green_reg: Const =
20 (
0);
417 green_reg: Plus(green_reg,green_reg) =
21 (
1);
419 red_reg: Const =
30 (
0);
420 red_reg: Plus(red_reg,red_reg) =
31 (
2);
422 \caption{A Diverging Tree Grammar
\label{fig-diverge-grammar
}}
424 diverges, since non-terminals
\var{green
\_reg} and
\var{red
\_reg}
425 derive identical infinite trees with different costs. If the cost of
426 rule
31 is changed to
1, then the grammar does not diverge.
428 Practical tree grammars describing instruction selection do not
429 cost-diverge because infinite trees are derived from non-terminals
430 that model temporary registers. Machines can move data between
431 different types of registers for a small bounded cost, and the rules
432 for these instructions prevent divergence. For example, if
433 \figref{fig-diverge-grammar
} included rules to move data between red
434 and green registers, the grammar would not diverge. If a bonafide
435 machine grammar appears to make
\PROG loop, try a host with more
436 memory. To apply
\PROG to problems other than instruction selection,
437 be prepared to consult the literature on
438 cost-divergence~
\cite{pelegri-phd
}.
440 \section{Running
\PROG\
}\label{sec-man-page
}
442 \PROG reads a tree grammar and writes a
\PARSER in C.
\PARSER can be
443 compiled by itself or included in another file. When suitably named
444 with the
\option{p
} option, disjoint instances of
\PARSER should link
445 together without name conflicts. The command:
447 \var{burg
} [ {\it arguments
} ] [ {\it file
} ]
449 invokes
\PROG. If a
{\it file
} is named,
\PROG expects its grammar
450 there; otherwise it reads the standard input. The options include:
453 \newcommand\odescr[2]{% #1=option character, #2=optional argument
455 \item[\option{#1}\ifx\Arg2\Empty\else{{\it #2}}\fi]
460 Abort if any relative cost exceeds $N$, which keeps
\PROG from looping on
461 diverging grammars. Several
462 references~
\cite{pelegri-popl,henry-budp,balachandran-complang,proebsting-
91}
463 explain relative costs.
466 Report a few statistics and flag unused rules and terminals.
468 \odescr{o
}{} {\it file
}
469 Write parser into
{\it file
}. Otherwise it writes to the standard output.
471 \odescr{p
}{} {\it prefix
}
472 Start exported names with
{\it prefix
}. The default is
\var{burm
}.
475 Generates smaller tables faster, but all goal non-terminals passed to
476 \var{burm
\_rule} must come from an appropriate
\var{burm
\_nts}. Using
477 \var{burm
\_}$x$
\var{\_NT} instead may give unpredictable results.
480 Emit code for
\var{burm
\_arity},
\var{burm
\_child},
\var{burm
\_cost},
481 \var{burm
\_ntname},
\var{burm
\_op\_label},
\var{burm
\_opname},
482 \var{burm
\_state\_label}, and
\var{burm
\_string}.
485 Change the principal cost to $N$. Elements of each cost vector are
489 Compare costs lexicographically, using all costs in the given order.
490 This option slows
\PROG and may produce a larger parser. Increases
491 range from small to astronomical.
494 \section{Acknowledgements
}
496 The first
\PROG was adapted by the second author from his
\CODEGEN
497 package, which was developed at the University of Washington with
498 partial support from NSF Grant CCR-
88-
01806. It was unbundled from
499 \CODEGEN with the support of Tera Computer. The current
\PROG was
500 written by the third author with the support of NSF grant
501 CCR-
8908355. The interface, documentation, and testing involved
504 Comments from a large group at the
1991 Dagstuhl Seminar on Code
505 Generation improved
\PROG's interface. Robert Giegerich and Susan
506 Graham organized the workshop, and the International Conference and
507 Research Center for Computer Science, Schloss Dagstuhl, provided an
508 ideal environment for such collaboration. Beta-testers included Helmut
509 Emmelmann, Dave Hanson, John Hauser, Hugh Redelmeier, and Bill Waite.
511 \begin{thebibliography
}{BMW87
}
513 \bibitem[AGT89
]{aho-twig-toplas
}
514 Alfred~V. Aho, Mahadevan Ganapathi, and Steven W.~K. Tjiang.
515 \newblock Code generation using tree matching and dynamic programming.
516 \newblock {\em ACM Transactions on Programming Languages and Systems
},
517 11(
4):
491--
516, October
1989.
519 \bibitem[AJ76
]{aho-johnson-dp-classic
}
520 Alfred~V. Aho and Steven~C. Johnson.
521 \newblock Optimal code generation for expression trees.
522 \newblock {\em Journal of the ACM
},
23(
3):
458--
501, July
1976.
524 \bibitem[App87
]{appel-
87}
526 \newblock Concise specification of locally optimal code generators.
527 \newblock Technical
report CS-TR-
080-
87, Princeton University,
1987.
529 \bibitem[BDB90
]{balachandran-complang
}
530 A.~Balachandran, D.~M. Dhamdhere, and S.~Biswas.
531 \newblock Efficient retargetable code generation using bottom-up tree pattern
533 \newblock {\em Computer Languages
},
15(
3):
127--
140,
1990.
535 \bibitem[BMW87
]{wilhelm-tr
}
536 J\"
{u
}rgen B\"
{o
}rstler, Ulrich M\"
{o
}nche, and Reinhard Wilhelm.
537 \newblock Table compression for tree automata.
538 \newblock Technical Report Aachener Informatik-Berichte No.
87-
12, RWTH Aachen,
539 Fachgruppe Informatik, Aachen, Fed. Rep. of Germany,
1987.
541 \bibitem[Cha87
]{chase-popl
}
543 \newblock An improvement to bottom up tree pattern matching.
544 \newblock {\em Fourteenth Annual ACM Symposium on Principles of Programming
545 Languages
}, pages
168--
177, January
1987.
547 \bibitem[FH91
]{fraser-henry-spe-
91}
548 Christopher~W. Fraser and Robert~R. Henry.
549 \newblock Hard-coding bottom-up code generation tables to save time and space.
550 \newblock {\em Software---Practice\&Experience
},
21(
1):
1--
12, January
1991.
552 \bibitem[HC86
]{hatcher-popl
}
553 Philip~J. Hatcher and Thomas~W. Christopher.
554 \newblock High-quality code generation via bottom-up tree pattern matching.
555 \newblock {\em Thirteenth Annual ACM Symposium on Principles of Programming
556 Languages
}, pages
119--
130, January
1986.
558 \bibitem[Hen89
]{henry-budp
}
560 \newblock Encoding optimal pattern selection in a table-driven bottom-up
561 tree-pattern matcher.
562 \newblock Technical Report
89-
02-
04, University of Washington Computer Science
563 Department, Seattle, WA, February
1989.
565 \bibitem[HO82
]{hoffmann-jacm
}
566 Christoph Hoffmann and Michael~J. O'Donnell.
567 \newblock Pattern matching in trees.
568 \newblock {\em Journal of the ACM
},
29(
1):
68--
95, January
1982.
570 \bibitem[Kro75
]{kron-phd
}
572 \newblock {\em Tree Templates and Subtree Transformational Grammars
}.
573 \newblock PhD thesis, UC Santa Cruz, December
1975.
575 \bibitem[PL87
]{pelegri-phd
}
576 Eduardo Pelegri-Llopart.
577 \newblock {\em Tree Transformations in Compiler Systems
}.
578 \newblock PhD thesis, UC Berkeley, December
1987.
580 \bibitem[PLG88
]{pelegri-popl
}
581 Eduardo Pelegri-Llopart and Susan~L. Graham.
582 \newblock Optimal code generation for expression trees: An application of
584 \newblock {\em Fifteenth Annual ACM Symposium on Principles of Programming
585 Languages
}, pages
294--
308, January
1988.
587 \bibitem[Pro91
]{proebsting-
91}
589 \newblock Simple and efficient
{BURS
} table generation.
590 \newblock Technical
report, Department of Computer Sciences, University of
593 \end{thebibliography
}