1 ========================================
2 Kaleidoscope: Code generation to LLVM IR
3 ========================================
11 Welcome to Chapter 3 of the "`Implementing a language with
12 LLVM <index.html>`_" tutorial. This chapter shows you how to transform
13 the `Abstract Syntax Tree <LangImpl02.html>`_, built in Chapter 2, into
14 LLVM IR. This will teach you a little bit about how LLVM does things, as
15 well as demonstrate how easy it is to use. It's much more work to build
16 a lexer and parser than it is to generate LLVM IR code. :)
18 **Please note**: the code in this chapter and later require LLVM 3.7 or
19 later. LLVM 3.6 and before will not work with it. Also note that you
20 need to use a version of this tutorial that matches your LLVM release:
21 If you are using an official LLVM release, use the version of the
22 documentation included with your release or on the `llvm.org releases
23 page <https://llvm.org/releases/>`_.
28 In order to generate LLVM IR, we want some simple setup to get started.
29 First we define virtual code generation (codegen) methods in each AST
34 /// ExprAST - Base class for all expression nodes.
37 virtual ~ExprAST() = default;
38 virtual Value *codegen() = 0;
41 /// NumberExprAST - Expression class for numeric literals like "1.0".
42 class NumberExprAST : public ExprAST {
46 NumberExprAST(double Val) : Val(Val) {}
47 Value *codegen() override;
51 The codegen() method says to emit IR for that AST node along with all
52 the things it depends on, and they all return an LLVM Value object.
53 "Value" is the class used to represent a "`Static Single Assignment
54 (SSA) <http://en.wikipedia.org/wiki/Static_single_assignment_form>`_
55 register" or "SSA value" in LLVM. The most distinct aspect of SSA values
56 is that their value is computed as the related instruction executes, and
57 it does not get a new value until (and if) the instruction re-executes.
58 In other words, there is no way to "change" an SSA value. For more
59 information, please read up on `Static Single
60 Assignment <http://en.wikipedia.org/wiki/Static_single_assignment_form>`_
61 - the concepts are really quite natural once you grok them.
63 Note that instead of adding virtual methods to the ExprAST class
64 hierarchy, it could also make sense to use a `visitor
65 pattern <http://en.wikipedia.org/wiki/Visitor_pattern>`_ or some other
66 way to model this. Again, this tutorial won't dwell on good software
67 engineering practices: for our purposes, adding a virtual method is
70 The second thing we want is a "LogError" method like we used for the
71 parser, which will be used to report errors found during code generation
72 (for example, use of an undeclared parameter):
76 static std::unique_ptr<LLVMContext> TheContext;
77 static std::unique_ptr<IRBuilder<>> Builder(TheContext);
78 static std::unique_ptr<Module> TheModule;
79 static std::map<std::string, Value *> NamedValues;
81 Value *LogErrorV(const char *Str) {
86 The static variables will be used during code generation. ``TheContext``
87 is an opaque object that owns a lot of core LLVM data structures, such as
88 the type and constant value tables. We don't need to understand it in
89 detail, we just need a single instance to pass into APIs that require it.
91 The ``Builder`` object is a helper object that makes it easy to generate
92 LLVM instructions. Instances of the
93 `IRBuilder <https://llvm.org/doxygen/IRBuilder_8h_source.html>`_
94 class template keep track of the current place to insert instructions
95 and has methods to create new instructions.
97 ``TheModule`` is an LLVM construct that contains functions and global
98 variables. In many ways, it is the top-level structure that the LLVM IR
99 uses to contain code. It will own the memory for all of the IR that we
100 generate, which is why the codegen() method returns a raw Value\*,
101 rather than a unique_ptr<Value>.
103 The ``NamedValues`` map keeps track of which values are defined in the
104 current scope and what their LLVM representation is. (In other words, it
105 is a symbol table for the code). In this form of Kaleidoscope, the only
106 things that can be referenced are function parameters. As such, function
107 parameters will be in this map when generating code for their function
110 With these basics in place, we can start talking about how to generate
111 code for each expression. Note that this assumes that the ``Builder``
112 has been set up to generate code *into* something. For now, we'll assume
113 that this has already been done, and we'll just use it to emit code.
115 Expression Code Generation
116 ==========================
118 Generating LLVM code for expression nodes is very straightforward: less
119 than 45 lines of commented code for all four of our expression nodes.
120 First we'll do numeric literals:
124 Value *NumberExprAST::codegen() {
125 return ConstantFP::get(*TheContext, APFloat(Val));
128 In the LLVM IR, numeric constants are represented with the
129 ``ConstantFP`` class, which holds the numeric value in an ``APFloat``
130 internally (``APFloat`` has the capability of holding floating point
131 constants of Arbitrary Precision). This code basically just creates
132 and returns a ``ConstantFP``. Note that in the LLVM IR that constants
133 are all uniqued together and shared. For this reason, the API uses the
134 "foo::get(...)" idiom instead of "new foo(..)" or "foo::Create(..)".
138 Value *VariableExprAST::codegen() {
139 // Look this variable up in the function.
140 Value *V = NamedValues[Name];
142 LogErrorV("Unknown variable name");
146 References to variables are also quite simple using LLVM. In the simple
147 version of Kaleidoscope, we assume that the variable has already been
148 emitted somewhere and its value is available. In practice, the only
149 values that can be in the ``NamedValues`` map are function arguments.
150 This code simply checks to see that the specified name is in the map (if
151 not, an unknown variable is being referenced) and returns the value for
152 it. In future chapters, we'll add support for `loop induction
153 variables <LangImpl05.html#for-loop-expression>`_ in the symbol table, and for `local
154 variables <LangImpl07.html#user-defined-local-variables>`_.
158 Value *BinaryExprAST::codegen() {
159 Value *L = LHS->codegen();
160 Value *R = RHS->codegen();
166 return Builder->CreateFAdd(L, R, "addtmp");
168 return Builder->CreateFSub(L, R, "subtmp");
170 return Builder->CreateFMul(L, R, "multmp");
172 L = Builder->CreateFCmpULT(L, R, "cmptmp");
173 // Convert bool 0/1 to double 0.0 or 1.0
174 return Builder->CreateUIToFP(L, Type::getDoubleTy(TheContext),
177 return LogErrorV("invalid binary operator");
181 Binary operators start to get more interesting. The basic idea here is
182 that we recursively emit code for the left-hand side of the expression,
183 then the right-hand side, then we compute the result of the binary
184 expression. In this code, we do a simple switch on the opcode to create
185 the right LLVM instruction.
187 In the example above, the LLVM builder class is starting to show its
188 value. IRBuilder knows where to insert the newly created instruction,
189 all you have to do is specify what instruction to create (e.g. with
190 ``CreateFAdd``), which operands to use (``L`` and ``R`` here) and
191 optionally provide a name for the generated instruction.
193 One nice thing about LLVM is that the name is just a hint. For instance,
194 if the code above emits multiple "addtmp" variables, LLVM will
195 automatically provide each one with an increasing, unique numeric
196 suffix. Local value names for instructions are purely optional, but it
197 makes it much easier to read the IR dumps.
199 `LLVM instructions <../../LangRef.html#instruction-reference>`_ are constrained by strict
200 rules: for example, the Left and Right operands of an `add
201 instruction <../../LangRef.html#add-instruction>`_ must have the same type, and the
202 result type of the add must match the operand types. Because all values
203 in Kaleidoscope are doubles, this makes for very simple code for add,
206 On the other hand, LLVM specifies that the `fcmp
207 instruction <../../LangRef.html#fcmp-instruction>`_ always returns an 'i1' value (a
208 one bit integer). The problem with this is that Kaleidoscope wants the
209 value to be a 0.0 or 1.0 value. In order to get these semantics, we
210 combine the fcmp instruction with a `uitofp
211 instruction <../../LangRef.html#uitofp-to-instruction>`_. This instruction converts its
212 input integer into a floating point value by treating the input as an
213 unsigned value. In contrast, if we used the `sitofp
214 instruction <../../LangRef.html#sitofp-to-instruction>`_, the Kaleidoscope '<' operator
215 would return 0.0 and -1.0, depending on the input value.
219 Value *CallExprAST::codegen() {
220 // Look up the name in the global module table.
221 Function *CalleeF = TheModule->getFunction(Callee);
223 return LogErrorV("Unknown function referenced");
225 // If argument mismatch error.
226 if (CalleeF->arg_size() != Args.size())
227 return LogErrorV("Incorrect # arguments passed");
229 std::vector<Value *> ArgsV;
230 for (unsigned i = 0, e = Args.size(); i != e; ++i) {
231 ArgsV.push_back(Args[i]->codegen());
236 return Builder->CreateCall(CalleeF, ArgsV, "calltmp");
239 Code generation for function calls is quite straightforward with LLVM. The code
240 above initially does a function name lookup in the LLVM Module's symbol table.
241 Recall that the LLVM Module is the container that holds the functions we are
242 JIT'ing. By giving each function the same name as what the user specifies, we
243 can use the LLVM symbol table to resolve function names for us.
245 Once we have the function to call, we recursively codegen each argument
246 that is to be passed in, and create an LLVM `call
247 instruction <../../LangRef.html#call-instruction>`_. Note that LLVM uses the native C
248 calling conventions by default, allowing these calls to also call into
249 standard library functions like "sin" and "cos", with no additional
252 This wraps up our handling of the four basic expressions that we have so
253 far in Kaleidoscope. Feel free to go in and add some more. For example,
254 by browsing the `LLVM language reference <../../LangRef.html>`_ you'll find
255 several other interesting instructions that are really easy to plug into
258 Function Code Generation
259 ========================
261 Code generation for prototypes and functions must handle a number of
262 details, which make their code less beautiful than expression code
263 generation, but allows us to illustrate some important points. First,
264 let's talk about code generation for prototypes: they are used both for
265 function bodies and external function declarations. The code starts
270 Function *PrototypeAST::codegen() {
271 // Make the function type: double(double,double) etc.
272 std::vector<Type*> Doubles(Args.size(),
273 Type::getDoubleTy(*TheContext));
275 FunctionType::get(Type::getDoubleTy(*TheContext), Doubles, false);
278 Function::Create(FT, Function::ExternalLinkage, Name, TheModule.get());
280 This code packs a lot of power into a few lines. Note first that this
281 function returns a "Function\*" instead of a "Value\*". Because a
282 "prototype" really talks about the external interface for a function
283 (not the value computed by an expression), it makes sense for it to
284 return the LLVM Function it corresponds to when codegen'd.
286 The call to ``FunctionType::get`` creates the ``FunctionType`` that
287 should be used for a given Prototype. Since all function arguments in
288 Kaleidoscope are of type double, the first line creates a vector of "N"
289 LLVM double types. It then uses the ``Functiontype::get`` method to
290 create a function type that takes "N" doubles as arguments, returns one
291 double as a result, and that is not vararg (the false parameter
292 indicates this). Note that Types in LLVM are uniqued just like Constants
293 are, so you don't "new" a type, you "get" it.
295 The final line above actually creates the IR Function corresponding to
296 the Prototype. This indicates the type, linkage and name to use, as
297 well as which module to insert into. "`external
298 linkage <../../LangRef.html#linkage>`_" means that the function may be
299 defined outside the current module and/or that it is callable by
300 functions outside the module. The Name passed in is the name the user
301 specified: since "``TheModule``" is specified, this name is registered
302 in "``TheModule``"s symbol table.
306 // Set names for all arguments.
308 for (auto &Arg : F->args())
309 Arg.setName(Args[Idx++]);
313 Finally, we set the name of each of the function's arguments according to the
314 names given in the Prototype. This step isn't strictly necessary, but keeping
315 the names consistent makes the IR more readable, and allows subsequent code to
316 refer directly to the arguments for their names, rather than having to look up
317 them up in the Prototype AST.
319 At this point we have a function prototype with no body. This is how LLVM IR
320 represents function declarations. For extern statements in Kaleidoscope, this
321 is as far as we need to go. For function definitions however, we need to
322 codegen and attach a function body.
326 Function *FunctionAST::codegen() {
327 // First, check for an existing function from a previous 'extern' declaration.
328 Function *TheFunction = TheModule->getFunction(Proto->getName());
331 TheFunction = Proto->codegen();
336 if (!TheFunction->empty())
337 return (Function*)LogErrorV("Function cannot be redefined.");
340 For function definitions, we start by searching TheModule's symbol table for an
341 existing version of this function, in case one has already been created using an
342 'extern' statement. If Module::getFunction returns null then no previous version
343 exists, so we'll codegen one from the Prototype. In either case, we want to
344 assert that the function is empty (i.e. has no body yet) before we start.
348 // Create a new basic block to start insertion into.
349 BasicBlock *BB = BasicBlock::Create(*TheContext, "entry", TheFunction);
350 Builder->SetInsertPoint(BB);
352 // Record the function arguments in the NamedValues map.
354 for (auto &Arg : TheFunction->args())
355 NamedValues[std::string(Arg.getName())] = &Arg;
357 Now we get to the point where the ``Builder`` is set up. The first line
358 creates a new `basic block <http://en.wikipedia.org/wiki/Basic_block>`_
359 (named "entry"), which is inserted into ``TheFunction``. The second line
360 then tells the builder that new instructions should be inserted into the
361 end of the new basic block. Basic blocks in LLVM are an important part
362 of functions that define the `Control Flow
363 Graph <http://en.wikipedia.org/wiki/Control_flow_graph>`_. Since we
364 don't have any control flow, our functions will only contain one block
365 at this point. We'll fix this in `Chapter 5 <LangImpl05.html>`_ :).
367 Next we add the function arguments to the NamedValues map (after first clearing
368 it out) so that they're accessible to ``VariableExprAST`` nodes.
372 if (Value *RetVal = Body->codegen()) {
373 // Finish off the function.
374 Builder->CreateRet(RetVal);
376 // Validate the generated code, checking for consistency.
377 verifyFunction(*TheFunction);
382 Once the insertion point has been set up and the NamedValues map populated,
383 we call the ``codegen()`` method for the root expression of the function. If no
384 error happens, this emits code to compute the expression into the entry block
385 and returns the value that was computed. Assuming no error, we then create an
386 LLVM `ret instruction <../../LangRef.html#ret-instruction>`_, which completes the function.
387 Once the function is built, we call ``verifyFunction``, which is
388 provided by LLVM. This function does a variety of consistency checks on
389 the generated code, to determine if our compiler is doing everything
390 right. Using this is important: it can catch a lot of bugs. Once the
391 function is finished and validated, we return it.
395 // Error reading body, remove function.
396 TheFunction->eraseFromParent();
400 The only piece left here is handling of the error case. For simplicity,
401 we handle this by merely deleting the function we produced with the
402 ``eraseFromParent`` method. This allows the user to redefine a function
403 that they incorrectly typed in before: if we didn't delete it, it would
404 live in the symbol table, with a body, preventing future redefinition.
406 This code does have a bug, though: If the ``FunctionAST::codegen()`` method
407 finds an existing IR Function, it does not validate its signature against the
408 definition's own prototype. This means that an earlier 'extern' declaration will
409 take precedence over the function definition's signature, which can cause
410 codegen to fail, for instance if the function arguments are named differently.
411 There are a number of ways to fix this bug, see what you can come up with! Here
416 extern foo(a); # ok, defines foo.
417 def foo(b) b; # Error: Unknown variable name. (decl using 'a' takes precedence).
419 Driver Changes and Closing Thoughts
420 ===================================
422 For now, code generation to LLVM doesn't really get us much, except that
423 we can look at the pretty IR calls. The sample code inserts calls to
424 codegen into the "``HandleDefinition``", "``HandleExtern``" etc
425 functions, and then dumps out the LLVM IR. This gives a nice way to look
426 at the LLVM IR for simple functions. For example:
431 Read top-level expression:
434 ret double 9.000000e+00
437 Note how the parser turns the top-level expression into anonymous
438 functions for us. This will be handy when we add `JIT
439 support <LangImpl04.html#adding-a-jit-compiler>`_ in the next chapter. Also note that the
440 code is very literally transcribed, no optimizations are being performed
441 except simple constant folding done by IRBuilder. We will `add
442 optimizations <LangImpl04.html#trivial-constant-folding>`_ explicitly in the next
447 ready> def foo(a b) a*a + 2*a*b + b*b;
448 Read function definition:
449 define double @foo(double %a, double %b) {
451 %multmp = fmul double %a, %a
452 %multmp1 = fmul double 2.000000e+00, %a
453 %multmp2 = fmul double %multmp1, %b
454 %addtmp = fadd double %multmp, %multmp2
455 %multmp3 = fmul double %b, %b
456 %addtmp4 = fadd double %addtmp, %multmp3
460 This shows some simple arithmetic. Notice the striking similarity to the
461 LLVM builder calls that we use to create the instructions.
465 ready> def bar(a) foo(a, 4.0) + bar(31337);
466 Read function definition:
467 define double @bar(double %a) {
469 %calltmp = call double @foo(double %a, double 4.000000e+00)
470 %calltmp1 = call double @bar(double 3.133700e+04)
471 %addtmp = fadd double %calltmp, %calltmp1
475 This shows some function calls. Note that this function will take a long
476 time to execute if you call it. In the future we'll add conditional
477 control flow to actually make recursion useful :).
481 ready> extern cos(x);
483 declare double @cos(double)
486 Read top-level expression:
489 %calltmp = call double @cos(double 1.234000e+00)
493 This shows an extern for the libm "cos" function, and a call to it.
495 .. TODO:: Abandon Pygments' horrible `llvm` lexer. It just totally gives up
496 on highlighting this due to the first line.
501 ; ModuleID = 'my cool jit'
505 %addtmp = fadd double 4.000000e+00, 5.000000e+00
509 define double @foo(double %a, double %b) {
511 %multmp = fmul double %a, %a
512 %multmp1 = fmul double 2.000000e+00, %a
513 %multmp2 = fmul double %multmp1, %b
514 %addtmp = fadd double %multmp, %multmp2
515 %multmp3 = fmul double %b, %b
516 %addtmp4 = fadd double %addtmp, %multmp3
520 define double @bar(double %a) {
522 %calltmp = call double @foo(double %a, double 4.000000e+00)
523 %calltmp1 = call double @bar(double 3.133700e+04)
524 %addtmp = fadd double %calltmp, %calltmp1
528 declare double @cos(double)
532 %calltmp = call double @cos(double 1.234000e+00)
536 When you quit the current demo (by sending an EOF via CTRL+D on Linux
537 or CTRL+Z and ENTER on Windows), it dumps out the IR for the entire
538 module generated. Here you can see the big picture with all the
539 functions referencing each other.
541 This wraps up the third chapter of the Kaleidoscope tutorial. Up next,
542 we'll describe how to `add JIT codegen and optimizer
543 support <LangImpl04.html>`_ to this so we can actually start running
549 Here is the complete code listing for our running example, enhanced with
550 the LLVM code generator. Because this uses the LLVM libraries, we need
551 to link them in. To do this, we use the
552 `llvm-config <https://llvm.org/cmds/llvm-config.html>`_ tool to inform
553 our makefile/command line about which options to use:
558 clang++ -g -O3 toy.cpp `llvm-config --cxxflags --ldflags --system-libs --libs core` -o toy
564 .. literalinclude:: ../../../examples/Kaleidoscope/Chapter3/toy.cpp
567 `Next: Adding JIT and Optimizer Support <LangImpl04.html>`_