1 Install instructions for Dungeon Crawl Stone Soup (DCSS)
2 --------------------------------------------------------
3 (Last updated on 20091025 for Dungeon Crawl Stone Soup 0.6.0.)
8 This file describes how to compile a runtime executable of DCSS from the
9 source code. If you're trying to compile Crawl yourself, skip ahead to the
10 next section, "Building Dungeon Crawl Stone Soup".
12 If, however, you're having trouble getting a precompiled binary to run:
14 1) Check whether you've downloaded and extracted the correct version.
16 Platform Tiles? Download package
17 -------- ------ ----------------
18 Windows yes stone_soup-VERSION-tiles-win32.zip
19 Windows no stone_soup-VERSION-win32.zip
20 Mac OS X yes stone_soup-VERSION-tiles-osx.zip
21 or stone_soup-VERSION-tiles-osx-app.dmg
22 Mac OS X no stone_soup-VERSION-osx.zip
23 or stone_soup-VERSION-osx-app.dmg
24 DOS no stone_soup-VERSION-dos.zip
25 Source code yes stone_soup-VERSION-src.zip
26 or stone_soup-VERSION-src.tar.bz2
28 2) Try removing/renaming your saves/ directory in case older saves aren't
32 If you still can't get Crawl to run, you can ask for further help on
33 rec.games.roguelike.misc. Please try to be as detailed as possible about
34 any error messages you're getting.
36 The rest of the file deals with compiling from the source.
39 Building Dungeon Crawl Stone Soup
40 ---------------------------------
42 Crawl Stone Soup is known to compile successfully on the following platforms
45 - Any Unix with a recent gcc (and g++), GNU make and libncurses, including
46 Linux and Mac OS X. "Recent" is defined as gcc 4.1 or newer.
47 - Microsoft Windows NT/2000/XP.
49 The only officially supported compiler is gcc (available on almost all
50 Unixes, and as djgpp for DOS, and MinGW for Windows).
52 On other platforms, your mileage may wary, but you should be able to build
53 Crawl on pretty much any operating system with a modern C++ compiler (full
54 support for the standard C++ library, in particular <string>, the collection
55 classes and <algorithm> is necessary) and the curses library.
57 Windows 9X and ME are no longer supported but you may probably build for
58 them with relatively minor changes. We also used to support DOS in the past
59 and the remnants of DOS code are still present, but getting them to work may
60 require a substantial effort. If you have any success, please let us know!
66 If you don't already have the source code downloaded (in the form of a .zip
67 or .tar.bz2 file), you can obtain the latest source code from Git. MinGW
68 users can obtain Git by installing msysgit (described in the MinGW section).
69 Linux users can just install the 'git' or 'git-core' package with whatever
70 package manager they use. Note that there used to be another package called
71 'git' (now 'gnuit') going around which stands for 'GNU interactive tools'.
72 This is not what you want.
74 Once you have Git installed, you just need to do:
76 git clone git://crawl-ref.git.sourceforge.net/gitroot/crawl-ref/crawl-ref
78 And then to get the contributing libraries, enter the newly cloned
79 repository via 'cd crawl-ref' and type:
81 git submodule update --init
83 This should get you all you need to build Crawl from source.
89 Crawl can be built with some optional features:
94 * Unicode characters for the map (Unix only).
96 Crawl Stone Soup also uses a level compiler to compile special level
97 definitions; to make changes to the level compiler, you'll need the flex and
98 bison/byacc tools (Other lex/yaccs may also work). More details are
101 Sounds must be enabled by editing AppHdr.h (uncomment SOUND_PLAY_COMMAND on
102 Unixes or WINMM_PLAY_SOUNDS on Windows).
104 Regular expressions require libpcre on non-Unix operating systems. On
105 Unixes, the standard POSIX regular expression support is adequate for
108 Stone Soup 0.6 includes Lua 5.1.4 in its source tree. Crawl uses Lua for
109 dungeon generation. In addition, Crawl has a (rudimentary) Lua interface
110 for users to run scripts which can do things such as conditionalise parts of
111 the .crawlrc/init.txt. Such user Lua scripts are enabled by default, but
112 can be turned off by appending NO_LUA_BINDINGS=y to the make invocation.
114 Unicode support needs libncursesw and its header files; these are usually
115 available in your operating system's package-management system. Unicode is
116 not supported on Windows or DOS. Some systems, such as Mac OS X, may have
117 Unicode support available in libncurses itself (i.e., without a separate
121 Building on Unix (Linux, *BSD, Solaris, etc.)
122 ---------------------------------------------
124 To install or not to install:
126 If only one user on the system (you) is going to be playing Crawl, you do
127 not need to use "make install". A simple "make" will build Crawl in the
128 source directory, where you can run it as "./crawl".
131 Prerequisites (Debian):
133 On Debian-based systems (Ubuntu, Mepis, Xandros, ...), you can get all
134 dependencies by typing the following as root/sudo:
135 apt-get install build-essential libncursesw5-dev bison flex liblua5.1-0-dev \
136 libsqlite3-dev libz-dev pkg-config libsdl-image1.2-dev libsdl1.2-dev \
138 (the last four are needed only for tiles builds). This is the complete set,
139 with it you don't have a need for the bundled "contribs".
142 Prerequisites (other systems):
144 GNU gcc and g++, GNU make, libncurses or libcurses. You need the development
145 headers for ncurses - they may not be installed by default on some Unixes.
147 For tile builds, you need FreeDesktop's pkg-config and X11 headers.
149 flex and bison are optional but highly recommended. Recent versions of byacc
150 are also fine (edit your makefile appropriately).
152 If you want to use Unicode, you need to link with a curses library that
153 understands Unicode characters, usually named libncursesw (the development
154 headers for libncursesw are usually in /usr/include/ncursesw.) You also need
155 to have a UTF-8 locale installed. You can then build Crawl with support for
156 Unicode by setting USE_UNICODE on the 'make' command line.
161 * cd to the source directory (you can safely ignore the dolinks.sh and
164 * Most users can simply type 'make' without any extra flags, and get a
165 working build as a result. If just typing 'make' works for you, then you
166 shouldn't need to read any further. BSD and Solaris users may have to use
167 'gmake' instead of 'make'.
169 * If you want a graphical (tiled) build, then you should add 'TILES=y' to
170 the 'make' command-line, like so:
174 Note that the graphical build requires that you have development libraries
175 installed for SDL, SDL_image, libpng, zlib, and freetype. If your system
176 doesn't have these installed, you can usually get them via your package
177 manager (apt-get, emerge, yum, etc).
179 If you would rather, you can go to the source/contrib directory and type
180 'make', and the required libraries should be built for you.
182 * If you want to install Crawl system-wide rather than play from the build
183 directory, you need to specify a directory to hold the save and data files.
184 Specifying a prefix of /usr or /usr/local will default SAVEDIR to
185 "~/.crawl" and DATADIR to share/crawl (relative to the prefix).
186 SAVEDIR must be writeable and owned by the user running crawl, so except
187 for special cases it should be inside "~" (home directory).
189 * If you're installing Crawl for multiple users, run 'make install' as root.
190 Crawl will be copied into the directory specified by 'prefix' (see above).
191 The data directory will be created if necessary, and the level layout,
192 tile and help files will be copied there.
194 * If you do not want players to be able to script Crawl with Lua, add
195 'NO_LUA_BINDINGS=y' to the 'make' command-line.
201 For non-graphical builds, you can use the Unix build process described
202 above, or you can use Xcode, as described below.
204 For graphical builds, we do not support the use of the Unix build process
207 * Crawl is officially built and tested under Xcode 3.2 on OS X 10.6.1, but
208 it's highly likely that other versions of Xcode will work fine.
210 * Make sure Xcode is installed. Xcode should be available on the Mac OS X
211 install DVD if you haven't already installed it. You can also download
212 Xcode from Apple's website (note that their website often has versions of
213 Xcode that are newer than the versions distributed on their Mac OS X
214 DVDs): http://developer.apple.com/TOOLS/Xcode/
216 * Open the Xcode project (Crawl.xcodeproj) under the "source"
219 * Hit Build in Xcode. This should build all the necessary dependencies,
220 including libpng, freetype, SDL, and SDL_image, and then finally build
221 Crawl itself. The process may take quite a while, so go grab a coke or a
224 * The default build configuration, Release, will build a ppc/i386 Universal
225 binary suitable for play on all OS X 10.4 or newer systems. The other
226 build configurations are intended for development and may not result in a
227 binary that can be distributed to others.
229 * If you would like users to be unable to script Crawl with Lua, you can
230 specify NO_LUA_BINDINGS=y when building. See the section on Lua for more
234 Building on Windows (MinGW)
235 ---------------------------
237 NOTE: You cannot build Windows binaries on Windows 9x/ME using MinGW. On
238 9x/ME, you can use the Cygwin build instructions, or build a binary on a
239 Windows NT/2k/XP system (the binary will run on 9x), or build a DOS binary.
241 * To install MinGW, you have two options. You can install via the installer
242 provided by the MinGW project (http://www.mingw.org), but this is not
243 officially supported by the Crawl team. If you have problems with it, we
244 will not help you until you get a supported version of MinGW, which can be
245 obtained from the msysgit project. msysgit is a full MinGW setup that
246 even includes Git (which happens to be the source code management system
247 used by the Crawl team). To get msysgit, be sure to download the
248 'netinstall' from here:
250 http://code.google.com/p/msysgit/downloads/list
252 NOTE: Do NOT get any of the versions that do not have 'netinstall' in the
253 filename. The 'netinstall' is the only one used by the Crawl team.
255 * Start msys by running 'c:\msysgit\msys.bat'. Now you're in a MinGW build
258 * cd to the the Crawl source directory. For instance, if you have the crawl
259 sources in c:\crawl\source, you would type 'cd /c/crawl/source'.
261 * Build Crawl by running 'make'. If you want a graphical build, you will
262 need to add 'TILES=y' on the 'make' command line.
264 * When the process finishes, you should be able to run crawl right from the
265 source directory by typing './crawl'
267 * If you get a message about missing SDL.h even though you do have contribs
268 installed, your version of msys may suffer from a bug that has been fixed
269 since. Please either update msys, or delete /mingw/bin/sdl-config so it
270 won't interfere with the copy shipped with Crawl.
272 Building on Windows (cygwin)
273 ----------------------------
275 * Get Cygwin from http://www.cygwin.com/. When installing, ensure that the
276 following packages are selected: gcc, g++, make, flex, bison,
277 libncurses-devel. If you'd like to build from git, install the git-core
278 package. You may also want to install diff, patch, and other such tools.
280 * Once Cygwin is installed, open a Cygwin bash shell (use the Start menu or
281 desktop icon, do not double-click bash.exe in Explorer).
283 * cd to the the Crawl source directory. For instance, if you have the crawl
284 sources in c:\crawl\source, you would type 'cd /cygdrive/c/crawl/source'.
286 * Follow the Linux build instructions to build Crawl.
289 Building for DOS (djgpp) -- unsupported!
290 ----------------------------------------
292 * Install djgpp from http://www.delorie.com/djgpp/. Don't forget to include
293 C++ support when the Zip picker asks for what you want. You may also have
294 to download GNU make as a separate package. It's important to follow the
295 install instructions carefully, because bad installs can produce rather
296 confusing error messages.
298 * Make sure gxx and make are in your PATH.
300 * If you want to modify the level compiler, install the djgpp flex, bison
301 and m4 packages and set DOYACC := y in makefile.dos.
303 * cd to the Crawl source directory.
305 * Build Crawl by running
308 * When the build is done, crawl.exe should be in the source directory.
311 Building Tiles versions
312 -----------------------
314 * On most platforms, you can simply type:
317 * If you compiled the ASCII binary before, you'll need to run 'make clean'
318 before running 'make'.
320 * All platforms require the same prerequisites listed in the other sections
321 above for building each of these platforms.
323 * All platforms additionally require the development versions of the
324 following software packages installed.
326 * SDL (http://www.libsdl.org/download-1.2.php)
327 * SDL_image (http://www.libsdl.org/projects/SDL_image/)
328 * libpng (http://www.libpng.org/pub/png/libpng.html)
329 * Freetype 2 (http://www.freetype.org/download.html)
331 On Linux, these can be installed via a package manager (apt-get, emerge,
334 On Mac OS X, these will be compiled automatically when you build the Xcode
337 On Windows (MinGW or Cygwin), these will be compiled as needed when you
340 * If you want both ASCII and Tiles binaries you need to compile them
341 separately, rename one of them, and copy them into the same Crawl
345 *****************************************************************************
349 Crawl looks for several data files when starting up. They include:
351 * Special level and vault layout (dat/*.des) files.
352 * Core Lua code (dat/clua/*.lua).
353 * Descriptions for monsters and game features (dat/descript/*.txt).
354 * Definitions for monster dialogue and randart names (dat/database/*.txt).
356 All these files are in the source tree under source/dat.
358 Crawl will also look for documentation files when players invoke the help
359 system. These files are available under the docs directory.
361 Your built Crawl binary must be able to find these files, or it will not
364 If Crawl is built without an explicit DATA_DIR_PATH (this is the most common
365 setup), it will search for its data files under the current directory, and
366 if it can't find them there, one level above the current directory. In
367 short, it uses these search paths: ., ./dat, ./docs, .., ../dat, ../docs.
370 *****************************************************************************
375 Crawl uses a level compiler to read the level design (.des) files in the
376 source/dat directory.
378 If you're using one of standard makefile, the steps described in this
379 section are performed automatically:
381 The level compiler source is in the source/util directory (levcomp.lpp and
382 levcomp.ypp). The steps involved in building the level compiler are:
384 * Run flex on levcomp.lpp to produce the levcomp.lex.cc lexer.
385 * Run bison on levcomp.ypp to produce the levcomp.tab.cc parser and
387 * Compile the resulting C++ source files and levcomp.cc and link the object
388 files into the Crawl executable.
390 For convenience on systems that don't have flex/bison, pre-generated
391 intermediate files are provided under source/prebuilt. The makefiles
392 provided with the Crawl source distribution will use these pre-generated
393 files automatically if flex/bison is not available.
396 *****************************************************************************
397 Optional Libraries (Lua and PCRE)
398 ---------------------------------
403 Security on multiuser systems (Unix):
405 As of 0.8, setuid and setgid installs are no longer supported, thus any
406 previous concerns are no longer applicable. Lua user scripts are sandboxed
407 and should be generally safe even on public servers (besides increasing the
410 As of 0.3, the Lua source is included with Crawl. It will be used if you
411 don't have Lua headers installed. Note that we don't provide security
412 support for Lua, and thus if you run a public server or a kiosk, it is
413 strongly recommended to use system Lua which does receive security updates
414 from whatever distribution you use.
419 As of 0.6.0, PCRE 7.9 source is included with Crawl. It is enabled by
420 default. The sources in contrib/pcre are identical to the 7.9 distro except
421 for the use of a custom-made Makefile instead of the automake system that
422 was in place previously.
424 On Unixes, you're better served by the existing POSIX regular expression
425 support. If you want PCRE, your package management system is again your
426 best bet. Remember to install development headers, not just the plain
433 Modern Unixes may support Unicode terminals (particularly xterms). If you
434 have a terminal that can display Unicode characters, and an ncurses library
435 that can handle Unicode (libncursesw, and its devel headers), you can build
436 Crawl to display Unicode in the map: set 'USE_UNICODE=y' when running
439 NOTE: You may have libncursesw, but not have the header files; check that
440 you have the header files installed as well, or you'll get a lot of errors.
441 Crawl expects the ncursesw headers to be in /usr/include/ncursesw.
443 After compiling Crawl with Unicode support, you still need to add the line
444 "char_set = unicode" in your .crawlrc to tell Crawl to use Unicode. You may
445 also need to set the locale in your terminal if the encoding does not
446 default to UTF-8. To check this, run "locale charmap", which should say
447 "UTF-8". If your encoding is not UTF-8, you can force it to UTF-8 on most
448 systems by doing "export LC_ALL=en_US.UTF-8" or the equivalent, depending on
449 your language locale and what shell you're using.
451 Crawl tries to use en_US.UTF-8 as its default Unicode locale. If you do not
452 have this locale installed, but do have some other UTF-8 locale, you can
453 tell Crawl to use your preferred UTF-8 locale by setting
454 UNICODE_LOCALE=${foo}.UTF-8 on the 'make' command line, where ${foo} is your
457 You may not want to embed the locale in Crawl itself, but have Crawl use the
458 locale as set in the environment LC_* variables. In such cases, you can
459 omit the UNICODE_LOCALE option from the 'make' command line. Crawl will
460 then use the locale as set in your environment.
462 If you're playing Crawl on a remote machine, the remote Crawl should be
463 built with Unicode support, it needs to have a UTF-8 locale installed, *and*
464 your local terminal (where you're running telnet/ssh) needs to be able to