${HEADER}

For the moment, TESLA requires a bit of effort to build: it has a very heavyweight dependency (recent LLVM/Clang). The steps required to build are:

  1. acquire prerequisites
  2. build (a very recent version of) LLVM
  3. build TESLA
  4. install TESLA (optional)

Prerequisites

Here is a table of dependencies and the names of their ports/packages on various platforms:

Name FreeBSD Mac OS X Linux
HomeBrew MacPorts Ubuntu Fedora
C++ compiler base system XCode (with Command Line Tools) clang gcc-c++
LLVM llvm-devel build from source
Clang clang-devel
libexecinfo libexecinfo included with libc
Git git git-core git
Subversion subversion
Ninja devel/ninja ninja ninja-build from source
CMake cmake cmake --enable-ninja cmake
Protocol Buffers protobuf protobuf-cpp libprotobuf-dev
protobuf-compiler
from source

Build a recent LLVM/Clang

You need to download and build a very recent version of LLVM/Clang from source unless you're using FreeBSD, which has a very recent version available from ports (see prerequisites, above).

$ cd somewhere/to/stash/1GB/of/LLVM $ svn co http://llvm.org/svn/llvm-project/llvm/trunk llvm [...] U llvm Checked out revision 175226. $ cd llvm/tools $ svn co http://llvm.org/svn/llvm-project/cfe/trunk clang [...] A clang/LICENSE.TXT U clang Checked out revision 175226. $ cd ../.. $ mkdir build $ cd build $ cmake -G Ninja -D CMAKE_C_COMPILER=clang -D CMAKE_CXX_COMPILER=clang++ ../llvm # or gcc/g++ -- The C compiler identification is [...] -- Check for working C compiler using: Ninja -- Check for working C compiler using: Ninja -- works [...] -- Generating done -- Build files have been written to: /home/jonathan/LLVM/build $ ninja [1932/1932] Linking CXX executable bin/c-index-test

Build TESLA

Once you have LLVM, make sure that it's at the front of your PATH:

$ export PATH=/path/to/LLVM/build/bin:$PATH $ llvm-config --libdir # test that llvm-config works

Next, we download and configure TESLA:

$ git clone https://github.com/CTSRD-TESLA/TESLA.git tesla Cloning into 'TESLA'... remote: Counting objects: 3138, done. [...] $ cd tesla $ mkdir build $ cd build $ cmake -G Ninja -DCMAKE_C_COMPILER=clang -DCMAKE_CXX_COMPILER=clang++ .. -- The C compiler identification is Clang 3.3.0 -- Check for working C compiler using: Ninja -- Check for working C compiler using: Ninja -- works -- Detecting C compiler ABI info [...] -- Found PROTOBUF: /usr/local/lib/libprotobuf.so -- Configuring done -- Generating done -- Build files have been written to: /home/jonathan/TESLA/build

If the CMake command fails because of libc++ link failures or an inability to find LLVM-Config or AddLLVM, you may need to run one of:

$ cmake -D USE_LIBCXX=false . # if you have libc++ but didn't link LLVM against it $ cmake -D CMAKE_MODULE_PATH=/<LLVM prefix>/share/llvm/cmake . # if CMake can't find LLVM-Config or AddLLVM

Then we can build TESLA and test it:

$ ninja [50/50] Linking CXX executable tesla/tools/read/telsa-read

After that, you'll want to be able to actually run TESLA!

Install TESLA (optional)

TESLA can be run in-place from the build directory, but you may prefer to install it on your PATH:

$ cd build $ cmake -D CMAKE_INSTALL_PREFIX=/some/sensible/path . # or else things go in /usr/local $ ninja install [1/1] Install the project... -- Install configuration: "Debug" -- Up-to-date: /some/sensible/path/lib/libtesla.so -- Installing: /some/sensible/path/bin/tesla-analyse -- Installing: /some/sensible/path/bin/tesla-instrument -- Installing: /some/sensible/path/bin/tesla-graph -- Installing: /some/sensible/path/bin/llvm-triple -- Installing: /some/sensible/path/bin/tesla-read -- Installing: /some/sensible/path/bin/tesla $ export PATH=/some/sensible/path

Having either set variables or installed TESLA, you should now be able to run TESLA commands:

$ tesla graph -help USAGE: tesla-graph [options] <input file> OPTIONS: automata determinism: -u - unlinked NFA -l - linked NFA -d - DFA -help - Display available options (-help-hidden for more) -o=<string> - <output file> -version - Display the version of this program
${FOOTER}