GithubHelp home page GithubHelp logo

xwang98 / wali Goto Github PK

View Code? Open in Web Editor NEW

This project forked from arjunr2/wali

0.0 0.0 0.0 34.33 MB

A low-level virtualization interface for Linux-based systems using WebAssembly

License: MIT License

Shell 7.51% C++ 0.37% Python 0.46% C 80.88% Lua 0.03% Awk 0.15% KRL 0.12% Makefile 0.80% HTML 0.40% Jupyter Notebook 0.60% M4 2.89% Roff 5.78%

wali's Introduction

Webassembly Linux Interface (WALI)

WebAssembly Linux Interface

This repo serves to prototype an implementation of the WebAssembly Linux Interface. For current range of support, refer here

Overview

WALI is a complete(ish?) abstraction over Linux for WebAssembly that aims to push lightweight virtualization down to even low-level system applications. WALI adopts a layering approach to API design, establishing infrastructure for facilitating WebAssembly-oriented research by virtualizing high-level APIs and seamless build-run-deploy workflows of arbitrary applications. We create a custom modified C standard library (musl libc) that uses WALI and produce a baseline implementation in WAMR

Skip the talk, I want to run some WALI apps!

  1. Install dependencies
  • Ninja
  • Make
  • Cmake
  • GCC
  • lld
  • WABT

If using apt, run sudo ./apt-install-deps.sh to install above depedencies

  1. Build a WALI runtime following these instructions

  2. The wasm-apps directory has several popular applications like Bash, Lua, and Sqlite with sample scripts/data for each app. As an example, to run sqlite3:

# Increase the stack size if the program runs out of space
./iwasm -v=0 --stack-size=524288 wasm-apps/sqlite/sqlite3.wasm

Building the Entire Toolchain

Before proceeding, make sure all dependencies are up to date, as detailed in previous section:

There are four major toolchain components:

  1. WALI runtime
  2. Custom Clang compiler (C -> Wasm-WALI)
  3. C-standard library for WALI
  4. (Optional) AoT Compiler for WAMR (Wasm-WALI -> WAMR AoT)

If compiling WALI applications is not required and step 1 is required.

Building WALI runtime

We produce a baseline implementation in WAMR. For details on how to implement these native APIs in WAMR, refer here

To build the WAMR-WALI runtime:

git submodule update --init wasm-micro-runtime
make iwasm

An iwasm symlink executable should be generated in the root directory

Building the Wasm-WALI Clang compiler

We use LLVM Clang 16 with compiler-rt builtins for full wasm32 support. To build the llvm suite:

git submodule update --init llvm-project
make wali-compiler

Future steps use this toolchain. Add the llvm build binary directory (<root-directory>/llvm-project/build/bin) to PATH for convenience.

Building WALI libc

The wali-musl submodule has detailed information on prerequisites and steps for compiling libc

To build libc:

git submodule update --init wali-musl
make libc

We currently support 64-bit architectures for x86-64, aarch64, and riscv64. In the future, we will expand to more architectures.

(Optional) WAMR AoT Compiler

Refer here on steps to build the AoT compiler.

Once completed, you can create a symlink from the root directory:

ln -sf wasm-micro-runtime/wamr-compiler/wamrc wamrc

Compiling Applications to WALI

Standalones

To compile C to WASM, refer to compile-wali-standalone.sh:

# Compile standalone C file
<path-to-WALI-clang> \
  --target=wasm32-wasi-threads -O3 -pthread \
  `# Sysroot and lib search path` \
  --sysroot=<path-to-wali-sysroot> -L<path-to-wali-sysroot>/lib \
  `# Enable wasm extension features`  \
  -matomics -mbulk-memory -mmutable-globals -msign-ext  \
  `# Linker flags for shared mem + threading` \
  -Wl,--shared-memory -Wl,--export-memory -Wl,--max-memory=67108864 \
  <input-c-file> -o <output-wasm-file>

Since changes are yet to be made to clang/wasm-ld for the wali toolchain, we are using support enabled in wasi-threads target. This will change once a wasm32-linux target is added for WALI.

To indepedently specify compile and link flags, refer to compile-wali.sh used for the test suite

Building the Test Suite

make tests

WALI executables are located in tests/wasm. Native ELF files for the same in tests/elf can be used to compare against the WASM output

WASM Bytecode -> AoT Compilation

Use the WAMR compiler wamrc with the --enable-multi-thread flag to generate threaded code

Running WALI-WASM code

Use any Webassembly runtime that implements WALI to execute the above generated WASM code.

If you built the baseline WAMR implementation from the Makefile, you can use ./iwasm <path-to-wasm-file> to execute the code.

The wasm-apps directory has several popular prebuilt binaries to run. You may also run the test suite binaries detailed here

Miscellaneous

Run WASM code like an ELF binary!

Most Linux distros will allow registration of miscellaneous binary formats. For WASM binaries, the OS must be aware of which program to invoke to run the WASM file.

  1. Save all current environment variables to a file
env &> ~/.walienv
  1. Create a wrapper bash script around the runtime invocation as below. Parameters to iwasm can be configured based on preferences
#!/bin/bash
# /usr/bin/iwasm-wrapper - Wrapper for running WASM programs

exec <absolute-path-to-iwasm> -v=0 --stack-size=524288 --max-threads=30 --env-file=<absolute-path-to-envfile> "$@"
  1. Register WASM as a misc format and use the script from step 2 as the interpreter
cd misc
sudo ./binfmt_register.sh

NOTE: The above solution gets erased after reboots. For a more permanent setup using binfmt daemon:

sudo cp misc/iwali.conf /etc/binfmt.d/
sudo systemctl restart systemd-binfmt

More information about miscellaneous binary formats and troubleshooting can be found here

Resources

Syscall Information Table

This paper (https://cseweb.ucsd.edu/~dstefan/pubs/johnson:2022:wave.pdf) and its related work section, especially the bit labeled "Modeling and verifying system interfaces"

wali's People

Contributors

arjunr2 avatar enjhnsn2 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.