GithubHelp home page GithubHelp logo

lurk-lab / yatima-lang-alpha Goto Github PK

View Code? Open in Web Editor NEW
448.0 10.0 19.0 2.33 MB

A programming language for the decentralized web

License: MIT License

Rust 75.08% Nix 24.73% HTML 0.07% JavaScript 0.12% Shell 0.01%

yatima-lang-alpha's Introduction

Yatima: A programming language for the decentralized web

built with nix

NOTE: This repository is no longer maintained. Development has moved to https://github.com/yatima-inc/yatima, a dependently typed and content addressed compiler from the Lean theorem prover to the Lurk zkSNARK language. Further information can be found in the Yatima Wiki.


In one sense, the Truth Mines were just another indexscape. Hundreds of thousands of specialized selections of the library's contents were accessible in similar ways--and Yatima had climbed the Evolutionary Tree, hopscotched the Periodic Table, walked the avenue-like Timelines for the histories of fleshers, gleisners, and citizens. Half a megatau before, ve'd swum through the Eukaryotic Cell; every protein, every nucleotide, even carbohydrate drifting through the cytoplasm had broadcast gestalt tags with references to everything the library had to say about the molecule in question.

In the Truth Mines, though, the tags weren't just references; they included complete statements of the particular definitions, axioms, or theorems the objects represented. The Mines were self-contained: every mathematical result that fleshers and their descendants had ever proven was on display in its entirety. The library's exegesis was helpful-but the truths themselves were all here.

Diaspora, Greg Egan


Yatima is a pure functional programming language implemented in Rust with the following features:

  • Content-Addressing powers reproducible builds, and peer-to-peer package management. A Yatima content-address represents an immutable program and all its dependencies. That means if someones shares an address with you, you can perfectly replicate their computation (and in principle even their computing environment!). Since the program is immutable, the way it runs the first time is the way it runs everytime.
  • First-class types. This lets you the programmer to tell the compiler what you intend to do in your program. Then, like a helpful robot assistant, the compiler will check to make sure that what you're actually doing matches those expressed intentions. Type-driven programming lets the compiler act as your "correctness exocortex", i.e. a cognitive augmentation that helps you catch your mistakes.
  • Linear, affine and erased types give you fine-grained control over resource usage during execution. Substructural types allow you to get the memory safety benefits of using a high-level language, while also allowing you to work "close to the metal" when you want to.
  • Type-safe dependent metaprogramming lets Yatima have the flexibility and extensibility of a dynamically-typed language, without sacrificing the safety of static-typing.

Examples

Algebraic datatypes (ADTs):

type Maybe (A: Type) {
  None,
  Some A,
}

type List (A: Type) {
  Nil,
  Cons A (List A),
}

def List.head (0 A: Type) (a: List A): Maybe A
  = (case a) (λ _ => Maybe A) (Maybe.None A) (λ x _ => Maybe.Some A x)

Generalized algrebraic datatypes:

type Expr: ∀ Type -> Type {
  N Nat: Expr Nat,
  B Bool: Expr Bool,
  Add (Expr Nat) (Expr Nat): Expr Nat,
  Mut (Expr Nat) (Expr Nat): Expr Nat,
  Eql (Expr Nat) (Expr Nat): Expr Bool,
}

def Expr.checks : Expr Bool = Expr.Eql (Expr.N 1) (Expr.N 2)

Dependent types and proofs:

type Vector (A: Type): ∀ (ω k: Natural) -> Type {
   Nil: Vector A Natural.Z,
   Cons (0 k: Natural) (x: A) (xs: Vector A k): Vector A (Natural.S k),
}

def Vector.head (0 A: Type) (k: Natural) (a : Vector A (Natural.S k)): A
  = ((case a) (λ k' self => ∀ (Equal Natural (Natural.S k) k') -> A)
    (λ e => Empty.absurd A (Natural.Z_isnt_S k e))
    (λ k x xs e => x))
    (Equal.Refl Natural (Natural.S k))

For more examples of Yatima code please refer to the introit standard library: https://github.com/yatima-inc/introit

Implementation

Come chat with us on Matrix: #yatima:matrix.org or on the Yatima subreddit

Build Instructions:

Clone this repository and cd into it:

git clone [email protected]:yatima-inc/yatima.git
...
cd yatima

Using binary cache (optional):

To speed up builds use our binary cache from cachix. Install cachix and run:

cachix use yatima

With Nix flakes (default):

Assuming you have activated flakes for your nix, otherwise see here.

# Activate shell environment
direnv allow
# Run standalone
nix run
# Build
nix build
# Start dev shell. Handled automatically by direnv
nix develop
# Install into your environment
nix profile install

Compiling to WASM

nix-shell
cd web

Then run the following command to install required dependencies:

npm install

Afterwards, the experimental web version can be hosted with:

npm start

With cargo

Yatima requires nightly Rust:

rustup default nightly

To build yatima:

cargo build

To run the test-suite:

cargo test --all

To install the yatima binary:

cargo install --path cli

Usage Instructions:

Parse a .ya file (like from https://github.com/yatima-inc/introit) with:

λ yatima parse bool.ya 
Package parsed: bafy2bzacedl5jeqjqvvykquxjy53xey2l2hvcye2bi2omddjdwjbfqkpagksi
...

Typecheck with:

λ yatima check bool.ya
Checking package bool at bafy2bzacedl5jeqjqvvykquxjy53xey2l2hvcye2bi2omddjdwjbfqkpagksi
Checking definitions:
✓ Bool: Type
✓ Bool.True: Bool
✓ Bool.False: Bool
✓ Bool.eql: ∀ (x: Bool) (y: Bool) -> Bool
✓ Bool.lte: ∀ (x: Bool) (y: Bool) -> Bool
✓ Bool.lth: ∀ (x: Bool) (y: Bool) -> Bool
✓ Bool.gte: ∀ (x: Bool) (y: Bool) -> Bool
✓ Bool.gth: ∀ (x: Bool) (y: Bool) -> Bool
✓ Bool.and: ∀ (x: Bool) (y: Bool) -> Bool
✓ Bool.or: ∀ (x: Bool) (y: Bool) -> Bool
✓ Bool.xor: ∀ (x: Bool) (y: Bool) -> Bool
✓ Bool.not: ∀ (x: Bool) -> Bool
✓ Bool.neq: ∀ (x: Bool) (y: Bool) -> Bool
✓ Bool.if: ∀ (A: Type) (bool: Bool) (t: A) (f: A) -> A

Run the main expression in a Yatima package with

yatima run HelloWorld.ya

Enter the interactive Yatima REPL with

yatima repl

Motivation

We're still in the early days of the Computing Revolution. The first general-purpose digital computers were only switched on about 75 years ago. The living memory of your parents and grandparents extends into the past before computers. These machines are shockingly new, and as a species we really have no idea what they're for yet. We're in the middle of an epochal transformation whose nearest precedent is the invention of writing. There are a lot of prognostications of what that means for our future; lots of different, and sometimes contradictory, visions of how computing is going to continue to shape our minds, our bodies, and our relationships with one another.

Yatima, as a project, has an opinionated view of that future. We think computing should belong to individual users rather than corporations or states. A programming language is an empowering medium of individual expression, where the user encounters, and extends their mind through, a computing machine. We believe "Programmer" shouldn't be a job description, anymore than "scribe" is a job description in a world with near-universal literacy. Computing belongs to everyone, and computer programming should therefore be maximally accessible to everyone.

Currently, it's not: There are about 5 billion internet users worldwide, but only an estimated 25 million software developers. That's a "Programming Literacy rate" of less than 1%. Furthermore, that population is not demographically representative. It skews heavily toward men, the Global North, and those from privileged socioeconomic or ethnic backgrounds. This is a disgrace. It is if we live in some absurd dystopia where only people with green eyes play music.

A new programming language isn't going to be some panacea that solves that problem on its own, but there are some ways in a programming language can help:

  1. Build a simple, but powerful programming language. Yatima's core logic is under 500 lines of code, but is incredibly expressive in its type system, runtime and syntax. We want to reduce the language's conceptual overhead, without hindering the language learner's future growth and power.

  2. Make explicit in the language the connection between computing and mathematics. These two seemingly separate fields are actually, in essence, the same: All proofs are programs, all programs are proofs. A student doing math homework is programming, even if they don't conceptualize at such.

    Many people dislike math due to the tedium of manual computation and the unclear relevance of the results. And many people dislike programming because the concrete mechanics often seem arbitrary and frustrating. These are are complementary complaints. Math is more fun when you have a computer to take care of the detail-work. And computing is much easier when you have a clear notion of the theory of what you're doing.

  3. Be portable in execution. Run locally, in the browser, on mobile, in a distributed process. People shouldn't have to worry about the details of where they want to do something, only what they want to do.

  4. Be portable in semantics. Pure semantics and reproducible builds let people focus on the actual content of their programs rather than the scut-work of configuring infrastructure.

  5. Integrate with decentralized technologies to remove, as much as possible, social barriers and frictions. Having centralized services like most modern package managers raises the question "Who controls the package server?" The famous leftpad incident is commonly presented as a build system issue (which it absolutely is), but less frequently discussed is that what precipitated the incident was how the npm administrators transfered ownership of a package from an individual developer without their consent to a large company.

  6. Have a clear code of conduct to combat the endemic toxicity of contemporary programming culture. Some might find this controverisial, but it shouldn't be. Computing is a social and cultural project as much as it is a technical one. Cultures which perpetuate cycles of trauma are less successful in the long run than ones which do not.

The future we want to build is one where billions of people use, understand and love their mathematical computing machines, as natural extensions of themselves. A future where users have autonomy and privacy over their own systems and their own data. A future where reliable, type-checked, formally-verified software is the norm, so you can rely on software engineering with the same quotidian confidence you have for civil engineering whenever you drive your car over a bridge.

Thank you to our Supporters!

yatima-lang-alpha's People

Contributors

anderssorby avatar atanmarko avatar brightly-salty avatar dlight avatar gabriel-barrett avatar jeremyschlatter avatar johnchandlerburnham avatar neauoire avatar nothingnesses avatar samestep avatar samuelburnham avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

yatima-lang-alpha's Issues

npm start fails with Error: Cannot find module 'webpack-cli/bin/config-yargs'

Hello, I'm on Arch Linux and I followed the nix + flakes setup. When I run npm start it fails with

[nix-shell:/proj/y/yatima/web]$ npm start

> [email protected] start
> webpack-dev-server

node:internal/modules/cjs/loader:936
  throw err;
  ^

Error: Cannot find module 'webpack-cli/bin/config-yargs'
Require stack:
- /proj/y/yatima/web/node_modules/webpack-dev-server/bin/webpack-dev-server.js
    at Function.Module._resolveFilename (node:internal/modules/cjs/loader:933:15)
    at Function.Module._load (node:internal/modules/cjs/loader:778:27)
    at Module.require (node:internal/modules/cjs/loader:1005:19)
    at require (node:internal/modules/cjs/helpers:94:18)
    at Object.<anonymous> (/proj/y/yatima/web/node_modules/webpack-dev-server/bin/webpack-dev-server.js:65:1)
    at Module._compile (node:internal/modules/cjs/loader:1101:14)
    at Object.Module._extensions..js (node:internal/modules/cjs/loader:1153:10)
    at Module.load (node:internal/modules/cjs/loader:981:32)
    at Function.Module._load (node:internal/modules/cjs/loader:822:12)
    at Function.executeUserEntryPoint [as runMain] (node:internal/modules/run_main:79:12) {
  code: 'MODULE_NOT_FOUND',
  requireStack: [
    '/proj/y/yatima/web/node_modules/webpack-dev-server/bin/webpack-dev-server.js'
  ]
}

At first I thought it was because I have webpack 5 installed in my system and the project is configured to use webpack 3. But changing the command in package.json to webpack serve (the webpack 5 command) gives yet another error message

[nix-shell:/proj/y/yatima/web]$ npm start

> [email protected] start
> webpack serve

[webpack-cli] Failed to load '/proj/y/yatima/web/webpack.config.js' config
[webpack-cli] Invalid options object. Copy Plugin has been initialized using an options object that does not match the API schema.
 - options[0] should be an object:
   object { patterns, options? }

Also, installing webpack 3 on Arch doesn't help; it gives the first error.

Integrate with Substrate by making yatima_core no_std (using the sp_std lib)

Since the Yatima's core typechecking and evaluation via the λ-DAG machine only requires rust core, we should be able to refactor to build with only sp_std:

  • libipld (this is probably the most work, requiring forking our own sp_std version)
  • nom
  • im (im is unmaintained, we should remove this dep bodil/im-rs#185)
  • num-bigint
  • ropey
  • base-x (we should inline this)
  • petgraph (should move this out of yatima_core, tbh)

We will also need to switch over to using https://docs.rs/sp-std/3.0.0/sp_std/collections/index.html everywhere for Maps, Sets, etc.

Position range printing

With

package Bug where

def bad: Type = 1
λ yatima check Bug.ya
Checking package Bug at bafy2bzacedt5r6zbxwanysdh54ppnhcj6udrcysemvrgha5td6c6nazcop2ew
Checking definitions:
✕ bad: Type
                     ▼
3 | def bad: Type = 1
                      ▲
Error: Type Mismatch from 3:17 to 3:18 in bafy2bzaceaux3htprpvupxmdz3o3sbc3vx3p3ifkkf5dcybs7luaenec5ujdy
• Expected: Type
• Detected: #Nat

The column markers are showing 3:18 to 3:19.

Additionally, we should add better context to the ranges by printing surrounding lines, since

package Bug where

def bad: Type 
  = 1
λ yatima check Bug.ya
Checking package Bug at bafy2bzacecmrm7sophvzffvc5nsb6dejmokpcdb54qjgp2muyl4skqe6t22mw
Checking definitions:
✕ bad: Type
         ▼
4 |   = 1
          ▲
Error: Type Mismatch from 4:5 to 4:6 in bafy2bzacebep2mocntr4ljalgor2s2pwy4or3ys2k6y5zify5ebq7hyjl3ewy
• Expected: Type
• Detected: #Nat

is not ideal

JS-IPFS integration in the Web REPL

#72 Adds an attempt at integrating JS-IPFS into the web repl so one can run

:load bafy2bzaced2f2qsqewwitetx5eqrwa34r2kt7w26ncl6myuy7jbkbaqxswips
# Or even better
:load /ipns/introit.yatima.io

And have the correct defs loaded into the web REPL. For the CLI REPL importing rust-ipfs as a library may be the simplest solution.

TODO

  • Make ipfs.get(cid) work from inside wasm
  • Make ipfs.get(ipfs-path) work from inside wasm

Literal Test Suite

There are some bugs in the current literal implementation, largely typos such as:

https://github.com/yatima-inc/yatima/blob/ef9108be7e5d2adc282ccf9f429ab0f253320786/yatima_core/src/prim/nat.rs#L158

which implements #Nat.div incorrectly. The literal subsystem is full of a huge amount of boilerplate, so typos like these were kinda inevitable.

So we need a comprehensive test suite that covers every priim::Op, its evaluation, its typechecking etc so we can be confident that the implementation is correct here.

Essentially, in each module in prim we need something like

  #[test]
  fn test_safe_head() {
    let rope: Rope = Rope::from_str("foo");
    let res = safe_head(rope);
    assert_eq!(res, Some(('f', Rope::from_str("oo"))));
  }

This is a lot of work, but it's not particularly conceptually complicated, so a good issue for someone who wants to learn how Yatima's primitives work.

Supress `Pos` printing in Debug trait of Term

Currently printing a term with Debug is very illegible

Lam(None, Name { inner: "A" }, Lam(None, Name { inner: "k" }, Slf(None, Name {
inner: "Vector.self" }, All(None, None, Name { inner: "P" }, (All(None, Many,
Name { inner: "k" }, (LTy(Some(Position { input:
Cid(bafk2bzaced56r2aurokhixoafmfxkczwqjwetlqzaef3ov7q3vwrbgdcr3i6o),
from_offset: 31, from_line: 1, from_column: 30, upto_offset: 35, upto_line: 1,
upto_column: 34 }), Nat), All(None, Many, Name { inner: "#_" }, (App(None,
(App(None, (Rec(None), Var(None, Name { inner: "k" }, 0))), Var(None, Name {
inner: "A" }, 1))), Typ(None))))), All(None, Affi, Name { inner: "Vector.Nil" },
(App(None, (App(None, (Var(None, Name { inner: "P" }, 0), Lit(Some(Position {
input: Cid(bafk2bzaced56r2aurokhixoafmfxkczwqjwetlqzaef3ov7q3vwrbgdcr3i6o),
from_offset: 62, from_line: 1, from_column: 61, upto_offset: 63, upto_line: 1,
upto_column: 62 }), Nat(BigUint { data: [] })))), Dat(None, Lam(None, Name {
inner: "P" }, Lam(None, Name { inner: "Vector.Nil" }, Lam(None, Name { inner:
"Vector.Cons" }, Var(None, Name { inner: "Vector.Nil" }, 2))))))), All(None,
Affi, Name { inner: "Vector.Cons" }, (All(None, Many, Name { inner: "k" },
(LTy(Some(Position { input:
Cid(bafk2bzaced56r2aurokhixoafmfxkczwqjwetlqzaef3ov7q3vwrbgdcr3i6o),
from_offset: 74, from_line: 1, from_column: 73, upto_offset: 78, upto_line: 1,
upto_column: 77 }), Nat), All(None, Many, Name { inner: "x" },
(Var(Some(Position { input:
Cid(bafk2bzaced56r2aurokhixoafmfxkczwqjwetlqzaef3ov7q3vwrbgdcr3i6o),
from_offset: 84, from_line: 1, from_column: 83, upto_offset: 85, upto_line: 1,
upto_column: 84 }), Name { inner: "A" }, 4), All(None, Many, Name { inner: "xs"
}, (App(Some(Position { input:
Cid(bafk2bzaced56r2aurokhixoafmfxkczwqjwetlqzaef3ov7q3vwrbgdcr3i6o),
from_offset: 92, from_line: 1, from_column: 91, upto_offset: 102, upto_line: 1,
upto_column: 101 }), (App(Some(Position { input:
Cid(bafk2bzaced56r2aurokhixoafmfxkczwqjwetlqzaef3ov7q3vwrbgdcr3i6o),
from_offset: 92, from_line: 1, from_column: 91, upto_offset: 102, upto_line: 1,
upto_column: 101 }), (Rec(Some(Position { input:
Cid(bafk2bzaced56r2aurokhixoafmfxkczwqjwetlqzaef3ov7q3vwrbgdcr3i6o),
from_offset: 92, from_line: 1, from_column: 91, upto_offset: 98, upto_line: 1,
upto_column: 97 })), Var(Some(Position { input:
Cid(bafk2bzaced56r2aurokhixoafmfxkczwqjwetlqzaef3ov7q3vwrbgdcr3i6o),
from_offset: 99, from_line: 1, from_column: 98, upto_offset: 100, upto_line: 1,
upto_column: 99 }), Name { inner: "A" }, 5))), Var(Some(Position { input:
Cid(bafk2bzaced56r2aurokhixoafmfxkczwqjwetlqzaef3ov7q3vwrbgdcr3i6o),
from_offset: 101, from_line: 1, from_column: 100, upto_offset: 102, upto_line:
1, upto_column: 101 }), Name { inner: "k" }, 1))), App(None, (App(None,
(Var(None, Name { inner: "P" }, 4), App(Some(Position { input:
Cid(bafk2bzaced56r2aurokhixoafmfxkczwqjwetlqzaef3ov7q3vwrbgdcr3i6o),
from_offset: 115, from_line: 1, from_column: 114, upto_offset: 125, upto_line:
1, upto_column: 124 }), (Opr(Some(Position { input:
Cid(bafk2bzaced56r2aurokhixoafmfxkczwqjwetlqzaef3ov7q3vwrbgdcr3i6o),
from_offset: 115, from_line: 1, from_column: 114, upto_offset: 123, upto_line:
1, upto_column: 122 }), Nat(Suc)), Var(Some(Position { input:
Cid(bafk2bzaced56r2aurokhixoafmfxkczwqjwetlqzaef3ov7q3vwrbgdcr3i6o),
from_offset: 124, from_line: 1, from_column: 123, upto_offset: 125, upto_line:
1, upto_column: 124 }), Name { inner: "k" }, 2))))), Dat(None, Lam(None, Name {
inner: "P" }, Lam(None, Name { inner: "Vector.Nil" }, Lam(None, Name { inner:
"Vector.Cons" }, Lam(None, Name { inner: "k" }, Lam(None, Name { inner: "x" },
Lam(None, Name { inner: "xs" }, App(None, (App(None, (App(None, (Var(None, Name
{ inner: "Vector.Cons" }, 1), Var(None, Name { inner: "xs" }, 3))), Var(None,
Name { inner: "x" }, 4))), Var(None, Name { inner: "k" }, 5)))))))))))))))))),
App(None, (App(None, (Var(None, Name { inner: "P" }, 2), Var(None, Name { inner:
"k" }, 2))), Var(None, Name { inner: "Vector.self" }, 3))))))))))))

The Position information is nearly always useless when we print, and makes it quite hard to see what the term actually is. We already have a custom PartialEq trait for Term which ignores Pos when comparing term equality. https://github.com/yatima-inc/yatima/blob/8661216b2371a75a78568f48ea696ae0be0da6c0/core/src/term.rs#L38-L70

With custom Debug Traits for Term and perhaps Name the above could be printed as:

Lam("A" , Lam("k" , Slf("Vector.self" , All("P" , (All(Many, "k" , (LTy(Nat),
All(Many, "#_" , (App((App((Rec(..), Var("k" , 0))), Var("A" , 1))),
Typ(..))))), All(Affi, "Vector.Nil" , (App((App((Var("P" , 0), Lit(Nat(BigUint {
data: [] })))), Dat(Lam("P" , Lam("Vector.Nil" , Lam("Vector.Cons" ,
Var("Vector.Nil" , 2))))))), All(Affi, "Vector.Cons" , (All(Many, "k" ,
(LTy(Nat), All(Many, "x" , (Var("A" , 4), All(Many, "xs" , (App((App((Rec(..),
Var("A" , 5))), Var("k" , 1))), App((App((Var("P" , 4), App((Opr(Nat(Suc)),
Var("k" , 2))))), Dat(Lam("P" , Lam("Vector.Nil" , Lam("Vector.Cons" , Lam("k" ,
Lam("x" , Lam("xs" , App((App((App((Var("Vector.Cons" , 1), Var("xs" , 3))),
Var("x" , 4))), Var("k" , 5)))))))))))))))))), App((App((Var("P" , 2), Var("k" ,
2))), Var("Vector.self" , 3))))))))))))

Still a big term, but now much easier to understand

Replace String with Rc<str> or other data structure for tracking names in parser and checker

Currently we do an egregious amount of cloning of String names, like in Lam/Slf/All/Let binders, Vars and Refs. But the thing about names is that they are immutable, once we parse we should never need to change them. So a datatype like

struct Name(Rc<str>)

might significantly improve performance here. Doesn't have to be exactly Rc<str>, but that's probably a good place to start investigating.

Also relevant to argumentcomputer/yatima#38 since I believe String isn't in sp_std

Add explicit `dyn` to `sp_im` ConsList library

We should fix these warnings now, before a future Rust update breaks the library:

warning: trait objects without an explicit `dyn` are deprecated
   --> sp_im/src/conslist.rs:399:13
    |
399 |       cmp: &Fn(&A, &A) -> Ordering,
    |             ^^^^^^^^^^^^^^^^^^^^^^ help: use `dyn`: `dyn Fn(&A, &A) -> Ordering`
    |
    = warning: this was previously accepted by the compiler but is being phased out; it will become a hard error in the 2021 edition!
    = note: for more information, see issue #80165 <https://github.com/rust-lang/rust/issues/80165>

Rec not printing correct name in some cases

Checking

package Empty where

def Empty : Type =
  @self ∀
  (0 P : ∀ Empty -> Type) ->
  P self

def absurd (A: Type) (x: Empty): A = (case x) (λ _ => A)

If we remove the (λ _ => A) from absurd and try checking:

def absurd (A: Type) (x: Empty): A = (case x)

Then

Checking package Empty at bafy2bzacecoefmntlllb2t6nfcc7yjb3yuck65appexyoy2s2bpf2whrvi6qo
Checking definitions:
✓ Empty: Type
✕ absurd: ∀ (A: Type) (x: Empty) -> A
                                           ▼
8 | def absurd (A: Type) (x: Empty): A = (case x)
                                                 ▲
Error: Type Mismatch from 8:39 to 8:45 in bafy2bzacebdxkcb4pengirlb2eaelsjm6hxsjfg4ucybe2v452ao4fea6jnak
• Context:
  - ω A: Type
  - ω x: Empty
• Expected: A
• Detected: ∀ (0 P: ∀ (_: #^) -> Type) -> P x

The #^ syntax is the internal display format of the Rec constructor and should never appear to the user. When we convert from DAG to Term to generate the CheckError::TypeMismatch:

https://github.com/yatima-inc/yatima/blob/ef9108be7e5d2adc282ccf9f429ab0f253320786/yatima_core/src/check.rs#L210

we pass a re_rec: Bool argument to to_term to indicate that we should not convert the DAGPtr::Var or DAGPtr::Ref nodes back to a Term::Rec. What we want is:

• Detected: ∀ (0 P: ∀ (_: Empty) -> Type) -> P x

where Empty is either a Var or Ref on the detected : Term so it prints correctly.

Remove recursion in the DAG functions

The Term<->DAG conversion functions, DAG clone and printing, free_dead_node, upcopy and clean_up are all recursive. These functions will potentially overflow the stack when the DAG is large enough. To solve this, we need to remove recursion and instead do BFS or DFS search with an explicit stack.

Problem with parsing Bool.ya

Parsing Bool.ya from master branch introit repository, yatima build master branch 484f198, Rust version rustc 1.52.0-nightly, panics with following error:

thread 'main' panicked at 'Error parsing file Bool.ya: Parsing Error: ParseError { input: LocatedSpan { offset: 73, line: 5, fragment: "> Type)\n (& true : P (data λ P t f => t))\n (& false : P (data λ P t f => f))\n -> P self\n\ndef true : Bool = data λ P t f => t\ndef false : Bool = data λ P t f => f\n\ndef and (a b: Bool): Bool = (case a) (\_ => Bool) ((case b) (\_ => Bool) true false) false\ndef or (a b: Bool): Bool = (case a) (\_ => Bool) true ((case b) (\_ => Bool) true false)\ndef not (a : Bool): Bool = (case a) (\_ => Bool) false true\ndef xor (a b: Bool): Bool = (case a) (\_ => Bool) (not b) b\n\n\ndef if (A : Type) (bool : Bool) (a1 a2 : A): A\n = (case bool) (\_ => A) a1 a2\n", extra: () }, expected: Some("base 10 digits"), errors: [Nom(TakeTill1)] }', src/parse/package.rs:262:7

Cargo check fails in github CI

By enabling cargo-check.enable = true in default.nix, following error happens:

cargo-check..............................................................Failed

- hook id: cargo-check
- exit code: 101

    Updating crates.io index
warning: spurious network error (2 tries remaining): [6] Couldn't resolve host name; class=Net (12)
warning: spurious network error (1 tries remaining): [6] Couldn't resolve host name; class=Net (12)
error: failed to get `base-x` as a dependency of package `hashexpr v0.1.0 (/build/src/hashexpr)`

Caused by:
  failed to fetch `https://github.com/rust-lang/crates.io-index`

WASM runtime explorations

To experiment with running yatima as WASM I'm going to try to compile much of the library into a wasm binary.

Some ideas for steps/targets:

  • Create a wasm binary that can parse and run yatima.
  • Support connecting to an external hashspace server.
  • Embed serialized yatima code inside the wasm binary.
  • Create a HTML based REPL
  • Explore more optimized runtimes.
  • Connect the REPL to IPFS to load packages.

Segfault when checking Is.is annotated

package Bug where

def Bool: Type = #Bool

def true: Bool = #Bool.true
def false: Bool = #Bool.false

def Equal (A: Type) (a: A) (b: A) : Type =
  @self ∀
  (0 P    : ∀ (b: A) (Equal A a b) -> Type)
  (& refl : P a (data λ P refl => refl))
  -> P b self

def refl (0 A: Type) (a : A) : Equal A a a = data λ P refl => refl

def Is (a: Bool): Type = Equal Bool true a
def is: Is true       = refl Bool true

def bad: Is true = (is :: Is true)
λ yatima check Bug.ya
Checking package Bug at bafy2bzacec6tnqo6n2naej2lrvwjbff5umsmjj4hfbaoddn7kcor4zhrrvoqq
Checking definitions:
✓ Bool: Type
✓ true: Bool
✓ false: Bool
✓ Equal: ∀ (A: Type) (a: A) (b: A) -> Type
✓ refl: ∀ (0 A: Type) (a: A) -> Equal A a a
✓ Is: ∀ (a: Bool) -> Type
✓ is: Is true
fish: Job 1, 'yatima check Bug.ya' terminated by signal SIGSEGV (Address boundary error)

Valgrind output here: https://gist.github.com/johnchandlerburnham/d3933bedfbed61a47b70a466529c60c6

Immutable sp_std ConsList to improve Parser performance

Recently we removed im::Vector as the context data-structure in parser for the sp_std effort, replacing it with

https://github.com/yatima-inc/yatima/blob/6ef436b22c9d540fba495e09eb44ec161b46dbd5/core/src/parse/term.rs#L70

(We have to use Rc here rather than mut var because the parser is made up of Rust closures to mimic a parser combinator library.)

This new Ctx has the disadvantage of requiring a full copy whenever we bind a new variable, as in parse_lam, parse_all, etc. I have not benchmarked, but the parser feels significantly slower now, likely because im::Vector shares nodes whenever you write to it (https://docs.rs/im/10.2.0/im/vector/struct.Vector.html)

While we can't use the im library as is, and porting im::Vector to sp_std is likely complicated, we can probably port the simpler im::ConsList (https://docs.rs/im/10.2.0/im/conslist/struct.ConsList.html) without as much effort since it's essentially just a List of Arc nodes:

pub struct ConsList<A>(Arc<ConsListNode<A>>);

pub enum ConsListNode<A> {
    Cons(usize, Arc<A>, ConsList<A>),
    Nil,
}

and Arc is in sp_std.

yatima_core panics on particular input with unfree variable

I get a panic when I run yatima check map_test.ya
and map_test.ya is

package map_test
  import map 
  import nat
  import ord
where 

def test_map: Map Nat Nat (Ord Nat) = 
  let O: Ord Nat = Ord.New Nat Nat.compare;
  let empty: Map Nat Nat O = Map.empty Nat Nat O;
  let map1: Map Nat Nat O = Map.insert Nat Nat O 1 2 empty;
  map1
Checking definitions:
thread 'main' panicked at 'Variable not free', core/src/dag.rs:1290:20

Rust panic message on parse error

with

 package Bug where

def bad: Type = x
λ yatima check Bug.ya
Parse Error in Bug.ya:
at line 4:18
4 | def bad: Type = x
                     ^
Reported errors:
- Undefined reference x

thread 'main' panicked at 'explicit panic', src/file/parse.rs:111:9
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace

Whereas on type errors with def bad: Type = 1

λ yatima check Bug.ya
Checking package Bug at bafy2bzacedt5r6zbxwanysdh54ppnhcj6udrcysemvrgha5td6c6nazcop2ew
Checking definitions:
✕ bad: Type
                     ▼
3 | def bad: Type = 1
                      ▲
Error: Type Mismatch from 3:17 to 3:18 in bafy2bzaceaux3htprpvupxmdz3o3sbc3vx3p3ifkkf5dcybs7luaenec5ujdy
• Expected: Type
• Detected: #Nat

there's no Rust panic warning

Web REPL bugs and improvements

#72 Fixes some problems like pasting and interaction, but there are still several issues in the web REPL. These probably require structural changes and a better integration with the xterm.js api.

Problems:

  • Backspace does not work properly when not on the end of a line.
  • Delete does not work
  • Multiline edit does not work

Segfault when case matching on lambda-encoding containing primitives

package Test where

def State: Type =
  @self ∀
  (0 P: ∀ State -> Type)
  (& new: ∀ (pos: #Nat) (txt: #Text) 
    -> P (data λ P n => n pos txt)
  )
  -> P self

def new
  (pos: #Nat)
  (txt: #Text)
  : State
  = data λ P n => n pos txt

def ste: State = new 0 "foo"

def main: ∀
  (0 P: ∀ State -> Type)
  (& new: ∀ (pos: #Nat) (txt: #Text)
    -> P (data λ P n => n pos txt)
  )
  -> P ste
  = (case ste)

Checking and running:

λ yatima check Test.ya
Checking package Test at bafy2bzacecyxwnicyh5d2c5f5az4m66472o2i2wvlcrqfzfbhhiscuozofpra
Checking definitions:
✓ State: Type
✓ new: ∀ (pos: #Nat) (txt: #Text) -> State
✓ ste: State
✓ main: ∀ (0 P: ∀ (_: State) -> Type) (& new: ∀ (pos: #Nat) (txt: #Text) -> P (data λ P n => n pos txt)) -> P ste
~/introit jcb/parser*
λ yatima run Test.ya
fish: Job 1, 'yatima run Test.ya' terminated by signal SIGSEGV (Address boundary error)

Relation to the language Kind?

I noticed some the contributors here were in close contact with contributors to the language Kind?

Could you maybe elaborate on the differences and reasons for existence of both despite having the same core compiler/evaluator?

Negative index when case matching

In https://github.com/yatima-inc/introit/blob/fd6cf9b0f2c2d50b779b25c609160322a0429a8d/Parser.ya#L104

def State.compare (S E: Type) (x y: State S E): State S E
  = (case x) (λ _ => State S E) (λ x_pos _ _ _ =>
      (case y) (λ _ => State S E) (λ y_pos _ _ _ =>
        (case (Nat.lte x_pos y_pos)) (λ _ => State S E) x y
      )
    )

errors with

thread 'main' panicked at 'Negative index found.', core/src/dag.rs:843:13
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace

Assess safety of negative De Bruijn offset technique in `type` def elaborator

Copy-pasting a code comment for a different and likely faster technique for parsing the type defs:

https://github.com/yatima-inc/yatima/blob/a6499a3b3a84c911d57ddd6e96480525b8c01e3c/core/src/typedef.rs#L341

Currently we use two separate parsing passes to build separate Variant structs for the datatype type definitions versus the constructors. This is because the different contexts for both cases cause the De Bruijn
indices to differ significantly.

Nevertheless it is possible to use the self.typ_variants to compute the self.cons_variants and to avoid needing a second parsing pass.

for (i, v) in self.typ_variants.iter().enumerate() {
 let offset = -((self.typ_indices.len() + 2 + i) as i64);
 let v_params: Vec<Term> =
   v.params.clone().into_iter().map(|t| t.shift(offset, None)).collect();
 let v_indices: Vec<Term> = v
  .indices
  .clone()
  .into_iter()
  .map(|t| t.shift(offset, Some(v.bind.len() as u64)))
  .collect();
 let v_bind: Vec<(Uses, Name, Term)> = v
  .bind
  .clone()
  .into_iter()
  .enumerate()
  .map(|(i, (u, n, t))| (u, n, t.shift(offset, Some(i as u64))))
  .collect();
}

This technique is likely more performant, but is complex and presents a danger in its use of a potentially large negative offset. While integrating the above code does successfully pass the test-suite and typecheck introit, a subtle bug could easily create negative De Bruijn indices which can crash the language. Even just having an incorrect offset can cause typechecking or evaluation to behave in frustrating and unpredictable ways.


It would be good to learn whether

  • this actually is an optimization relative to parsing twice. And if so, how much of an optimization?
  • the negative offsets are safe or not. Current evidence is strong that it's safe since it passes the test-suite and typechecking. But that doesn't mean there's isn't some more exotic typedef that might break it

Fork multibase and nom_locate to disable std default feature in Cargo.toml

In order to add no_std compatibility to the above yatima_core dependencies, the easiest approach is to fork them and change a few lines in their Cargo.toml. It seems like multibase and nom_locate were meant to be used in a no_std environment with cargo build --no-default-features, but this didn't work for me when using them as a dependency.

Relevant to #38.

Overflowing stack on large pure lambda encodings

Against https://github.com/yatima-inc/introit/tree/jcb/rust-version:

~/introit jcb/rust-version* 14s
λ yatima repl
⅄ :load Pure.Nat
Checking package Nat at bafy2bzacecb3cqd7zx6ofrlm3b4qzdqmjb2563mllclrsperda22dl7siib4g
...
⅄ from_prim 0
data λ P z s => z
⅄ from_prim 1
data λ P z s => s (data λ P z s => z)
⅄ from_prim 5
data λ P z s => s (data λ P z s => s (data λ P z s => s (data λ P z s => s (data λ P z s => s (data λ P z s => z)))))
⅄ from_prim 1000

thread 'main' has overflowed its stack
fatal runtime error: stack overflow
fish: Job 1, 'yatima repl' terminated by signal SIGABRT (Abort)

We should not be overflowing like this, since the DAG nodes are heap allocated. So something is blowing the stack, possibly in the prinitng or conversion back to Term from the DAG

inverse primops

When checking

package Vector
  import Nat (Nat, zero, succ, pred)
  import Nat as Nat
where

def Vector (A: Type) (k: Nat): Type =
  @self ∀
  (0 P : ∀ (k: Nat) (Vector A k) -> Type)
  (& nil : P zero (data λ P n c => n))
  (& cons: ∀
    (0 k: Nat) (x: A) (xs: Vector A k) -> P (succ k) (data λ P n c => c k x xs))
  -> P k self

def nil (0 A: Type) : Vector A zero = data λ P n c => n

def cons (0 A: Type) (0 k: Nat) (x: A) (xs: Vector A k): Vector A (succ k)
  = data λ P n c => c k x xs

def head (0 A : Type) (0 size : Nat) (a : Vector A size): Maybe A
  = (case a) (λ _ _ => Maybe A)
        (none A)
        (λ _ head tail => just A head)

def tail (0 A : Type) (0 size : Nat) (a : Vector A size): Vector A (pred size)
  = (case a) (λ k _ => Vector A (pred k))
        (nil A)
        (λ _ head tail => tail)

We error with:

Checking definitions:
✓ Vector: ∀ (A: Type) (k: Nat) -> Type
✓ nil: ∀ (0 A: Type) -> Vector A zero
✓ cons: ∀ (0 A: Type) (0 k: Nat) (x: A) (xs: Vector A k) -> Vector A (succ k)
✓ head: ∀ (0 A: Type) (0 size: Nat) (a: Vector A size) -> Maybe A
✕ tail: ∀ (0 A: Type) (0 size: Nat) (a: Vector A size) -> Vector A (pred size)
                                ▼
29 |         (λ _ head tail => tail)
                                    ▲
Error: Type Mismatch from 29:27 to 29:31 in bafy2bzacedsu5jz7gw3jciug77qhae2iug64mivapsftzp65itrv3ohwdqofy
• Context:
  - 0 A: Type
  - 0 size: Nat
  - ω a: Vector A size
  - 0 k: Nat
  - ω x: A
  - ω xs: (#^) A k
• Expected: @self ∀ (0 P: ∀ (k: Nat) (_: (#^) A k) -> Type) (& nil: P zero (data λ P n c => n)) (& cons: ∀ (0 k: Nat) (x: A) (xs: (#^) A k) -> P (succ k) (data λ P n c => c k x xs)) -> P (#Nat.pre (#Nat.suc k)) self
• Detected: @self ∀ (0 P: ∀ (k: Nat) (_: (#^) A k) -> Type) (& nil: P zero (data λ P n c => n)) (& cons: ∀ (0 k: Nat) (x: A) (xs: (#^) A k) -> P (succ k) (data λ P n c => c k x xs)) -> P k self

The issue is that Yatima can't figure out that P (#Nat.pre (#Nat.suc k)) self is the same as P k self. We solved this in the haskell prototype with

https://github.com/yatima-inc/yatima-haskell-prototype/blob/3fb0d94f4fc2962e5d4c3f550cf4d55bd1bc9231/src/Yatima/Core/Prim.hs#L97

So that needs to get ported over.

Nix build fails on macOS

On macOS Big Sur (11.4), building with and without flakes (nix run), I get the following error:

    = note: ld: file not found: /System/Library/Frameworks/CoreFoundation.framework/Versions/A/CoreFoundation
            clang-7: error: linker command failed with exit code 1 (use -v to see invocation)

Using niv update nixpkgs -b nixpkgs-unstable, things seem to work alright. I don't know what exactly the problem is, but here's a couple of related information that I've found:

I'll check nixpkgs 21.05 and see if it works.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.