GithubHelp home page GithubHelp logo

sneeuwballen / zipperposition Goto Github PK

View Code? Open in Web Editor NEW
122.0 122.0 19.0 29.94 MB

An automatic theorem prover in OCaml for typed higher-order logic with equality and datatypes, based on superposition+rewriting; and Logtk, a supporting library for manipulating terms, formulas, clauses, etc.

Home Page: https://sneeuwballen.github.io/zipperposition/

License: BSD 2-Clause "Simplified" License

Makefile 0.09% Python 0.99% Shell 3.64% OpenEdge ABL 0.01% OCaml 94.39% Prolog 0.52% C 0.06% Dockerfile 0.02% Common Lisp 0.05% Vim Script 0.23%
cnf computer-science-algorithms experimental induction logic ocaml polymorphism prototype prover rewriting saturation superposition symbolic-computation

zipperposition's People

Contributors

0function avatar abentkamp avatar blanchette avatar c-cube avatar fourchaux avatar fpottier avatar gh-salt avatar nartannt avatar petarvukmirovic avatar pratherconid avatar tpmkranz avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

zipperposition's Issues

Type syntax in "zf" files

Hello,

Would it be possible to get a more compact notation for explicit types when several variables have the same type? For example:

val foo : pi (a b : type)...

rather than

val foo : pi (a : type) (b : type)...

I could use type inference but I rely on another tool (not zipperposition), which uses "zf" files but does not support type inference.

Thanks a lot,

David.

--timeout option not always respected

On the following example, Zipperposition does not respect the --timeout 2 option but fills all the available memory:

data list a := nil | cons a (list a).

def[infix "++"] append : pi a. list a -> list a -> list a where
  forall l. append nil l = l;
  forall hd tl l. append (cons hd tl) l = cons hd (append tl l).

def foreach : pi a. list a -> (a -> prop) -> prop where
  forall p. foreach nil p = true;
  forall hd tl p. foreach (cons hd tl) p = (p hd && foreach tl p).

goal forall a (l1 l2 : list a) p. foreach (append l1 l2) p = (foreach l1 p && foreach l2 p).

Called with zipperposition --timeout 2 bug.zf.

enforce def-as-rewrite

the option is there, but there must be, in Cnf, a decision between rewrite rules and axioms

tracking: proof checking

Summarize progress in proof checking:

  • update proof generation with renamings (or not), including rewrite
    steps and demod steps
  • test this (without checking) on all TPTP; look for quick errors; merge into dev
  • [ ] write LLTerm.t and basic functions (substitution/typing in particular)
  • write simple CC based tableau (if possible, somehow incremental)
  • conversion statement → formula
  • make inference steps sth like
    intros [x,y,z]; apply C1 [g(y),x+1]; apply C2[z,z]; tableau
  • skip some steps based on metadata (esa/arith) for now
  • final summary on how many steps skipped/ok/fail
  • debug on pure FO
  • store result of checking inside proof steps
  • direct β reduction of llterms
    ite/bool/β reduction rules in tableau
  • make rewriting under λ terms pass proof checking
  • proof checking for arith: FM / omega(?)/cooper
  • proof check CNF steps, including tseitin
  • proof check avatar (#64)
  • lazy equality exchange (case split on all equalities between arith terms?)
  • proof checking for full HO (#52)
  • turn checking on by default
  • fast proof checker (#53)

How to access Logtk

Sorry if this is not a proper issue, but I am very confused with regards to using logtk as a library.

Since logtk now ships with zipperposition, should it be the case that logtk is still accessible? Cause I can't seem to be able to use it even after installing zipperposition via opam.

Cheers,
Darren

optimize debug

  • only allow modifying levels at beginning; once a level is queried it should be immutable (lazy field)
  • have the level checking be super fast and inline
  • split debugf into an [@inline] fun that just checks level, and the actual part.

use DB indices in term rewriting/demod

do the De Bruijn switch for rewriting/demod

  • convert l=r into De Bruijn indices (easy)
  • write matching env:term db_env -> pattern:term -> term -> term db_env
    (where only DB variables can be bound in the pattern)
  • write a small, simple, lightweight index for De Bruijn terms
  • re-write demod/rewriting to use this representation (carry a db_env
    along, as a kind of stack)

pros: would make rewriting easier (no more renaming needed)
cons: hard

opam installation

I tried to install zipperposition using opam and got the following error

=-=- Processing actions -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=  🐫 
[ERROR] The compilation of qbf failed at "./configure --enable-quantor
        --disable-depqbf --docdir=/Users/giles/.opam/system/doc
        --disable-random".
∗  installed gen.0.2.4
∗  installed msat.1.1
∗  installed ppx_deriving.3.0
∗  installed sequence.0.5.5
∗  installed containers.0.13
∗  installed logtk.0.8

#=== ERROR while installing qbf.0.1 ===========================================#
# opam-version 1.2.1
# os           darwin
# command      ./configure --enable-quantor --disable-depqbf --docdir=/Users/giles/.opam/system/doc --disable-random
# path         /Users/giles/.opam/system/build/qbf.0.1
# compiler     system (4.02.2)
# exit-code    1
# env-file     /Users/giles/.opam/system/build/qbf.0.1/qbf-38252-8c0dd3.env
# stdout-file  /Users/giles/.opam/system/build/qbf.0.1/qbf-38252-8c0dd3.out
# stderr-file  /Users/giles/.opam/system/build/qbf.0.1/qbf-38252-8c0dd3.err
### stdout ###
# -n cc ...
# [...]
# -n cflags ...
#   -W -Wall -fPIC -static -DNDEBUG -O3
# -n makefile ...
#  done
# rm -f config.h; ./mkconfig > config.h
# gcc -W -Wall -fPIC -static -DNDEBUG -O3 -c picosat.c
# gcc -W -Wall -fPIC -static -DNDEBUG -O3 -c app.c
# gcc -W -Wall -fPIC -static -DNDEBUG -O3 -c main.c
# gcc -W -Wall -fPIC -static -DNDEBUG -O3 -o picosat main.o app.o picosat.o
### stderr ###
# picosat.c:3748:3: warning: dereferencing type-punned pointer will break strict-aliasing rules [-Wstrict-aliasing]
# [...]
# picosat.c:3754:7: warning: dereferencing type-punned pointer will break strict-aliasing rules [-Wstrict-aliasing]
#        while (++i < rcount && *CLS2ACT (resolved[i]) < min_cls_activity)
#        ^
# ld: library not found for -lcrt0.o
# collect2: error: ld returned 1 exit status
# make[2]: *** [picosat] Error 1
# make[1]: *** [all] Error 2
# make: *** [deps] Error 2
# E: Failure("Command 'make deps' terminated with error code 2")

fast proof checker

possible investigation leads:

  • use a SMT instead of tableau architecture (sidekick?)
  • imperative CC
  • use external prover (vendored?)

investigate GEG problems

investigate why GEG rational problems are much harder
(ordering? missing optim?)
→ self paramod with, say, transitivity of <

Installation issue with #dev

When trying to install the #dev version using opam, the following error arises:

#=== ERROR while installing zipperposition.1.0 ================================#
# opam-version 1.2.2
# os           linux
# command      make install
# path         /home/guigui/.opam/4.04.0+flambda/build/zipperposition.1.0
# compiler     4.04.0+flambda
# exit-code    2
# env-file     /home/guigui/.opam/4.04.0+flambda/build/zipperposition.1.0/zipperposition-3692-13102d.env
# stdout-file  /home/guigui/.opam/4.04.0+flambda/build/zipperposition.1.0/zipperposition-3692-13102d.out
# stderr-file  /home/guigui/.opam/4.04.0+flambda/build/zipperposition.1.0/zipperposition-3692-13102d.err
### stdout ###
# ./setup.exe -install 
### stderr ###
# [...]
# W: Cannot find source header for module library in Parse_tptp logtk_parsers
# W: Cannot find source header for module library in Lex_tptp logtk_parsers
# W: Cannot find source header for module library in Lex_ho logtk_parsers
# W: Cannot find source header for module library in Parse_ho logtk_parsers
# W: Cannot find source header for module library in Parse_zf logtk_parsers
# W: Cannot find source header for module library in Lex_zf logtk_parsers
# ocamlfind: Package logtk is already installed
#  - (file /home/guigui/.opam/4.04.0+flambda/lib/logtk/META already exists)
# E: Failure("Command ''/home/guigui/.opam/4.04.0+flambda/bin/ocamlfind' install logtk src/core/META _build/src/parsers/logtk_parsers.cmx _build/src/parsers/parse_tptp.annot _build/src/parsers/parse_tptp.cmt _build/src/parsers/parse_tptp.cmti _build/src/parsers/lex_tptp.annot _build/src/parsers/lex_tptp.cmt _build/src/parsers/ast_tptp.annot _build/src/parsers/ast_tptp.cmt _build/src/parsers/ast_tptp.cmti _build/src/parsers/util_tptp.annot _build/src/parsers/util_tptp.cmt _build/src/parsers/util_tptp.cmti _build/src/parsers/ast_ho.annot _build/src/parsers/ast_ho.cmt _build/src/parsers/ast_ho.cmti _build/src/parsers/lex_ho.annot _build/src/parsers/lex_ho.cmt _build/src/parsers/parse_ho.annot _build/src/parsers/parse_ho.cmt _build/src/parsers/parse_ho.cmti _build/src/parsers/trace_tstp.annot _build/src/parsers/trace_tstp.cmt _build/src/parsers/trace_tstp.cmti _build/src/parsers/parse_zf.annot _build/src/parsers/parse_zf.cmt _build/src/parsers/parse_zf.cmti _build/src/parsers/lex_zf.annot _build/src/parsers/lex_zf.cmt _build/src/parsers/util_zf.annot _build/src/parsers/util_zf.cmt _build/src/parsers/util_zf.cmti _build/src/parsers/util_tip.annot _build/src/parsers/util_tip.cmt _build/src/parsers/util_tip.cmti _build/src/parsers/parsing_utils.annot _build/src/parsers/parsing_utils.cmt _build/src/parsers/callProver.annot _build/src/parsers/callProver.cmt _build/src/parsers/callProver.cmti _build/src/parsers/logtk_parsers.cmxs _build/src/parsers/logtk_parsers.a _build/src/parsers/logtk_parsers.cmxa _build/src/parsers/logtk_parsers.cma _build/src/parsers/logtk_parsers.cmi _build/src/parsers/logtk_parsers.cmt src/parsers/callProver.mli src/parsers/parsing_utils.ml src/parsers/util_tip.mli src/parsers/util_zf.mli src/parsers/trace_tstp.mli src/parsers/ast_ho.mli src/parsers/util_tptp.mli src/parsers/ast_tptp.mli _build/src/core/logtk.cmx _build/src/core/InnerTerm.annot _build/src/core/InnerTerm.cmt _build/src/core/InnerTerm.cmti _build/src/core/FOTerm.annot _build/src/core/FOTerm.cmt _build/src/core/FOTerm.cmti _build/src/core/Type.annot _build/src/core/Type.cmt _build/src/core/Type.cmti _build/src/core/Util.annot _build/src/core/Util.cmt _build/src/core/Util.cmti _build/src/core/STerm.annot _build/src/core/STerm.cmt _build/src/core/STerm.cmti _build/src/core/Interfaces.annot _build/src/core/Interfaces.cmt _build/src/core/DBEnv.annot _build/src/core/DBEnv.cmt _build/src/core/DBEnv.cmti _build/src/core/Position.annot _build/src/core/Position.cmt _build/src/core/Position.cmti _build/src/core/Var.annot _build/src/core/Var.cmt _build/src/core/Var.cmti _build/src/core/HVar.annot _build/src/core/HVar.cmt _build/src/core/HVar.cmti _build/src/core/Subst.annot _build/src/core/Subst.cmt _build/src/core/Subst.cmti _build/src/core/Unif.annot _build/src/core/Unif.cmt _build/src/core/Unif.cmti _build/src/core/Signature.annot _build/src/core/Signature.cmt _build/src/core/Signature.cmti _build/src/core/Scoped.annot _build/src/core/Scoped.cmt _build/src/core/Scoped.cmti _build/src/core/Unif_intf.annot _build/src/core/Unif_intf.cmt _build/src/core/TypeInference.annot _build/src/core/TypeInference.cmt _build/src/core/TypeInference.cmti _build/src/core/Options.annot _build/src/core/Options.cmt _build/src/core/Options.cmti _build/src/core/Comparison.annot _build/src/core/Comparison.cmt _build/src/core/Comparison.cmti _build/src/core/Precedence.annot _build/src/core/Precedence.cmt _build/src/core/Precedence.cmti _build/src/core/Builtin.annot _build/src/core/Builtin.cmt _build/src/core/Builtin.cmti _build/src/core/Ordering.annot _build/src/core/Ordering.cmt _build/src/core/Ordering.cmti _build/src/core/Skolem.annot _build/src/core/Skolem.cmt _build/src/core/Skolem.cmti _build/src/core/Cnf.annot _build/src/core/Cnf.cmt _build/src/core/Cnf.cmti _build/src/core/ID.annot _build/src/core/ID.cmt _build/src/core/ID.cmti _build/src/core/IDOrBuiltin.annot _build/src/core/IDOrBuiltin.cmt _build/src/core/IDOrBuiltin.cmti _build/src/core/SLiteral.annot _build/src/core/SLiteral.cmt _build/src/core/SLiteral.cmti _build/src/core/Index.annot _build/src/core/Index.cmt _build/src/core/Index.cmti _build/src/core/Index_intf.annot _build/src/core/Index_intf.cmt _build/src/core/Dtree.annot _build/src/core/Dtree.cmt _build/src/core/Dtree.cmti _build/src/core/Fingerprint.annot _build/src/core/Fingerprint.cmt _build/src/core/Fingerprint.cmti _build/src/core/NPDtree.annot _build/src/core/NPDtree.cmt _build/src/core/NPDtree.cmti _build/src/core/Binder.annot _build/src/core/Binder.cmt _build/src/core/Binder.cmti _build/src/core/Congruence.annot _build/src/core/Congruence.cmt _build/src/core/Congruence.cmti _build/src/core/FeatureVector.annot _build/src/core/FeatureVector.cmt _build/src/core/FeatureVector.cmti _build/src/core/FV_tree.annot _build/src/core/FV_tree.cmt _build/src/core/FV_tree.cmti _build/src/core/UntypedAST.annot _build/src/core/UntypedAST.cmt _build/src/core/UntypedAST.cmti _build/src/core/Ind_ty.annot _build/src/core/Ind_ty.cmt _build/src/core/Ind_ty.cmti _build/src/core/TypedSTerm.annot _build/src/core/TypedSTerm.cmt _build/src/core/TypedSTerm.cmti _build/src/core/Statement.annot _build/src/core/Statement.cmt _build/src/core/Statement.cmti _build/src/core/Flex_state.annot _build/src/core/Flex_state.cmt _build/src/core/Flex_state.cmti _build/src/core/Compute_prec.annot _build/src/core/Compute_prec.cmt _build/src/core/Compute_prec.cmti _build/src/core/Ordinal.annot _build/src/core/Ordinal.cmt _build/src/core/Ordinal.cmti _build/src/core/Rewrite_term.annot _build/src/core/Rewrite_term.cmt _build/src/core/Rewrite_term.cmti _build/src/core/Test_prop.annot _build/src/core/Test_prop.cmt _build/src/core/Test_prop.cmti _build/src/core/lib/Hashcons.annot _build/src/core/lib/Hashcons.cmt _build/src/core/lib/Hashcons.cmti _build/src/core/lib/ParseLocation.annot _build/src/core/lib/ParseLocation.cmt _build/src/core/lib/ParseLocation.cmti _build/src/core/lib/Multiset.annot _build/src/core/lib/Multiset.cmt _build/src/core/lib/Multiset.cmti _build/src/core/lib/LazyList.annot _build/src/core/lib/LazyList.cmt _build/src/core/lib/LazyList.cmti _build/src/core/lib/Hash.annot _build/src/core/lib/Hash.cmt _build/src/core/lib/Hash.cmti _build/src/core/lib/IArray.annot _build/src/core/lib/IArray.cmt _build/src/core/lib/IArray.cmti _build/src/core/lib/AllocCache.annot _build/src/core/lib/AllocCache.cmt _build/src/core/lib/AllocCache.cmti _build/src/core/lib/Multiset_intf.annot _build/src/core/lib/Multiset_intf.cmt _build/src/core/lib/signal.annot _build/src/core/lib/signal.cmt _build/src/core/lib/signal.cmti _build/src/core/lib/UnionFind.annot _build/src/core/lib/UnionFind.cmt _build/src/core/lib/UnionFind.cmti _build/src/core/logtk.cmxs _build/src/core/logtk.a _build/src/core/logtk.cmxa _build/src/core/logtk.cma _build/src/core/logtk.cmi _build/src/core/logtk.cmt _build/src/core/dlllogtk_stubs.so _build/src/core/liblogtk_stubs.a src/core/lib/UnionFind.mli src/core/lib/Multiset_intf.ml src/core/lib/AllocCache.mli src/core/lib/IArray.mli src/core/lib/Hash.mli src/core/lib/LazyList.mli src/core/lib/Multiset.mli src/core/lib/ParseLocation.mli src/core/lib/Hashcons.mli src/core/Test_prop.mli src/core/Rewrite_term.mli src/core/Ordinal.mli src/core/Compute_prec.mli src/core/Flex_state.mli src/core/Statement.mli src/core/TypedSTerm.mli src/core/Ind_ty.mli src/core/UntypedAST.mli src/core/FV_tree.mli src/core/FeatureVector.mli src/core/Congruence.mli src/core/Binder.mli src/core/NPDtree.mli src/core/Fingerprint.mli src/core/Dtree.mli src/core/Index_intf.ml src/core/Index.mli src/core/SLiteral.mli src/core/IDOrBuiltin.mli src/core/ID.mli src/core/Cnf.mli src/core/Skolem.mli src/core/Ordering.mli src/core/Builtin.mli src/core/Precedence.mli src/core/Comparison.mli src/core/Options.mli src/core/TypeInference.mli src/core/Unif_intf.ml src/core/Scoped.mli src/core/Signature.mli src/core/Unif.mli src/core/Subst.mli src/core/HVar.mli src/core/Var.mli src/core/Position.mli src/core/DBEnv.mli src/core/Interfaces.ml src/core/STerm.mli src/core/Util.mli src/core/Type.mli src/core/FOTerm.mli src/core/InnerTerm.mli' terminated with error code 2")
# make: *** [Makefile:19: install] Error 1

It seems that zipperposition is trying to install logtk even though it is already installed. Maybe uninstalling zipperposition doesn't remove logtk ?

opam install failed

Hello, I'm running Ubuntu 16.04, opam 1.2.2. I tried installing with opam and got the following error:

opam install zipperposition
The following actions will be performed:
  ∗  install logtk          0.8               [required by zipperposition]
  ∗  install zipperposition 0.6.1
===== ∗  2 =====
Do you want to continue ? [Y/n] y

=-=- Gathering sources =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
Processing  2/2: [logtk: http] [zipperposition: http]
[logtk.0.8] https://github.com/c-cube/logtk/archive/0.8.0.1.tar.gz downloaded
[zipperposition.0.6.1] https://github.com/c-cube/zipperposition/archive/0.6.1.tar.gz downloaded

=-=- Processing actions -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
[ERROR] The compilation of logtk failed at "make all".
Processing  1/2: [logtk: ocamlfind remove]
#=== ERROR while installing logtk.0.8 =========================================#
# opam-version 1.2.2
# os           linux
# command      make all
# path         /home/kris/.opam/system/build/logtk.0.8
# compiler     system (4.02.3)
# exit-code    2
# env-file     /home/kris/.opam/system/build/logtk.0.8/logtk-13522-ad8886.env
# stdout-file  /home/kris/.opam/system/build/logtk.0.8/logtk-13522-ad8886.out
# stderr-file  /home/kris/.opam/system/build/logtk.0.8/logtk-13522-ad8886.err
### stdout ###
# [...]
# Error: The implementation src/base/logtkSymbol.ml
#        does not match the interface src/base/logtkSymbol.cmi:
#        Values do not match:
#          val hash_fun : t -> Hash.state -> Hash.state
#        is not included in
#          val hash_fun : t -> int64 -> int64
#        File "src/base/logtkSymbol.ml", line 81, characters 4-12:
#          Actual declaration
# Command exited with code 2.
# Makefile:16: recipe for target 'all' failed
### stderr ###
# W: Cannot find source file matching module 'logtk_solving' in library logtk_solving
# W: Cannot find source file matching module 'logtk_meta' in library logtk_meta
# W: Cannot find source file matching module 'logtk_parsers' in library logtk_parsers
# E: Failure("Command ''/usr/bin/ocamlbuild' src/base/liblogtk_stubs.a src/base/dlllogtk_stubs.so src/base/logtk.cma src/base/logtk.cmxa src/base/logtk.a src/base/logtk.cmxs src/parsers/logtk_parsers.cma src/parsers/logtk_parsers.cmxa src/parsers/logtk_parsers.a src/parsers/logtk_parsers.cmxs src/meta/logtk_meta.cma src/meta/logtk_meta.cmxa src/meta/logtk_meta.a src/meta/logtk_meta.cmxs src/solving/logtk_solving.cma src/solving/logtk_solving.cmxa src/solving/logtk_solving.a src/solving/logtk_solving.cmxs -use-ocamlfind -menhir 'menhir --dump --explain' -tag debug' terminated with error code 10")
# make: *** [all] Error 1

portfolio inside zipperposition

(longer term)

  • make 'a flex_state.key json serializable
  • have a set of configurations as json files, embedded into the binary by build system magic
  • have a portfolio phase (loops through its internal phases) with hardcoded flow between these configurations ("on timeout"/"on gaveup" on the flow's edges)
  • have the portfolio layout being configurable via json, and add --mode <name|file.json> to specifiy a list of left-to-right updates (last one overrides whichever fields it sets).

sometimes kbo is really bad

see some problems that are easy with --ord=rpo6:

  • pelletier_problems/pb47.zf (which is normally easy)
  • examples/sledgehammer/prob_e_1.p

even when KBO finds a proof, it's weirdly long and convoluted

Assertion failure in file Precedence.ml

On the following bug.zf file, zipperposition raises File "src/core/Precedence.ml", line 271, characters 4-10: Assertion failed:

data list a := nil | cons a (list a).

def[infix "∈"] mem : pi a. a -> list a -> prop where
  forall x. mem x nil = false;
  forall x hd tl. mem x (cons hd tl) = (x = hd || mem x tl).

def count : pi a. list a -> (a -> prop) -> int where
   forall p. count nil p = 0;
   forall hd tl p. count (cons hd tl) p = count tl p + (if p hd then 1 else 0).

goal forall a (l : list a) p (x : a). count l p != 0 => (exists result. mem result l && p result).

Note the useless universal variable x in the goal, if it is removed the problem is accepted.

problem with weak-head normal form

on branch unif_framework

Command: ./zipperposition.exe --mode=ho-pragmatic --max-inferences=-1 --ho-unif-max-depth=3 --ho-max-elims=0 --ho-max-app-projections=0 --ho-max-var-imitations=0 --boolean-reasoning=cases-inf --ext-decompose=2 --ho-prim-enum=full --ho-prim-max=1 --ho-neg-cong-fun --tptp-def-as-rewrite --rewrite-before-cnf=true --stats --kbo-weight-fun=modarity -q "2|prefer-sos|conjecture-relative-var(1,l,f)" -q "2|const|conjecture-relative-var(1,l,f)" -q "1|defer-sos|conjecture-relative-var(1,l,f)" -q "1|prefer-fo|conjecture-relative-var(1,s,f)" -q "1|prefer-ground|orient-lmax(2,1,2,1,1)" -q "2|prefer-easy-ho|explore" -q "1|prefer-processed|fifo" -q "2|prefer-ho-steps|default" -q "1|prefer-processed|conjecture-relative-var(1,s,f)" --ho-elim-leibniz=1 --select=ho-selection --avatar --timeout=10 /media/petar/4AF7-98BC/TPTP-v7.2.0/Problems/SEU/SEU948^5.p --ho-unif-level=pragmatic-framework

Error: Did not disassemble properly: [(λ (Y0:a). Y0) (F39483 (λ (Y0:a). Y0) Y-1)] [Y-1]

tracking: datatypes

  • inference for acyclicity (not just simplification):
    given C ∨ s = t, look for σ such that sσ = tσ is absurd by acyclicity.
    Then infer from that.
    E.g. s (f x) = s (s (f a)) would give σ={x→a}
    (do anti-unification with cstors only, then try to unify
    cstor-prefixed subterms on one side with the root on the other side)
  • remove some specialized rules (positive injectivity) and instead,
    generate rewrite rules during preprocessing
  • hierarchic superposition for datatypes (with defined functions being part
    of the background)
    • need corresponding TKBO with 2 levels
      (just replace KBO with it anyway, and build weight fun
      from constant classification)
    • with TKBO implemented, removed the code that forces rpo6 to be
      used when induction is enabled, as well as constraint disabling
    • narrowing with defined symbols would ± correspond to E-unification on pure
      background literals
    • add purification inference (read carefully!)
      → do we want weak abstraction? would need 2 kinds of vars then
    • add "case split" rule for t != u where they are of a datatype.
      use a table for caching split for a given ground t.
      split looks like t = cstor1(…) | … | t=cstor_k(…) where
      each is a list of fresh parameters (i.e. possibly inductive
      skolems).
      → avatar should fire on that!
      Do not do case split on α != β where both are parameters
      of a recursive datatype (always possible to pick distinct
      values). For non-recursive datatypes we need to do it.
      → check on examples/data/unit_… problems
    • need a theory solver (msat + small SMT?) that deals with parameters
      → parameters are the way of dealing with exhaustiveness
  • look into "superposition for fixed domains" more seriously
    (ask Weidenbach for more details?)
  • rule similar to fool_param for for datatypes:
    C[t] where t:nat (strict subterm) is not a cstor term nor a variable
    would become C[S x] ∨ t ≠ S x and C[0] ∨ t ≠ 0
    • should be terminating (reduces the number of such strict subterms)
      but careful that with reduction you might find the same clause again,
      this must be an inference and not a simplification
    • is sound, and might be decreasing (check!).
      It does seem to work for fool.
    • enables more reductions…

invalid_argument: Term.head

This error occurs when calling Zipperposition on the following file:

# An encoding of separation logic

val heap : type -> type.
val loc : type.

val emp : pi a. heap a.
val pointsto : pi a. loc -> a -> heap a.
val[AC] heap_merge : pi a. heap a -> heap a -> heap a.

rewrite forall a h. heap_merge h (emp a) = h.
rewrite forall a h. heap_merge (emp a) h = h.

# assoc h l v means that h contains at least once pointsto l v
def assoc : pi a. heap a -> loc -> a -> prop where
  forall a l v. ~ assoc (emp a) l v;
  forall a l1 l2 (v1 v2 : a).
    assoc (pointsto l1 v1) l2 v2 <=> l1 = l2 && v1 = v2;
  forall a h1 h2 l (v : a).
    assoc (heap_merge h1 h2) l v <=> (assoc h1 l v || assoc h2 l v).

def disjoint : pi a. heap a -> heap a -> prop where
  forall a (h1 h2 : heap a).
    disjoint h1 h2 <=> (forall l v. ~(assoc h1 l v && assoc h2 l v)).

# Probably useless but does not hurt
rewrite forall a h. disjoint h (emp a).
rewrite forall a h. disjoint (emp a) h.

def functional_heap : pi a. heap a -> prop where
  forall a (h : heap a).
    functional_heap h <=>
      (forall l v1 v2. assoc h l v1 && assoc h l v2 => v1 = v2).

rewrite forall a. functional_heap (emp a).

assert forall a (h1 h2 : heap a). functional_heap (heap_merge h1 h2) => functional_heap h1 && functional_heap h2.

val default_value : pi a. a.

data list a := nil | cons a (list a).

def head : pi a. list a -> a where
  head nil = default_value;
  forall hd tl. head (cons hd tl) = hd.

def tail : pi a. list a -> list a where
  tail nil = nil;
  forall hd tl. tail (cons hd tl) = tl.

def foreachp : pi a b. list a -> (a -> heap b -> prop) -> heap b -> prop where
  forall a b (l : list a) (p : a -> heap b -> prop) h.
     foreachp l p h =
       (if l = nil then
          h = emp
        else
          (exists h1 h2.
            h = heap_merge h1 h2 &&
            disjoint h1 h2 &&
            p (head l) h1 &&
            foreachp (tail l) p h2)).

tracking: induction

  • custom induction schema, with a toplevel command

    • structural induction on datatypes:
      inductive (n:nat) := { (zero, {}), (succ(n'), {n'}) }

    • induction on sets
      inductive (S:set a) := { (S=empty, {}), (∃x S'. (S = S' ∪ {x} ∧ x∉S'), {S'}) }
      corresponding to axioms:
      forall S. S = empty xor ∃x S'. (S = S' ∪ {x} ∧ x∉S')
      [ P empty && (forall x S S'. x ∈ S ∧ S = S' ∪ {x} ∧ x∉S' ∧ P(S') => P(S)) ] => ∀S. P S.
      NOTE: need to restrict its application, not needed for most use cases
      and will only slow things down
      → only when goal involves recursive function on sets?

    • inductive relations? if R transitive:
      inductive (R(x,y)) := { (x=y, {}), (x ≠ y, {∃z. R(x,y) ∧ R(y,z)}) }

  • discussion

    • it all becomes sound (assuming hidden induction principle): each cut is really the introduction of an instance of the induction principle
    • no nested induction anymore
    • allows to use smallcheck before trying a lemma
    • allows to refine a lemma by generalizing (Aubin 77) some specific
      subterms and some specific occurrences of variables, based on
      their position below defined symbols (in particular, for accumulator terms)
    • similar subgoals that would be distinct before (same goal, different
      skolems) are now the same lemma, thanks to the α-equiv checking
  • from a clause C with inductive skolems {a,b,c} we can generalize
    on these skolems without worrying,
    and try to prove ∀xyz. ¬C[x/a,y/b,z/c] (but only do induction
    on variables that occur in active positions)

  • remove trail literals for induction (and remove clause context,
    might have the higher-order induction principle for proof
    production though)

  • generate fresh coverset every time; new inductive skolem constants
    really become branch dependent
    (no need to prevent branches from interfering every again!)

  • call small_check on candidate inductive formulas;
    try simple generalizations backed by small_check before starting.
    → will be useful after purification (approximation leads to too
    many inferences, some of which yield non-inductively true constraints,
    so we need to check constraints before solving them by induction)

  • only do induction on active positions
    → check that it fixes previous regression on list10_easy.zf, nat2.zf…)

  • might be a bug in candidate lemmas regarding α-equiv checking
    (see on nat2.zf, should have only one lemma?)

  • do induction on simplified formula (e.g. for HO functions)

  • notion of active position should also work for
    defined propositions (look into clause rules)
    + [x] factor the computation of positions out of rewrite_term
    and abstract it so it takes a list of LHS
    + [x] move this into Statement? propositions defined by clause rules
    with atom t as LHS should be as "defined" as term RW rules
    + [x] small check truly needs to use clause rules, too

  • check if two variables are interchangeable (i.e. {X→Y,Y→X}
    gives same form). In this case don't do induction on the second one.

  • do induction on multiple variables, iff they occur in active
    positions in the same subterm.
    + just combine the coversets for each variable
    + same as splitting, do union-find of x,y if there is an active subterm
    f …x…y… where both x and y are active
    + should subsume/replace the individual inductions (which are bound
    to fail since the subterms will not reduce because of one of
    their arg)
    + goes with generalization? If a non-var occurs in active position,
    it must be generalized? Maybe in 2 successive steps…
    + [ ] example: should prove transitivity of ≤
    ./zipperposition.native --print-lemmas --stats -o none -t 30 --dot /tmp/truc.dot examples/ind/nat21.zf
    → might be sufficient for many cases where we used to use nested ind.

  • generalize a bit notion of ind_cst (to ground terms made of skolems)

  • functional induction:

    • based on assumption that recursive functions terminate
      → ask for [wf] attribute?
      → maybe prove termination myself?
      → require that such functions are total! (not just warning)
    • build functional induction scheme(s) based on recursive def, might
      prove very useful for some problems.
    • applies for goals of the form P[f(x…)] with only variables in
      active/accumulator positions of f. In inductive hypothesis,
      use P[f(skolems…)]? or is it automatic with multi-var-induction?
  • use subsumption on candidate lemmas:

    • if a proved lemma subsumes the current candidate, then skip candidate
    • if an unknown lemma subsumes the current candidate, delay it; wait until the subsuming one is proved/disproved
    • OR,
      when a candidate lemma is subsumed by some active lemma,
      lock it and store it somewhere, waiting for one of the following
      conditions to happen:
      1. when a lemma is proved, delete candidate lemmas it subsumes
      2. when a lemma is disproved, unlock candidate lemmas that it
        subsumes and activate them (unless they are locked by other lemmas)
        → might even be in Avatar itself, as a generic mechanism!
    • when a lemma is proved, find other lemmas that are subsumed by it?
      or this might be automatic, but asserting [L2] if L1 proved and L1⇒L2
      might still help with the many clauses derived from L2
    • might need a signal on_{dis_,}proved_lemma for noticing
      that a lemma is now (dis,)proved by the SAT solver.
      → Hook into it to unlock/remove candidates subsumed by the lemma.
  • some clause constraints (e.g. a+s b ≠ s (a+b)) might deserve
    their own induction, because no other rule (not even E-unification)
    will be able to solve these.
    → Again, need very good and fast small_check

  • when generalizing f X a != f a X, which currently yields
    the lemma f X Y = f Y X, instead we could "skolemize" X
    with a HO variable, obtaining f (H Y) Y = f Y (H Y).
    see examples/ind/nat6.zf for a case where it can help
    (we need H because there already is a skolem out there).

  • strong induction?

    • use explicit <| subterm relation for the hypothesis
    • use special transitive rel saturation rules for <|
      (and nonstrict version ≤|)
    • also use custom rules for subterm (using constructors)
      including simplification of t <| cstor^N(t)
      and corresponding unification inference rule
      → acyclicity just becomes the axiom ¬ (x <| x) combined with above rules
    • no need for coverset anymore, just introduce skolems, but need
      (depth-limited) case-split on arbitrary constants/ground terms.
      → decouples case split and induction
    • when generalizing ¬P[a,b] into ∀xy. P[x,y], when doing induction
      on x, might instead prove: ∀xy. y ≤| b ⇒ P[x,y]? this way we
      can re-use hypothesis on y (and maybe x)?
    • multi-variable induction requires <| to work on tuples or multisets
      on both sides…
    • ./zipperposition.native --print-lemmas --stats -o none -t 30 --dot /tmp/truc.dot examples/ind/nat21.zf
  • lemma guessing in induction:

    • simple generalization of a variable with ≥ 2 occurrences in active pos,
      and ≥ 1 passive occurrences
    • generalization of subterms that are not variables nor constructors,
      occurring at least twice in active positions.
    • track variable dependencies for generalized subterms, to avoid
      losing the (often crucial) relation between them
      and other terms containing the same variables.
      → need to also prove t = t' for every generalized term t
      and other term t' that shared ≥1 var with t
    • purification of composite terms occurring in passive position
    • anti-unification in sub-goal solving
      (e.g. append a t1 != append a t2, where a is a skolem
      → try to prove t1!=t2 instead,
      if append is found to be left-injective by testing or lemma)
    • paramodulation of sub-goal with inductive hypothesis (try on list7.zf)?
  • make real inductive benchmarks
    (ok using tip-benchmarks)

    • add lemma statement to tip-parser
    • parse this in Zipperposition
    • use quickspec to generate lemmas on Isaplanner problems
    • run benchmarks (without induction, with induction, with quickspec lemmas)
  • lemma by generalization (if t occurs on both sides of ineq?)

    • see what isaplanner does
    • use "Aubin" paper (generalize exactly the subterms at reductive position),
      but this requires to have tighter control over rules/definitions first
  • generate all lemmas up to given depth

    • need powerful simplifications from the beginning (for smallchecking)

Otter loop?

an alternative to discount, could be interesting to explore

  • for forward demod/simp_reflect (all rewrite rules are used, even passive ones)
  • maybe for forward subsumption, too

Clause.proof_depth counts ESA-steps as 0

Clause.proof_depth / Proof.Step.inferences_perfomed considers only Inference or Simplifcation steps, but not Esa steps (e.g. Avatar). This might confuse the heuristics using it.

error in Term.Pos: invalid position `1.ε` in term `(fun_2 X3)`

I get the error message in the title of this issue by calling zipperposition bug.zf on the following bug.zf file:

data list a := nil | cons a (list a).

def[infix "∈"] mem : pi a. a -> list a -> prop where
  forall x. mem x nil = false;
  forall x hd tl. mem x (cons hd tl) = (x = hd || mem x tl).

def count : pi a. list a -> (a -> prop) -> int where
   forall p. count nil p = 0;
   forall hd tl p. count (cons hd tl) p = count tl p + (if p hd then 1 else 0).

def remove : pi a. a -> list a -> list a where
  forall x. remove x nil = nil;
  forall x hd tl. remove x (cons hd tl) =
          (if hd = x then tl else cons hd (remove x tl)).

goal forall a (l : list a) p (x : a). mem x l = true => count (remove x l) p = count l p - (if p x then 1 else 0).

zipperposition 0.2 fails to compile on 4.01

A find function has been added to the Set.S signature in OCaml 4.01:

# Error: The implementation src/ptset.ml
#        does not match the interface src/ptset.cmi:
#        ...
#        In module Big:
#        The field `find' is required but not provided
# Command exited with code 2.

Found as part of OCamlPro/opam-repository#1029

proof checking for arith ℚ

./zipperposition.native --check -t 30 -o none --dot-llproof /tmp/truc.dot examples/GEG022=1_rat.p --debug.llproof 5

Dependency on containers > 1.0

The branch master depends on containers > 1.0, which is not released. Does this means that zipperposition actually needs the dev version of containers, or could the dependency be inclusive of containers.1.0 ?

Proof checking: Issue with lambda-expressions and sidekick

Sidekick currently does not support lambda-expressions or other binders.

It seems to me that adding support for them into sidekick is the only clean way to implement the proof checker.

The example that I was looking at when I decided that sidekick needs to be changed was this:

./zipperposition.exe  --timeout 60   --tptp-def-as-rewrite --rewrite-before-cnf=true    --boolean-reasoning=cases-simpl --ho-prune-arg=old-prune   --ho-neg-cong-fun --ho-neg-ext=true --simultaneous-sup=false --ho-prim-enum=none   -q "1|prefer-easy-ho|default"   -q "1|prefer-ho-steps|conjecture-relative-var(1.03,s,f)"   -q "1|prefer-sos|default"   -q "5|const|conjecture-relative-var(1.01,l,f)"   -q "1|prefer-processed|fifo"   -q "1|prefer-non-goals|conjecture-relative-var(1.05,l,f)"   -q "1|prefer-fo|conjecture-relative-var(1.1,s,f)"   --select=e-selection5 --recognize-injectivity=true --ho-choice-inst=true --ho-selection-restriction=none  --check --dot-llproof test.dot ../TPTP-v7.2.0/Problems/CSR/CSR132^1.p --debug.llproof 5

on the esa_proofs_sidekick branch. Unfortunately I don't remember the details of what is going on there, but this would be a good starting point if you want to look into this.

Rename `Unif.FO`, `Subst.FO`, etc

Our idea at the workshop was to rename Term into HTerm and Type into HType.
Then the FO submodules can also be renamed into HTerm.

This should be done after the big merge!

type inference for CSR

$TPTP/Axioms/CSR003+0.ax contains mixed integers/symbols and I think we're a bit too strict in assuming "integer numeral => type is $int" there. It makes some problems fail with an error (e.g. $TPTP/Problems/CSR/CSR099+1.p)

improve TSTP output

cc @rafoo

  • topological order
  • indicate substitutions in inferences
  • structured rules (as terms) to indicate variables in substitutions

HOT ISSUE: ST.AppBuiltin

@c-cube: There is an extremely strange behavior with ST.AppBuiltin:

When I call ST.app_builtin ~ty:* b l with some Builtin b and list l where Builtin is not Builtin.Not, I get back a term that is severly simplified! For eaxmple, all (dis)equalities are replacted by (dis)equivalences, and all TRUEs or FALSEs are removed if they are under Or or And.

Note however, that this should not be done if & or | have only one argument.

Strangely, when I open the app_builtin funciton it performs NO simplifcations like that.
Is there any magic behavior I am not aware of?

This is very hot issue since I need to fix it before submitting the prover and I just found it out.

Thanks.

tracking: rewriting

  • conditional rewriting
    • parse rewrite forall vars. ∧_i a_i => l = r

    • same for clausal rewriting

    • handling by inference rule that unifies (rewrite & narrowing are the same)

      `∧_i a_i => l = r           C[l]    lσ=a
      ----------------------------------------
          ∧_i a_iσ => C[rσ]
      

      which is a form of superposition that is artificially restricted to
      rewriting l first

    • for each rule, compile fast pre-checks (e.g.
      matched term must have symbol f at arg position i) and use
      these before attempting call to matching

    • in proof, put set of rewrite rules used in simplification steps,
      at least in full (non-compressed) version

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.