GithubHelp home page GithubHelp logo

eth-sri / dl2 Goto Github PK

View Code? Open in Web Editor NEW
83.0 9.0 17.0 99 KB

DL2 is a framework that allows training neural networks with logical constraints over numerical values in the network (e.g. inputs, outputs, weights) and to query networks for inputs fulfilling a logical formula.

License: MIT License

Python 95.91% Shell 4.09%

dl2's Introduction

DL2: Training and Querying Neural Networks with Logic

portfolio_viewportfolio_view

DL2 is a framework that allows training neural networks with logical constraints over numerical values in the network (e.g. inputs, outputs, weights) and to query networks for inputs fulfilling a logical formula. An example query is shown below. For more details read training/README.md and querying/README.md.

This implementation of DL2 can be used as a library compatible with PyTorch and can be used to reproduce the results of the DL2 research paper.

Example query

FIND i[100]
WHERE i[:] in [-1, 1],
      class(NN1(GEN(i)), 1),
      class(NN2(GEN(i)), 2),
RETURN GEN(i)

This example query, spans 3 networks: a generator GEN and two classifiers NN1 and NN2. It looks for a noise input (a 100-dimensional vector where all values are between -1 and 1) to the generator such, that it creates an input that gets classifies to class 1 by NN1 and class 2 by NN2. Finally the generated input is returned.

Structure

.
├── README.md              - this  file
├── dl2lib                 - DL2 Library
├── training               - the experiments for training networks
│   ├── README.md          - more details on training networks with DL2
│   ├── semisupservised
│   │   ├── main.py        - script to run the semi-supervised experiments
│   │   └── run.sh         - replicates the experiments from the paper
│   ├── supervised
│   │   ├── main.py        - script to run the supervised experiments
│   │   ├── results.py     - creates the tables and plots for the paper
│   │   └── run.sh         - replicates the experiments from the paper
│   └── unsupervised
│       ├── setup.sh       - installs prerequisite libraries
│       ├── run.sh         - replicates the experiments from the paper
│       └── train_DL2.py   - script to run the unsupervised experiments
├── querying               - the experiments for querying networks
│   ├── README.md          - more details on querying networks with DL2
│   ├── run.py             - runs the querying experiments from the paper
│   ├── run_additional.py  - runs the additional querying experiments from the appendix
│   └── train_models.sh    - downloads and trains the models required for the queries
└── requirements.txt       - pip requirements

Some files omitted.

Installation

DL2 was developed and tested with with python 3.6, but should also work with newer versions. All requirements can be installed via pip install -r requirements.txt. Afterwards the folder dl2lib can be imported as a python library (for details see examples).

Reproducing Results and Examples

For examples see the files intraining and querying, which implement the experiments from the paper. Each folder contains it's own README.md with further instructions on how to use the library and on how to reproduce the results.

Paper

website pdf

@inproceedings{fischer2019dl2, title={DL2: Training and Querying Neural Networks with Logic}, author={Marc Fischer, Mislav Balunovic, Dana Drachsler-Cohen, Timon Gehr, Ce Zhang, Martin Vechev}, booktitle={International Conference on Machine Learning}, year={2019}}

If you are using the library please also use the above citation to reference this work.

Contributors

dl2's People

Contributors

viehzeug avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dl2's Issues

[Unexpected behavior] Setting box constraints on some input i

Hello!
When I am trying to set some box constraints on part of input I using DSL, the result is not what I expected to have.
DSL is like the following:

FIND i[10]
S.T. i[5:] in [1, 2],
...
RETURN i

I am expecting that last 5 items of i is restricted in [1, 2], but the constraints don't seem to work in the result.

The similar thing happens when I use mask instead of slice operation.
DSL is like the following:

FIND i[10]
S.T. i[mask] in [1, 2],
...
RETURN i

The mask is like 0000000001(only the last one is true), but in the result, seems like the first element in i is restricted by the box constraint.

I'm wondering if I have some misunderstanding about the effect of DSL above? Any advice would be appreciated!

GTSRB problems with test images path

In querying/context.py, the part after Load gtsrb variables has the following problems:

  1. The test images for GTSRB (querying/data/GTSRB/Final_Test/Images) are not located in class folders, only indicating their class by their filenames. Theses should be moved by running this script:
for t in *ppm; do mkdir -p "${t%.*}"; mv "$t" "${t%.*}"; done 
  1. The test set needs an absolute path built like this, replacing
testset = datasets.ImageFolder('../data/GTSRB/Final_Test/Images', transform=transform_test)

with

testset = datasets.ImageFolder(os.path.join(curr_dir, 'data/GTSRB/Final_Test/Images'), transform=transform_test)

The reproduce of semisupervised learning task.

I tried to reproduce the semisupervised learning task reported in the paper, but failed.
The running code is modified based on run.sh, as can be seen in the following.
python main.py --lr 0.001 --net_type vgg --constraint 'DL2' --epochs 1600 --num_labeled 100 --dataset cifar100 --exp_name dl2_06 --constraint-weight 0.6
The results are

| Test Result   Acc@1: 1.12%
| Test Result   CAcc: 99.71%
| Test Result   GroupAcc: 5.18%

Have I used the wrong experimental settings? What parameter settings I should add in order to reproduce the results?
Thanks for any reply.

Errors in example usage in querying/README.md

The line

success, r, t = q.solve(a < b, return_values=[i])

in querying/README.md contains an error.
TypeError: solve() missing 1 required positional argument: 'args'

It would make sense changing the example to

import dl2lib as dl2
import dl2lib.query as q
from configargparse import ArgumentParser

parser = ArgumentParser(description='DL2 Querying')
parser = dl2.add_default_parser_args(parser, query=True)
args = parser.parse_args() # additionaly arguments such as timeout etc.
NN = q.Model(myPytrochModel)
i = q.Variable('i', (10,))
a = i[0] + i[3]
b = NN(i[4])
success, r, t = q.solve(a < b, return_values=[I], args=args)

TypeError: unsupported operand type(s) for +: 'NoneType' and 'Fn'

When executing the following adapted example code given in querying/README.md

import torch

import dl2lib as dl2
import dl2lib.query as q
from configargparse import ArgumentParser

from querying.models.mnist.main import Net1

my_pytorch_model = Net1()
my_pytorch_model.load_state_dict(torch.load('models/mnist/mnist_1.pt'))


def high_level_query():
    qtext = """FIND i[10]
    S.T. i[0] + i[3] < NN(i[4])"""  # actualy query
    context = dict()  # a python dict that defines models and named constants
    context['NN'] = my_pytorch_model
    parser = ArgumentParser(description='DL2 Querying')
    parser = dl2.add_default_parser_args(parser, query=True)
    args = parser.parse_args()  # additionaly arguments such as timeout etc.
    success, result, time = q.Query(qtext, context=context, args=args).run()

if __name__ == '__main__':
    high_level_query()

the following error occurs:

Traceback (most recent call last):
  File "~/dl2-master/querying/example_query.py", line 36, in <module>
    variant1()
  File "~/dl2-master/querying/example_query.py", line 21, in variant1
    success, result, time = q.Query(qtext, context=context, args=args).run()
  File "~/dl2-master/dl2lib/query/query.py", line 11, in __init__
    self.constraint, self.return_values = Parser(query, self.context, self.args).parse()
  File "~/dl2-master/dl2lib/query/parser.py", line 42, in parse
    return self.generate_find()
  File "~/dl2-master/dl2lib/query/parser.py", line 50, in generate_find
    constraints = self.traverse_constraints(self.parse_tree.find.constraints.constraints)
  File "~/dl2-master/dl2lib/query/parser.py", line 153, in traverse_constraints
    c1 = self.traverse_constraint(constraint.c1)
  File "~/dl2-master/dl2lib/query/parser.py", line 137, in traverse_constraint
    lhs = self.traverse_expression(constraint.lhs)
  File "~/dl2-master/dl2lib/query/parser.py", line 127, in traverse_expression
    return eval(f"lhs {exp.op} rhs")
  File "<string>", line 1, in <module>
TypeError: unsupported operand type(s) for +: 'NoneType' and 'Fn'

The model mnist_1.pt was trained using the given training procedure train_models.sh from the querying directory.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.