GithubHelp home page GithubHelp logo

ibm / ai-descartes Goto Github PK

View Code? Open in Web Editor NEW
97.0 12.0 17.0 6.89 MB

Open source package for accelerated symbolic discovery of fundamental laws.

License: MIT License

Python 8.67% Java 91.30% Shell 0.02%

ai-descartes's Introduction

GitHub tag DOI License: MIT

AI Descartes: Combining Data and Theory for Derivable Scientific Discovery

This repository contains the code and the data used for the experiments in the paper Combining data and theory for derivable scientific discovery with AI-Descartes.

Visit our website for a general overview, references, and some introductory videos: → AI-Descartes website

system overview

Folders description:

  • data: contains the 3 datasets used in the paper (Kepler’s third law of planetary motion, Einstein’s time-dilation formula, Langmuir’s adsorption equation), the data points for 81 FSRD problems and the corresponding background theories (see data/README.md).
  • reasoning: contains the code for the Reasoning module of AI-Descartes (see reasoning/README.md)
  • symbolic-regression: contains the code for the Symbolic Regression module of AI-Descartes (see symbolic-regression/README.md)

How to cite

@article{AI_Descartes,
	title = {Combining data and theory for derivable scientific discovery with {AI}-{Descartes}},
	volume = {14},
	issn = {2041-1723},
	url = {https://doi.org/10.1038/s41467-023-37236-y},
	doi = {10.1038/s41467-023-37236-y},
	number = {1},
	journal = {Nature Communications},
	author = {Cornelio, Cristina and Dash, Sanjeeb and Austel, Vernon and Josephson, Tyler R. and Goncalves, Joao and Clarkson, Kenneth L. and Megiddo, Nimrod and El Khadir, Bachir and Horesh, Lior},
	month = apr,
	year = {2023},
	pages = {1777},
}

ai-descartes's People

Contributors

corneliocristina avatar ibm-open-source-bot avatar supermang avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

ai-descartes's Issues

Help needed to run AI-Descartes symbolic regression

I have some problems running the system with the provided datasets. In particular I get the repeated output that jobs failed

~/AI-Descartes/symbolic-regression/run_emf$ java -cp ../emf/src/:../emf/yamlbeans-1.0.jar  emf.Main opts-common.yaml opts-langmuir.yaml datasets/langmuir/sun_et_al/
Current relative path is: /home/gkronber/AI-Descartes/symbolic-regression/run_emf
YAML OPTS from dir: datasets/langmuir/sun_et_al/ [[outputDir: datasets/langmuir/sun_et_al//tmp], [infile: datasets/langmuir/sun_et_al//input.dat], [, opts-common.yaml], [, opts-langmuir.yaml], [, datasets/langmuir/sun_et_al//opts.yaml]]
YAML STRING infile: datasets/langmuir/sun_et_al//input.dat
YAML  {infile=datasets/langmuir/sun_et_al//input.dat}
YAML opts from string infile: datasets/langmuir/sun_et_al//input.dat: {infile=datasets/langmuir/sun_et_al//input.dat}
YAML opts from opts-common.yaml: infile=... multitask=false nonconst_plusminus=false baron_exec=... baronOpts=... inputSampleSize=50 kill_existing_baron_jobs=false max_num_jobs=45 LicName=... printBaronFilesUpfront=true
YAML opts from opts-langmuir.yaml: infile=... multitask=false nonconst_plusminus=false baron_exec=... baronOpts=... inputSampleSize=50 kill_existing_baron_jobs=false max_num_jobs=45 LicName=... printBaronFilesUpfront=true dimensionVars={x=[1, 0], y=[0, 3]} use_dimensional_analysis=false relative_squared_error=false prod_exp_limit=6 add_noise_to_input=false max_num_consts=1 dimensionUnits=[bar, mm] max_tree_depth=3
WARNING: yaml key max_num_consts is being overridden in file datasets/langmuir/sun_et_al//opts.yaml.
YAML opts from datasets/langmuir/sun_et_al//opts.yaml: infile=... multitask=false nonconst_plusminus=false baron_exec=... baronOpts=... inputSampleSize=50 kill_existing_baron_jobs=false max_num_jobs=45 LicName=... printBaronFilesUpfront=true dimensionVars={x=[1, 0], y=[0, 3]} use_dimensional_analysis=false relative_squared_error=false prod_exp_limit=6 add_noise_to_input=false max_num_consts=4 dimensionUnits=[bar, mm] max_tree_depth=3 operators=[*, /, +] max_var_exp=2
UNOPS [*, /, +] [] []
The following yaml items have non-default values:
  max_num_jobs: 45
  multitask: false
  max_num_consts: 4
  prod_exp_limit: 6
  printBaronFilesUpfront: true
  inputSampleSize: 50
 BOP
 [MaxTime: 60;, PrTimeFreq: 1;, EpsA: 1e-4;, EpsR: 1e-3;]

outputDir: datasets/langmuir/sun_et_al//tmp
Operators: [*, /, +]
 0: [0.07] -> 0.695
 1: [0.11] -> 0.752
 2: [0.2] -> 0.797
 3: [0.31] -> 0.825
 4: [0.56] -> 0.86
 5: [0.8] -> 0.882
 6: [1.07] -> 0.904
 7: [1.46] -> 0.923
 8: [3.51] -> 0.976
 9: [6.96] -> 1.212
10: [12.06] -> 1.371
11: [17.26] -> 1.469
12: [27.56] -> 1.535
13: [41.42] -> 1.577
14: [55.2] -> 1.602
15: [68.95] -> 1.619
16: [86.17] -> 1.632
NUM TREES 31 with max depth 3
        ArithNormTree                            Unnormalized (blank if same)
     0   3: P
     1   7: (P+P)
     2  10: (P+P+P)
     3  13: (P/(P+P))
     4  13: (P+P+P+P)
     5  16: (P/(P+P+P))
     6  16: (P+P+P+P+P)
     7  17: ((P+P)/(P+P))
     8  17: (P+(P/(P+P)))
     9  19: (P/(P+P+P+P))
    10  19: (P+P+P+P+P+P)
    11  20: ((P+P+P)/(P+P))
    12  20: ((P+P)/(P+P+P))
    13  20: (P+P+(P/(P+P)))
    14  21: (P+((P+P)/(P+P)))
    15  22: (P+P+P+P+P+P+P)
    16  23: ((P+P+P)/(P+P+P))
    17  23: ((P+P+P+P)/(P+P))
    18  23: ((P+P)/(P+P+P+P))
    19  23: (P+P+P+(P/(P+P)))
    20  24: (P+P+((P+P)/(P+P)))
    21  25: (P+P+P+P+P+P+P+P)
    22  26: ((P+P+P)/(P+P+P+P))
    23  26: ((P+P+P+P)/(P+P+P))
    24  26: (P+P+P+P+(P/(P+P)))
    25  27: (P+P+P+((P+P)/(P+P)))
    26  27: ((P/(P+P))+(P/(P+P)))
    27  29: ((P+P+P+P)/(P+P+P+P))
    28  30: (P+P+P+P+((P+P)/(P+P)))
    29  31: ((P/(P+P))+((P+P)/(P+P)))
    30  35: (((P+P)/(P+P))+((P+P)/(P+P)))

-----------------------

IGNORING inputSampleSize, since too large (50 >= 17).
Printing all jobs upfront - ubound is 0.0
THESE JOBS FAILED: [0:P, 1:(P+P), 2:(P+P+P), 3:(P/(P+P)), 4:(P+P+P+P), 5:(P/(P+P+P)), 6:(P+P+P+P+P), 7:((P+P)/(P+P)), 8:(P+(P/(P+P))), 9:(P/(P+P+P+P)), 10:(P+P+P+P+P+P), 11:((P+P+P)/(P+P)), 12:((P+P)/(P+P+P)), 13:(P+P+(P/(P+P))), 14:(P+((P+P)/(P+P))), 15:(P+P+P+P+P+P+P), 16:((P+P+P)/(P+P+P)), 17:((P+P+P+P)/(P+P)), 18:((P+P)/(P+P+P+P)), 19:(P+P+P+(P/(P+P))), 20:(P+P+((P+P)/(P+P))), 21:(P+P+P+P+P+P+P+P), 22:((P+P+P)/(P+P+P+P)), 23:((P+P+P+P)/(P+P+P)), 24:(P+P+P+P+(P/(P+P))), 25:(P+P+P+((P+P)/(P+P))), 26:((P/(P+P))+(P/(P+P))), 27:((P+P+P+P)/(P+P+P+P)), 28:(P+P+P+P+((P+P)/(P+P))), 29:((P/(P+P))+((P+P)/(P+P))), 30:(((P+P)/(P+P))+((P+P)/(P+P)))]
RERUNNING
.still running: []
THESE JOBS FAILED: [0:P, 1:(P+P), 2:(P+P+P), 3:(P/(P+P)), 4:(P+P+P+P), 5:(P/(P+P+P)), 6:(P+P+P+P+P), 7:((P+P)/(P+P)), 8:(P+(P/(P+P))), 9:(P/(P+P+P+P)), 10:(P+P+P+P+P+P), 11:((P+P+P)/(P+P)), 12:((P+P)/(P+P+P)), 13:(P+P+(P/(P+P))), 14:(P+((P+P)/(P+P))), 15:(P+P+P+P+P+P+P), 16:((P+P+P)/(P+P+P)), 17:((P+P+P+P)/(P+P)), 18:((P+P)/(P+P+P+P)), 19:(P+P+P+(P/(P+P))), 20:(P+P+((P+P)/(P+P))), 21:(P+P+P+P+P+P+P+P), 22:((P+P+P)/(P+P+P+P)), 23:((P+P+P+P)/(P+P+P)), 24:(P+P+P+P+(P/(P+P))), 25:(P+P+P+((P+P)/(P+P))), 26:((P/(P+P))+(P/(P+P))), 27:((P+P+P+P)/(P+P+P+P)), 28:(P+P+P+P+((P+P)/(P+P))), 29:((P/(P+P))+((P+P)/(P+P))), 30:(((P+P)/(P+P))+((P+P)/(P+P)))]
RERUNNING
.THESE JOBS FAILED: [0:P, 1:(P+P), 2:(P+P+P), 3:(P/(P+P)), 4:(P+P+P+P), 5:(P/(P+P+P)), 6:(P+P+P+P+P), 7:((P+P)/(P+P)), 8:(P+(P/(P+P))), 9:(P/(P+P+P+P)), 10:(P+P+P+P+P+P), 11:((P+P+P)/(P+P)), 12:((P+P)/(P+P+P)), 13:(P+P+(P/(P+P))), 14:(P+((P+P)/(P+P))), 15:(P+P+P+P+P+P+P), 16:((P+P+P)/(P+P+P)), 17:((P+P+P+P)/(P+P)), 18:((P+P)/(P+P+P+P)), 19:(P+P+P+(P/(P+P))), 20:(P+P+((P+P)/(P+P))), 21:(P+P+P+P+P+P+P+P), 22:((P+P+P)/(P+P+P+P)), 23:((P+P+P+P)/(P+P+P)), 24:(P+P+P+P+(P/(P+P))), 25:(P+P+P+((P+P)/(P+P))), 26:((P/(P+P))+(P/(P+P))), 27:((P+P+P+P)/(P+P+P+P)), 28:(P+P+P+P+((P+P)/(P+P))), 29:((P/(P+P))+((P+P)/(P+P))), 30:(((P+P)/(P+P))+((P+P)/(P+P)))]
RERUNNING
.THESE JOBS FAILED: [0:P, 1:(P+P), 2:(P+P+P), 3:(P/(P+P)), 4:(P+P+P+P), 5:(P/(P+P+P)), 6:(P+P+P+P+P), 7:((P+P)/(P+P)), 8:(P+(P/(P+P))), 9:(P/(P+P+P+P)), 10:(P+P+P+P+P+P), 11:((P+P+P)/(P+P)), 12:((P+P)/(P+P+P)), 13:(P+P+(P/(P+P))), 14:(P+((P+P)/(P+P))), 15:(P+P+P+P+P+P+P), 16:((P+P+P)/(P+P+P)), 17:((P+P+P+P)/(P+P)), 18:((P+P)/(P+P+P+P)), 19:(P+P+P+(P/(P+P))), 20:(P+P+((P+P)/(P+P))), 21:(P+P+P+P+P+P+P+P), 22:((P+P+P)/(P+P+P+P)), 23:((P+P+P+P)/(P+P+P)), 24:(P+P+P+P+(P/(P+P))), 25:(P+P+P+((P+P)/(P+P))), 26:((P/(P+P))+(P/(P+P))), 27:((P+P+P+P)/(P+P+P+P)), 28:(P+P+P+P+((P+P)/(P+P))), 29:((P/(P+P))+((P+P)/(P+P))), 30:(((P+P)/(P+P))+((P+P)/(P+P)))]
RERUNNING
.THESE JOBS FAILED: [0:P, 1:(P+P), 2:(P+P+P), 3:(P/(P+P)), 4:(P+P+P+P), 5:(P/(P+P+P)), 6:(P+P+P+P+P), 7:((P+P)/(P+P)), 8:(P+(P/(P+P))), 9:(P/(P+P+P+P)), 10:(P+P+P+P+P+P), 11:((P+P+P)/(P+P)), 12:((P+P)/(P+P+P)), 13:(P+P+(P/(P+P))), 14:(P+((P+P)/(P+P))), 15:(P+P+P+P+P+P+P), 16:((P+P+P)/(P+P+P)), 17:((P+P+P+P)/(P+P)), 18:((P+P)/(P+P+P+P)), 19:(P+P+P+(P/(P+P))), 20:(P+P+((P+P)/(P+P))), 21:(P+P+P+P+P+P+P+P), 22:((P+P+P)/(P+P+P+P)), 23:((P+P+P+P)/(P+P+P)), 24:(P+P+P+P+(P/(P+P))), 25:(P+P+P+((P+P)/(P+P))), 26:((P/(P+P))+(P/(P+P))), 27:((P+P+P+P)/(P+P+P+P)), 28:(P+P+P+P+((P+P)/(P+P))), 29:((P/(P+P))+((P+P)/(P+P))), 30:(((P+P)/(P+P))+((P+P)/(P+P)))]
RERUNNING
.THESE JOBS FAILED: [0:P, 1:(P+P), 2:(P+P+P), 3:(P/(P+P)), 4:(P+P+P+P), 5:(P/(P+P+P)), 6:(P+P+P+P+P), 7:((P+P)/(P+P)), 8:(P+(P/(P+P))), 9:(P/(P+P+P+P)), 10:(P+P+P+P+P+P), 11:((P+P+P)/(P+P)), 12:((P+P)/(P+P+P)), 13:(P+P+(P/(P+P))), 14:(P+((P+P)/(P+P))), 15:(P+P+P+P+P+P+P), 16:((P+P+P)/(P+P+P)), 17:((P+P+P+P)/(P+P)), 18:((P+P)/(P+P+P+P)), 19:(P+P+P+(P/(P+P))), 20:(P+P+((P+P)/(P+P))), 21:(P+P+P+P+P+P+P+P), 22:((P+P+P)/(P+P+P+P)), 23:((P+P+P+P)/(P+P+P)), 24:(P+P+P+P+(P/(P+P))), 25:(P+P+P+((P+P)/(P+P))), 26:((P/(P+P))+(P/(P+P))), 27:((P+P+P+P)/(P+P+P+P)), 28:(P+P+P+P+((P+P)/(P+P))), 29:((P/(P+P))+((P+P)/(P+P))), 30:(((P+P)/(P+P))+((P+P)/(P+P)))]
RERUNNING

... output repeats endlessly

I had to compile the source files myself because the .jar file that is mentioned in the docs (https://github.com/IBM/AI-Descartes/blob/main/symbolic-regression/README.md) does not exist in the repository. I have the demo license of BARON running on my system because I do not want to spend beyond 200$ a month for an academic license, for just trying AI Descartes.

Please let me know how I can run one of the examples on my system.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.