GithubHelp home page GithubHelp logo

talendar / nevopy Goto Github PK

View Code? Open in Web Editor NEW
28.0 1.0 6.0 37.54 MB

Neuroevolution framework for Python.

License: MIT License

Python 99.90% Makefile 0.10%
neuroevolution python neat deep-learning evolutionary-algorithms genetic-algorithms machine-learning neural-network parallel-computing distributed

nevopy's Introduction

NEvoPy logo

Neuroevolution for Python

Python versions License PyPI Documentation

NEvoPy is an open source neuroevolution framework for Python. It provides a simple and intuitive API for researchers and enthusiasts in general to quickly tackle machine learning problems using neuroevolutionary algorithms. NEvoPy is optimized for distributed computing and has compatibility with TensorFlow.

Currently, the neuroevolutionary algorithms implemented by NEvoPy are:

  • NEAT (NeuroEvolution of Augmenting Topologies), a powerful method by Kenneth O. Stanley for evolving neural networks through complexification;
  • the standard fixed-topology approach to neuroevolution, with support to TensorFlow and deep neural networks.

Note, though, that there's much more to come!

In addition to providing high-performance implementations of powerful neuroevolutionary algorithms, such as NEAT, NEvoPy also provides tools to help you more easily implement your own algorithms.

Neuroevolution, a form of artificial intelligence that uses evolutionary algorithms to generate artificial neural networks (ANNs), is one of the most interesting and unexplored fields of machine learning. It is a vast and expanding area of research that holds many promises for the future.

Installing

To install the current release, use the following command:

$ pip install nevopy

Getting started

To learn the basics of NEvoPy, the XOR example is a good place to start. More examples can be found in the examples folder of the project's GitHub repo.

You should also take a look at this quick overview on NEvoPy. The project's documentation is available on Read the Docs, through this link.

     

Citing

If you use NEvoPy in your research and would like to cite the NEvoPy framework, here is a Bibtex entry you can use. It currently contains only the name of the original author, but more names might be added as more people contribute to the project. Also, feel free to contact me (Talendar/Gabriel) to show me your work - I'd love to see it.

@misc{nevopy,
  title={ {NEvoPy}: A Neuroevolution Framework for Python},
  author={Gabriel Guedes Nogueira},
  howpublished={\url{https://github.com/Talendar/nevopy}},   
}

nevopy's People

Contributors

talendar avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

nevopy's Issues

Unify the implementation of genetic algorithms

Currently, each neuroevolutionary algorithm implements its own genetic algorithm (through a class that inherits from Population). Since the behaviour of the genetic algorithms in most neuroevolutionary algorithms is basically the same (with minor differences), I think it would be better to unify the implementation of those genetic algorithms in one place, in order to avoid duplicate code.

An intuitive way to do that is to create a subclass of Population named GeneticPopulation (or something like that) that implements a general genetic algorithm. By inheriting this class, each neuroevolutionary algorithm can customize the behaviour of the genetic algorithm without needing to implement it from the ground. This might require breaking the evolve() method into smaller pieces.

Configuring time and memory constraints when running NEvoPy programs

Hey @Talendar, this isn't exactly an issue but I couldn't quite find your email and I didn't know where else to submit this question, so apologies if it shouldn't be written here. To get a sense of how NEvoPy scripts can be configured, I'd like to run one of the examples (the XOR fixed topology example) you've given on a computing cluster I have access to. Can you give me a sense of how much memory and time I should request for the run, and if the example could/should be run with GPUs? (I have a sense that they can because the XOR fixed topology example seems to use Tensorflow layers, or at least wrappers of Tensorflow layers, which I believe can use GPUs.) Also, for a more complex neural network than the one in the example, is there a good way of estimating how much memory and time the runs will take?

2D observation spaces in gym throw error with NEAT genomes

Below is a simple OpenAI environment with a 2d observation space. If you try to run this in NEvoPy the genome class throws a ValueError: setting an array element with a sequence. You can duplicate this by using any of the trading environments from gym-anytrading, or with the below bare-bones environment. I think this is related to the fact that reset is returning a n dimensional observation space but that is just a guess.

import gym
from gym import spaces
import numpy as np

class xCartPoleEnv(gym.Env):
  metadata = {
    'render.modes': ['human', 'rgb_array'],
    'video.frames_per_second': 50
  }

  def __init__(self):
    self.action_space = spaces.Discrete(2)
    # ==== NOTE THE 10x2 observation space here
    self.observation_space = spaces.Box(low=-np.inf, high=np.inf, shape=(10,2), dtype=np.float32)

 def step(self, action):
    self.state = np.zeros((10, 2))
    reward = 1.0
    done = True
    return self.state, reward, done, {}

  def reset(self):
    self.state = np.zeros((10, 2)) 
    return self.state

  def render(self, mode='human'):
    print('render')

  def close(self):
    print('close')

Add a network processing scheduler

Feature:

A processing scheduler (subclass of ProcessingScheduler) capable of assigning work to different machines on a network. The RayProcessingScheduler seems to be able to do that to some extent, but, although I could make it work on a local network, I was unable to use it to split the processing among computers in the Internet.

The network processing scheduler must be able to manage other processing schedulers (workers), each of which is in a different machine.

Error during installation

This is the output of pip install nevopy:

Collecting nevopy
Using cached nevopy-0.2.3-py3-none-any.whl (117 kB)
Collecting Columnar~=1.3.1 (from nevopy)
Using cached Columnar-1.3.1-py3-none-any.whl (11 kB)
Collecting gym~=0.17.3 (from nevopy)
Using cached gym-0.17.3.tar.gz (1.6 MB)
Preparing metadata (setup.py) ... done
Collecting matplotlib~=3.3.3 (from nevopy)
Using cached matplotlib-3.3.4.tar.gz (37.9 MB)
Preparing metadata (setup.py) ... done
Collecting networkx~=2.5 (from nevopy)
Using cached networkx-2.8.8-py3-none-any.whl (2.0 MB)
Collecting numpy~=1.19.5 (from nevopy)
Using cached numpy-1.19.5.zip (7.3 MB)
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing metadata (pyproject.toml) ... done
INFO: pip is looking at multiple versions of nevopy to determine which version is compatible with other requirements. This could take a while.
Collecting nevopy
Using cached nevopy-0.2.2-py3-none-any.whl (117 kB)
Using cached nevopy-0.2.1-py3-none-any.whl (117 kB)
Using cached nevopy-0.2.0-py3-none-any.whl (117 kB)
Using cached nevopy-0.1.1-py3-none-any.whl (112 kB)
Using cached nevopy-0.1.0-py3-none-any.whl (111 kB)
Using cached nevopy-0.0.2-py3-none-any.whl (93 kB)
Collecting mypy~=0.790 (from nevopy)
Using cached mypy-0.991-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (18.1 MB)
Collecting nevopy
Using cached nevopy-0.0.1-py3-none-any.whl (93 kB)
INFO: pip is still looking at multiple versions of nevopy to determine which version is compatible with other requirements. This could take a while.
ERROR: Cannot install nevopy==0.0.1, nevopy==0.0.2, nevopy==0.1.0, nevopy==0.1.1, nevopy==0.2.0, nevopy==0.2.1, nevopy==0.2.2 and nevopy==0.2.3 because these package versions have conflicting dependencies.

The conflict is caused by:
nevopy 0.2.3 depends on ray~=1.1.0
nevopy 0.2.2 depends on ray~=1.1.0
nevopy 0.2.1 depends on ray~=1.1.0
nevopy 0.2.0 depends on ray~=1.1.0
nevopy 0.1.1 depends on ray~=1.1.0
nevopy 0.1.0 depends on ray~=1.1.0
nevopy 0.0.2 depends on ray~=1.1.0
nevopy 0.0.1 depends on ray~=1.1.0

To fix this you could try to:
1. loosen the range of package versions you've specified
2. remove package versions to allow pip attempt to solve the dependency conflict

ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/topics/dependency-resolution/#dealing-with-dependency-conflicts

Implement neural network layers optimized for neuroevolution

Feature:

Although TensorFlow is an awesome, high-performance framework, it wasn't designed with neuroevolution in mind. That said, perhaps NEvoPY should implement its own neural network layers, optimized for neuroevolution (without the usual "deep learning stuff", like optimizers and back-propagation). The idea is not to remove TensorFlow support from NEvoPY, but to have built-in neural layers that the user can opt to use.

Implementing a bug-free neural layer compatible with GPU is no trivial task, however. Also, I don't know enough about TensorFlow's core to judge if making that new implementations is worth. Some tests are needed in order to clarify this.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.