GithubHelp home page GithubHelp logo

neoresearch / neocompiler-eco Goto Github PK

View Code? Open in Web Editor NEW
37.0 9.0 20.0 39.72 MB

NeoCompiler Eco: compile, deploy, test NEO smart contracts and tools for the decentralized web

Home Page: https://neocompiler.io

License: MIT License

JavaScript 33.74% CSS 16.77% Shell 1.97% HTML 5.90% C# 7.47% Dockerfile 0.20% Python 0.03% Less 16.62% SCSS 17.31%
neo python csharp smart-contracts online-ide private-network blockchain

neocompiler-eco's Introduction

NeoCompiler Eco 3+

Open in Gitpod

Open in GitHub Codespaces

This is an open-source initiative for providing an easy access to on NEO ecosystem.

In particular, we provide simple interact and didactic interfaces for allowing online compiling for C#, Python, Go and Java.

Official Documentation on ReadTheDocs

Documentation Status

Building Sphinx Documentation Locally

Just type ./make-docs.sh, and find ./docs/build/html/index.html (the same as ReadTheDocs)

Suggestions

Browsers/Devices

  • Tested with Firefox Quantum - 84.0
The current front-end interface can be accessed from:
Compilers RPC API services are available at:
Other useful services

What does it currently do

  • Compile input C# code using reliable and safe servers (backend) compilers;
  • Return NEF compatible files: AVM, ABI and MANIFEST codes;
  • Deploy and invoke code to private net, shared privatenet, testnet (unsafe), mainnet (unsafe);
  • Save your history of activities for testing a given smart contract;
  • Tests with different wallets, synced and with able to provide historic data our activity;
  • Perform all up-to-date blockchain invocations and RPC calls;
  • Runs RPC with all enabled plugins and up-to-date features;
  • Provide basic statistical data;
  • Use websockets to provide some useful information;
  • It can be used on TestNet, or even MainNet.

Roadmap

  • Move towards client-based compiling (more secure, robust and much more scalable).
  • Ideas? Collaborations are welcome :) The goal is to be didactic and bring it close to citizens and users: Smart Cities, Smart Governance and Smart Blockchain Technologies :P

Dependencies

Docker recommendations

Docker technology is essential for sandboxing all compilers in different environments (for different languages). Docker compose is now integrated with desktop docker.

sudo apt-get install apt-transport-https ca-certificates curl gnupg-agent software-properties-common

curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -

sudo apt-key fingerprint 0EBFCD88

sudo add-apt-repository "deb [arch=amd64] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable"

sudo apt update

sudo apt-get install docker-ce docker-ce-cli containerd.io

If necessary, add user to docker group: sudo usermod -a -G docker $USER

Build everything

The online command required to create our own NeoCompiler Ecosystem, suitable for private of public blockchain projects.

This will call a docker-compose with NeoCompiler Private Net (Eco) + NeoScan (optional). Furthermore, it will set all available compilers and open the front/backend interface and server, respectively.

./build_everything.sh

Developers guidelines

Basically, two steps are required: A1 and A2. Both are described below.

A1) Building compilers

This script already builds the compilers and starts the server:

./docker-sock-express-compilers/docker-compilers/buildCompilers.sh

  • Edit ./docker-sock-express-compilers/docker-compilers/.env if you want to build other compilers, for example BUILD_ALL_CSHARP=0
    • Build list with different versions for a given compiler, such as: docker-sock-express-compilers/docker-compilers/compilers/docker-compiler-csharp/docker_build_list_of_compilers.sh

Building C# Neo Core Compiler entrypoint based image

The backend for C# is provided using native github/neo-project tools, only two steps are necessary to build and tag image:

cd /docker-sock-express-compilers/docker-compilers/compilers/docker-compiler-csharp

docker_build.sh

Running express node servers

Base images express nodes and dockers

docker-sock-express-compilers/docker-ubuntu-docker-node-express

./docker_build-all.sh will build both:

  • essential docker image for express and node docker_build-express.sh
  • the aforementioned image with docker docker_build-express-docker.sh

Http front-end: cd /docker-sock-express-compilers/docker-http-express

docker compose up

Compilers RPC API Backend: cd docker-sock-express-compilers/docker-compilers

docker compose up

Eco Services: cd docker-sock-express-compilers/docker-services

docker compose up

A2) Eco Network Functionalities

Docker compose is the main tools that acts for the creation of our micro-service. This script will start all necessary backend functionalities and neo-csharp-nodes.

In particular, we currently have:

  • csharp nodes are with TCP at 2033x and RPC at 3033X, websocket is not being used
    • 4 csharp consensus node, two of them are also a RPC as default at port 30333 and 30334;
    • 1 csharp pure RPC nodes at 30337;

Dealing with docker compose swarm of containers

Start up the container, checking the messages and following warnings

Simply run runEco_network.sh (integrated with .env file)

or:

cd ./docker-compose-eco-network

docker compose up

Start up the container in a detached mode

docker compose up -d

Feel free to take is down

docker compose down

However, consider stopping and restarting

docker compose stop
docker compose start

NeoCompiler Eco useful commands and ideas

Other functionalities and integrations are possible and some are implemented

It is also possible to integrate the Eco Network with lighwallet and explorers.

Other parameters

One could check docker docker-compose.yml, picking up a combination of your choice from docker-compose-eco-network folder. This can be done for locally modifying some characteristics.

Run build_everything.sh with an additional parameter --no-build and your modified node zip of the private net will be called, use the name neo-cli-built.zip.

Useful Commands

open csharpnodes

  • docker exec -it eco-neo-csharp-node1-running bash
  • screen -dr will show the screen
  • /opt/run.sh will start a new screen (if you killed the last process);
  • /opt/start_node.sh will start the node directly in the terminal.

Contributing

  • If you have ideas or issues, you can inform directly at github or contact us directly

    1. Check the open issues and pull requests for existing discussions.
    2. Open an issue first, to discuss a new feature or enhancement.
    3. Write tests, and make sure the test suite passes locally and on CI.
    4. Open a pull request, and reference the relevant issue(s).
    5. After receiving feedback, squash your commits and add a great commit message.
    6. Run make push-tag after merging your pull request.
    7. Anyway, you already were part of team... :P
  • Our team is currently formed by researchers/professors, so our time is very constrained... if you feel you can help us, don't hesitate!

  • We created a wallet specially for project donations. That can help us improve our servers and perhaps hire someone for improving graphical interfaces and developing many more interesting features. NEO wallet: AJX1jGfj3qPBbpAKjY527nPbnrnvSx9nCg

LICENSE MIT

This project is part of NeoResearch initiative and it is freely available at NeoCompiler.io. The website is rebooted periodically, in order to keep resource usage low, so that everyone is welcome to use it.

NeoCompiler Eco team @igormcoelho and @vncoelho

Copyleft 2017-2020

neocompiler-eco's People

Contributors

corollari avatar d7laungani avatar dependabot[bot] avatar ftgibran avatar igormcoelho avatar joaooliversystem avatar omahs avatar ricardomartinlabs avatar rodoufu avatar salahzsh avatar slynrick avatar stevenjack avatar tomascarvalho avatar vncoelho avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

neocompiler-eco's Issues

Claim Fails Error, even though Claim is Successful

When I click orange claim button here for the top address

image

I recieve the following error

image

However if I wait a few moment it seems that the claim is successful, and the GAS is successfully transferred into the wallet

Submission of raw .avm

It would be interesting to have a feature for submission of raw .avm file. That would simplify integration with VStudio and other Desktop IDEs, so advanced programmers won't need to setup their own NEO Ecosystem (besides the local compiler, if they have one already).

Something wrong to generate gas

It uses to happen on the "second" half of the day (at night on Brazil time zone).
No gas, no gain. 😢 But I'm enjoying a new book meanwhile. ☺️


image

Error "docker-compose up" in ./docker-compose-eco-network

I try to do "docker-compose up" in the docker-compose-eco-network on Windows 10. It gives me the following error:

pull access denied for eco-neo-csharp-node, repository does not exist or may require 'docker login'

I want to run the privnet which is compiled with 3 sec blocktime so we can test our neo smart contracts faster.

Here is some more information on the Windows 10 system:

Client: 18.03.0-ce
API version: go1.9.4
Git comFri Mar 23 08:31:36 2018
OS/Arch: falsews/amd64
Orchestrator: swarm

Server:
Engine: 18.06.1-ce
API version: go1.10.3nimum version 1.12)
Git commit: Tue Aug 21 17:28:38 2018
OS/Arch: false/amd64

docker-compose version 1.20.1, build 5d8c71b2
docker-py version: 3.1.4
CPython version: 3.6.4
OpenSSL version: OpenSSL 1.0.2k 26 Jan 2017

Thanks!

RPC stats polling at one second intervals on TestNet and MainNet

The node block height/version/mempool queries that populate the stats in the menu bar are each being polled once per second, which I understand is normal for the one-second-per-block neocompiler nodes, but this happens even when connected to a MainNet or TestNet node.

The code should be changed to be aware of the number of seconds per block on whichever network it is connected to and only poll at those intervals, in order to avoid overloading the public nodes with unnecessary queries.

Magic boxes for invoke

Folks, we need to implement "magic box" feature, based on abi information, so we can invoke contract functions directly from a listbox, and filtering parameters correctly (strings, numbers, lists...). I believe this feature should also indicate a json format for passing invoke infomation (perhaps we propose a NEP for this?).

Add NEP-2 string wallet import feature

Add feature to import NEP-2 wallet (client-only!!) and perform invoke operations on private/testnets. It is important to NEVER store or print this value anywhere (on backend), and also warn that is very dangerous to use mainnet wallets here.

Need your help, wallets is unavailable sometimes

I found the wallets in the neocompiler.io is unavailble sometime recently.

I want to show our dapp demo in 4 August in Game-Shared meeting organised by offical of neo. Would you help me to make neocompiler.io stable in that day.
I would like to introduce the neocompile.io to other developers in that meeting.

thank you in advance.

The docker-neo/entrypoint.sh script PATH is wrong

issue1: neocompiler.iodocker-neo/entrypoint.sh path is wrong

origin export:
export PATH=$PATH:/neo-compiler/neon/bin/Release/netcoreapp1.0/

right export:
export PATH=$PATH:/neo-compiler/neon/bin/Release/netcoreapp1.0/Neon

issue2: neocompiler.io/docker-neo/Dockerfile path is wrong

cp /usr/lib/mono/4.5/mscorlib.dll /buildtmp/NeoContract1/bin/Release

=>

cp /usr/lib/mono/4.5/mscorlib.dll /buildtmp/NeoContractTeste/NeoContract1/bin/Release

cd NeoContractTeste

otherwise it will report "hostpolicy.dll was not found

Compute scripthash from avmhex

The scripthash field should be completed automatically after avm submission or compiling. There's no need for abi information at this point, because we already have this feature (scripthash/address calculation) implemented in Utils.

default Storage and Dynamic Invoke boxes

It's better to have Storage and Dynamic Invoke set by default... since gas in infinite here, better to pay for all, even if not necessary, to avoid problems.

Feature: Allow legacy versions of each compiler

If we could recompile open-source contracts with legacy versions of the compiler, then we would be able to verify if it matches with the script hash on the blockchain. This would greatly improve trust and security for NEP-5 tokens (and other open-source smart contracts). Blockchain explorers would be able to display the actual source code (given that the source code is available) for each deployed smart contract.

[feature] Improve TXs responsive table

*Status are currently not working and could be updated when user calls a button, such as in the Utils for checking balances
*Create box for allowing users to insert TXs to be table by hand
*Future, change and modify transaction orders

bug losing my contract on compiler screen

Second time I write my contract on C# main screen... then I go to Conversors tab, go to "Disassembly" tool, and when I come back, my contract is gone! It becomes Hello World again.
Please let's try to fix this.

File Sharing problem still exists

Issue #25 still persists, now in the following code:
(This code must have been added after I opened #25 , as I was not getting an error here before).

Fix: Similarly to #25, this can be fixed by using the same local directory near the docker-composer.yml file.

volumes:
- /var/log/neopython-rest-rpc/:/neo-python/logs
eco-neo-python-rpc-running:
image: eco-neo-python:latest
container_name: "eco-neo-python-rpc-running"
ports:
- "10332:10332"
links:
- eco-neo-csharp-nodes-running
- eco-neo-python-first-multisig-transfer-running
command: /bin/bash /opt/start_neopython_rpc.sh
depends_on:
- eco-neo-csharp-nodes-running
- eco-neo-python-first-multisig-transfer-running
volumes:
- /var/log/neopython-rest-rpc/:/neo-python/logs

Make Return Types and Deploy Params More obvious

image

Currently it has the raw integer values, and it is not possible to know what this translates to (String, Int, etc.)

It would be nice to show the type in plain text format instead of the hex code

MapExample.cs is not returning the right (format) value

The notifications from StorageMap used to show the same values on the comments. Now, it looks like the number format has changed for some reason.

Sincerely, I don't remember what used to be shown in RPC output. It is currently returning byte[] as value format for the .AsBigInteger() method. I think the expected format should be Integer, shouldn't be?

A last note, the first time I've notice this behaviour was on June, 15th.

Support upload of arbitrary NEP-3 ABI files

Some times I'd like to be able to test a contract I didn't compile using NeoCompiler Eco, so having a way to manually populate the ABI data in the interface and invoke that contract would be helpful.

all addresses should receive some Neo

our network was slow,so it made sense to accumulate on.a single one. Now gas is generated fast, so its nice to have easy to use addresses full of gas and neo, since the beggining of the network (every restart).

Always response with status of "Fault, Break"

It's realy a nice website of "https://neocompiler.io", and it saves me hours for test smart contracts.
But when i deploy a simple smart contracts directly on your website, and test it with "postman". it always responses with status of "Fault, Break".
The avm file works well on my local neopy prompt.
Any advice is appreciated.

File Sharing issues when installing NeoCompiler on MacOS

Issue: Running the ./build_everything.sh shell script on mac os will result in error. Docker tries to use file sharing with a folder that does not exist on MacOS /var/logs/, raising an error and aborting the shell script. This can possible be fixed by meddling with Docker for Mac File Sharing properties, but I was not able to do it .

Possible Solution: Change the path in order to work in other OS's.

I got it working by creating a folder called dkrat my home directory and changed every /var/log/ occurrence in the project to ~/dkr/.

- /var/log/neocli-node1:/opt/node1/neo-cli/Logs
- /var/log/neocli-node2:/opt/node2/neo-cli/Logs
- /var/log/neocli-node3:/opt/node3/neo-cli/Logs
- /var/log/neocli-node4:/opt/node4/neo-cli/Logs

The invoke result really confuses me

This is a my test smart contract (success)

using Neo.SmartContract.Framework.Services.Neo;
namespace Neo.SmartContract
{
    public class KeyValue : Framework.SmartContract
    {
        public static byte[] BoolToByteArray(bool input){
            //return new byte[1] { input ? (byte)0 : (byte)1 };
            if(input)
                return new byte[1] { 0x01 };
            else
                return new byte[1] { 0x00 };
        }
        
        public static byte[] Main(string op, byte[] key, byte[] value)
        {
            if (op == "write") {
                Storage.Put(Storage.CurrentContext, key, value);
                //return new byte[1]{(byte)0};
                return BoolToByteArray(true);
            }
            if (op == "reads") {
                return Storage.Get(Storage.CurrentContext, key);
            }
            else{
                //return new byte[1]{(byte)1};
                return BoolToByteArray(false);
            }
        }
    }
}

After deploy it on the neocompiler.io, I can invoke the "write" method and the content will be written into storage, and it can be read from storage correctly.

Buy if I change the return value of "BoolToByteArray" method

This is a my test smart contract (failed)

using Neo.SmartContract.Framework.Services.Neo;

namespace Neo.SmartContract
{
    public class KeyValue : Framework.SmartContract
    {
        public static byte[] BoolToByteArray(bool input){
            //return new byte[1] { input ? (byte)0 : (byte)1 };
            if(input)
                return new byte[1] { 0x00 };
            else
                return new byte[1] { 0x01 };
        }
        
        public static byte[] Main(string op, byte[] key, byte[] value)
        {
            if (op == "write") {
                Storage.Put(Storage.CurrentContext, key, value);
                //return new byte[1]{(byte)0};
                return BoolToByteArray(true);
            }
            if (op == "reads") {
                return Storage.Get(Storage.CurrentContext, key);
            }
            else{
                //return new byte[1]{(byte)1};
                return BoolToByteArray(false);
            }
        }
    }
}

Deploy it just like the first one , I got the status of "Fault" when invoking the "write" method, and nothing was written into storage.

Add F# compiler support

Let us support F# compiler.
I will add a few simple F# examples here.
If the team needs any help for test, let me know :)

Deploy Error: One of the policy filters failed

When attempting to compile and deploy the ICO contract template on the C# template, the contract compiles fine and generates the avm, but when I try to deploy it in the next step I get this error

image

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.