GithubHelp home page GithubHelp logo

vivounicorn / xgboost Goto Github PK

View Code? Open in Web Editor NEW

This project forked from dmlc/xgboost

0.0 1.0 0.0 2.8 MB

eXtreme Gradient Boosting (GBDT or GBRT) Library for large-scale (distributed) machine learning

License: Other

Makefile 0.98% R 13.70% C++ 67.98% C 2.08% CSS 0.39% TeX 0.09% Shell 0.61% Python 10.68% Java 3.48%

xgboost's Introduction

XGBoost: eXtreme Gradient Boosting

An optimized general purpose gradient boosting library. The library is parallelized, and also provides an optimized distributed version. It implements machine learning algorithm under gradient boosting framework, including generalized linear model and gradient boosted regression tree (GBDT). XGBoost can also also distributed and scale to even larger data.

Contributors: https://github.com/dmlc/xgboost/graphs/contributors

Documentations: Documentation of xgboost

Issues Tracker: https://github.com/dmlc/xgboost/issues

Please join XGBoost User Group to ask questions and share your experience on xgboost.

Distributed Version: Distributed XGBoost

Highlights of Usecases: Highlight Links

What's New

Features

  • Sparse feature format:
    • Sparse feature format allows easy handling of missing values, and improve computation efficiency.
  • Push the limit on single machine:
    • Efficient implementation that optimizes memory and computation.
  • Speed: XGBoost is very fast
    • IN demo/higgs/speedtest.py, kaggle higgs data it is faster(on our machine 20 times faster using 4 threads) than sklearn.ensemble.GradientBoostingClassifier
  • Layout of gradient boosting algorithm to support user defined objective
  • Distributed and portable
    • The distributed version of xgboost is highly portable and can be used in different platforms
    • It inheritates all the optimizations made in single machine mode, maximumly utilize the resources using both multi-threading and distributed computing.

Build

  • Run bash build.sh (you can also type make)

  • If you have C++11 compiler, it is recommended to type make cxx11=1

    • C++11 is not used by default
  • If your compiler does not come with OpenMP support, it will fire an warning telling you that the code will compile into single thread mode, and you will get single thread xgboost

  • You may get a error: -lgomp is not found

    • You can type make no_omp=1, this will get you single thread xgboost
    • Alternatively, you can upgrade your compiler to compile multi-thread version
  • Windows(VS 2010): see windows folder

    • In principle, you put all the cpp files in the Makefile to the project, and build
  • OS X:

    • For users who want OpenMP support using Homebrew, run brew update (ensures that you install gcc-4.9 or above) and brew install gcc. Once it is installed, edit Makefile by replacing:
    export CC  = gcc
    export CXX = g++
    

    with

    export CC  = gcc-4.9
    export CXX = g++-4.9
    

    Then run bash build.sh normally.

    export CC  = gcc
    export CXX = g++
    

    with

    export CC  = /usr/local/bin/gcc
    export CXX = /usr/local/bin/g++
    

    Then run bash build.sh normally. This solution is given by Phil Culliton.

Build with HDFS and S3 Support

  • To build xgboost use with HDFS/S3 support and distributed learnig. It is recommended to build with dmlc, with the following steps
    • git clone https://github.com/dmlc/dmlc-core
    • Follow instruction in dmlc-core/make/config.mk to compile libdmlc.a
    • In root folder of xgboost, type make dmlc=dmlc-core
  • This will allow xgboost to directly load data and save model from/to hdfs and s3
    • Simply replace the filename with prefix s3:// or hdfs://
  • This xgboost that can be used for distributed learning

Version

  • This version xgboost-0.3, the code has been refactored from 0.2x to be cleaner and more flexibility
  • This version of xgboost is not compatible with 0.2x, due to huge amount of changes in code structure
    • This means the model and buffer file of previous version can not be loaded in xgboost-3.0
  • For legacy 0.2x code, refer to Here
  • Change log in CHANGES.md

XGBoost in Graphlab Create

xgboost's People

Contributors

tqchen avatar pommedeterresautee avatar hetong007 avatar giuliohome avatar kalenhaha avatar antinucleon avatar cblsjtu avatar yepyao avatar zygmuntz avatar white1033 avatar tcfuji avatar jseabold avatar chrissly31415 avatar aldanor avatar smly avatar mhue avatar travisbrady avatar khotilov avatar ericchendm avatar nagadomi avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.