Releases
v0.60
This is a stable release of 0.6 version
tqchen
released this
29 Jul 18:25
Changes
Version 0.5 is skipped due to major improvements in the core
Major refactor of core library.
Goal: more flexible and modular code as a portable library.
Switch to use of c++11 standard code.
Random number generator defaults to std::mt19937
.
Share the data loading pipeline and logging module from dmlc-core.
Enable registry pattern to allow optionally plugin of objective, metric, tree constructor, data loader.
Future plugin modules can be put into xgboost/plugin and register back to the library.
Remove most of the raw pointers to smart ptrs, for RAII safety.
Add official option to approximate algorithm tree_method
to parameter.
Change default behavior to switch to prefer faster algorithm.
User will get a message when approximate algorithm is chosen.
Change library name to libxgboost.so
Backward compatiblity
The binary buffer file is not backward compatible with previous version.
The model file is backward compatible on 64 bit platforms.
The model file is compatible between 64/32 bit platforms(not yet tested).
External memory version and other advanced features will be exposed to R library as well on linux.
Previously some of the features are blocked due to C++11 and threading limits.
The windows version is still blocked due to Rtools do not support std::thread
.
rabit and dmlc-core are maintained through git submodule
Anyone can open PR to update these dependencies now.
Improvements
Rabit and xgboost libs are not thread-safe and use thread local PRNGs
This could fix some of the previous problem which runs xgboost on multiple threads.
JVM Package
Enable xgboost4j for java and scala
XGBoost distributed now runs on Flink and Spark.
Support model attributes listing for meta data.
Support callback API
Support new booster DART(dropout in tree boosting)
Add CMake build system
You can’t perform that action at this time.