Shark Random Forest vs Weka - 慢速和低精度问题

时间:2013-12-16 09:56:20

标签: c++ machine-learning weka random-forest shark-ml

我希望获得比Weka更快的随机森林分类器,所以我只是尝试了Shark(我不能使用像wiseRF那样的商业分类器)。我知道在Weka上有一个替代的RF分类器,但我期望使用这个着名的C ++快速库获得更好的结果。我的第一批结果是:

Training time: Weka > 170s VS Shark > 168s
Prediction results on the same test set: Weka > 90,1% correctly classified VS Shark > accuracy of 0.510824 (!!!!)

这听起来很疯狂,所以我相信必须有很多改进这些结果。

我得到了推荐的代码来使用这样的鲨鱼:

svn co https://svn.code.sf.net/p/shark-project/code/trunk/Shark
sudo apt-get install cmake cmake-curses-gui libatlas-base-dev libboost-all-dev

我使用建议的选项成功运行了cmake:

cmake -DOPT_ENABLE_ATLAS=ON -DOPT_ENABLE_OPENMP=ON

我使用了doc“RFTutorial.cpp”中的基本示例,并添加了“trainer.setNTrees(100);”为了获得与我的Weka测试中相同数量的森林。

我使用了基本的Makefile建议并添加了:

SHARK_ROOT = /home/doxav/Shark
CPPFLAGS   =  -I${BOOST_ROOT} -I${SHARK_ROOT}/include -Wall
CPPFLAGS  += -DNDEBUG -DBOOST_UBLAS_NDEBUG  -O3
LDFLAGS   += -L${SHARK_ROOT}/lib/ -L${BOOST_ROOT}/lib
LDLIBS     = -lshark -lgomp
LDLIBS    += -lboost_serialization -lboost_system -lboost_filesystem -lboost_program_options
CC         =  g++

运行我的makefile时,我得到g ++的以下命令行:

g++  -I -I/home/xavier/Shark/include -Wall -DNDEBUG -DBOOST_UBLAS_NDEBUG  -O3 -L/home/xavier/Shark/lib/ -L/lib  RFTest.cpp  -lshark -lgomp -lboost_serialization -lboost_system -lboost_filesystem -lboost_program_options -o RFTest

我不得不调整我在Weka中使用的CSV文件,使其在Shark上运行,因为它显然不接受字符串:

Weka =>
225,#225,138.6,-648,225,0.410451,#2,0,0,0.0256,0.0256,0.15411,?,?,0.045524,0.006503,0.002223,0.782222,1.328889,?,1.017778,0.617778,0,-11,?,-6,-5,176,116,-1430,0,0,0.170455,0.170455,0.136174,?,?,0.041649,0.00595,0.001192,299,269,-659,0,0,0.006689,-0.143509,0.23395,?,?,0.015899,-0.005781,0.002956,?,?,?,?,?,?,?,?,?,?,?,?,?,229,139,-653,0,0,0.026201,0.026201,0.093029,?,?,0.047562,0.006795,0.000937,139,79,-13945,0,0,0,0,-0.094604,?,?,?,?,0.001049,#225

Shark (I removed ? (used for unknown values) and # (I use it to force Weka to interpret some numeric values as nominal out of the box)) =>
225,225,138.6,-648,225,0.410451,2,0,0,0.0256,0.0256,0.15411,,,0.045524,0.006503,0.002223,0.782222,1.328889,,1.017778,0.617778,0,-11,,-6,-5,176,116,-1430,0,0,0.170455,0.170455,0.136174,,,0.041649,0.00595,0.001192,299,269,-659,0,0,0.006689,-0.143509,0.23395,,,0.015899,-0.005781,0.002956,,,,,,,,,,,,,,229,139,-653,0,0,0.026201,0.026201,0.093029,,,0.047562,0.006795,0.000937,139,79,-13945,0,0,0,0,-0.094604,,,,,0.001049,225

0 个答案:

没有答案
相关问题