Comments (13)
Unfortunately, the problem remains the same. I removed two GradientBoostingForest node and the except is not throwed. But one same exception is throwed by RandomForest and it continues running.
Do you need the training file for debugging? If it is pretty sure that the problem is caused by the index, I will use a script to check if there is a broken index in my training file.
from stacknet.
Apologies for late response. I am still working on these things. The error in your case was triggered because you have some elements with 'zero' values. like col_index:0.00000 . StackNet are not expecting zero values in the sparse format . However with the newer version I will release, this will be taken care of. Apologies for coming back late again. I did not forget you- it is just I am buried with tasks these days.
from stacknet.
Thanks for the help.
from stacknet.
Is that with Sparse data?
I know why this happens. It is because (probably) the indices in your data are not sorted.
E.g. if you look at your training file is it like that:?
56:1.0 120:1.0 103212:1.0 241234123:1.0 // e.g. column increase from left to right
or
120:1.0 103212:1.0 56:1.0 241234123:1.0 // e.g. order is not respected?
I will update this, via raising an error. In the mean time (if that is the error) can you just sort the indices before creating the sparse file?
from stacknet.
Hi. I checked for a few samples and it seems the column index is right.
I used ssp.hstack to add new features horizontally.
from stacknet.
can you use this updated function to print your sparse data (with python) ?
import numpy as np
from scipy.sparse import csr_matrix,csc_matrix
def fromsparsetofile(filename, array, deli1=" ", deli2=":",ytarget=None):
zsparse=csr_matrix(csc_matrix(array))
indptr = zsparse.indptr
indices = zsparse.indices
data = zsparse.data
print(" data lenth %d" % (len(data)))
print(" indices lenth %d" % (len(indices)))
print(" indptr lenth %d" % (len(indptr)))
f=open(filename,"w")
counter_row=0
for b in range(0,len(indptr)-1):
#if there is a target, print it else , print nothing
if ytarget!=None:
f.write(str(ytarget[b]) + deli1)
for k in range(indptr[b],indptr[b+1]):
if (k==indptr[b]):
if np.isnan(data[k]):
f.write("%d%s%f" % (indices[k],deli2,-1))
else :
f.write("%d%s%f" % (indices[k],deli2,data[k]))
else :
if np.isnan(data[k]):
f.write("%s%d%s%f" % (deli1,indices[k],deli2,-1))
else :
f.write("%s%d%s%f" % (deli1,indices[k],deli2,data[k]))
f.write("\n")
counter_row+=1
if counter_row%10000==0:
print(" row : %d " % (counter_row))
f.close()
Example:
fromsparsetofile(path + "file.sparse", my_sparse_array, deli1=" ", deli2=":",ytarget=target)
from stacknet.
Thanks for your immediate response. I will have a quick check and update later.
from stacknet.
Hm. Are you using an "L1" model somewhere in your mix ? These should be put at the end on each level as they mess up the indices - I have discovered that now . Please send me a subset of the data that replicates this . Thank you.
from stacknet.
I use the paramsv1.txt so the "L1" problem is not my case?
Please use this link to download the subset.
from stacknet.
Ok. I have found the error...it is really stupid...I don't know why it happens, but rounding seems to be inconsistent... can you please add rounding:20
to all tree-based models until I fix this?
EDIT: Even that won't fix it . I need to make a new release.
from stacknet.
Thanks. I just tried and seems not working either.
I will try the new release asap.
from stacknet.
@kaz-Anova Hi. Is there any I can help to fix this bug?
from stacknet.
I had the same problem, and i don‘t know why
from stacknet.
Related Issues (20)
- Something wrong with labeling in classifier HOT 2
- Gini for metric output HOT 1
- java.lang.IllegalStateException: Tree is not fitted HOT 1
- Amazon example
- How to convert sklearn sparse matrix to libsvm format? HOT 1
- Do the meta classifiers (learners) learn on the predicted probabilities of the base learners or the class labels predicted by the base learners? HOT 2
- likelihood encoding and stacking HOT 1
- java.lang.OutOfMemoryError: GC overhead limit exceeded HOT 2
- ffmRegressor suport or how to change LibffmClassifier to Regressor?
- Pilon terminated with "Exception in thread "main" java.lang.reflect.InvocationTargetException" error
- Exception in thread "main" java.lang.reflect.InvocationTargetException HOT 2
- Where to include the target labels of the dataset? HOT 1
- can't figure this error out? HOT 6
- Explanation of (double) HOT 1
- Failed to generate sparse file HOT 8
- Fail to train a regression task? HOT 5
- README: Missing information
- Help please HOT 1
- Custom objective function in StackNet
- Hyperparameter Tuning
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from stacknet.