feature_importances deems gender umimportant

Hi all I’m very new to all of this, but I just can’t get my head around this. It is clear that Sex has a big impact on the prediction accuracy, as simply stating that all females will survive results in about 0.76 accuracy. So I’m using sk-learn and have been experimenting with the GradientBoostClassifier. When I train on 80pct of the training set, and use the other 20pct for testing, I usually get a score of around 0.81. classifier = GradientBoostingClassifier(n_estimators=700) classifier.fit(X_train, y_train) print classifier.feature_importances_ What I don’t understand is that feature_importances are showing me something like this: [ 0.04224973 0.03318767 0.42367892 0.05388107 0.03522496 0.07889192 0.01812447 0.11699406 0.0070662 0.01845042 0.01065627 0.11439321 0.03316843 0.01403267] These are the features I used for above run (and no, I didn’t move any features…

Link to Full Article: feature_importances deems gender umimportant

Pin It on Pinterest

Share This

Join Our Newsletter

Sign up to our mailing list to receive the latest news and updates about homeAI.info and the Informed.AI Network of AI related websites which includes Events.AI, Neurons.AI, Awards.AI, and Vocation.AI

You have Successfully Subscribed!