-
Couldn't load subscription status.
- Fork 13
Description
Hi there,
is there any elaborated way to obtain SHAP Feature Importance using shapper?
Reading this https://christophm.github.io/interpretable-ml-book/shap.html#shap-feature-importance
...I would guess, doing a loop over "shapper::individual_variable_effect" and mean() the results of attributions per vname could do the trick.
Am I wrong?
Is there any plan to integrate the original functions, like summary_plot to obtain SHAP feature importance?
By the way, when I try to feed the function individual_variable_effect with multiple new observations new_observation = testX[1:5, ] I get errors.
Error in $<-.data.frame(tmp, "_attribution_", value = c(0, -0.365675633989662, : replacement has 140 rows, data has 70