Posts
Interpreting Isolation Forest’s predictions — and not only The problem: how to interpret Isolation Forest’s predictions More specifically, how to tell which features are contributing more to the predictions. Since Isolation Forest is not a typical Decision Tree (see Isolation Forest characteristics here) , after some research, I ended up with three possibl...
Machine Learning Interpretability — Shapley Values with PySpark
Mar 20, 2021
586 views