Hacker News new | past | comments | ask | show | jobs | submit login

Clearly both are equally important, 100% necessary. This doesn't account for rarity, nor does it account for wages, agreeability, smell or any of the other things it isn't trying to measure. You'll need a different metric for that and if you want to take both into account you should.





Shapley values try to measure importance of contributions, and for this, bare necessity isn't a sufficient indicator. I think it comes down to probability. The task of the surgeon is, from a prior perspective, less likely to be fulfilled because it is harder to get hold of a surgeon.

Similarly: What what was the main cause of the match getting lit? The match being struck? Or the atmosphere containing oxygen? Both are necessary in the sense that if either hadn't occurred the match wouldn't be lit. But it seems clear that the main cause was the match being struck, because matches being struck is relatively rare, and hence unlikely, while the atmosphere contains oxygen pretty much always.

So I think the contributions calculated for Shapley values should be weighted by the inverse of their prior probabilities. Though it is possible that such probabilities are not typically available in the machine learning context in which SHAP operates.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: