Get the latest tech news

A Non-Technical Guide to Interpreting SHAP Analyses


With interpretability becoming an increasingly important requirement for machine learning projects, there's a growing need for the complex outputs of techniques such as SHAP to be communicated to non-technical stakeholders.

As we shall see in the sections below, SHAP values reveal interesting insights into how the input variables are impacting the predictions of the machine learning model, both at the level of individual instances and across the population as a whole. [ full-size image]Force plots are useful for examining explanations for multiple instances of the data at once, as their compact construction allows for outputs to be stacked vertically for ease of comparison (Figure 6). [ full-size image] The goal of global interpretation methods is to describe the expected behaviour of a machine learning model with respect to the whole distribution of values for its input variables.

Get the Android app

Or read this on Hacker News

Read more on:

Photo of shap analyses

shap analyses

Photo of non-technical guide

non-technical guide