public interface SHAP<T>
SHAP leverages local methods designed to explain a prediction
f(x)
based on a single input x
.
The local methods are defined as any interpretable approximation
of the original model. In particular, SHAP employs additive feature
attribution methods.
SHAP values attribute to each feature the change in the expected
model prediction when conditioning on that feature. They explain
how to get from the base value E[f(z)]
that would be
predicted if we did not know any features to the current output
f(x)
.
In game theory, the Shapley value is the average expected marginal contribution of one player after all possible combinations have been considered.
Modifier and Type | Method and Description |
---|---|
default double[] |
shap(java.util.stream.Stream<T> data)
Returns the average of absolute SHAP values over a data set.
|
double[] |
shap(T x)
Returns the SHAP values.
|
double[] shap(T x)
p x k
, where p
is the number of
features and k
is the classes. The first k elements are
the SHAP values of first feature over k classes, respectively. The
rest features follow accordingly.x
- an instance.default double[] shap(java.util.stream.Stream<T> data)