Model Interpretation

class ai4water.postprocessing.interpret.Interpret(model)[source]

Bases: ai4water.utils.visualizations.Plot

Interprets the ai4water Model.

__init__(model)[source]
Parameters

model – an instance of ai4water’s Model

compare_xgb_f_imp(calculation_method='all', rescale=True, figsize: Optional[tuple] = None, backend: str = 'matplotlib', show: bool = False, **kwargs)[source]

compare various feature importance calculations methods that are built in in XGBoost

f_importances_svm(coef, names, save)[source]
feature_importance()[source]
get_enc_var_selection_weights(data='test')[source]

Returns encoder variable selection weights of TFT model

interpret_example_tft(example_index, data='test', show=False)[source]

interprets a single example using TFT model.

Parameters
  • example_index – index of example to be explained

  • data – the data whose example to interpret.

  • show – whether to show the plot or not

interpret_tft(data='test')[source]

global interpretation of TFT model.

Parameters

data – the data to use to interpret model

property model
plot_feature_importance(importance=None, save=True, show=False, use_xgb=False, max_num_features=20, figsize=None, **kwargs)[source]
tft_attention_components(data='test')[source]

Gets attention components of tft layer from ai4water’s Model.

Parameters

data – the data to use to calculate attention components

Returns

  • dict – dictionary containing attention components of tft as numpy arrays. Following four attention components are present in the dictionary

    • decoder_self_attn: (attention_heads, ?, total_time_steps, 22)

    • static_variable_selection_weights:

    • encoder_variable_selection_weights: (?, encoder_steps, input_features)

    • decoder_variable_selection_weights: (?, decoder_steps, input_features)

  • str – a string indicating which data was used