Welcome to EXPAI

When you create a free account, you will have access to a sample project that contains a toy example model that you can use to create your first explanations without further configuration.

Understanding the project

The example behind this project is fairly simple. Imagine we are a fintech company trying to classify our clients into 5 different groups. For this task, we have created a ML model that predicts with high accuracy to which group each client belongs.

Of course, the data and the model have been simplified to illustrate the potential of Explainable AI rather than a proper ML training process.

Now that we understand the use case we are trying to solve, it is time to open your account and the welcome project inside it. You will see that it contains:

  • The ML model

  • The dataset

  • 3 example explanations that we have created for you

This project is protected so you will not be able to modify its content. Remember that you can test out EXPAI using your own models and data if you create a project from scratch.

Motivating the need for Explainable AI

To motivate the need for Explainable AI and the explanations we will generate later, let me present three different profiles within the ML pipeline and their potential needs:

  • Data Scientists: probably need to verify that learning is optimal, and that predictions follow the business rules they were told to automate. If predictions do not, they must have arguments to support the findings in their ML modeling process.

  • Business Leaders: require non-technical insights to verify that the model they are going to acquire works as expected.

  • Decision Makers: if someone needs to make relevant decisions based on the outputs of the model, they may ask themselves why the predictions were made and which are the best actionable features to achieve their desired objective.

  • Final user: imagine predictions were used to determine whether a client gets or not fees for their credit card. What if they ask for the reasons for the decision? What if they want to know what they need to change in order to get a free credit card? We need Explainable AI to answer these questions and ensure transparency along the process.

Obtaining valuable insights with EXPAI

Recall the project is ready to be used and you do not need any specific ML skills to start generating explanations. We will now try to motivate some of the insights that we may get from our ML model using Explainable AI. However, there is much more for you to explore and discover.

How to generate your first explanations

1. Visit the "Explanations" section within the project and click on "New explanation"

2. Select the model and your target class

The first step in the generation process is selecting the model by clicking on it. Then, we need to choose which is the group we want to explain with respect to. Depending on this selection, we will get different explanations. For instance, if we select "Group 0", then we will see the most important variables to classify a sample within this group. This will be potentially different across groups. You will be able to modify your selection later.

3. Click on the dataset

In this step, we select the data we want to use for the explanation. For this, we click on the dataset item and click "Next".

Optional: We can filter down our data using the table on the right to explain how our model behaves with respect to subgroups that may be represented in our data. For instance, we may generate explanations for men and women separately to spot potential differences.

4. Generate any explanation

In this last step, you get to choose what explanation you wish to generate on the selected model and for the selected data. You can see more details of each of the explanations in "Our explanations" section on the left menu. In the following section we present some examples and their interpretation.

Note: you can change your output class at any moment using the button on the bottom.

Example Explanations

Finally, we present the explanations you will find as examples and give some intuitions behind them to help you get started with Explainable AI.

Most important variables for Group 0

In this first explanation, we represent the most important features to determine whether some client belongs to Group 0 (for this, we selected Group 0 in step 2).

We can determine that the Annual Income is the most relevant feature for Group 0. A question that may naturally arise now is, how does it impact? People from which Annual Income belong to Group 0 according to our model? To answer this, we will generate the following explanation.

Impact of Annual Income for Group 0

With this new explanation we can fully understand what our model learnt about Group 0. The most relevant feature is having an Annual Income between $40k and $60k.

The y-axis represents the probability of belonging to Group 0 against all other classes for each value that Annual Income can take (x-axis).

Below, we see a histogram showing how many data points we are considering for each value of the Annual Income. We may expect results on regions with more data to be more meaningful.

Why a certain client is classified as Group 0?

In the "Prediction Explanation", we get to choose for which sample we want to understand its prediction. We select the second entry in the dataset and visualize the output.

This explanations shows that the average probability of belonging to Group 0 is 0.39. However, this sample got 0.94 (it is very likely that it comes from Group 0). The contribution of each feature is plotted below. In this case, we see that we capture the interaction between variables. This means that the impact they have appearing together is larger than if we consider them separately.

The most relevant feature is having an Annual Income of $62k while being a male. If we recall our previous insights, this value for the Annual Income is within the range that our model considers very characteristic from this group.

What comes next?

Our team is happy to have guided you up to here. However, now it is time to say goodbye and let you explore our tool. If you want some ideas to try next, these are some we came up with:

  • Repeat the process for a different group. What differences do you spot?

  • What happens if you change the annual income of the second sample (the one we explained before) to be outside the relevant range for Group 0. You can try our What If tool for this.

  • Is gender relevant for some predictions? Should it be?

Also, remember we have a tutorial to create a new project completely from scratch to try out our tool with your custom models and datasets.

If you are a developer, you may want to try our API directly to use EXPAI in production environments.

Last updated