This post explores the expected value framework using inference data to understand decision-making under uncertainty in ML contexts. More specifically, this is a demo to demonstrate a business-oriented optimization of a classification threshold for a binary classification model used for email marketing campaign optimization.
[Read More]
Understanding ML Models with Counterfactual Explanations
A Deep Dive into Interpretable AI
In the era of complex machine learning models, understanding why a model makes certain predictions has become increasingly important. While traditional approaches focus on global model interpretability, counterfactual explanations offer a unique perspective by answering the question: “What changes would be needed to achieve a different outcome?”
[Read More]
An Introduction to Recommender Systems
Collaborative Filtering
The primary goal of a recommender system is to increase “product” sales. Recommender systems are, after all, utilised by merchants to increase their profits. Although the primary goal of a recommendation system is to increase revenue for the merchant, this is often achieved in ways that are less obvious than...
[Read More]
Central Limit Theorem - A Magic Wand for Inference
How the CLT turns any data into gold!
The central limit theorem (CLT) is a powerful tool that allows us to make inferences about populations, even if we don’t know the exact distribution of the population. Just wave your wand (i.e., use the CLT) over a sample of data, and the distribution of the sample means will be...
[Read More]
The Law of Large Numbers!
How to Make Sense of Randomness.
The law of large numbers is a statistical theorem that states that as the number of identically distributed, randomly generated variables increases, their sample mean approaches their theoretical mean.
[Read More]