🔟Micro-Segmentation

Discover patterns to optimize KPI With the Micro-Segmentation feature, you can discover actionable rules to optimize any KPI. The KPI mentioned here refers to the target variable of the analysis. Thus

1. What is Micro-Segmentation?

Professionals always ponder about optimizing their tasks. Imagine a marketing executive in a company who, during a campaign, identifies rules to segregate groups that are likely to respond enthusiastically and those that won't. Such insights would allow the company to efficiently target the potential group.

Wouldn't it be wonderful if one could automatically identify ways to optimize desired goals?

HEARTCOUNT provides Micro-Segmentation feature, which discovers rules to optimize KPIs automatically, recommending conditions for optimal execution. This feature leverages the decision tree algorithm, a prominent machine learning technique used to classify two different groups.

What is a Decision Tree?

A decision tree algorithm designs models to classify distinct groups. Simply put, it delineates spaces to group similar entities together. The significance of this algorithm is that it formulates optimal classification rules using as few variables as possible, considering real-world datasets often comprise numerous variables.

  • Want to know more about Decision Trees? (Click)

2. Utilizing Micro-Segmentation

The example below illustrates the process of automatically discovering rules to optimize profits in a discount mart. Let's explore how to find optimization rules for a particular KPI using Micro-Segmentation.

(1) Finding Optimal Rules

Begin by selecting the variable you wish to optimize (e.g., [Profit]). Clicking the 'analyze' button on the top right will automatically suggest rules to classify the top 20% and bottom 20% in the table.

💡 Micro-segmentation provides 'optimization rules' for numeric target variables by automatically segregating them into top/bottom groups. For categorical variables, classification rules are offered, which can be set as follows.

(Optional)

At its core, micro-segmentation aims to differentiate between two distinct target groups and establish a set of 'rules' for their classification. While we provide a default split of the top 20% and bottom 20% groups, users have the flexibility to define their own target groups. Simply use the cogwheel ⚙️ icon to adjust the range for the groups you're interested in. You can easily slide the range with your mouse or input specific values.

Dive Deep into the Table Details

Each entry in our table comes with its unique information. As a quick guide, entries are ranked in descending order based on their target ratio.

Let's break them down:

  1. Purity: Represents the proportion of a specific target group within a larger segment, giving insights into the accuracy of a classification rule. For instance, consider how the subgroup with 'less than 12% discount rate and in the top 20% profit' fits into the broader group having 'a discount rate less than 12%.'

  2. Target Ratio: It's the proportion of segments that adhere to a particular classification rule within the main target group. For example, think of how the segment with 'a discount rate less than 12%' compares to the larger segment 'in the top 20% of profits.'

  3. Average KPI Difference: This metric shows the gap between the average value of a specific segment and the overall target variable.

  4. Smart Search (▽): Want more detailed insights? Click on the ▽ next to each row. This action will lead you to our 'Smart Search' feature, allowing you to dive deeper into individual data records for each rule. 👉 Intrigued by Smart Search? Give it a click!

A Glimpse into the Predictive Model's Summary

For a concise overview, click on the downward arrow (∨) highlighted in a red box in the provided visual.

This summary gives you insights into either the default target groups or any custom ones you've crafted.

🚨 It's crucial to note that the summary's target ratio varies from the individual segments mentioned previously. Although the overall target remains constant, individual segment targets might differ. In this summary, we amalgamate all the derived rule segments and then divide by the total target. To illustrate, if the top 20% profit group stands at a size of 1.64k, and the total derived segments sum up to 1.53k, it signifies that an impressive 93.6% of the top 20% profit group aligns with significant classification rules.

(2) Visualization Graphs

Users can understand the provided rules more intuitively through visualized graphs. The image below showcases the visualization of the rule at the top of the table, which is based on the 'Profit Optimization' criterion. It specifically highlights the rule where Sales is more or equal to 86.19, the discount rate is less than 0.12, State is Alabama, Arkansas, and so on.

  1. Class: Records classified under the condition of State = [Alabama, Arkansas, ..., Wyoming], 86.19 <= Sales, Discount Rate < 0.12 predominantly belong to the Top 20% Profit group.

  2. Size: The number of records satisfying the condition (1,449 records).

  3. Purity: Among the segment size of 1,449, 1,427 (98.48%) belong to the Top 20% Profit group. A higher purity signifies greater reliability in the analysis outcome. In essence, it's a metric indicating the precision of the classification rule.

  4. Target Ratio: Out of the total records in the Top 20% Profit group (2,000), 1,427 (71.35%) fall under this segment.

  5. Average KPI Difference: The difference between the average KPI of the records within this segment and the overall average KPI.

  6. Segment Classification Rule: Detailed representation of the classification rule presented in the first table in the image above.

  7. Visualization Graph: A graphical representation of the derived profit optimization condition, with the blue region highlighting the classification rule's domain.

  8. Decision Tree: Clicking the icon displays the classification rules in a tree structure.

(3) Exporting as a File

  • Clicking on SQL will allow users to view the prediction model as a query.

  • Clicking the table icon will save the prediction model as a CSV or XLSX file.

Last updated