Information Gain Calculator
Calculate information gain, gain ratio, Gini gain, and feature importance for decision trees.
Inputs
bits
bits
bits
Results
Information Gain
0.38 bits
Information Gain Ratio
0.391
Feature Importance
0.38
Gini Gain-0.24
Entropy Reduction38%
Information Gain
0.38 bits
Information Gain Ratio
0.391
Feature Importance
0.38
How to Use This Calculator
- Start by filling in the input fields below. Results update instantly as you type, so you can experiment with different values to see how they affect the outcome.
- Parent Entropy — Entropy of parent node Accepts values from 0 bits to 10 bits (default: 1 bits).
- Left Child Entropy — Entropy of left child node Accepts values from 0 bits to 10 bits (default: 0.5 bits).
- Right Child Entropy — Entropy of right child node Accepts values from 0 bits to 10 bits (default: 0.8 bits).
- Left Weight — Proportion of samples in left child Accepts values from 0 to 1 (default: 0.6).
- Right Weight — Proportion of samples in right child Accepts values from 0 to 1 (default: 0.4).
- Once all inputs are set, review your results in the Results panel. Here's what each output means:
- Information Gain — shown as a numeric value. This is the primary result of this calculator.
- Information Gain Ratio — shown as a numeric value. This is the primary result of this calculator.
- Feature Importance — shown as a numeric value. This is the primary result of this calculator.
- Gini Gain — shown as a numeric value.
- Entropy Reduction — shown as a percentage.
- Explore the related calculators below if you need deeper analysis or want to approach this topic from a different angle.
Ad Placeholder
Formula
IG = H(Parent) - Σ(Weight × H(Child))Related Calculators
Ad Placeholder