Information Gain Calculator
Calculate information gain, gain ratio, Gini gain, and feature importance for decision trees.
Inputs
bits
bits
bits
Results
Information Gain
0.38 bits
Information Gain Ratio
0.391
Feature Importance
0.38
Gini Gain-0.24
Entropy Reduction38%
How to Use This Calculator
- Enter the class distribution of the parent node (e.g., 40 positive, 60 negative).
- Input the class distribution for each child node after splitting on a feature.
- Review the calculated entropy of the parent, entropy of each child, and information gain.
- Select the feature with the highest information gain for the next decision tree split.
- Use Gini impurity as an alternative splitting criterion for faster computation in large datasets.
Ad Placeholder
Formula
IG = H(Parent) - Ξ£(Weight Γ H(Child))Related Calculators
Ad Placeholder