Skip to main content
Calcimator

Information Gain Calculator

Calculate information gain, gain ratio, Gini gain, and feature importance for decision trees.

Inputs

bits
bits
bits

Results

Information Gain

0.38 bits

Information Gain Ratio

0.391

Feature Importance

0.38

Gini Gain-0.24
Entropy Reduction38%
How to Use This Calculator
  1. Enter the class distribution of the parent node (e.g., 40 positive, 60 negative).
  2. Input the class distribution for each child node after splitting on a feature.
  3. Review the calculated entropy of the parent, entropy of each child, and information gain.
  4. Select the feature with the highest information gain for the next decision tree split.
  5. Use Gini impurity as an alternative splitting criterion for faster computation in large datasets.
Ad Placeholder

Formula

IG = H(Parent) - Ξ£(Weight Γ— H(Child))

Related Calculators

Ad Placeholder