Loading the page...
Preparing tools and content for you. This usually takes a second.
Preparing tools and content for you. This usually takes a second.
Fetching calculator categories and tools for this section.
Free entropy calculator for discrete probability distributions. Compute Shannon entropy in bits, nats, or hartleys; normalized probabilities; perplexity; and efficiency relative to a uniform reference on the support.
Last updated: April 13, 2026
Need STEM widgets for your course or product? Get a Quote
Separate with commas or spaces. If values do not sum to ~1, they are treated as nonnegative weights.
Entropy H
2 bits (log₂)
Perplexity
4
Support size
4
Max H (uniform)
2
Efficiency H/H_max
1
Normalized pᵢ
0.25, 0.25, 0.25, 0.25
Convention
0 log 0 is treated as 0. Entropy is maximized (for fixed support size) by the uniform distribution.
Standard information-theoretic entropy for finite alphabets, with your choice of logarithm base.
Interprets entropy as the equivalent number of uniform choices in the same logarithmic scale.
Efficiency compares your H to the entropy of a uniform distribution over the positive-probability outcomes.
Paste comma- or space-separated values; the tool detects whether normalization is needed.
Jump directly to the maximum-entropy distribution on n labels—handy for coding and games of chance.
Shows normalized probabilities so students can verify inputs after weight scaling.
Fair four-way outcome: 0.25 each, base 2
H = 2 bits
Values are parsed into nonnegative numbers. If they already sum to approximately 1, they are treated as probabilities (renormalized for tiny drift). Otherwise they are interpreted as weights and scaled to a probability vector. Entropy sums −p log p over positive p only. Browse more on our Math & Science calculators index.
H = -Σ p_i log(p_i)Perplexity = b^H (b = 2, e, or 10)Uniform on n symbols: H = log(n), perplexity = nShare it with anyone learning information theory or ML basics
Suggested hashtags: #InformationTheory #Entropy #Statistics #MachineLearning #Education