--- title: Entropy id: 599d15309e88c813a40baf58 challengeType: 5 --- ## Description
Task:

Calculate the Shannon entropy H of a given input string.

Given the discreet random variable $X$ that is a string of $N$ "symbols" (total characters) consisting of $n$ different characters (n=2 for binary), the Shannon entropy of X in bits/symbol is :

$H_2(X) = -\sum_{i=1}^n \frac{count_i}{N} \log_2 \left(\frac{count_i}{N}\right)$

where $count_i$ is the count of character $n_i$.

## Instructions
## Tests
```yml tests: - text: entropy is a function. testString: assert(typeof entropy === 'function', 'entropy is a function.'); - text: entropy("0") should return 0 testString: assert.equal(entropy('0'), 0, 'entropy("0") should return 0'); - text: entropy("01") should return 1 testString: assert.equal(entropy('01'), 1, 'entropy("01") should return 1'); - text: entropy("0123") should return 2 testString: assert.equal(entropy('0123'), 2, 'entropy("0123") should return 2'); - text: entropy("01234567") should return 3 testString: assert.equal(entropy('01234567'), 3, 'entropy("01234567") should return 3'); - text: entropy("0123456789abcdef") should return 4 testString: assert.equal(entropy('0123456789abcdef'), 4, 'entropy("0123456789abcdef") should return 4'); - text: entropy("1223334444") should return 1.8464393446710154 testString: assert.equal(entropy('1223334444'), 1.8464393446710154, 'entropy("1223334444") should return 1.8464393446710154'); ```
## Challenge Seed
```js function entropy (s) { // Good luck! } ```
## Solution
```js function entropy(s) { // Create a dictionary of character frequencies and iterate over it. function process(s, evaluator) { let h = Object.create(null), k; s.split('').forEach(c => { h[c] && h[c]++ || (h[c] = 1); }); if (evaluator) for (k in h) evaluator(k, h[k]); return h; } // Measure the entropy of a string in bits per symbol. let sum = 0, len = s.length; process(s, (k, f) => { const p = f / len; sum -= p * Math.log(p) / Math.log(2); }); return sum; } ```