计算给定输入字符串的香农熵H.
给定谨慎的随机变量$ X $,它是$ N $“符号”(总字符)的字符串,由$ n $个不同的字符组成(对于二进制,n = 2),位/符号中X的香农熵是:
$ H_2(X)= - \ sum_ {i = 1} ^ n \ frac {count_i} {N} \ log_2 \ left(\ frac {count_i} {N} \ right)$
其中$ count_i $是字符$ n_i $的计数。
entropy
是一种功能。
testString: 'assert(typeof entropy === "function", "entropy
is a function.");'
- text: entropy("0")
应该返回0
testString: 'assert.equal(entropy("0"), 0, "entropy("0")
should return 0
");'
- text: entropy("01")
应该返回1
testString: 'assert.equal(entropy("01"), 1, "entropy("01")
should return 1
");'
- text: entropy("0123")
应该返回2
testString: 'assert.equal(entropy("0123"), 2, "entropy("0123")
should return 2
");'
- text: entropy("01234567")
应该返回3
testString: 'assert.equal(entropy("01234567"), 3, "entropy("01234567")
should return 3
");'
- text: entropy("0123456789abcdef")
应返回4
testString: 'assert.equal(entropy("0123456789abcdef"), 4, "entropy("0123456789abcdef")
should return 4
");'
- text: entropy("1223334444")
应返回1.8464393446710154
testString: 'assert.equal(entropy("1223334444"), 1.8464393446710154, "entropy("1223334444")
should return 1.8464393446710154
");'
```