# Entropy Bias Calculator

When you roll an eight sided dice exactly 3 bits of entropy are generated.

A six sided dice generates 2.585 bits of entropy for each roll. How does being a non-integer number of bits affect the entropy?

## Test setup

times in a row. Repeat this to create samples.

Use

Align the

## Theoretical results

Each sample of 62 events is a base 6 number totalling 160.26768 bits of entropy, ie 62 × log2(6).

A 160 bit number would be expected 83.066% of the time

and a 161 bit number would be expected 16.934% of the time.

Why?

The largest number that can be generated is:

largest_possible_sample = base6(555 ... 62 digits)

In binary this is base2(100110100 ... 161 digits)

How many 161 bit samples could be generated?

total_161_bit_samples = largest_possible_sample - base2(111 ... 160 digits)

The portion of 161 bit samples is:

total_161_bit_samples / sample_space_size = 16.934%

In the javascript developer console for this tool, the portion can be calculated like this:

``````
let baseN = 6;
let base2 = 2;
let largest_possible_sample = new BigNumber("".padStart(62, "5"), baseN);
let sample_space_size = largest_possible_sample.plus(1);
let max160 = new BigNumber("".padStart(160, "1"), base2);
let diff = largest_possible_sample.minus(max160);
let portion = diff.div(sample_space_size);
console.log("Portion of 161 bit samples: " + (portion*100).toString() + "%");``````

## Experimental data

This is a set of samples generated from random numbers, one sample per line. The largest number is converted to 0, eg on a six-sided dice the number 6 becomes 0.

For a familiar way of looking at it, these are the same samples in base 10.

This is the same sample set in binary. Leading zeros are added where required.

## Experimental results

? samples were 160 bits, ie ?%

? samples were 161 bits, ie ?%

Bit # 0s 1s % 0s