Info Theory Basics
Information is -logP(x) for an event Independent events have additive infomration Less likely events have higher Information knowing outcome of an event with 50% prob has 1 bit of information Measured in nats or bits (recall logs of all bases are proportional) 0 information if certain
setup: a bitstream encodes a sequence of random vairables. Prefix requirements impose a cost of 2^l
Last Reviewed: 10/27/24 Reference Sheet #3, 3.1