Information:
What is it? How do we measure it? Shannon’s formula / Shannon’s theory of information   What is information?
Information is NOT data.   Info @ Cornell:
Boring '   low info: “I’m a student”(p=0.9), “Nice party”(p=0.8), “Its snowing”(p=0.99)
Not Boring: “I don’t drink”(p=0.1), “I’m in orie”(p=0.2),”It was sunny today”(p=.001)   Takes more information to explain things that are rare/have low outcome.   Information: is a function of probability -- f(p). Information is inversely proportional to p.   Shannons def: info(p)=log2(1/p)   Entropy = average information   Ex: coin flip
Pr(H)=Pr(T)=1/2
Info(H)=log2(1/(1/2))=log2(2)=1 bit
Info(T)=1 bit
If you flip a fair coin, either output gives you 1 bit. Since the entropy is defined as the average information, Entropy for a coin flip is a 1.
Formal   States 1…n
Pr(i)=pi
Info(i)=log2(1/pi)=-log2(pi)
Entropy: H= {draw:frame} {draw:frame}

Flipping a coin twice:
Possible outcomes: HH, HT, TH, TT
n=4, pi=(1/2)(1/2)=1/4
Info(HH)=log2(1/(1/4))=log2(4)=2
Entropy(HH)=2  H=-p1log(p1) '   p2log(p2) '   p3log(p3) '   p4log(p4) = ¼(2) + ¼(2) +¼(2) +¼(2) = 2   Example: 6 sided die
States: 1…6
pi=1/6
Info(pi)=log2(6)=2.6
H= 2.6   Base 2!!!   {draw:frame} {draw:frame} (change of base formula) [continues]

### Cite This Essay

APA

(2009, 03). Information Technology Lecture 1. StudyMode.com. Retrieved 03, 2009, from http://www.studymode.com/essays/Information-Technology-Lecture-1-197486.html

MLA

"Information Technology Lecture 1" StudyMode.com. 03 2009. 03 2009 <http://www.studymode.com/essays/Information-Technology-Lecture-1-197486.html>.

CHICAGO

"Information Technology Lecture 1." StudyMode.com. 03, 2009. Accessed 03, 2009. http://www.studymode.com/essays/Information-Technology-Lecture-1-197486.html.