1. Trang chủ
  2. » Kỹ Thuật - Công Nghệ

Principles of communications

13 187 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 13
Dung lượng 136,5 KB

Các công cụ chuyển đổi và chỉnh sửa cho tài liệu này

Nội dung

Principles of CommunicationsBy: Vinh Dang Quang... Course information Lecturer: Msc... Mutual InformationRelationship between entropy, joint and mutual information.

Trang 1

Principles of Communications

By: Vinh Dang Quang

Trang 2

Course information

 Lecturer: Msc Dang Quang Vinh

 Mail: dg_vinh@yahoo.com

 Mobile:0983692806

 Duration:30 hrs

Trang 3

 Basic concepts

 Information

 Entropy

 Joint and Conditional Entropy

 Channel Representations

 Channel Capacity

Trang 4

Basic concepts What is Information Theory?

 Information Theory: how much information

 … is contained in a signal?

 … can a system generate?

 … can a channel transmit?

 Used in many fields: Communications, Computer Science, Economics,…

 Examples: Barcelona 0-3 SLNA

Trang 5

 Let xj be an event with p(xj)

 If xj occurred, we have

1

( )

j

p x

units of information

 The base of the logarithm

 10 → the measure of information is hartley

 e → the measure of information is nat

 2 → the measure of information is bit

Examples 10.1 (page 669)

Trang 6

 H(X) = - ∑ p(x) log p(x)

 Entropy = information = uncertainty

 If a signal is completely predictable, it has

zero entropy and no information

 Entropy = average number of bits required to transmit the signal

Trang 7

Entropy example 1

 Random variable with uniform distribution over 32 outcomes

 H(X) = - ∑ 1/32 log 1/32 = log 32 = 5

 # bits required = log 32 = 5 bits!

 Therefore H(X) = number of bits required to represent a random event

 How many bits are needed for:

 Outcome of a coin toss

 “tomorrow is a Wednesday”

 US tops Winter Olympics tally”

Trang 8

Entropy example 2

 Horse race with 8 horses, with winning

probabilities

½, ¼, 1/8, 1/16, 1/64, 1/64, 1/64, 1/64

 Entropy H(X) = 2 bits

 How many bits do we need?

 (a) Index each horse  log8 = 3 bits

 (b) Assign shorter codes to horses with higher probability:

0, 10, 110, 1110, 111100, 111101, 111110, 111111

 average description length = 2 bits!

Trang 9

 Need at least H(X) bits to represent X

 H(X) is a lower bound on the required

descriptor length

 Entropy = uncertainty of a random variable

Trang 10

Joint and conditional entropy

 Joint entropy:

H(X,Y) = ∑ x ∑ y p(x,y) log p(x,y)

 simple extension of entropy to 2 RVs

 Conditional Entropy:

H(Y|X) = ∑ x p(x) H(Y|X=x)

= ∑ x ∑ y p(x,y) log p(y|x)

“What is uncertainty of Y if X is known?”

 Easy to verify:

 If Y = X , then H(Y|X) = 0

H(Y|X) = extra information between X & Y

Trang 11

Mutual Information

 I(X;Y) = H(X) – H(X|Y)

= reduction of uncertainty due to another variable

 I(X;Y) = ∑x ∑y p(x,y) log p(x,y)/{p(x)p(y)}

 “How much information about Y is contained

in X?”

 If X,Y independent, then I(X;Y) = 0

 If X,Y are same, then I(X;Y) = H(X) = H(Y)

 Symmetric and non-negative

Trang 12

Mutual Information

Relationship between

entropy, joint and

mutual information

Trang 13

Mutual Information

 I(X;Y) is a great measure of similarity between X and Y

 Widely used in image/signal processing

 Medical imaging example:

 MI based image registration

 Why? MI is insensitive to gain and bias

Ngày đăng: 03/01/2016, 21:08

TỪ KHÓA LIÊN QUAN