1. Trang chủ
  2. » Giáo án - Bài giảng

markov random fields in image segmentation

49 361 1

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Tiêu đề Markov Random Fields in Image Segmentation
Tác giả Zoltan Kato
Trường học University of Szeged
Chuyên ngành Image Processing & Computer Graphics
Thể loại thesis
Năm xuất bản 2008
Thành phố Vienna
Định dạng
Số trang 49
Dung lượng 0,91 MB

Các công cụ chuyển đổi và chỉnh sửa cho tài liệu này

Nội dung

MRF segmentation model„ Pixel labels or classes are represented by Gaussian distributions: „ Clique potentials: Singleton: proportional to the likelihood of features given ω: logPf |

Trang 1

Markov Random Fields

in Image Segmentation

Zoltan Kato

Image Processing & Computer Graphics Dept.

University of Szeged Hungary

Trang 2

… Markov Random Field (MRF)

… Gibbs distribution & Energy function

… Simulated Annealing

… Markov Chain Monte Carlo (MCMC) sampling

Trang 3

1 Extract features from the input image

… Each pixel s in the image has a feature vector

… For the whole image, we have

2 Define the set of labels Λ

… Each pixel s is assigned a label

… For the whole image, we have

„ For an N×M image, there are | Λ| NM possible labelings.

… Which one is the right segmentation?

Segmentation as a Pixel Labelling Task

s

f r

} :

{ s sS

= ω ω

Trang 4

Probabilistic Approach, MAP

possible labelings and select the most likely one

given the observed feature

P ω

f

)

| ( f

P ω

)

| (

max arg

ω

ω ∈ Ω

=

Trang 5

( )

(

) ( )

|

( )

|

f P

P f

P f

P P( f | ω)

likelihood

prior

Trang 6

Why MRF Modelization?

neighboring pixels usually have similar

properties (intensity, color, texture, …)

model which captures such contextual

constraints

underlying structure Î Simulated Annealing

Trang 7

What is MRF?

Fields, we need some basic building blocks

… Observation Field and (hidden) Labeling Field

… Pixels and their Neighbors

… Cliques and Clique Potentials

… Energy function

… Gibbs Distribution

Trang 8

Definition – Neighbors

pixels as its neighbors

neighbors

Trang 9

Definition – MRF

Markov Random Field (MRF) if

0)

(: Χ = >

| (

) ,

| ( s r r s P s r r N s

s

N

Trang 10

Hammersley-Clifford Theorem

„ The Hammersley-Clifford Theorem states that a random

field is a MRF if and only if follows a Gibbs

distribution.

„ where is a normalization constant

„ This theorem provides us an

easy way of defining MRF models via

clique potentials.

) ( ω

P

) ) (

exp(

1 ))

( exp(

1 )

c

V Z

U Z

exp( U

Z

Trang 11

Definition – Clique

pixels in this subset are neighbors

C

C = 1 U 2 U U

singleton doubleton

Trang 12

Definition – Clique Potential

where is the configuration of the labeling field

energy of the configuration

) (ω

c

V

) (ω

) ,

( V )

( V )

( V )

(

U

2

2 1

1

C ) j , i (

j i C

C i

i C

C c

ω

= ω

=

ω

ω

Trang 13

MRF segmentation model

+ find MAP estimate

Segmentation of grayscale images:

A simple MRF model

formed by spatial clusters of pixels with similar

Trang 14

MRF segmentation model

„ Pixel labels (or classes) are represented by

Gaussian distributions:

„ Clique potentials:

… Singleton: proportional to the likelihood of

features given ω: log(P(f | ω)).

… Doubleton: favours similar labels at neighbouring

pixels – smoothness prior

As β increases, regions become more homogenous

1 )

| (

s

s

s

s s

s

f f

π ω

≠ +

j i

j i c

if

if j

i V

ω ω

β

ω ω

β ω

ω

βδ ( , ) )

, (

2

Trang 15

Model parameters

… less dependent on the input Î

„ can be fixed a priori

… Problem dependentÎ

„ usually given by the user or

„ inferred from some higher level knowledge

… estimated from the input image

Trang 16

Model parameters

„ The class statistics (mean and variance)

can be estimated via the empirical mean

and variance:

… where Sλdenotes the set of pixels in the

training set of class λ

… a training set consists in a representative

region selected by the user

Trang 17

s

s s

f U

, 2

2

) ,

( 2

)

( )

2 log(

)

σ

μ σ

π

ω

ω

ω ω

) ) ( exp(

1 ))

( exp(

1 )

c

V Z

U Z

f

) (

min arg

)

| (

max arg

=

Trang 19

ICM (~Gradient descent) [Besag86]

Trang 20

Simulated Annealing

Trang 21

Temperature Schedule

Trang 22

… Fixed number of iterations

… Energy change is less than a threshols

Trang 23

„ Download from:

http://www.inf.u-szeged.hu/~kato/software/

Trang 24

good segmentation from a wrong model

Trang 25

What color features?

RGB histogram

C IE -L*u

*v

* histogram

Trang 26

Extract Color Feature

is perceptually uniform.

… Color difference can be measured by Euclidean

distance of two color vectors.

CIE-L*u*v* space Î

… We have 3 color feature images

Trang 27

Color MRF segmentation model

„ Pixel labels (or classes) are represented by

three-variate Gaussian distributions:

„ Clique potentials:

… Singleton: proportional to the likelihood of

features given ω: log(P(f | ω)).

… Doubleton: favours similar labels at neighbouring

pixels – smoothness prior

As β increases, regions become more homogenous

≠ +

j i

j i c

if

if j

i V

ω ω

β

ω ω

β ω

ω

βδ ( , ) )

,

(

2

) ) (

)

( 2

1 exp(

|

| ) 2 (

1 )

|

n s

s

u f

u f

Trang 28

segmentation from a wrong model

interaction?

parameters automatically (EM algorithm)

Trang 29

Incomplete data problem

„ e.g somebody manually assigned labels to pixels

called Expectation-Maximization

„ Assigns labels and estimates parameters simultaneously

„ Chicken-Egg problem

Trang 30

EM principles : The two steps

Parameters :

P(pixel/label)P(label) Weighted labeling :P(label/pixel)

E Step : For each pixel,

use parameters to compute probability distribution

M Step : Update the estimates of parameters

based on weighted (or ”soft”) labeling

Trang 31

The basic idea of EM

straightforward assuming the other is

solved

estimate the parameters

„ Similar to supervised learning (hard vs soft labeling)

we can assign a label to each pixel

„ by Maximum Likelihood – i.e using the singleton energies only without pairwise interactions

Trang 32

Parameter estimation via EM

„ Basically, we will fit a mixture of Gaussian

to the image histogram

mixture components

„ specifies the contribution of the pixel feature to each of the labels – i.e a soft labeling

Trang 33

Parameter estimation via EM

„ E step: recompute lsi at each pixel s:

„ M step: update Gaussian parameters for

λ

λ

λ

)()

|(

)()

|

()

|

(

P P

P

P P

s

s s

i s

f

f f

l

, )

| (

)

| ( ,

|

|

)

| ( )

S s S

s

P

P S

P P

s

s s s

f

f f f

λ

λ μ

λ

Trang 34

… Optimization is just a tool, do not expect a good

segmentation from a wrong model

… Extension to color is relatively

… Yes, but you need to estimate model parameters

automatically (EM algorithm)

… Fully automatic segmentation requires

„ Modeling of the parameters AND

„ a more sophisticated sampling algorithm (Reversible jump MCMC)

Trang 35

MRF+RJMCMC vs JSEG

1. color quantization: colors are

quantized to several representing

classes that can be used to

differentiate regions in the image

2 spatial segmentation: A region

growing method is then used to

segment the image.

Trang 36

Berkeley Segmentation Dataset

RJMCMC JSEG

Trang 37

„ Design your model carefully

… Optimization is just a tool, do not expect a good segmentation from

a wrong model

„ What about other than graylevel features

… Extension to color is relatively

„ Can we segment images without user interaction?

… Yes, but you need to estimate model parameters automatically (EM algorithm)

„ What if we do not know |Λ|?

… Fully automatic segmentation requires

„ Modeling of the parameters AND

„ a more sophisticated sampling algorithm (Reversible jump MCMC)

„ Can we segment more complex images?

… Yes but you need a more complex MRF model

Trang 38

„ Combine different segmentation cues:

… Color & Texture [ICPR2002,ICIP2003]

… Color & Motion [ACCV2006,ICIP2007]

… Multiple cues are perceived simultaneously and then they are integrated by the human visual

system [Kersten et al An Rev Psych 2004]

… Therefore different image features has to be

handled in a parallel fashion

Markovian framework

Trang 39

… Clique potentials define

the local interaction strength

„ MAP ⇔ Energy

minimization (U(ω))

Z Z

) ) ( V

))

exp(-exp(-U(

)

P(

: Theorem Clifford

Hammersley

-C C

=

=

ω ω

ω Model Definition of clique potentials

Texture

Trang 40

Texture Layer: MRF model

… Gabor feature is good at discriminating

strong-ordered textures

… MRSAR feature is good at discriminating

weak-ordered (or random) textures

… The number of texture feature images depends on the size of the image and other parameters.

„ Most of these doesn’t contain useful information Î

… Select feature images with high discriminating power.

Trang 41

Examples of Texture Features

MRSAR features:

Gabor features:

Trang 42

Combined Layer: Labels

„ A label on the combined

layer consists of a pair of

color and texture/motion

labels such that

where and

„ The number of possible

classes is

„ The combined layer

selects the most likely

ones

>

=< m

s c s

s η η

η ,

c c

L

L ×

Trang 43

Combined Layer: Singleton potential

… is the percentage of labels belonging to class

layer.

be penalized and removed to get a lower energy

… Mean value is a guess about the number of classes,

… Variance is the confidence.

Trang 44

Combined Layer: Doubleton potential

1 Similar color and motion/texture labels

2 Different color and motion/texture labels

3 Similar color (resp motion/texture) and different

motion/texture (resp color) labels

„ These are contours visible only at one feature layer.

m s

c r

c s

m r

m s

c r

c s

m r

m s

c r

c s

m r

m s

c r

c s

r s

η η

η η

η η

η η

α

η η

η η

η η

η η

α η

η δ

,

or

,

if

,

if 0

,

if

),

(

Trang 45

Inter-layer clique potential

a feature and combined layer

difference of the singleton

potentials at the corresponding

feature layer

… Prefers ωs and ηs having the same

label, since they represent the

labeling of the same pixel

… Prefers ωs and ηr having the same

label, since we expect the combined

and feature layers to be homogenous

Trang 46

Color Textured Segmentation

segmentation

segmentation

color

color texture

texture

Trang 47

Color Textured Segmentation

Segmentation

Color Segmentation

Multi-cue Segmentation Texture Layer

Result

Color Layer Result

Combined Layer Result

Segmentation

Color Segmentation

Multi-cue Segmentation Texture Layer

Result

Color Layer Result

Combined Layer Result

Trang 48

Color & Motion Segmentation

Trang 49

http://www.inf.u-szeged.hu/~kato/

Ngày đăng: 24/04/2014, 13:13

TỪ KHÓA LIÊN QUAN