proba bilistic
graphical models
summary
probabilistic
graphical
models

why pgms?
• pgms are the marriage of statistics arid
computer science
— statistics: sound probabilistic foundations
— computer science: bcita structures and algorithms for exploiting them
daphne keller

why pgms?
• pgms are the marriage of statistics and
computer science
— statistics: sound probabilistic foundations
— computer science: bata structijrs and algorithms for exptdfting them
bçhne koiicr

declarative representation
beclarative representation
___/
model
/
ophnc koller

beclarative representation
beclarative representation
7
model
/
ophne koiicr

beclarative representation
beclarative representation
model
/
dophne koiicr

declarative representation
beclar’citive representation
/___
model
/
ophne koiicr

declarative representation
algorithm
beclarative representation
/
learning
domain expert
el
/
/
model
algorithm
ophnc koiicr

when pgms?
• when we have noisy data and uncertainty
• when we have lots of prior knowledge
• when we wish to reason about multiple variables
dophne koller

when pgms?
• when we have noisy data and uncertainty
• when we have lots of prior knowledge
• when we wish to reason about multiple variables
• when we want to construct richly structured models from modular buildirj blocks
dophne koiicr

intertwined design choices
• representation
• inference algorithm
• learning algorithm
dophne koller

intertwined design choices
• representation
— affects cost of inference learning
• inference aigorifhm
• learning algorithm
dophne koller

intertwined design choices
• represeritcitiori
— affects cost of inference ‘earning
• inference nlgorihm
— used as asubroutirie in learning
• learning algorithm
dophne koller

intertwined design choices
• representation
— affects cost of inference learning
• inference algorihm
— used as a subroutine in learning
— some areiy usable in certain types of models
• learning algorithm
ophnc koller

intertwined design choices
• representation
— affects cost of inference & learning
• inference algorifhm
— used as a subroutine in learning
— some areiiy usable in certain types of models
• learning algorithm
— lecirnicibility imposes modeling constraints
ophne koiicr

example: image segmentation
• bns vs mrfs vs crfs
doplrnc keller

example: image segmentation
• bns vs mrfs vs crfs
dophnc keiicr

image segmentation
example:
• bnsvsmrfsvscrfs
— naturalness of model
pr
dophnc keiicr

image segmentation vs mrfsvscrfs
example:
•bns
— naturalness of model
— using rich features
dophnc koiicr

example: image segmentation
• bnsvsmrfgvgcrfg
— naturalness of model
• •
— using nch features’
ophnc keller

example: image segmentation
l1%4fqs
• bnsvsmrfsvscrfs
— naturalness of model
-.‘using rich features -
dophne keiicr

example: image segmentation
sljwqs x
• bnsvsmrfsvscrfs
— naturalness of model
- ‘using rich features -
— inference costs
orrlja
ophnc keiicr

example: image segmentation
.ljwqs c
bns vs mrfs vs crfs
— naturcilriess of model
- ‘using rich features -‘
—inference costs
lrrl(jti
dophne keiicr

example: image segmentation
l1wqs x
bns vs mrfs vs crfs
— naturalness of model
- ‘using rich features -_
—inference costs
(csjcct’44ij’, rejer)
ophnc koiicr

example: image segmentation
bns vs mrfs vs crfs
— naturalness of model
- ‘using rich features -‘
—inference costs
rrluj
(oc1’l44ie,
dophne koiicr

example: image segmentation
24.ojwq,.s c
bns vs mrfs vs crfs
— naturalness of model
- ‘using rich features -,
— inference costs
orrqli
t’l4i€ rjtr)
ophnc keller

example: image segmentation
ljwq,.s c
bns vs mrfs vs crfs
— naturalness of model
- ‘using rich features -‘
—inference costs
/ rqj4) jl4j f1
daphne koller

example: image segmentation
bnsvsmrfsvscrfs
— naturalness of model
corrluêa
- ‘using rich features -
— c4jli.j f”?<tç
5o 1csf
—inference costs
/
dophnc koiicr

example: image segmentation
.lljlrqs x
bns vs mrfs vs crfs
— naturalness of model
- ‘using rich features -‘
—inference costs
— training cost
ç_oc14lj€, rey114r)
jllj- p1•1tc
5o4 1cjf 0- fop
dophne koiicr

(c’i.4€, jl.lj- •(
l’4v
b—so 01
example:
image segmentation
bns vs mrfs vs crfs
— naturalness of model
- ‘using rich features
—inference costs
— training cost
dophne koiicr

jtr) ‘4jcil.ij
&4 v4 s o t cbl4j d4 01
example:
image segmentation
.ljwq,.s x
bns vs mrfs vs crfs
— naturalness of model
4orrltj&1
- ‘using rich features -,
— inference costs ________
—training cost
— learn with missing data
dophnc keiicr

ci4)l.lj plç
l,i’w’ $°4’e tcke!r
4’ulfrvttj
example:
image segmentation
bnsvsmrfsvscrfs
— naturalness of model
- ‘using rich features
— inference costs ________
—training cost
— learn with missing data
dçhne keiicr

mix & match: modeling
si,
sj —t--
____ i • ‘

mix & match: modeling
• mix directed & undirected edges

mix & match: modeling
• mix directed & undirected edges
• e.g, image segmentation from unlabeled images
— undirected edges over labels s - naturcit directionality
f

mix & match: modeling
• mix directed & undirected edges
• eq., image segmentation from unlabeled iriages
— undirected edges over labels s - natural icionaiity
— birected for p(xi i si) - easy learning (w/o inference)

mix & match: modeling
• mix directed & undirected edges
• e.g, image segmentation from unlabeled iipages
— undirected edges over labels s - natural ècionaiity
— birected for p(xi i si) - easy learning (w/o inference)

mix & match: modeling
• mix directed & undirected edges
• e.g, image segmentation from unlabeled it’pages
— undirected edges over labels 5 - natural tècnonaiity
— birected for p(x i si) - easy learning (w/o inference)
rj

mix & match: modeling
• mix directed & undirected edges
• e.g, imoge segmentation from unlabeled irpages
— undirected edges over labels s - natural tèc4ionaiity
— birected for p(xi i si) - easy learning (w/o inference)
: ,
cfr¼wei

mix & match: modeling
• mix directed & undirected edges
• e.g., image segmentation from unlabeled irpages
— undirected edges over labels s - natural tcionaiity
— birected for p(xi i si) - easy learning (w/o inference)

mix & match: modeling
• mix directed & undirected edges
• e.g, image segmentation from unlabeled irpages
— undirected edges over labels s - natural ècionaiity
— birected for p(x i si) -easy learning (w/o inference)
t,4cl’r) ;ivg59
: ,,
jq,r4

mix & match: inference
• apply different inference algorithms to different parts of model
• e.g., combine approximate inference (f or mcmc) with exact inference over subsets of varicibtes

mix & match: inference
• apply different inference algorithms to different parts of model
• e.g., combine approximate inference (bp or mcmc) with exact inference over subsets of varicibtes

mix & match: inference
• apply different inference algorithms to different parts of model
• e.g., combine approximate inference (b? or mcmc) with exact inference over subsets of variabtes

mix & match: inference
• apply different inference algorithms to different parts of model
• e.g., combine approximate inference (f or mcmc) with exact inference over subsets of variabtes
fr
0_.l.b03,_, oj4,-
ls

mix & match: inference
• apply different inference algorithms to different parts of model
• e.g., combine approximate inference (f or mcmc) with exact inference over subsets of variabtes

mix & match: learning
• apply different 1earring algorithms to different parts of model
• e.g., combine high-accuracy, easily-trained model (e.g., svm) for node potentials p(5 i x) with crf learning for higher-order potentials

d. bohus & e. horvitz. situated interaction project, microsoft research, 2010
u _______ _____ _____ zj
1.1
____
1i
doplrne keiicr

b. bohus e. horvitz. situated interaction project, microsoft research, 2010
dopirnc keiicr

summary
• integrated framework for reasoning arid learning in complex, uncertain domains
— large bag of tools within single framework
• used in a huge range of applications
ophnc keller

summary
• integrated framework for reosonirl9 arid learning in complex, uncertain domains
— large bag of tools within single framework
• used in a huge range of applications
• much work to be done, both on applications arid on foundational methods
dophnc keller

summary
• integrated framework for reosonirl9 arid learning in complex, uncertain domains
— large bag of tools within single framework
• used in a huge range of applications
• much work to be done, both on applications arid on foundational methods.
dophne koiicr

Original on youtube.com

You need Flash player 8+ and JavaScript enabled to view this video.

Professor Daphne Koller is offering a free online course on Probabilistic Graphical Models starting in March 19, 2012. http://www.pgm-class.org/

Offered by Coursera: https://www.coursera.org/