CS 294-5: Statistical Natural Language Processing

CS 294-5: Statistical Natural Language Processing

CS 188: Artificial Intelligence Bayes Nets: Sampling Instructors: Dan Klein and Pieter Abbeel --- University of California, Berkeley [These slides were created by Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley. All CS188 materials are available at http://ai.berkeley.edu.] Bayes Net Representation A directed, acyclic graph, one node per random variable A conditional probability table (CPT) for each node A collection of distributions over X, one for each combination of parents values Bayes nets implicitly encode joint distributions As a product of local conditional distributions To see what probability a BN gives to a full assignment, multiply all the relevant conditionals together: Variable Elimination Interleave joining and marginalizing

dk entries computed for a factor over k variables with domain sizes d Ordering of elimination of hidden variables can affect size of factors generated Worst case: running time exponential in the size of the Bayes net Approximate Inference: Sampling Sampling Sampling is a lot like repeated simulation Predicting the weather, basketball games, Basic idea Draw N samples from a sampling distribution S

Compute an approximate posterior probability Show this converges to the true probability P Why sample? Learning: get samples from a distribution you dont know Inference: getting a sample is faster than computing the right answer (e.g. with variable elimination) Sampling Sampling from given distribution Step 1: Get sample u from uniform distribution over [0, 1) E.g. random() in python Step 2: Convert this sample u into an outcome for the given distribution by having each target outcome associated with a sub-interval of [0,1) with subinterval size equal to probability of the

outcome Example C red green blue P(C) 0.6 0.1 0.3 If random() returns u = 0.83, then our sample is C = blue E.g, after sampling 8 times: Sampling in Bayes Nets Prior Sampling Rejection Sampling

Likelihood Weighting Gibbs Sampling Prior Sampling Prior Sampling +c -c 0.5 0.5 Cloudy +s +c -c 0.1 -s

+s -s +c 0.9 0.5 0.5 +r Sprinkler +w 0.99 -w +w 0.01

0.90 +s WetGrass +r -r -w +w -w +w 0.10 0.90 0.10 0.01 0.8

-r +r -r 0.2 0.2 0.8 Samples: +c, -s, +r, +w -c, +s, -r, +w -r -s -c Rain

+r Prior Sampling For i = 1, 2, , n Sample xi from P(Xi | Parents(Xi)) Return (x1, x2, , xn) Prior Sampling This process generates samples with probability: i.e. the BNs joint probability Let the number of samples of an event be Then I.e., the sampling procedure is consistent Example Well get a bunch of samples from the BN:

+c, -s, +r, +w +c, +s, +r, +w -c, +s, +r, -w +c, -s, +r, +w -c, -s, -r, +w C S If we want to know P(W) We have counts <+w:4, -w:1> Normalize to get P(W) = <+w:0.8, -w:0.2> This will get closer to the true distribution with more samples

Can estimate anything else, too What about P(C | +w)? P(C | +r, +w)? P(C | -r, -w)? Fast: can use fewer samples if less time (whats the drawback?) R W Rejection Sampling Rejection Sampling Lets say we want P(C) No point keeping all samples around Just tally counts of C as we go C S R W

Lets say we want P(C | +s) Same thing: tally C outcomes, but ignore (reject) samples which dont have S=+s This is called rejection sampling It is also consistent for conditional probabilities (i.e., correct in the limit) +c, -s, +r, +w +c, +s, +r, +w -c, +s, +r, -w +c, -s, +r, +w -c, -s, -r, +w Rejection Sampling Input: evidence instantiation For i = 1, 2, , n Sample xi from P(Xi | Parents(Xi)) If xi not consistent with evidence Reject: return no sample is generated in this cycle

Return (x1, x2, , xn) Likelihood Weighting Likelihood Weighting Problem with rejection sampling: If evidence is unlikely, rejects lots of samples Evidence not exploited as you sample Consider P( Shape | blue ) Shape Color pyramid, pyramid, sphere, cube, sphere,

green red blue red green Idea: fix evidence variables and sample the rest Problem: sample distribution not consistent! Solution: weight by probability of evidence given parents pyramid, blue pyramid, blue sphere, blue Shape Color cube, blue sphere, blue

Likelihood Weighting +c -c 0.5 0.5 Cloudy +s +c -c 0.1 -s +s -s +c

0.9 0.5 0.5 +r Sprinkler +w 0.99 -w +w 0.01 0.90 -w

+w -w +w 0.10 0.90 0.10 0.01 +s -r -s +r Rain WetGrass -c

+r 0.8 -r +r -r 0.2 0.2 0.8 Samples: +c, +s, +r, +w Likelihood Weighting Input: evidence instantiation w = 1.0

for i = 1, 2, , n if Xi is an evidence variable Xi = observation xi for Xi Set w = w * P(xi | Parents(Xi)) else Sample xi from P(Xi | Parents(Xi)) return (x1, x2, , xn), w Likelihood Weighting Sampling distribution if z sampled and e fixed evidence Cloudy C Now, samples have weights S R

W Together, weighted sampling distribution is consistent Likelihood Weighting Likelihood weighting is good We have taken evidence into account as we generate the sample E.g. here, Ws value will get picked based on the evidence values of S, R More of our samples will reflect the state of the world suggested by the evidence Likelihood weighting doesnt solve all our problems Evidence influences the choice of downstream variables, but not upstream ones (C isnt more likely to get a value matching the evidence) We would like to consider evidence when we

sample every variable (leads to Gibbs sampling) C S R W Gibbs Sampling Gibbs Sampling Procedure: keep track of a full instantiation x1, x2, , xn. Start with an arbitrary instantiation consistent with the evidence. Sample one variable at a time, conditioned on all the rest, but keep evidence fixed. Keep repeating this for a long time. Property: in the limit of repeating this infinitely many times the resulting samples come from the correct distribution (i.e. conditioned on evidence). Rationale: both upstream and downstream variables condition on evidence. In contrast: likelihood weighting only conditions on upstream evidence, and hence weights obtained in likelihood weighting can sometimes be

very small. Sum of weights over all samples is indicative of how many effective samples were obtained, so we want high weight. Gibbs Sampling Example: P( S | +r) Step 1: Fix evidence Step 2: Initialize other variables C Randomly R = +r S C +r S

+r W W Steps 3: Repeat Choose a non-evidence variable X Resample X from P( X | all other variables) C C C S +r W S

+r W S +r W C S C +r W S C +r

W S +r W Efficient Resampling of One Variable Sample from P(S | +c, +r, -w) C S +r W Many things cancel out only CPTs with S remain! More generally: only CPTs that have resampled variable need to be considered, and

Bayes Net Sampling Summary Prior Sampling P( Q ) Rejection Sampling P( Q | e ) Likelihood Weighting P( Q | e) Gibbs Sampling P( Q | e ) Further Reading on Gibbs Sampling* Gibbs sampling produces sample from the query distribution P( Q | e ) in limit of re-sampling infinitely often Gibbs sampling is a special case of more general methods called Markov chain Monte Carlo (MCMC) methods Metropolis-Hastings is one of the more famous MCMC methods (in fact, Gibbs sampling is a special case of Metropolis-Hastings) You may read about Monte Carlo methods theyre just sampling

Recently Viewed Presentations

  • Contemporary Theatre - Ballard Theatre

    Contemporary Theatre - Ballard Theatre

    19th Century Melodrama. Uses a simple, suspenseful plot and exaggerated characters to appeal to the emotions. Standard characters: a hero (always the fearless one), heroine (the love of the hero, usually the one that the hero saves), villain (usually likes...
  • Presentazione standard di PowerPoint

    Presentazione standard di PowerPoint

    Mission Statement. To continuously improve health care for the public, in collaboration with other stakeholders, by evaluating health care organizations and inspiring them to excel in providing safe and effective care of the highest quality and value
  • B O N N E V I L

    B O N N E V I L

    Whold Facility (CDA, Blowers, etc) Yes, 10/1. United Electric Coop. High Desert Milk. ... Yes, Charlie Weber did approve this, but I don't have a record of the date. 12/24 Comments. Added 5 potential EPMs based on ESIP feedback. Moved...
  • Verbals - campbell.k12.ky.us

    Verbals - campbell.k12.ky.us

    Verbals Gerunds, Participles, and Infinitive Phrases. Standard. L.8.1a - Explain the function of verbals (gerunds, participles, infinitives) in general and their function in particular sentences. Gerunds. A gerund is a verb form ending in ...
  • C H A P T E R 12 Temperature and Heat - Winthrop Chemistry

    C H A P T E R 12 Temperature and Heat - Winthrop Chemistry

    C H A P T E R 12 Temperature and Heat ... Fahrenheit scale Celsius scale Boiling point 212 100 Unknown temperature Tf Tc Freezing point 32 0 Fahrenheit scale Celsius scale Boiling point 212 100 Unknown Tf Tc Freezing...
  • Year 5 - overview sheet Kat Wolfe Investigates

    Year 5 - overview sheet Kat Wolfe Investigates

    • Experiment with adverbs, adverbial phrases, fronted adverbials, subjunctive sentences. Possible Written Outcomes or . Incidental Writing Opportunities . Prediction- based on pictures, the book trailer, blurb and front cover of the book. Diary - based on chapter 10 when...
  • Collegiate DECA Competitive Events  Competitive events are experiential

    Collegiate DECA Competitive Events Competitive events are experiential

    What's it About? Anyone interested in a career path that involves finance or accounting: Auditor Financial Analyst Accountant Personal Finance Manager Bank Officer Loan Officer Who Should Compete? Collegiate DECA Competitive Events Competitive events are experiential learning activities designed to...
  • Contract law - unizg.hr

    Contract law - unizg.hr

    Unilateralcontract. In a unilateral, or one-sided, contract, one party, known as the offeror, makes a promise in exchange for an act (or abstention from acting) by another party, known as the offeree.If the offeree acts on the offeror's promise, the...