Skip to content

Commit 40c907d

Browse files
committed
Adds enumeration ask
1 parent b24966f commit 40c907d

File tree

1 file changed

+205
-10
lines changed

1 file changed

+205
-10
lines changed

notebooks/ProbabilisticReasoning.ipynb

Lines changed: 205 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -30,13 +30,13 @@
3030
},
3131
{
3232
"cell_type": "code",
33-
"execution_count": 2,
33+
"execution_count": 3,
3434
"metadata": {},
3535
"outputs": [
3636
{
3737
"data": {
3838
"application/vnd.jupyter.widget-view+json": {
39-
"model_id": "d2cf7a35-5540-4e74-ac94-07cb97cbaf68",
39+
"model_id": "05dc6698-2418-49a5-89a2-606bc131197f",
4040
"version_major": 2,
4141
"version_minor": 0
4242
},
@@ -167,26 +167,24 @@
167167
},
168168
{
169169
"cell_type": "code",
170-
"execution_count": 12,
170+
"execution_count": 4,
171171
"metadata": {},
172172
"outputs": [
173173
{
174174
"name": "stdout",
175175
"output_type": "stream",
176176
"text": [
177177
"Random Variables = [Cavity, Toothache, Catch]\n",
178-
"The cavity Node: Cavity\n",
179-
"The toothache Node: Toothache\n",
180-
"The catch Node: Catch\n"
178+
"The cavity Node: Cavity\n"
181179
]
182180
},
183181
{
184182
"data": {
185183
"text/plain": [
186-
"aima.core.probability.bayes.impl.BayesNet@5a94841e"
184+
"aima.core.probability.bayes.impl.BayesNet@3e549709"
187185
]
188186
},
189-
"execution_count": 12,
187+
"execution_count": 4,
190188
"metadata": {},
191189
"output_type": "execute_result"
192190
}
@@ -299,7 +297,7 @@
299297
},
300298
{
301299
"cell_type": "code",
302-
"execution_count": 18,
300+
"execution_count": 5,
303301
"metadata": {},
304302
"outputs": [
305303
{
@@ -320,7 +318,7 @@
320318
"null"
321319
]
322320
},
323-
"execution_count": 18,
321+
"execution_count": 5,
324322
"metadata": {},
325323
"output_type": "execute_result"
326324
}
@@ -359,6 +357,203 @@
359357
" + model.posterior(cavity,toothache));"
360358
]
361359
},
360+
{
361+
"cell_type": "markdown",
362+
"metadata": {},
363+
"source": [
364+
"## Exact Inference in Bayesian Networks\n",
365+
"The basic task for any probabilistic inference system is to compute the posterior probability\n",
366+
"distribution for a set of query variables, given some observed event—that is, some assignment of values to a set of evidence variables.We will use the notation from the previous notebook: X denotes the query variable; **E**\n",
367+
"denotes the set of evidence variables $E_1 , . . . , E_m$ , and **e** is a particular observed event; Y will\n",
368+
"denote the nonevidence, nonquery variables $Y_1 , . . . , Y_l$ (called the hidden variables). Thus,\n",
369+
"the complete set of variables is $X = \\{X\\} \\cup E \\cup Y$. A typical query asks for the posterior\n",
370+
"probability distribution $P(X | \\textbf{e})$."
371+
]
372+
},
373+
{
374+
"cell_type": "markdown",
375+
"metadata": {},
376+
"source": [
377+
"### Inference by enumeration\n",
378+
"We proved in the previous notebook that any conditional probability can be computed by using the full joint distribution. Mathematically:\n",
379+
"$$ \\textbf{P}(X|\\textbf{e}) = \\alpha \\textbf{P}(X,\\textbf{e}) = \\alpha \\sum_{y}\\textbf{P}(X|\\textbf{e},\\textbf{y})$$"
380+
]
381+
},
382+
{
383+
"cell_type": "markdown",
384+
"metadata": {},
385+
"source": [
386+
"Also, we know that a Bayesian Network can give a complete representation of a full joint distribution. Hence, the above sum is calculable from a Bayesian Network. Now, let's have a look at the entire process. Let b, j and m be particular values of random variables *B*, *J* and *M*. Let *E* and *A* be the hidden variables. Then by using the sum and product rules of probability we get:\n",
387+
"$$P(b|j,m) = \\alpha P(b) \\sum_{e}P(e)\\sum_{a}P(a|b,e)P(j|a)P(m|a)$$"
388+
]
389+
},
390+
{
391+
"cell_type": "markdown",
392+
"metadata": {},
393+
"source": [
394+
"Now, the above calculation can be represented in the form of a calculation tree shown below. The order of calculation brings with itself an intuition of a depth first tree. In fact, the enumeration ask algorithm employs a depth first approach to solve the inference problem."
395+
]
396+
},
397+
{
398+
"cell_type": "code",
399+
"execution_count": 6,
400+
"metadata": {},
401+
"outputs": [
402+
{
403+
"data": {
404+
"text/markdown": [
405+
"### AIMA3e\n",
406+
"__function__ ENUMERATION-ASK(_X_, __e__, _bn_) __returns__ a distribution over _X_ \n",
407+
" __inputs__: _X_, the query variable \n",
408+
"     __e__, observed values for variables __E__ \n",
409+
"     _bn_, a Bayes net with variables \\{_X_\\} ⋃ __E__ ⋃ __Y__ /\\* __Y__ = hidden variables \\*/ \n",
410+
"\n",
411+
" __Q__(_X_) ← a distribution over _X_, initially empty \n",
412+
"&emsp;__for each__ value _x<sub>i</sub>_ of _X_ __do__ \n",
413+
"&emsp;&emsp;&emsp;__Q__(_x<sub>i</sub>_) &larr; ENUMERATE\\-ALL(_bn_.VARS, __e__<sub>_x_<sub>_i_</sub></sub>) \n",
414+
"&emsp;&emsp;&emsp;&emsp;&emsp;where __e__<sub>_x_<sub>_i_</sub></sub> is __e__ extended with _X_ = _x<sub>i</sub>_ \n",
415+
"&emsp;__return__ NORMALIZE(__Q__(_X_)) \n",
416+
"\n",
417+
"---\n",
418+
"__function__ ENUMERATE\\-ALL(_vars_, __e__) __returns__ a real number \n",
419+
"&emsp;__if__ EMPTY?(_vars_) __then return__ 1.0 \n",
420+
"&emsp;_Y_ &larr; FIRST(_vars_) \n",
421+
"&emsp;__if__ _Y_ has value _y_ in __e__ \n",
422+
"&emsp;&emsp;&emsp;__then return__ _P_(_y_ &vert; _parents_(_Y_)) &times; ENUMERATE\\-ALL(REST(_vars_), __e__) \n",
423+
"&emsp;&emsp;&emsp;__else return__ &sum;<sub>_y_</sub> _P_(_y_ &vert; _parents_(_Y_)) &times; ENUMERATE\\-ALL(REST(_vars_), __e__<sub>_y_</sub>) \n",
424+
"&emsp;&emsp;&emsp;&emsp;&emsp;where __e__<sub>_y_</sub> is __e__ extended with _Y_ = _y_ \n",
425+
"\n",
426+
"---\n",
427+
"__Figure__ ?? The enumeration algorithm for answering queries on Bayesian networks."
428+
],
429+
"text/plain": [
430+
"<IPython.core.display.Markdown object>"
431+
]
432+
},
433+
"execution_count": 2,
434+
"metadata": {},
435+
"output_type": "execute_result"
436+
}
437+
],
438+
"source": [
439+
"%%python\n",
440+
"from notebookUtils import *\n",
441+
"pseudocode('Enumeration Ask')"
442+
]
443+
},
444+
{
445+
"cell_type": "markdown",
446+
"metadata": {},
447+
"source": [
448+
"The above algorithm calculates the desired conditional distribution. It is implemented in the [`EnumerationAsk`](/aima-core/src/main/java/aima/core/probability/bayes/exact/EnumerationAsk.java) class in the repository. The algorithm takes as input a query variable, a few evidence variables and a Bayesian Network. However, to use the algorithm we will not directly call the algorithm. Instead, we will pass the `EnumerationAsk` as a parameter in the form of an Inference Procedure to our `BayesNetModel`. The cell below shows the steps."
449+
]
450+
},
451+
{
452+
"cell_type": "code",
453+
"execution_count": 20,
454+
"metadata": {},
455+
"outputs": [
456+
{
457+
"name": "stdout",
458+
"output_type": "stream",
459+
"text": [
460+
"The prior distribution for toothache is <0.2, 0.8>\n",
461+
"The prior distribution for cavity is <0.2, 0.8>\n",
462+
"The prior distribution for catch is <0.34, 0.66>\n",
463+
"The posterior distribution for toothache given cavity is \n",
464+
" \t <0.6, 0.10000000000000002, 0.4000000000000001, 0.9>\n",
465+
"The posterior distribution for catch given cavity is \n",
466+
" \t <0.9, 0.19999999999999998, 0.09999999999999999, 0.7999999999999999>\n",
467+
"The prior probability of having a cavity is 0.2\n",
468+
"The posterior probability of having a cavity given a toothache is 0.6\n",
469+
"The prior probability of having a cavity or a toothache is 0.28\n",
470+
"The posterior probability of not having a cavity given a toothache is 0.4000000000000001\n",
471+
"The prior probability of not having a cavity but having a toothache is 0.08000000000000002\n",
472+
"The prior probability of having a cavity or a toothache is 0.28\n",
473+
"The posterior probability of having a cavity given that the patient has a cavity or a toothache is 0.7142857142857143\n"
474+
]
475+
},
476+
{
477+
"data": {
478+
"text/plain": [
479+
"null"
480+
]
481+
},
482+
"execution_count": 20,
483+
"metadata": {},
484+
"output_type": "execute_result"
485+
}
486+
],
487+
"source": [
488+
"package aima.notebooks.probabilisticreasoning;\n",
489+
"\n",
490+
"import aima.core.probability.example.*;\n",
491+
"import aima.core.probability.bayes.*;\n",
492+
"import aima.core.probability.bayes.exact.*;\n",
493+
"import aima.core.probability.bayes.impl.*;\n",
494+
"import aima.core.probability.bayes.impl.*;\n",
495+
"import aima.core.probability.bayes.model.*;\n",
496+
"import aima.core.probability.proposition.*;\n",
497+
"\n",
498+
"// Load the network from the network factory.\n",
499+
"BayesianNetwork cavityNet = BayesNetExampleFactory.constructToothacheCavityCatchNetwork();\n",
500+
"// Construct the BayesModel from the BayesNet\n",
501+
"// We will pass EnumerationAsk as the new inference procedure\n",
502+
"FiniteBayesModel model = new FiniteBayesModel(cavityNet, new EnumerationAsk());\n",
503+
"\n",
504+
"// Now we will fully exhaust this model to extract as much information as we can\n",
505+
"\n",
506+
"// First let us define some assignment propositions\n",
507+
"AssignmentProposition atoothache = new AssignmentProposition(\n",
508+
"\t\t\t\tExampleRV.TOOTHACHE_RV, true);\n",
509+
"\t\tAssignmentProposition anottoothache = new AssignmentProposition(\n",
510+
"\t\t\t\tExampleRV.TOOTHACHE_RV, false);\n",
511+
"\t\tAssignmentProposition acavity = new AssignmentProposition(\n",
512+
"\t\t\t\tExampleRV.CAVITY_RV, true);\n",
513+
"\t\tAssignmentProposition anotcavity = new AssignmentProposition(\n",
514+
"\t\t\t\tExampleRV.CAVITY_RV, false);\n",
515+
"\t\tAssignmentProposition acatch = new AssignmentProposition(\n",
516+
"\t\t\t\tExampleRV.CATCH_RV, true);\n",
517+
"\t\tAssignmentProposition anotcatch = new AssignmentProposition(\n",
518+
"\t\t\t\tExampleRV.CATCH_RV, false);\n",
519+
"\n",
520+
"// Now let us define some propositions which are conjunctions and/or disjunctions of the above propositions\n",
521+
"ConjunctiveProposition toothacheAndNotCavity = new ConjunctiveProposition(\n",
522+
"\t\t\t\tatoothache, anotcavity);\n",
523+
"DisjunctiveProposition cavityOrToothache = new DisjunctiveProposition(\n",
524+
"\t\t\t\tacavity, atoothache);\n",
525+
"\n",
526+
"// First let us calculate the prior probabilities of our random variables\n",
527+
"// The probabilities in the distribution are returned in the order <True, False>\n",
528+
"System.out.println(\"The prior distribution for toothache is \"+ model.priorDistribution(ExampleRV.TOOTHACHE_RV));\n",
529+
"System.out.println(\"The prior distribution for cavity is \"+ model.priorDistribution(ExampleRV.CAVITY_RV));\n",
530+
"System.out.println(\"The prior distribution for catch is \"+ model.priorDistribution(ExampleRV.CATCH_RV));\n",
531+
"// Now let us calculate the posterior distribution is\n",
532+
"// Posterior distribution first exhausts all the possibilities of the evidence variables\n",
533+
"System.out.println(\"The posterior distribution for toothache given cavity is \\n \\t \"+ model.posteriorDistribution(ExampleRV.TOOTHACHE_RV,\n",
534+
" ExampleRV.CAVITY_RV).toString());\n",
535+
"\n",
536+
"System.out.println(\"The posterior distribution for catch given cavity is \\n \\t \"+ model.posteriorDistribution(ExampleRV.CATCH_RV,\n",
537+
" ExampleRV.CAVITY_RV).toString());\n",
538+
"\n",
539+
"// Now let us have a look at some individual probabilities\n",
540+
"System.out.println(\"The prior probability of having a cavity is \"+model.prior(acavity));\n",
541+
"System.out.println(\"The posterior probability of having a cavity given a toothache is \"+ model.posterior(acavity, atoothache));\n",
542+
"System.out.println(\"The prior probability of having a cavity or a toothache is \"+model.prior(cavityOrToothache));\n",
543+
"System.out.println(\"The posterior probability of not having a cavity given a toothache is \"+model.posterior(anotcavity, atoothache));\n",
544+
"System.out.println(\"The prior probability of not having a cavity but having a toothache is \"+model.prior(toothacheAndNotCavity));\n",
545+
"System.out.println(\"The prior probability of having a cavity or a toothache is \"+model.prior(cavityOrToothache));\n",
546+
"System.out.println(\"The posterior probability of having a cavity given that the patient has a cavity or a toothache is \"+\n",
547+
" model.posterior(acavity,cavityOrToothache));\n"
548+
]
549+
},
550+
{
551+
"cell_type": "markdown",
552+
"metadata": {},
553+
"source": [
554+
"There are a large number of inferences that can be derived from a probability model. For the sake of conciseness, we will focus only on a few prior and posterior distributions in the upcoming examples."
555+
]
556+
},
362557
{
363558
"cell_type": "code",
364559
"execution_count": null,

0 commit comments

Comments
 (0)