Skip to content

Commit b24966f

Browse files
committed
Adds Bayes Model
1 parent 1e83927 commit b24966f

File tree

1 file changed

+88
-1
lines changed

1 file changed

+88
-1
lines changed

notebooks/ProbabilisticReasoning.ipynb

Lines changed: 88 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -268,8 +268,95 @@
268268
"\n",
269269
"Then we repeat the process, reducing each conjunctive probability to a conditional probability\n",
270270
"and a smaller conjunction. We end up with one big product:\n",
271+
"$$P (x_1 , . . . , x_n ) = P (x_n | x_{n−1} , . . . , x_1 )P (x_{n−1} | x_{n−2} , . . . , x_1 ) · · · P (x_2 | x_1 )P (x_1 )$$\n",
272+
"$$ = \\prod_{i = 1}^{n}P(x_i|x_{i-1},...,x_1)$$"
273+
]
274+
},
275+
{
276+
"cell_type": "markdown",
277+
"metadata": {},
278+
"source": [
279+
"This identity is called the chain rule. It holds for any set of random variables. We can conclude that the specification of the joint distribution is equivalent to the\n",
280+
"general assertion that, for every variable $X_i$ in the network\n",
281+
"\n",
282+
"$$\\textbf{P}(X_i | X_{i−1} , . . . , X_1 ) = \\textbf{P}(X_i | Parents(X_i ))$$"
283+
]
284+
},
285+
{
286+
"cell_type": "markdown",
287+
"metadata": {},
288+
"source": [
289+
"provided that $Parents(X_i) \\subseteq {X_{i−1} , . . . , X_1 }$. This last condition is satisfied by numbering\n",
290+
"the nodes in a way that is consistent with the partial order implicit in the graph structure."
291+
]
292+
},
293+
{
294+
"cell_type": "markdown",
295+
"metadata": {},
296+
"source": [
297+
"Since we can represent the full joint distribution of a particular system using Bayesian Networks, therefore we can consider the BayesianNetworks as Probability Models and can use them for inference procedures in the same way as we used the `FullJointProbabilityModel`. This model can be created using the `FiniteBayesModel` class from the repository. this class asks for a Bayesian Network and an Inference Procedure. If no inference procedure is mentioned, the `EnumerationAsk` algorithm is used as the default inference procedure. From now on, I will be using the `BayesNetExampleFactory` to creste the ToothAcheCavityCatch example. Let's have a look at how we can manipulate the `BayesNetModel` to perform inference using uncertain knowledge."
298+
]
299+
},
300+
{
301+
"cell_type": "code",
302+
"execution_count": 18,
303+
"metadata": {},
304+
"outputs": [
305+
{
306+
"name": "stdout",
307+
"output_type": "stream",
308+
"text": [
309+
"The random variables in the model = [Cavity, Toothache, Catch]\n",
310+
"The prior probability of having a toothache = 0.2\n",
311+
"The prior probability of having a cavity = 0.2\n",
312+
"The probability of having a cavity and toothache simultaneously is = 0.12\n",
313+
"The probability of having a toothache given that the person has a cavity(causal direction) is = 0.6\n",
314+
"The probability of having a cavity given that the person is experiencing toothache(diagnostic direction) is = 0.6\n"
315+
]
316+
},
317+
{
318+
"data": {
319+
"text/plain": [
320+
"null"
321+
]
322+
},
323+
"execution_count": 18,
324+
"metadata": {},
325+
"output_type": "execute_result"
326+
}
327+
],
328+
"source": [
329+
"package aima.notebooks.probabilisticreasoning;\n",
330+
"\n",
331+
"import aima.core.probability.example.*;\n",
332+
"import aima.core.probability.bayes.*;\n",
333+
"import aima.core.probability.bayes.impl.*;\n",
334+
"import aima.core.probability.bayes.impl.*;\n",
335+
"import aima.core.probability.bayes.model.*;\n",
336+
"import aima.core.probability.proposition.*;\n",
337+
"\n",
338+
"// Load the network from the network factory.\n",
339+
"BayesianNetwork cavityNet = BayesNetExampleFactory.constructToothacheCavityCatchNetwork();\n",
340+
"// Construct the BayesModel from the BayesNet\n",
341+
"// We have not passed any inference procedure. Hence, the default inference procedure will be used.\n",
342+
"FiniteBayesModel model = new FiniteBayesModel(cavityNet);\n",
343+
"// Now we are ready to answer all sorts of questions.\n",
271344
"\n",
272-
"$$P (x 1 , . . . , x n ) = P (x n | x n−1 , . . . , x 1 )P (x n−1 | x n−2 , . . . , x 1 ) · · · P (x 2 | x_1 )P (x_1 )$$"
345+
"// Let's define a few assignments\n",
346+
"AssignmentProposition toothache = new AssignmentProposition(ExampleRV.TOOTHACHE_RV,true);\n",
347+
"AssignmentProposition cavity = new AssignmentProposition(ExampleRV.CAVITY_RV,true);\n",
348+
"// Now let's have a look at what we can do with the model.\n",
349+
"// To print the random variables in the model\n",
350+
"System.out.println(\"The random variables in the model = \" + model.getRepresentation());\n",
351+
"// We can calculate the prior probabilities of a variety of combinations of random variables\n",
352+
"System.out.println(\"The prior probability of having a toothache = \"+ model.prior(toothache));\n",
353+
"System.out.println(\"The prior probability of having a cavity = \"+ model.prior(cavity));\n",
354+
"System.out.println(\"The probability of having a cavity and toothache simultaneously is = \"+ model.prior(toothache, cavity));\n",
355+
"// We can also calculate a variety of posterior probabilities from the model as follows\n",
356+
"System.out.println(\"The probability of having a toothache given that the person has a cavity(causal direction) is = \"+ \n",
357+
" model.posterior(toothache,cavity));\n",
358+
"System.out.println(\"The probability of having a cavity given that the person is experiencing toothache(diagnostic direction) is = \"\n",
359+
" + model.posterior(cavity,toothache));"
273360
]
274361
},
275362
{

0 commit comments

Comments
 (0)