Experiment No - 1 Implement Inferencing with Bayesian Network in Python
With solution
What is Bayesian network: Bayesian belief network is key computer technology for dealing with probabilistic events and to solve a problem which has uncertainty. We can define a Bayesian network as: "A Bayesian network is a probabilistic graphical model which represents a set of variables and their conditional dependencies using a directed acyclic graph."
It is also called a Bayes network, belief network, decision network, or Bayesian model. Bayesian networks are probabilistic, because these networks are built from a probability distribution, and also use probability theory for prediction and anomaly detection. Real world applications are probabilistic in nature, and to represent the relationship between multiple events, we need a Bayesian network. It can also be used in various tasks including prediction, anomaly detection, diagnostics, automated insight, reasoning, time series prediction, and decision making under uncertainty.
What is Directed Acyclic Graph (DAG)? In graph theory and computer science , a directed ayclic graph (DAG) is a directed graph with no directed cycles. In other words , its made up of vertices and edges [also called arcs], with each pointing from one vertex to the next in such a way that following those directions would never lead to a closed-loop as depicted in below picture#mumbai_university
Implement Inferencing with Bayesian Network in Python
With solution
#mumbai_university
| Implement Inferencing with Bayesian Network in Python |
| Implement Inferencing with Bayesian Network in Python |
A directed graph is one in which all edge direction are consistent and the vertices can be topologically arranged in a linear order . DAGs have various scientific and computing applications, including biology evolution , family trees , and epidemiology , and sociology
Lets see quickly what are fundamental maths involved with Bayesian network
The maths behind the Bayesian network
An acyclic directed graph is used to create a Bayesian which is a probability model . its factored by utilizing a single conditional probability distribution for eact variable in the model , whose distribution is based on the graph . the simple principle of probability underpins Bayesian models, so first lets define conditional probability and joint probability distribution.
Conditional probability : Conditional probability is a measure of the likehood of an event occurring provided that event has already occurred (through assumption, supposition, statement or evidence).
If A is the event of interest and B is known or considered to have occurred the conditional probability of A given B is generally stated as P[A|B] or , less frequently ,
P(B)(A) if A is the event of interest and B is known or through to have occurred . This can also be expressed as a percentage of the likehood of B crossing
with A : P(A|B) = P(A and B) / P(B) Joint probability The chance of two ( or more ) events togethers is known as the joint probability . "Implement Inferencing with Bayesian Network in Python
With solution"#mumbai_university
The sum of the probabilities of two or more random variables is the joint probability distribution.
For example , the joint probability of events A and B is expressed formally as :
• The letter P is the first letter of the alphabet [A and B].
• The upside-down capital “U” operator or, in some situations, a comma “,” represents the “and” or conjunction.
• P(A*B)
• P(A,B) By multiplying the chance of event A by the likehood of event B , combined probability for occurrence A and B is calculated.
Implement Inferencing with Bayesian Network in Python With solution
Posterior Probability In Bayesian statistics, the conditional probability of a random occurrence or an ambiguous assertion is the conditional probability given the relevant data or background. "After taking into account the relevant evidence pertinent to the specific subject under consideration," "posterior" means in this case
The probability distribution of an unknown quantity
interpreted as a random variable based on
data from an experiment or survey is known as the posterior
probability distribution.
Implement Inferencing with Bayesian Network in Python With solution
Inferencing with Bayesian Network in Python
In this demonstration, we'll use Bayesian Networks to solve
the well-known Monty Hall
Problem. Let me explain the Monty Hall problem to those of
you who are unfamiliar with it:
This problem entails a competition in which a contestant
must choose one of three doors, one of
which conceals a price. The show's host (Monty) unlocks an
empty door and asks the contestant
if he wants to swap to the other door after the contestant
has chosen one. The decision is whether
to keep the current door or replace it with a new one. It is
preferable to enter by the other door
because the price is more likely to be higher. To come out
from this ambiguity let's model this
with a Bayesian network.
For this demonstration, we are using a python-based package
pgmpy is a Bayesian Networks
implementation written entirely in Python with a focus on
modularity and flexibility. Structure
Learning, Parameter Estimation, Approximate (Sampling-Based)
and Exact inference, and
Causal Inference are all available as implementations.
#mumbai_university
Alarm Problem
A person has installed a new alarm system that can be
triggered by a burglary or an earthquake.
This person also has two neighbors (John and Mary) that are
asked to make a call if they hear the
alarm. This problem is modeled in a Bayesian network with
probabilities attached to each edge.
Alarm has burglary and earthquake as parents, JohnCalls has
Alarm as parent and MaryCalls has
Alarm as parent. We can ask the network; what is the
probability for a burglary if both John and
Mary calls
#mumbai_university
Conclusion: Bayesian networks (BN) are a wide-spread tool to model uncertainty, and to reason about it. A BN represents conditional independence relations between random variables. It consists of a graph encoding the variable dependencies, and of conditional probability tables (CPTs). Given a variable order, the BN is small if every variable depends on only a few of its predecessors. Probabilistic inference requires to compute the probability distribution of a set of query variables, given a set of evidence variables whose values we know. The remaining variables are hidden. Inference by enumeration takes a BN as input, then applies Normalization + Marginalization, the Chain rule, and exploits conditional independence. This can be viewed as a tree search that branch over all values of the hidden variables. Variable elimination avoids unnecessary computation. Thus, we have successfully implemented Inferencing with Bayesian Network in Python
#mumbai_university
Experiment link :- Click here
0 Comments