pyAgrum on notebooks
☰  potentials

# Potentials¶

In pyAgrum, Potentials represent multi-dimensionnal arrays with (discrete) random variables attached to each dimension. This mathematical object have tensorial operators w.r.t. to the variables attached.

In [ ]:
import pyAgrum as gum
import pyAgrum.lib.notebook as gnb

a,b,c=[gum.LabelizedVariable(s,s,2) for s in "abc"]


# potential algebra¶

In [ ]:
p1=gum.Potential().add(a).add(b).fillWith([1,2,3,4]).normalize()

In [ ]:
gnb.sideBySide(p1,p2,p1+p2,
captions=['p1','p2','p1+p2'])

In [ ]:
p3=p1+p2
gnb.showPotential(p3/p3.margSumOut(["b"]))

In [ ]:
p4=gum.Potential()+p3
gnb.sideBySide(p3,p4,
captions=['p3','p4'])


# bayes formula¶

In [ ]:
bn=gum.fastBN("a->c;b->c",3)
bn


In such a small bayes net, we can directly manipulate $P(a,b,c)$. For instance : $$P(b|c)=\frac{\sum_{a} P(a,b,c)}{\sum_{a,b} P(a,b,c)}$$

In [ ]:
pABC=bn.cpt("a")*bn.cpt("b")*bn.cpt("c")
pBgivenC=(pABC.margSumOut(["a"])/pABC.margSumOut(["a","b"]))

pBgivenC.putFirst("b") # in order to have b horizontally in the table


# Joint, marginal probability, likelihood¶

Let's compute the joint probability $P(A,B)$ from $P(A,B,C)$

In [ ]:
pAC=pABC.margSumOut(["b"])
print("pAC really is a probability : it sums to {}".format(pAC.sum()))
pAC


## Computing $p(A)$¶

In [ ]:
pAC.margSumOut(["c"])


## Computing $p(A |C=1)$¶

It is easy to compute $p(A, C=1)$

In [ ]:
pAC.extract({"c":1})


Moreover, we know that $P(C=1)=\sum_A P(A,C=1)$

In [ ]:
pAC.extract({"c":1}).sum()


Now we can compute $p(A|C=1)=\frac{P(A,C=1)}{p(C=1)}$

In [ ]:
pAC.extract({"c":1}).normalize()


## Computing $P(A|C)$¶

$P(A|C)$ is represented by a matrix that verifies $p(A|C)=\frac{P(A,C)}{P(C}$

In [ ]:
pAgivenC=(pAC/pAC.margSumIn("c")).putFirst("a")
# putFirst("a") : to correctly show a cpt, the first variable have to bethe conditionned one
gnb.sideBySide(pAgivenC,pAgivenC.extract({'c':1}),
captions=["$P(A|C)$","$P(A|C=1)$"])


## Likelihood $P(A=2|C)$¶

A likelihood can also be found in this matrix.

In [ ]:
pAgivenC.extract({'a':2})


A likelihood does not have to sum to 1. It is not relevant to normalize it.

In [ ]:
pAgivenC.margSumIn(["a"])


# entropy of potential¶

In [ ]:
%matplotlib inline
from pylab import *
import matplotlib.pyplot as plt
import numpy as np

In [ ]:
p1=gum.Potential().add(a)
x = np.linspace(0, 1, 100)
plt.plot(x,[p1.fillWith([p,1-p]).entropy() for p in x])
plt.show()

In [ ]:
t=gum.LabelizedVariable('t','t',3)

def entrop(bc):
"""
bc is a list [a,b,c] close to a distribution
(normalized just to be sure)
"""
return p1.fillWith(bc).normalize().entropy()

import matplotlib.tri as tri

corners = np.array([[0, 0], [1, 0], [0.5, 0.75**0.5]])
triangle = tri.Triangulation(corners[:, 0], corners[:, 1])

# Mid-points of triangle sides opposite of each corner
midpoints = [(corners[(i + 1) % 3] + corners[(i + 2) % 3]) / 2.0 \
for i in range(3)]
def xy2bc(xy, tol=1.e-3):
"""
From 2D Cartesian coordinates to barycentric.
"""
s = [(corners[i] - midpoints[i]).dot(xy - midpoints[i]) / 0.75 \
for i in range(3)]
return np.clip(s, tol, 1.0 - tol)

def draw_entropy(nlevels=200, subdiv=6, **kwargs):
import math

refiner = tri.UniformTriRefiner(triangle)
trimesh = refiner.refine_triangulation(subdiv=subdiv)
pvals = [entrop(xy2bc(xy)) for xy in zip(trimesh.x, trimesh.y)]

plt.tricontourf(trimesh, pvals, nlevels, **kwargs)
plt.axis('equal')
plt.xlim(0, 1)
plt.ylim(0, 0.75**0.5)
plt.axis('off')

draw_entropy()

In [ ]: