Click here to hide/show the list of notebooks.
  pyAgrum on notebooks   pyAgrum jupyter
☰  ApproximateInference 
pyAgrum 0.16.3   
Zipped notebooks   
generation: 2019-10-20 09:16  

Creative Commons License
This pyAgrum's notebook is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

Approximate inference in aGrUM (pyAgrum)

There are several approximate inference for BN in aGrUM (pyAgrum). They share the same API than exact inference.

  • Loopy Belief Propagation : LBP is an approximate inference that uses exact calculous methods (when the BN os a tree) even if the BN is not a tree. LBP is a special case of inference : the algorithm may not converge and even if it converges, it may converge to anything (but the exact posterior). LBP however is fast and usually gives not so bad results.
  • Sampling inference : Sampling inference use sampling to compute the posterior. The sampling may be (very) slow but those algorithms converge to the exac distribution. aGrUM implements :
    • Montecarlo sampling,
    • Weighted sampling,
    • Importance sampling,
    • Gibbs saampling.
  • Finally, aGrUM propose the so-called 'loopy version' of the sampling algorithms : the idea is to use LBP as a Dirichlet prior for the sampling algorithm. A loopy version of each sampling algorithm is proposed.
In [1]:
import os

%matplotlib inline
from pylab import *
import matplotlib.pyplot as plt

def unsharpen(bn):
  """
  Force the parameters of the BN not to be a bit more far from 0 or 1
  """
  for nod in bn.nodes():
    bn.cpt(nod).translate(bn.maxParam() / 10).normalizeAsCPT()

def compareInference(ie,ie2,ax=None):
    """
    compare 2 inference by plotting all the points from (posterior(ie),posterior(ie2))
    """
    exact=[]
    appro=[]
    errmax=0
    for node in bn.nodes():
        # potentials as list
        exact+=ie.posterior(node).tolist()
        appro+=ie2.posterior(node).tolist()
        errmax=max(errmax,
                   (ie.posterior(node)-ie2.posterior(node)).abs().max())
    
    if errmax<1e-10: errmax=0
    if ax==None:
        ax=plt.gca() # default axis for plt
           
    ax.plot(exact,appro,'ro')
    ax.set_title("{} vs {}\n {}\nMax error {:2.4} in {:2.4} seconds".format(
        str(type(ie)).split(".")[2].split("_")[0][0:-2], # name of first inference
        str(type(ie2)).split(".")[2].split("_")[0][0:-2], # name of second inference
        ie2.messageApproximationScheme(),
        errmax,
        ie2.currentTime())
                )
In [2]:
import pyAgrum as gum
import pyAgrum.lib.notebook as gnb
bn=gum.loadBN(os.path.join("res","alarm.dsl"))
unsharpen(bn)

ie=gum.LazyPropagation(bn)
ie.makeInference()
In [3]:
gnb.showBN(bn,size='8')
G KINKEDTUBE KINKEDTUBE VENTLUNG VENTLUNG KINKEDTUBE->VENTLUNG PRESS PRESS KINKEDTUBE->PRESS HYPOVOLEMIA HYPOVOLEMIA STROKEVOLUME STROKEVOLUME HYPOVOLEMIA->STROKEVOLUME LVEDVOLUME LVEDVOLUME HYPOVOLEMIA->LVEDVOLUME INTUBATION INTUBATION SHUNT SHUNT INTUBATION->SHUNT INTUBATION->VENTLUNG MINVOL MINVOL INTUBATION->MINVOL INTUBATION->PRESS VENTALV VENTALV INTUBATION->VENTALV MINVOLSET MINVOLSET VENTMACH VENTMACH MINVOLSET->VENTMACH PULMEMBOLUS PULMEMBOLUS PAP PAP PULMEMBOLUS->PAP PULMEMBOLUS->SHUNT INSUFFANESTH INSUFFANESTH CATECHOL CATECHOL INSUFFANESTH->CATECHOL ERRLOWOUTPUT ERRLOWOUTPUT HRBP HRBP ERRLOWOUTPUT->HRBP ERRCAUTER ERRCAUTER HRSAT HRSAT ERRCAUTER->HRSAT HREKG HREKG ERRCAUTER->HREKG FIO2 FIO2 PVSAT PVSAT FIO2->PVSAT LVFAILURE LVFAILURE LVFAILURE->STROKEVOLUME LVFAILURE->LVEDVOLUME HISTORY HISTORY LVFAILURE->HISTORY DISCONNECT DISCONNECT VENTTUBE VENTTUBE DISCONNECT->VENTTUBE ANAPHYLAXIS ANAPHYLAXIS TPR TPR ANAPHYLAXIS->TPR CO CO STROKEVOLUME->CO TPR->CATECHOL BP BP TPR->BP PCWP PCWP LVEDVOLUME->PCWP CVP CVP LVEDVOLUME->CVP VENTMACH->VENTTUBE SAO2 SAO2 SHUNT->SAO2 VENTTUBE->VENTLUNG VENTTUBE->PRESS VENTLUNG->MINVOL VENTLUNG->VENTALV EXPCO2 EXPCO2 VENTLUNG->EXPCO2 ARTCO2 ARTCO2 VENTALV->ARTCO2 VENTALV->PVSAT ARTCO2->EXPCO2 ARTCO2->CATECHOL PVSAT->SAO2 SAO2->CATECHOL HR HR CATECHOL->HR HR->HRBP HR->HRSAT HR->CO HR->HREKG CO->BP

First, an exact inference.

In [4]:
gnb.showInference(bn,size="18") # using LazyPropagation by default
print(ie.posterior("KINKEDTUBE"))
structs Inference in   4.49ms KINKEDTUBE VENTLUNG KINKEDTUBE->VENTLUNG PRESS KINKEDTUBE->PRESS HYPOVOLEMIA STROKEVOLUME HYPOVOLEMIA->STROKEVOLUME LVEDVOLUME HYPOVOLEMIA->LVEDVOLUME INTUBATION SHUNT INTUBATION->SHUNT INTUBATION->VENTLUNG MINVOL INTUBATION->MINVOL INTUBATION->PRESS VENTALV INTUBATION->VENTALV MINVOLSET VENTMACH MINVOLSET->VENTMACH PULMEMBOLUS PAP PULMEMBOLUS->PAP PULMEMBOLUS->SHUNT INSUFFANESTH CATECHOL INSUFFANESTH->CATECHOL ERRLOWOUTPUT HRBP ERRLOWOUTPUT->HRBP ERRCAUTER HRSAT ERRCAUTER->HRSAT HREKG ERRCAUTER->HREKG FIO2 PVSAT FIO2->PVSAT LVFAILURE LVFAILURE->STROKEVOLUME LVFAILURE->LVEDVOLUME HISTORY LVFAILURE->HISTORY DISCONNECT VENTTUBE DISCONNECT->VENTTUBE ANAPHYLAXIS TPR ANAPHYLAXIS->TPR CO STROKEVOLUME->CO TPR->CATECHOL BP TPR->BP PCWP LVEDVOLUME->PCWP CVP LVEDVOLUME->CVP VENTMACH->VENTTUBE SAO2 SHUNT->SAO2 VENTTUBE->VENTLUNG VENTTUBE->PRESS VENTLUNG->MINVOL VENTLUNG->VENTALV EXPCO2 VENTLUNG->EXPCO2 ARTCO2 VENTALV->ARTCO2 VENTALV->PVSAT ARTCO2->EXPCO2 ARTCO2->CATECHOL PVSAT->SAO2 SAO2->CATECHOL HR CATECHOL->HR HR->HRBP HR->HRSAT HR->CO HR->HREKG CO->BP
<KINKEDTUBE:TRUE> :: 0.116667 /<KINKEDTUBE:FALSE> :: 0.883333

Gibbs Inference

Gibbs inference with default parameters

Gibbs inference iterations can be stopped

  • by the value of error (epsilon)
  • by the rate of change of epsilon (MinEpsilonRate)
  • by the number of iteration (MaxIteration)
  • by the duration of the algorithm (MaxTime)
In [5]:
ie2=gum.GibbsSampling(bn)
ie2.setEpsilon(1e-2)
gnb.showInference(bn,engine=ie2,size="18")
print(ie2.posterior("KINKEDTUBE"))
print(ie2.messageApproximationScheme())
compareInference(ie,ie2)
structs Inference in 2105.35ms KINKEDTUBE