## Efficient approximation of probability distributions with k-order decomposable models

##### Date

2016-01-01##### Metadata

Show full item record##### Abstract

During the last decades several learning algorithms have been proposed to learn probability distributions based on decomposable models. Some of these algorithms can be used to search for a maximum likelihood decomposable model with a given maximum clique size, k. Unfortunately, the problem of learning a maximum likelihood decomposable model given a maximum clique size is NP-hard for $k<2$. In this work, we propose the fractal tree family of algorithms which approximates this problem with a computational complexity of $\mathcal{O}(k \cdot n^2 \log n)$ in the worst case, where $n$ is the number of implied random variables and N is the size of the training set. The fractal tree algorithms construct a sequence of maximal $i$-order decomposable graphs, for $i=2,...,k,$ in $k - 1$ steps. At each step, the algorithms follow a divide-and-conquer strategy that decomposes the problem into a set of separate problems. Each separate problem is efficiently solved using the generalized Chow-Liu algorithm. Fractal trees can be considered a natural extension of the Chow-Liu algorithm, from $k = 2$ to arbitrary values of $k$, and they have shown a competitive behavior to deal with the maximum likelihood problem. Due to their competitive behavior, their low computational complexity and their modularity, which allow them to implement different parallelization strategies, the proposed procedures are especially advisable for modeling high dimensional domains.