@InProceedings{UAI96*211, author = "Rina Dechter", title = "Bucket elimination: {A} unifying framework for probabilistic inference", pages = "211--219", ISBN = "1-55860-412-X", editor = "Eric Horvitz and Finn Jensen", booktitle = "Proceedings of the 12th Conference on Uncertainty in Artificial Intelligence ({UAI}-96)", month = aug # "~1--4", publisher = "Morgan Kaufmann Publishers", address = "San Francisco", year = "1996", }
Bucket Elimination is a unifying algorithmic framework that generalizes dynamic programming to accommodate algorithms for many complex problem solving and reasoning activities, including directional resolution for propositional satisfiability (Davis and Putnam, 1960), adaptive consistency for constraint satisfaction (Dechter and Pearl, 1987), Fourier and Gaussian elimination for linear equalities and inequalities, and dynamic programming for combinatorial optimization (Bertele and Brioschi, 1972).
Directional resolution (DR)
Is the core of the well-known Davis-Putnam algorithm for satisfiability. It systematically eliminates literals from the set of clauses that are candidates for future resolution.
The algorithm, outlined in figure 1 (page 2), is described using buckets partitioning the set of clauses in theory Phi. We call its output theory, E_{d}(Phi), the directional extension of Phi. Given an ordering d = Q_{1}, ... , Q_{n}, all the clauses containing Q_{i} that do not contain any symbol higher in the ordering are placed in the bucket of Q_{i}, denoted bucket_{i}. The algorithm processes the buckets in a reverse order of d. When processing bucket_{i,} it resolves over Q_{i} all possible pairs of clauses in the bucket and inserts the resolvents into the appropriate lower buckets.
It was shown that if the algorithm returns a non-empty clause, then the theory is satisfiable and a satisfying truth assignment can be generated in time linear in the size of the resulting theory.
Theorem 1.1 (model generation)
Let Phi be a CNF formula, d = Q_{1}, ... , Q_{n} an ordering, and E_{d}(Phi) its directional extension. Then, if the extension is not empty, any model of Phi can be generated in a backtrack-free manner, consulting E_{d}(Phi) in the order d as follows: assign to Q_{1} a truth value that is consistent with clauses in bucket_{1} (if the bucket is empty, assign Q_{1} an arbitrary value); after assigning values to Q_{1}, ... , Q_{i-1}, assign a value to Q_{i} so that together with the previous assignments it will satisfy all clauses in bucket_{i.}
Tasks
The collection of the following four belief network algorithms have a lot in common with the resolution procedure above. They all possess similar properties of compiling a theory into a backtrack-free (e.g., greedy) theory and their complexity is dependent on the same graph parameters.
Given an evidence set e, an instantiated subset of variables,
The serious concern of elimination algorithms is efficiency; they normally take exponential time and space. Considerable memory is required to record intermediate functions. Conditioning is an important method for reducing the space complexity. It can be incorporated on top of elimination.
Conditioning is a generic name for algorithms that search the space of partial value assignments, or partial conditionings. Conditioning means splitting a problem into subproblems based on a certain condition. The complexity of conditioning algorithms is exponential in the conditioning set, however, their space complexity is only linear.