Bessiere, Christian
[Verfasser:in];
Carbonnel, Clément
[Verfasser:in];
Cooper, Martin C.
[Verfasser:in];
Hebrard, Emmanuel
[Verfasser:in]
;
Christian Bessiere and Clément Carbonnel and Martin C. Cooper and Emmanuel Hebrard
[Mitwirkende:r]
Complexity of Minimum-Size Arc-Inconsistency Explanations
Anmerkungen:
Diese Datenquelle enthält auch Bestandsnachweise, die nicht zu einem Volltext führen.
Beschreibung:
Explaining the outcome of programs has become one of the main concerns in AI research. In constraint programming, a user may want the system to explain why a given variable assignment is not feasible or how it came to the conclusion that the problem does not have any solution. One solution to the latter is to return to the user a sequence of simple reasoning steps that lead to inconsistency. Arc consistency is a well-known form of reasoning that can be understood by a human. We consider explanations as sequences of propagation steps of a constraint on a variable (i.e. the ubiquitous revise function in arc consistency algorithms) that lead to inconsistency. We characterize, on binary CSPs, cases for which providing a shortest such explanation is easy: when domains are Boolean or when variables have maximum degree two. However, these polynomial cases are tight. Providing a shortest explanation is NP-hard if the maximum degree is three, even if the number of variables is bounded, or if domain size is bounded by three. It remains NP-hard on trees, despite the fact that arc consistency is a decision procedure on trees. Finally, the problem is not FPT-approximable unless the Gap-ETH is false.