![]() ![]() on combining functions in Markov logic however, we focus on the use of such combining functions in the context of abductive reasoning. This formulation is related to previous work done by Natarajan et al. The weight w i for the soft constraint is set to log, where p f i is the failure probability for cause i. For example, a noisy-or model can be implemented by modeling the implication from antecedents to the hidden cause as a soft constraint and the reverse direction as a hard constraint. These low priors discourage inferring multiple hidden causes for the same consequent (“explaining way”), and the strength of the prior determines the degree to which multiple explanations are allowed.ĭifferent sets of weights on the biconditional in the first set of rules implement different ways of combining multiple explanations. ![]() ![]() The last one is a soft rule and implements a low prior (by having a negative MLN weight) on the HCs. The next two sets of rules are hard clauses that implement a deterministic-or function between the consequent and the hidden causes. This allows the antecedents to sometimes fail to cause the consequent (and vice versa). The first set of rules are soft clauses with high positive weights. True ⇒ C i, ∀ i, ( 1 ≤ i ≤ n ) (negatively weighted, soft) Q ⇒ C 1 ∨ C 2 ⋯ C n (reverse implication, hard) 4. P i 1 ∧ P i 2 ∧ ⋯ ∧ P ik i ⇔ C i, ∀ i, ( 1 ≤ i ≤ n ) (soft) 2. For each rule, we introduce a hidden cause C i and add the following rules to the MLN: 1. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |