Unconditional or Prior Joint Probabilities for independent events A and B (coin toss resulting in head or tail): P(A ^ B) = P(A) * P(B) P (A V B) = P(A) + P(B) Conditional Joint Probabilities for events A and B influencing each other (disease and ache): P(A ^ B) = P(A|B) * P(B) = P(B | A) * P(A) P(A V B) = P(A) + P(B) - P(A ^ B) Any query may be answered using joint probability table. If there are v variable in KB with n domain elements for each, the joint probability table to answer any query is n x n x n x ... v-times = n^v -1 Minus 1, as all probability entries must sum up to 1. Note: P(A | b, c) is probability of A, so, P(A, D |b, c) = P(A |b,c) * P(D |b,c) However: P(A |b,c) is NOT P(A |b)* P(A |c) Global semantics of Prob reasoning is the exhaustive joint probability distribution table of all variable domains. Different choice of initial ordering of variables may draw different Bayesian network. In a Bayesian network node n's conditional probability table will have P rows for P number of its parents. One column for n=T or n=F will suffice, the other column may be computed. If n is categorical, with d domains, then how many columns do you need? OTHER TYPE OF REASONING: Dempster-Shafer: P(A) + P(~A) <= 1, 1-P(A)-P(~A) is "unknown" Fuzzy Logic: "Nate is Tall" may be difficult to formulate in terms of Probabilistic Logic. Tall(Nate) is not a probability, although between [0, 1] T(A and B) = min( T(A), T(B)) What is P(A and B)? T(A or B) = max( T(A), T(B)) T(~A) = 1 - T(A)