Edit ‘decision_theory/xor_blackmail’
This commit is contained in:
parent
371a4f9a3f
commit
d1476ee87b
7
decision_theory/xor_blackmail.myco
Normal file
7
decision_theory/xor_blackmail.myco
Normal file
@ -0,0 +1,7 @@
|
||||
The XOR blackmail problem allows [[EDT|Decision Theory]] agents to be [[money-pumped]].
|
||||
|
||||
Say [[Omega]], a perfectly honest perfect predictor of your actions, sends you the following message: "I have sent you this message [[iff]] your house has been infested with [[termites]] [[xor]] you will pay me $1000", and that a termite infestation would cost $1000000, a larger number.
|
||||
|
||||
A CDT agent reasons that whether they pay Omega has no causal influence on whether their house is infested with termites, and so elects to not pay Omega. Omega predicts this, so
|
||||
|
||||
A helpful expository video from our research teams has been attached.
|
Loading…
x
Reference in New Issue
Block a user