Will Yudkowsky claim that he is more than 90% sure that AI will kill everyone no later than 1 year after the claim?
Prediction market on manifold. Criteria for Resolution: Claim of Probability: Eliezer Yudkowsky must make a public claim that he is more than 90% sure that AI will kill everyone no later than 1 year after the claim is made. The claim must be made in all seriousness, not as a joke or sarcasm. Types of Qualifying Statements: Statements such as "certain”, "almost sure”, "pretty certain”, "almost inevitable", or any similar expressions that indicate a probability of higher than 90% will qualify, beyond explicit quantitative claims. If resolution criteria are not met, the market will resolve NO on January 1, 2036. Verification of Ambiguity: - In cases where the claim is ambiguous, efforts will be made to explicitly ask Yudkowsky if his statement means that the probability is higher than 90%. A related market (the same question but 50% instead of 90%): https://manifold.markets/IhorKendiukhov/will-yudkowsky-claim-that-he-is-mor
Liquidity: $10,000. Resolves: 1/1/2036.