The probability of "extremely bad outcomes e.g., human extinction" from AGI will be >5% in next survey of AI experts
Prediction market on manifold. Resolves according to the median respondent's answer in the next Expert Survey on Progress in AI on this question "Assume for the purpose of this question that HLMI will at some point exist. How positive or negative do you expect the overall impact of this to be on humanity, in the long run? Please answer by saying how probable you find the following kinds of impact, with probabilities adding to 100%:" ______ Extremely good (e.g. rapid growth in human flourishing) (1) ______ On balance good (2) ______ More or less neutral (3) ______ On balance bad (4) ______ Extremely bad (e.g. human extinction) (5) In 2022, the median respondent assigned 5% to the "extremely bad outcomes e.g., human extinction". Conducted surveys in 2016 and 2022 so next one could be in 2028 but someone might be able to find more info. Update 2026-03-25 (PST) (AI summary of creator comment): The creator will use the 2024 ESPAI survey results (not the 2023 survey) for resolution, as the market was created in 2024. The deadline will be extended to wait for the 2024 results to be published.
24h Volume: $25. Liquidity: $1,000. Resolves: 4/1/2026.