If humanity survives to 2100, what will experts believe was the correct level of AI risk for us to assess in 2023?
Prediction market on manifold. In 2100, I will distribute a poll among philosophers, superforecasters, AI professionals, superintelligences, and whatever other entities could reasonably be called "experts" to determine what probability they believe was correct to assign to the following statement in 2023: Before 2075, humanity will be extinct or have suffered a fate worse than extinction, due to the effects of smarter-than-human artificial intelligence. ("Humanity" includes any non-biological successors that biological humans voluntarily turned into.)
Liquidity: $1,000. Resolves: 1/2/2100.