Products

Will LLM hallucinations be a fixed problem by the end of 2028?

Prediction market on manifold. https://fortune.com/2023/08/01/can-ai-chatgpt-hallucinations-be-fixed-experts-doubt-altman-openai/ [link preview]“This isn’t fixable,” said Emily Bender, a linguistics professor and director of the University of Washington’s Computational Linguistics Laboratory. “It’s inherent in the mismatch between the technology and the proposed use cases.” How true will this end up being? At the end of 2028 I will evalaute whether the hallucination problem for LLMs has been fixed or still exists. If hallucinations have been solved this market resolves YES. If the outstanding hallucination problem still exists, this market will resolve NO. This is a followup market to this related market: (https://manifold.markets/embed/SneakySly/will-llm-hallucinations-be-a-fixed?r=U25lYWt5U2x5) Update 2025-05-05 (PST) (AI summary of creator comment): The evaluation will focus on the full AI product available to users (e.g., a future version of ChatGPT like GPT-6), rather than just the raw underlying Large Language Model (LLM) in isolation. This means that systems which integrate the LLM with auxiliary components such as: Web search Knowledge bases Citation mechanisms to reduce or eliminate hallucinations will be considered when determining if the problem is 'fixed'.

Liquidity: $3,870. Resolves: 1/1/2029.

Blockcircle
Quantitative tools and real-time data for crypto and macro markets. Scorecards, trade signals, and research in one platform.
Trade
Whale AlphaPrediction AlphaPolitical AlphaInsider AlphaTrade Alpha
Discover
Momentum Trading EngineAsset Outperformer EngineMarket Reversal EngineAlpha Hunter SuiteMarket Analysis
Scorecards
Global Liquidity ScorecardMacroeconomic Risk ScorecardAltcoin Market Scorecard
Resources
Pulse DashboardEcosystem StatsTrending MarketsUser GuidesTrading CourseOpen SourceBlog and News
Company
About UsPricingInstitutionalContactTerms & ConditionsPrivacy Policy
© 2026 Blockcircle. All rights reserved.