Is attention all you need? (transformers SOTA in 2027)
Prediction market on manifold. This market simulates the wager between Jonathan Frankie (@jefrankle) and Sasha Rush (@srush_nlp) Details can be fount at https://www.isattentionallyouneed.com/ Proposition On January 1, 2027, a Transformer-like model will continue to hold the state-of-the-art position in most benchmarked tasks in natural language processing. [image]Other markets on the same: (https://manifold.markets/embed/HanchiSun/is-attention-all-you-need?r=Zmlyc3R1c2VyaGVyZQ)(https://manifold.markets/embed/ICRainbow/on-january-1-2027-a-transformerlike?r=Zmlyc3R1c2VyaGVyZQ)
Liquidity: $3,625.25. Resolves: 1/31/2027.