You are viewing a single comment's thread from:

RE: LeoThread 2024-11-10 03:05

in LeoFinance2 months ago

Part 2/5:

This raises several important points. Firstly, it's clear that language models are still making progress, albeit at a slower rate. As one OpenAI investor noted, "We're increasing the number of GPUs used to train AI, but we're not getting the intelligence improvements out of it at all." This suggests that the scaling paradigm may be reaching its limits, as hinted at by Dario Amodei, who said that training the next generation of models could soon cost hundreds of billions of dollars.

However, the story is not one of pure pessimism. The author delves into the potential for progress to come from a different angle: data efficiency. The Frontier Math Benchmark, which tests language models on challenging mathematical problems, highlights the importance of access to relevant training data. As one researcher noted, "These problems typically demand hours or even days for specialist mathematicians to solve," and even the renowned Terrence Tao struggled with many of them.

[...]