Sort:  

Ok, I am getting you there, I didn’t realise the amount of tokens made such a difference!

Yeah. Tokens denote data and it is all about data. The more that is available to the models to train on, the better. Llama3 was trained on 16 trillion so we can see a lot is needed.

I would say the next model training for Meta tops 50 trillion.

We need to feed LeoAI with more along with getting as much as we can on the blockchain. The latter makes it available to everyone.

Ok, but I guess it’s not only about volume, but also quality?

Well quality is not how you think. We think of quality in terms of posts. That is not how computers think. For example, better quality for a knowledge graph is a table of data because that is already structured in a way the computer understands.

Ok, I get it, I ‘ll keep posting long form content it, because I like it. Every little bit counts!

It all counts yet we have to increase it. The token count keeps increasing but so does the need. Each thread is fed in along with the posts. We just have to keep adding a ton more data (tokens).