Sort:  

Now there are models training other models. I am running local Deepseek R1 Distilled version of Llama 70B. This means that Llama has been trained to think and reason like Deepseek. These kind of things are changing the game already.

That is what most people overlook. There is a convergence to technology which produces exponential effects.

The advancement of Llama is compounded by Deepseek, which also has its own advancement.

Yep, and people using the corporate frontier models are the ones that are going to get left behind in the end. If you aren't building your own infrastructure, you will be controlled in the future. So it's best to get ahead of it now.

Exactly—those billions fuel the innovation we all tap into for free. Smaller models like Rafiki democratize the gains, letting us focus on real mindset shifts instead of the tech grind