You are viewing a single comment's thread from:

RE: LeoThread 2025-10-18 14-48

in LeoFinance2 months ago

Part 9/15:

He emphasizes that most of the real "cognitive work" resides in efficient representations and algorithms, not just raw memory in parameters. This has profound implications: smaller, distilled models may suffice if built upon better data and architectures, possibly leading to truly capable "cognitive cores."


The Future of Large Models and Its Economics

Karpathy foresees a steady evolution of AI architectures—not necessarily bigger models but smarter ones. The progression will likely involve more efficient attention, sparse structures, and improved algorithms, rather than explosive growth in size.