Hmm, runs on CPU and only uses a 3GB model. Would be cool if they took that GUI and allowed to run on the best available hardware in the system and used some larger models.
You are viewing a single comment's thread from:
Hmm, runs on CPU and only uses a 3GB model. Would be cool if they took that GUI and allowed to run on the best available hardware in the system and used some larger models.
I have ChatGPT-4 and I use that for work. But I also like to experiment and change and adapt things myself, and that's why I like to play with these things and try to learn more. Six months ago, that would have been a miracle. Now it's fun to play.
It is certainly fun to play. There's a bunch of chat bots out there, a lot of them are trash when run locally; and there's a lot that are just "circle simulators" where they take in what you said, modify it, and answer a question with a question.
I've not played with ChatGPT-4 (I don't feel as though I have an application where the "plebian" version won't answer the question I have) - but when I do, I'm sure I'll find something useful to do with it.
I expected nothing and was surprised. I had previously tried Alpaca Electron and that was a horror, slow and what came out, to forget. Besides the 3.5 GB there are also 7.5 GB models. But even the 3.5 have made fun.