If you got used to this, you would get rusty at creating journal entries when not connected to the internet, correct? (because to use ChatGPT requires internet connectivity)
A powerful enough personal computer could run LLaMA locally. I don’t think the raw model is optimized for chat, but with a suitable prompt, you might be able to get it into chat mode long enough to do this kind of thing. It also wouldn’t surprise me to learn that there are more specialized derivatives now that would be suitable. I’ve certainly heard of people working on it.
If you got used to this, you would get rusty at creating journal entries when not connected to the internet, correct? (because to use ChatGPT requires internet connectivity)
A powerful enough personal computer could run LLaMA locally. I don’t think the raw model is optimized for chat, but with a suitable prompt, you might be able to get it into chat mode long enough to do this kind of thing. It also wouldn’t surprise me to learn that there are more specialized derivatives now that would be suitable. I’ve certainly heard of people working on it.
Thanks; have six points of upvotedness.