I’m interested! I’d probably mostly be comparing it to unaugmented Claude for things like explaining ML topics and turning my post ideas into drafts (I don’t expect it to be great at this latter but I’m curious whether having some relevant posts in the context window will elicit higher quality). I also think the low-friction integration might make it useful for clarifying math- or programming-heavy posts, though I’m not sure I’ll want this often.
also think the low-friction integration might make it useful for clarifying math- or programming-heavy posts, though I’m not sure I’ll want this often.
I’m interested! I’d probably mostly be comparing it to unaugmented Claude for things like explaining ML topics and turning my post ideas into drafts (I don’t expect it to be great at this latter but I’m curious whether having some relevant posts in the context window will elicit higher quality). I also think the low-friction integration might make it useful for clarifying math- or programming-heavy posts, though I’m not sure I’ll want this often.
You now have access to the LW LLM Chat prototype!
That’s actualy one of my favorite use-cases