Thanks for natural language stochastic compiler explanation, makes a lot of sense. I broadly get a sense of what you mean by “context window” since people have been mentioning that quite a lot when talking about GPT-3. As for whether it makes sense to write docstrings for trivial things, I think this is only pointing at the Codex demo examples where people write docstrings and get results, but for most of my use cases, and when it gets really interesting, is when it auto-completes 1) while I’m writing 2) when I’m done writing and it guesses the next line 3) when I start a line by “return ” or “x = ” and wait for his auto-completion. Here, I would have no idea how to formulate it in the docstring, I just generally trust its ability to follow the logic of the code that precedes it (and I find it useful most of the time).
Thanks for natural language stochastic compiler explanation, makes a lot of sense. I broadly get a sense of what you mean by “context window” since people have been mentioning that quite a lot when talking about GPT-3. As for whether it makes sense to write docstrings for trivial things, I think this is only pointing at the Codex demo examples where people write docstrings and get results, but for most of my use cases, and when it gets really interesting, is when it auto-completes 1) while I’m writing 2) when I’m done writing and it guesses the next line 3) when I start a line by “return ” or “x = ” and wait for his auto-completion. Here, I would have no idea how to formulate it in the docstring, I just generally trust its ability to follow the logic of the code that precedes it (and I find it useful most of the time).