I didn’t explain it, but from playing with it I had the impression that it did understand what “temperature” was reasonably well (e.g. gpt-4-0613, which is the checkpoint I tested, answers In the context of large language models like GPT-3, "temperature" refers to a parameter that controls the randomness of the model's responses. A higher temperature (e.g., 0.8) would make the output more random, whereas a lower temperature (e.g., 0.2) makes the output more focused and deterministic. [...] to the question What is "temperature", in context of large language models?).
Another thing I wanted to do was compare GPT-4′s performance to people’s performance on this task, but I never got around to doing it.
I didn’t explain it, but from playing with it I had the impression that it did understand what “temperature” was reasonably well (e.g.
gpt-4-0613
, which is the checkpoint I tested, answersIn the context of large language models like GPT-3, "temperature" refers to a parameter that controls the randomness of the model's responses. A higher temperature (e.g., 0.8) would make the output more random, whereas a lower temperature (e.g., 0.2) makes the output more focused and deterministic. [...]
to the questionWhat is "temperature", in context of large language models?
).Another thing I wanted to do was compare GPT-4′s performance to people’s performance on this task, but I never got around to doing it.