Oh interesting, I’m out at the moment and don’t recall having this issue, but if you override the default number of threads for the repo to 1, does that fix it for you?
https://github.com/openai/evals/blob/main/evals/eval.py#L211
(There are two places in this file where threads =, would change 10 to 1 in each)
threads =
Don’t really want to touch the packages, but just setting the EVALS_THREADS environmental variable worked
Great! Appreciate you letting me know & helping debug for others
Oh interesting, I’m out at the moment and don’t recall having this issue, but if you override the default number of threads for the repo to 1, does that fix it for you?
https://github.com/openai/evals/blob/main/evals/eval.py#L211
(There are two places in this file where
threads =
, would change 10 to 1 in each)Don’t really want to touch the packages, but just setting the EVALS_THREADS environmental variable worked
Great! Appreciate you letting me know & helping debug for others