Rohin, thank you for the especially long and informative newsletter.
When there are more samples, we get a lower validation loss [...]
I guess you’ve meant a higher validation loss ?
a higher validation loss
No, I think lower is correct?
More samples --> more data to fit to --> less chance to overfit to noise in the training data --> better performance on held-out validation data --> lower validation loss.
Rohin, thank you for the especially long and informative newsletter.
I guess you’ve meant
a higher validation loss
?No, I think lower is correct?
More samples --> more data to fit to --> less chance to overfit to noise in the training data --> better performance on held-out validation data --> lower validation loss.