You should make the choice that brings highest utility. While truths in general are more helpful than falsehoods, this is not necessarily true, even in the case of a truly rational agent. The best falsehood will, in all probability, be better than the worst truth. Even if you exclusively value truth, there will most likely be a lie that results in you having a more accurate model, and the worst possible truth that’s not misleading will have negligible effect. As such, you should chose the black box.
You should make the choice that brings highest utility. While truths in general are more helpful than falsehoods, this is not necessarily true, even in the case of a truly rational agent. The best falsehood will, in all probability, be better than the worst truth. Even if you exclusively value truth, there will most likely be a lie that results in you having a more accurate model, and the worst possible truth that’s not misleading will have negligible effect. As such, you should chose the black box.
I don’t see why this would be puzzling.