I mean even if Lanrian’s corrections are based on perfect recall, none of them would make any of my notes “pretty false”. He hedged more here, that warning was more specific, the AGI definition was more like “transformative AGI”—these things don’t even go beyond the imprecision in Sam Altman’s answers.
The only point were I think I should have been more precise is about the different “loss function”. That was my interpretation in the moment, but it now seems to me much more uncertain whether that was actually what he meant.
I don’t care about the frontpage, but if this post is seen by some as “false gossip and rumors about someone’s views” I’d rather take it down.
I mean even if Lanrian’s corrections are based on perfect recall, none of them would make any of my notes “pretty false”. He hedged more here, that warning was more specific, the AGI definition was more like “transformative AGI”—these things don’t even go beyond the imprecision in Sam Altman’s answers.
The only point were I think I should have been more precise is about the different “loss function”. That was my interpretation in the moment, but it now seems to me much more uncertain whether that was actually what he meant.
I don’t care about the frontpage, but if this post is seen by some as “false gossip and rumors about someone’s views” I’d rather take it down.