Another solution would be to insert those texts by JavaScript, which means that users with JavaScript disabled would not see them.
They’re already inserted by javascript. E.g. the ‘recent comments’ one works by fetching http://lesswrong.com/api/side_comments and inserting its contents directly in the page.
Editing robots.txt might exclude those parts from the google index, but idk.
They’re already inserted by javascript. E.g. the ‘recent comments’ one works by fetching http://lesswrong.com/api/side_comments and inserting its contents directly in the page.
Editing robots.txt might exclude those parts from the google index, but idk.
I think robots.txt would work.