I think you talk about current SEO well. Good content and links to that content are still state of the art.
I got a lot out of thinking about the computational / human-bandwidth asymmetry of Google vs content creators.
But have you considered how the fear of being Sandboxed plays into things?
My first thought was that it improved value of the proxy somewhat by making people who know the proxy will change over time be less cavalier. Most people engaging in serious SEO have lucrative websites. You have to be very risk-seeking to go after those small marginal gains at the risk of losing all your cash flow permanently. There aren’t that many large players that it gets driven down to a Nash Equilibrium quicker than Google’s algorithms can change.
But the more I think about it, the fear of being penalized also tends to make legitimate content producers even more concerned that doing ANY SEO is bad. That may make things doubly-worse.
It’s impossible to not do SEO. Every site is optimized for something.
Thanks for the feedback. I hadn’t factored sandboxing into my thinking at all. But as you say it’s a double-edged sword.
I assume the way SEO techniques get around this is to initially ‘interrogate’ the algorithm through throwaway, high-risk websites, and then any robust techniques discovered slowly make their way up the ladder to established, highly conservative websites. Of course at any chance you risk getting caught (and it depends on the repulsiveness of the techniques as well) but that’s always a risk when you’re building on someone else’s platform. If a website depends on its standing in Google search results, you can say it’s building on their platform.
Also an interesting point about LessWrong’s optimization. I guess now we know we count two Search Engine Optimizers in our midst, the Powers that Be can get in touch with you guys..
I think you talk about current SEO well. Good content and links to that content are still state of the art.
I got a lot out of thinking about the computational / human-bandwidth asymmetry of Google vs content creators.
But have you considered how the fear of being Sandboxed plays into things?
My first thought was that it improved value of the proxy somewhat by making people who know the proxy will change over time be less cavalier. Most people engaging in serious SEO have lucrative websites. You have to be very risk-seeking to go after those small marginal gains at the risk of losing all your cash flow permanently. There aren’t that many large players that it gets driven down to a Nash Equilibrium quicker than Google’s algorithms can change.
But the more I think about it, the fear of being penalized also tends to make legitimate content producers even more concerned that doing ANY SEO is bad. That may make things doubly-worse.
It’s impossible to not do SEO. Every site is optimized for something.
For instance, lesswrong.com is optimized for:
vote
points
permalink
children
password
Think about that next time you lament that lesswrong is overwhelmingly less popular than other sites with clearly inferior content.
Thanks for the feedback. I hadn’t factored sandboxing into my thinking at all. But as you say it’s a double-edged sword.
I assume the way SEO techniques get around this is to initially ‘interrogate’ the algorithm through throwaway, high-risk websites, and then any robust techniques discovered slowly make their way up the ladder to established, highly conservative websites. Of course at any chance you risk getting caught (and it depends on the repulsiveness of the techniques as well) but that’s always a risk when you’re building on someone else’s platform. If a website depends on its standing in Google search results, you can say it’s building on their platform.
Also an interesting point about LessWrong’s optimization. I guess now we know we count two Search Engine Optimizers in our midst, the Powers that Be can get in touch with you guys..