This is really good, however i would love some additional discussion on the way that the current optimization changes the user.
Keep in mind, when facebook optimizes “clicks” or “scrolls”, it does so by altering user behavior, thus altering the user’s internal S1 model of what is important. This could frequently lead to a distortion of reality, beliefs and self-esteem. There have been many articles and studies correlating facebook usage with mental health. However, simply understanding “optimization” is enough evidence that this is happening.
While, a lot of these issues are pushed under the same umbrella of “digital addiction,” i think facebook is a lot more of a serious problem that, say video games. Video games do not, as a rule, act through the very social channels that are helpful to reducing mental illness. Facebook does.
Also another problem is facebook’s internal culture that, as of 4 years ago was very marked by the cool-aid that somehow promised unbelievable power(1 billion users, horray) without necessarily caring about responsibility (all we want to do is make the world open and connected, why is everyone mad at us).
This problem is also compounded by the fact that facebook get a lot of shitty critiques (like the critique of the fact that they run A/B tests at all) and has thus learned to ignore legitimate questions of value learning.
It’s just baffling to me that this happened, because it seems on-face obvious that “outrageous” or intentionally politically inflammatory material would be an undesirable attractor in interest-space.
My Facebook feed thinks that I’m most interested in the stupidest and most inflammatory individuals and ideas because that’s where my eyes linger for reasons that I don’t reflectively approve of. I wonder how quickly it would “learn” otherwise if I made an effort to break this pattern.
(It seems like more people who care about things should try working at FB, in particular if there was any learnable path to gaining any degree of power over algorithms or values-of-the-company, but maybe this is just hopelessly naive)
This is really good, however i would love some additional discussion on the way that the current optimization changes the user.
Keep in mind, when facebook optimizes “clicks” or “scrolls”, it does so by altering user behavior, thus altering the user’s internal S1 model of what is important. This could frequently lead to a distortion of reality, beliefs and self-esteem. There have been many articles and studies correlating facebook usage with mental health. However, simply understanding “optimization” is enough evidence that this is happening.
While, a lot of these issues are pushed under the same umbrella of “digital addiction,” i think facebook is a lot more of a serious problem that, say video games. Video games do not, as a rule, act through the very social channels that are helpful to reducing mental illness. Facebook does.
Also another problem is facebook’s internal culture that, as of 4 years ago was very marked by the cool-aid that somehow promised unbelievable power(1 billion users, horray) without necessarily caring about responsibility (all we want to do is make the world open and connected, why is everyone mad at us).
This problem is also compounded by the fact that facebook get a lot of shitty critiques (like the critique of the fact that they run A/B tests at all) and has thus learned to ignore legitimate questions of value learning.
full disclosure, i used to work at FB.
It’s just baffling to me that this happened, because it seems on-face obvious that “outrageous” or intentionally politically inflammatory material would be an undesirable attractor in interest-space.
My Facebook feed thinks that I’m most interested in the stupidest and most inflammatory individuals and ideas because that’s where my eyes linger for reasons that I don’t reflectively approve of. I wonder how quickly it would “learn” otherwise if I made an effort to break this pattern.
Why do you no longer work at FB?
(It seems like more people who care about things should try working at FB, in particular if there was any learnable path to gaining any degree of power over algorithms or values-of-the-company, but maybe this is just hopelessly naive)