Imagine a system where when you land on a Wikipedia page, it translates it into a version optimized for you at that time. The examples change to things in your life, and any concepts difficult for you get explained in detail. It would be like a highly cognitively empathetic personal teacher.
Hmm, something about this bothers me, but I’m not entirely sure what. At first I thought it was something about filter bubbles, but of course that can be fixed; just tune the algorithm so that it frames things in a way that is just optimally outside your intellectual filter bubble/comfort zone.
Now I think it’s something more like: it can be valuable to have read the same thing as other people; if everyone gets their own personalized version of Shakespeare, then people lose some of the connection they could have had with others over reading Shakespeare, since they didn’t really read the same thing. And also, it can be valuable for different people to read the same thing for another reason: different people may interpret a text in different ways, which can generate new insights. If everyone gets their own personalized version, we lose out on some of the insights people might have had by bouncing their minds off of the original text.
I guess this isn’t really a knockdown argument against making this sort of “personal translator” technology, since there’s no reason people couldn’t turn it off sometimes and read the originals, but nevertheless, we don’t have a great track record of using technology like this wisely and not overusing it (I’m thinking of social media here).
In regards to being able to read “the same thing” as other people; I would of course agree this is one benefit of the current system. Any novel system will have downsides, this is a downside for sure. I think the upsides are far more significant than this one downside at least. Generally we don’t mind tutors or educational YouTube courses that are made to be particularly useful for small groups of people, even though these things do decrease the amount of standardization.
we don’t have a great track record of using technology like this wisely and not overusing it
Agreed. With great power comes great responsibility, and often we don’t use that responsibility that well. But two things: 1) The upsides are really significant. If “being really good” at teaching people generic information is too powerful to be scary, that doesn’t leave us much hope for other tech advancements. 2) Even if it comes out to be net-negative, it could be useful to investigate further (like investigating if it is net-negative).
Hmm, something about this bothers me, but I’m not entirely sure what. At first I thought it was something about filter bubbles, but of course that can be fixed; just tune the algorithm so that it frames things in a way that is just optimally outside your intellectual filter bubble/comfort zone.
Now I think it’s something more like: it can be valuable to have read the same thing as other people; if everyone gets their own personalized version of Shakespeare, then people lose some of the connection they could have had with others over reading Shakespeare, since they didn’t really read the same thing. And also, it can be valuable for different people to read the same thing for another reason: different people may interpret a text in different ways, which can generate new insights. If everyone gets their own personalized version, we lose out on some of the insights people might have had by bouncing their minds off of the original text.
I guess this isn’t really a knockdown argument against making this sort of “personal translator” technology, since there’s no reason people couldn’t turn it off sometimes and read the originals, but nevertheless, we don’t have a great track record of using technology like this wisely and not overusing it (I’m thinking of social media here).
Thanks for the comment!
In regards to being able to read “the same thing” as other people; I would of course agree this is one benefit of the current system. Any novel system will have downsides, this is a downside for sure. I think the upsides are far more significant than this one downside at least. Generally we don’t mind tutors or educational YouTube courses that are made to be particularly useful for small groups of people, even though these things do decrease the amount of standardization.
Agreed. With great power comes great responsibility, and often we don’t use that responsibility that well. But two things:
1) The upsides are really significant. If “being really good” at teaching people generic information is too powerful to be scary, that doesn’t leave us much hope for other tech advancements.
2) Even if it comes out to be net-negative, it could be useful to investigate further (like investigating if it is net-negative).