How important is it to promote AI alignment in Japan? I ask this not to troll, but seriously. I’ve not heard of a lot of rapid progress towards transformative AI coming from Japan. Current progress seems to be coming out of the US. Are there a lot of folks in Japan working on things that could become AGI and don’t engage with the existing AI alignment content enough to warrant a specific Japanese focus?
I’ve wondered the same about how important it is to spread certain ideas to other cultures/languages, not because I don’t think it’s not a nice thing to do, but because, given limited resources, it’s unclear to me how much it will matter to the project of mitigating AI x-risks. Since it takes a lot of effort to bridge each culture gap, seems worth having a sense of how likely we think it matters for Japanese, Russian, Chinese, etc. so we can choose how to deploy people to such projects.
I think it could be valuable if academics in Japan were less allergic to alignment than those in the West. Then, perhaps we could reimport alignment ideas back into the US as people are generally more open to listening to strange ideas from people from another culture. In any case, it sounds like the OP is in Japan, so that they have more opportunity to promote alignment there than elsewhere.
How important is it to promote AI alignment in Japan? I ask this not to troll, but seriously. I’ve not heard of a lot of rapid progress towards transformative AI coming from Japan. Current progress seems to be coming out of the US. Are there a lot of folks in Japan working on things that could become AGI and don’t engage with the existing AI alignment content enough to warrant a specific Japanese focus?
I’ve wondered the same about how important it is to spread certain ideas to other cultures/languages, not because I don’t think it’s not a nice thing to do, but because, given limited resources, it’s unclear to me how much it will matter to the project of mitigating AI x-risks. Since it takes a lot of effort to bridge each culture gap, seems worth having a sense of how likely we think it matters for Japanese, Russian, Chinese, etc. so we can choose how to deploy people to such projects.
I think it could be valuable if academics in Japan were less allergic to alignment than those in the West. Then, perhaps we could reimport alignment ideas back into the US as people are generally more open to listening to strange ideas from people from another culture. In any case, it sounds like the OP is in Japan, so that they have more opportunity to promote alignment there than elsewhere.