First, great news on founding an alignment organization on your own. While I give this work a low chance of making progress, if you succeed the benefits would be vast.
I’ll pre-register a prediction. You will fail with 90% probability, but potentially usefully fail. My reasons are as follows:
Inner alignment issues have a good chance of wrecking your plans. Specifically there are issues like instrumental convergence causing deception and power-seeking by default. I notice an implicit assumption where inner alignment is either not a problem or so easy to solve by default that it’s not worth worrying about. This may hold, but I suspect more likely than not not to hold.
I suspect that cultural AI is only relevant in the below human and human regime, and once above the human regime happens, there’s a fairly massive incentives to simply not care about humans culture the same way way that humans don’t really care about the less powerful animals. Actually bettering less powerful being’s lives is very hard.
First, great news on founding an alignment organization on your own. While I give this work a low chance of making progress, if you succeed the benefits would be vast.
I’ll pre-register a prediction. You will fail with 90% probability, but potentially usefully fail. My reasons are as follows:
Inner alignment issues have a good chance of wrecking your plans. Specifically there are issues like instrumental convergence causing deception and power-seeking by default. I notice an implicit assumption where inner alignment is either not a problem or so easy to solve by default that it’s not worth worrying about. This may hold, but I suspect more likely than not not to hold.
I suspect that cultural AI is only relevant in the below human and human regime, and once above the human regime happens, there’s a fairly massive incentives to simply not care about humans culture the same way way that humans don’t really care about the less powerful animals. Actually bettering less powerful being’s lives is very hard.
> First, great news on founding an alignment organization on your own.
Actually I founded it with my cofounder, Nick Hay!
https://www.encultured.ai/#team