>Software development is a poor metaphor for AI alignment.
I think I disagree, but let’s ignore the metaphor aspect and focus on the model. The same causal model can also be communicated using science & engineering as a metaphor. If you want to know what scientific insights to work towards to create some breakthrough technology, it’s valuable to periodically put on your engineer hat. Without it, you’ll do basic research that could end up leading anywhere. In search terms, an engineer hat offers an improved heuristic. If your scientist hat allows you to forward chain, your engineer hat allows you to backward chain.
I’d argue the engineer hat is critical for effective differential technological development.
>Software development is a poor metaphor for AI alignment.
I think I disagree, but let’s ignore the metaphor aspect and focus on the model. The same causal model can also be communicated using science & engineering as a metaphor. If you want to know what scientific insights to work towards to create some breakthrough technology, it’s valuable to periodically put on your engineer hat. Without it, you’ll do basic research that could end up leading anywhere. In search terms, an engineer hat offers an improved heuristic. If your scientist hat allows you to forward chain, your engineer hat allows you to backward chain.
I’d argue the engineer hat is critical for effective differential technological development.