I would love to see (and contribute to, if you want to collaborate) a post on “what are roles and processes” in terms of human organizations, and how it might apply to agent alignment topics. I spend a lot of my time and energy at work (Principal Engineer at a very large company; somewhat similar to CTO of a 150-person division) in formalizing and encouraging people and teams to adopt processes and to understand the roles they need to embrace in order to have the (positive) impact we all want.
There’s an interesting mix in this work—some of it is identifying goals we share and looking for ways to measure and improve at furthering them. But some of it is normalizing the goals themselves—not exactly “alignment”, but “finding and formalizing of mutually-beneficial utility trades”. These are visible, causal trades—nothing fancy except that they’re rarely encoded as actual written agreements—they’re informal beliefs within the employees’ heads, based on implicit relationships between teams or with customers.
I would love to see (and contribute to, if you want to collaborate) a post on “what are roles and processes” in terms of human organizations, and how it might apply to agent alignment topics. I spend a lot of my time and energy at work (Principal Engineer at a very large company; somewhat similar to CTO of a 150-person division) in formalizing and encouraging people and teams to adopt processes and to understand the roles they need to embrace in order to have the (positive) impact we all want.
There’s an interesting mix in this work—some of it is identifying goals we share and looking for ways to measure and improve at furthering them. But some of it is normalizing the goals themselves—not exactly “alignment”, but “finding and formalizing of mutually-beneficial utility trades”. These are visible, causal trades—nothing fancy except that they’re rarely encoded as actual written agreements—they’re informal beliefs within the employees’ heads, based on implicit relationships between teams or with customers.