AE Studio is a team of 160+ programmers, product designers, and data scientists focused on increasing human agency through neglected high-impact approaches. Originally successful in BCI development and consulting, we’re now applying our expertise to AI alignment research, believing that the space of plausible alignment solutions is vast and under-explored.
Our alignment work includes prosociality research on self-modeling in neural systems, with attention schema theory in particular, self-other overlap mechanisms, and various neglected technical and policy approaches. We maintain a profitable consulting business that allows us to fund and pursue promising but overlooked research directions without pressure to expedite AGI development.
Learn more about us and our mission here:
https://ae.studio/ai-alignment
Thanks Lucius, yes, this was tongue-in-cheek and we actually decided to remove it shortly thereafter once we realized it might not come across in the right way. Totally grant the point, and thanks for calling it out.