This is the sort of thing I find appealing to believe, but I feel at least somewhat skeptical of. I notice a strong emotional pull to want this to be true (as well as an interesting counterbalancing emotional pull for it to not be true).
I don’t think I’ve seen output from the people aspiring in this direction without being visibly quite smart to make me think “okay yeah it seems like it’s on track in some sense.”
I’d be interested in hearing more explicit cruxes from you about it.
I do think it’s plausible than the “smart enough, creative enough, strong epistemics, independent, willing to spend years without legible output, exceptionally driven, and so on” are sufficient (if you’re at least moderately-but-not-exceptionally-smart). Those are rare enough qualities that it doesn’t necessarily feel like I’m getting a free lunch, if they turn out to be sufficient for groundbreaking pre-paradigmatic research. I agree the x-risk pipeline hasn’t tried very hard to filter for and/or generate people with these qualities.
(well, okay, “smart enough” is doing a lot of work there, I assume from context you mean “pretty smart but not like genius smart”)
But, I’ve only really seen you note positive examples, and this seems like the sort of thing that’d have a lot of survivorship bias. There can be tons of people obsessed, but not necessarily on the right things, and if you’re not naturally the right cluster of obsessed + smart-in-the-right-way, I don’t know whether trying to cultivate the obsession on purpose will really work.
I do nonetheless overall probably prefer people who have all your listed qualities, and who also either can:
a) self-fund to pursue the research without having to make it legible to others b) somehow figure out a way to make it legible along the way
I probably prefer those people to tackle “the hard parts of alignment” over many other things they could be doing, but not overwhelmingly obviously (and I think it should come with a background awareness that they are making a gamble, and if they aren’t the sort of person who must make that gamble due to their personality makeup, they should be prepared for the (mainline) outcome that it just doesn’t work out)
This is the sort of thing I find appealing to believe, but I feel at least somewhat skeptical of. I notice a strong emotional pull to want this to be true (as well as an interesting counterbalancing emotional pull for it to not be true).
I don’t think I’ve seen output from the people aspiring in this direction without being visibly quite smart to make me think “okay yeah it seems like it’s on track in some sense.”
I’d be interested in hearing more explicit cruxes from you about it.
I do think it’s plausible than the “smart enough, creative enough, strong epistemics, independent, willing to spend years without legible output, exceptionally driven, and so on” are sufficient (if you’re at least moderately-but-not-exceptionally-smart). Those are rare enough qualities that it doesn’t necessarily feel like I’m getting a free lunch, if they turn out to be sufficient for groundbreaking pre-paradigmatic research. I agree the x-risk pipeline hasn’t tried very hard to filter for and/or generate people with these qualities.
(well, okay, “smart enough” is doing a lot of work there, I assume from context you mean “pretty smart but not like genius smart”)
But, I’ve only really seen you note positive examples, and this seems like the sort of thing that’d have a lot of survivorship bias. There can be tons of people obsessed, but not necessarily on the right things, and if you’re not naturally the right cluster of obsessed + smart-in-the-right-way, I don’t know whether trying to cultivate the obsession on purpose will really work.
I do nonetheless overall probably prefer people who have all your listed qualities, and who also either can:
a) self-fund to pursue the research without having to make it legible to others
b) somehow figure out a way to make it legible along the way
I probably prefer those people to tackle “the hard parts of alignment” over many other things they could be doing, but not overwhelmingly obviously (and I think it should come with a background awareness that they are making a gamble, and if they aren’t the sort of person who must make that gamble due to their personality makeup, they should be prepared for the (mainline) outcome that it just doesn’t work out)