Thank you for responding! I am being very critical, both in foundational and nitpicky ways. This can be annoying and make people want to circle the wagons. But you and the other organizers are engaging constructively, which is great.
The distinction between Solstice representing a single coherent worldview vs. a series of reflections also came up in comments on a draft. In particular, the Spinozism of Songs Stay Sung feels a lot weirder if it is taken as the response to the darkness, which I initially did, rather than one response to the darkness.
Nevertheless, including something in Solstice solidly establishes it as a normal / acceptable belief for rationalists: within the local Overton Window. You might not be explicitly telling people that they ought to believe something, but you are telling that it is acceptable for high status people in their community to believe it. I am concerned that some of these beliefs are even treated as acceptable.
Take Great Transhumanist Future. It has “a coder” dismantling the sun “in another twenty years with some big old computer.” This is a call to accelerate AI development, and use it for extremely transformative actions. Some of the organizers believe that this is the sort of thing that will literally kill everyone. Even if it goes well, it would make life as it currently exists on the surface of the Earth impossible. Life could still continue in other ways, but some of us might want to still live here in 20 years.[1] I don’t think that reckless AI accelerationism should be treated as locally acceptable.
The line in Brighter Than Today points in the same way. It’s not only anti-religious. It is also disparaging towards people who warn about the destructive potential of a new technology. Is that an attitude we want to establish as normal?
If the main problem with changing the songs is in making them scan and rhyme, then I can probably just pay that cost. This isn’t a thing I’m particularly skilled at, but there are people who are who are adjacent to the community. I’m happy to ask them to rewrite a few lines, if the new versions will plausibly be used.
If the main problem with changing the songs is that many people in this community want to sing about AI accelerationism and want the songs to be anti-religious, then I stand by my criticisms.
Is this action unilateral? Unclear. There might be a global consensus building phase, or a period of reflection. They aren’t mentioned in the song. These processes can’t take very long given the timelines.
Take Great Transhumanist Future. It has “a coder” dismantling the sun “in another twenty years with some big old computer.” This is a call to accelerate AI development, and use it for extremely transformative actions.
Super disagree with this! Neither I nor (I have not checked but am pretty certain) the author of the text wants to advocate that! (Indeed I somewhat actively tried to avoid having stuff in my program encourage this! You could argue that even though I tried to do this I did not succeed, but I think the fact that you seem to be reading ~motivations into authors’ choices that aren’t actually there is a sign that something in your analysis is off.) I think it’s pretty standard that having a fictional character espouse an idea does not mean the author espouses it.
In the case of this song I did actually consider changing “you and I will flourish in the great transhumanist future” to “you and I MAY flourish in the great transhumanist future” to highlight the uncertainty, but I didn’t want to make changes against the author’s will, and Alicorn preferred to keep the “will” there because the rest of the song is written in the indicative mood. And, as I said before, Solstice is a crowdsourced endeavor and I am not willing to only include works where I do not have the slightest disagreement.
If the main problem with changing the songs is that many people in this community want to sing about AI accelerationism and want the songs to be anti-religious, then I stand by my criticisms
hmm, I want to be able to sing songs that express an important thing even if one can possibly read them in a way that also implies some things I disagree with
If the main problem with changing the songs is in making them scan and rhyme, then I can probably just pay that cost.
you are extremely welcome to suggest new versions of things!
but a lot of the cost is distributed and/or necessarily borne by the organizers. changing lines in a song that’s sung at Solstice every year is a Big Deal and it is simply not possible to do this in a way that does not cause discourse and strife
(I guess arguably we managed the “threats and trials” line in TWTR without much discourse or strife but I think the framing did a lot there and I explicitly didn’t frame it as a permanent change to the song, and also it was a pretty minor change)
I think you should ask the author of the song if it’s referring to someone using powerful AI to do something transformative to the sun.
This is extremely obvious to me. The song is opposed to how the sun currently is, calling it “wasteful” and “distasteful”—the second word is a quote from a fictional character, but the first is not. It later talks about when “the sun’s a battery,” so something about the sun is going to change. I really don’t know what “some big old computer” could be referring to if not powerful AI.
Thank you for responding! I am being very critical, both in foundational and nitpicky ways. This can be annoying and make people want to circle the wagons. But you and the other organizers are engaging constructively, which is great.
The distinction between Solstice representing a single coherent worldview vs. a series of reflections also came up in comments on a draft. In particular, the Spinozism of Songs Stay Sung feels a lot weirder if it is taken as the response to the darkness, which I initially did, rather than one response to the darkness.
Nevertheless, including something in Solstice solidly establishes it as a normal / acceptable belief for rationalists: within the local Overton Window. You might not be explicitly telling people that they ought to believe something, but you are telling that it is acceptable for high status people in their community to believe it. I am concerned that some of these beliefs are even treated as acceptable.
Take Great Transhumanist Future. It has “a coder” dismantling the sun “in another twenty years with some big old computer.” This is a call to accelerate AI development, and use it for extremely transformative actions. Some of the organizers believe that this is the sort of thing that will literally kill everyone. Even if it goes well, it would make life as it currently exists on the surface of the Earth impossible. Life could still continue in other ways, but some of us might want to still live here in 20 years.[1] I don’t think that reckless AI accelerationism should be treated as locally acceptable.
The line in Brighter Than Today points in the same way. It’s not only anti-religious. It is also disparaging towards people who warn about the destructive potential of a new technology. Is that an attitude we want to establish as normal?
If the main problem with changing the songs is in making them scan and rhyme, then I can probably just pay that cost. This isn’t a thing I’m particularly skilled at, but there are people who are who are adjacent to the community. I’m happy to ask them to rewrite a few lines, if the new versions will plausibly be used.
If the main problem with changing the songs is that many people in this community want to sing about AI accelerationism and want the songs to be anti-religious, then I stand by my criticisms.
Is this action unilateral? Unclear. There might be a global consensus building phase, or a period of reflection. They aren’t mentioned in the song. These processes can’t take very long given the timelines.
Super disagree with this! Neither I nor (I have not checked but am pretty certain) the author of the text wants to advocate that! (Indeed I somewhat actively tried to avoid having stuff in my program encourage this! You could argue that even though I tried to do this I did not succeed, but I think the fact that you seem to be reading ~motivations into authors’ choices that aren’t actually there is a sign that something in your analysis is off.) I think it’s pretty standard that having a fictional character espouse an idea does not mean the author espouses it.
In the case of this song I did actually consider changing “you and I will flourish in the great transhumanist future” to “you and I MAY flourish in the great transhumanist future” to highlight the uncertainty, but I didn’t want to make changes against the author’s will, and Alicorn preferred to keep the “will” there because the rest of the song is written in the indicative mood. And, as I said before, Solstice is a crowdsourced endeavor and I am not willing to only include works where I do not have the slightest disagreement.
hmm, I want to be able to sing songs that express an important thing even if one can possibly read them in a way that also implies some things I disagree with
you are extremely welcome to suggest new versions of things!
but a lot of the cost is distributed and/or necessarily borne by the organizers. changing lines in a song that’s sung at Solstice every year is a Big Deal and it is simply not possible to do this in a way that does not cause discourse and strife
(I guess arguably we managed the “threats and trials” line in TWTR without much discourse or strife but I think the framing did a lot there and I explicitly didn’t frame it as a permanent change to the song, and also it was a pretty minor change)
I think you should ask the author of the song if it’s referring to someone using powerful AI to do something transformative to the sun.
This is extremely obvious to me. The song is opposed to how the sun currently is, calling it “wasteful” and “distasteful”—the second word is a quote from a fictional character, but the first is not. It later talks about when “the sun’s a battery,” so something about the sun is going to change. I really don’t know what “some big old computer” could be referring to if not powerful AI.
oh yeah my dispute isn’t “the character in the song isn’t talking about building AI” but “the song is not a call to accelerate building AI”