The first time I tried to load this page it took >10 seconds before erroring out in a way that made me need to close the website and open it again.
Also, it seems like the site is spamming me to make dialogues. To get to this post in “recent comments” I had to scroll down past some people the site was suggesting I dialogue with. I clicked the “x” next to each of those entries to get them to go away. Then, when I had to re-load the page, two more suggested dialogue partners had spawned. This was after I turned off the notifications I had been subscribed to pinging me whenever anyone wanted to dialogue with me.
There’s a bunch of interesting AI alignment content, more than I feel like I have the bandwidth or inclination to read. I also like that there’s a trickle of new interesting users, e.g. it’s cool that Maxwell Tabarrok is on my front page.
As is often said, I’d be interested in more “classic rationality” content relative to AI stuff. Like, I don’t think we’re by any means perfect on that axis, or past some point of diminishing return. Since it’s apparently easier to write posts about AI, maybe someone should write up this paper and turn it into life advice. Alternatively, I think looking at how ancient people thought about logic could be cool (see e.g. the white horse paradox or Ibn Sina’s development of a modal logic system or whatever).
I have the impression that lots of people find LW too conflict-y, but I think we avoid the worst excesses of being a total forum of everyone agreeing with each other about how great we all are, and that more disagreement would make that better, as long as it’s with gentleness and respect, as they say.
Oh also the pattern of what things of mine get upvoted vs downvoted feel pretty weird. E.g. I thought my post on a mistake I think people make in discussions about open-source AI was a good contribution, if perhaps poorly written. But it got fewer upvotes than a post that was literal SEO. I guess the latter introduced people to a cool thing that wasn’t culture-war-y and explained it a bit, but I think the explanation I gave was pretty bad, because as mentioned, I actually just wanted it to be SEO.
The first time I tried to load this page it took >10 seconds before erroring out in a way that made me need to close the website and open it again.
Also, it seems like the site is spamming me to make dialogues. To get to this post in “recent comments” I had to scroll down past some people the site was suggesting I dialogue with. I clicked the “x” next to each of those entries to get them to go away. Then, when I had to re-load the page, two more suggested dialogue partners had spawned. This was after I turned off the notifications I had been subscribed to pinging me whenever anyone wanted to dialogue with me.
There’s a bunch of interesting AI alignment content, more than I feel like I have the bandwidth or inclination to read. I also like that there’s a trickle of new interesting users, e.g. it’s cool that Maxwell Tabarrok is on my front page.
As is often said, I’d be interested in more “classic rationality” content relative to AI stuff. Like, I don’t think we’re by any means perfect on that axis, or past some point of diminishing return. Since it’s apparently easier to write posts about AI, maybe someone should write up this paper and turn it into life advice. Alternatively, I think looking at how ancient people thought about logic could be cool (see e.g. the white horse paradox or Ibn Sina’s development of a modal logic system or whatever).
I have the impression that lots of people find LW too conflict-y, but I think we avoid the worst excesses of being a total forum of everyone agreeing with each other about how great we all are, and that more disagreement would make that better, as long as it’s with gentleness and respect, as they say.
Oh also the pattern of what things of mine get upvoted vs downvoted feel pretty weird. E.g. I thought my post on a mistake I think people make in discussions about open-source AI was a good contribution, if perhaps poorly written. But it got fewer upvotes than a post that was literal SEO. I guess the latter introduced people to a cool thing that wasn’t culture-war-y and explained it a bit, but I think the explanation I gave was pretty bad, because as mentioned, I actually just wanted it to be SEO.
Oh, also I think the site wants me to care about the review or believe that it’s important/valuable, but I don’t really.
You should click the settings gear in the “Dialogues” section to hide suggested partners from you.