If it were me, I’d also try to increase RoI by asking people to add commentary to existing books, rather than having people write from scratch.
This thought occurred to me—specifically, there’s likely quite a bit of interactive fiction out there with a suitable format which could be post-hoc thought annotated (might also be interesting to include a few different branches).
However, I don’t think it gives us the same thing: presumably we’d want the thoughts to be those that occur at the timeand contribute to the writing of the later narrative. Doing post-hoc annotations by trying to infer what a writer might plausibly have thought seems a quite different process. Perhaps that wouldn’t matter for some purposes, but I imagine it would for others (??).
While it’d be possible to check that post-hoc annotations passed a [human reader can’t tell the difference] test, this wouldn’t eliminate the difference—it’d only tell us it’s non-obvious to humans.
I initially tried doing post-hoc annotation and found it much more difficult than thinking my own actual thoughts, putting them down, and writing the prompt that resulted. Most of the work is in writing the thoughts, not the prompts, so adding pregenerated prompts at expense of making the thoughts more difficult is a loss.
Agree that it’s the ‘crafting’ part that is what matters, and I don’t think we can say a writer/DM is going to explicitly be thinking about all the details of the story at each turn. From the examples.. well as a side effect of doing AI research is that you can’t help but read the text of the story and see that the “thoughts” about it are just picking details in a way that even SOTA ML systems have a problem with. They don’t read as actual notes about the process. Perhaps there needs to be a request for samples, with a 30k word limit (so no one invests too much time in something that might not be used), and a focus on capturing the process of writing a story as the plot unfolds.
This thought occurred to me—specifically, there’s likely quite a bit of interactive fiction out there with a suitable format which could be post-hoc thought annotated (might also be interesting to include a few different branches).
However, I don’t think it gives us the same thing: presumably we’d want the thoughts to be those that occur at the time and contribute to the writing of the later narrative. Doing post-hoc annotations by trying to infer what a writer might plausibly have thought seems a quite different process. Perhaps that wouldn’t matter for some purposes, but I imagine it would for others (??).
While it’d be possible to check that post-hoc annotations passed a [human reader can’t tell the difference] test, this wouldn’t eliminate the difference—it’d only tell us it’s non-obvious to humans.
I initially tried doing post-hoc annotation and found it much more difficult than thinking my own actual thoughts, putting them down, and writing the prompt that resulted. Most of the work is in writing the thoughts, not the prompts, so adding pregenerated prompts at expense of making the thoughts more difficult is a loss.
Agree that it’s the ‘crafting’ part that is what matters, and I don’t think we can say a writer/DM is going to explicitly be thinking about all the details of the story at each turn. From the examples.. well as a side effect of doing AI research is that you can’t help but read the text of the story and see that the “thoughts” about it are just picking details in a way that even SOTA ML systems have a problem with. They don’t read as actual notes about the process. Perhaps there needs to be a request for samples, with a 30k word limit (so no one invests too much time in something that might not be used), and a focus on capturing the process of writing a story as the plot unfolds.