I think it probably makes sense to use less artificial intelligence to put things into context, and instead focus on providing information with high quality instead of information with high quantity. I guess you could use AI to provide high quality information, but I also guess that by default you end up providing a large quantity, since that’s a lot easier to do.
If I want a fire-hose of information, I can just use ChatGPT myself (or google, or wikipedia, or read the comments on the prediction market websites, etc.). And I anticipate I’ll just be kinda annoyed if you provide me with the same type of info, but less useful since I wasn’t the one who wrote the ChatGPT prompt.
Or maybe you spend a bunch of time on prompt engineering and I don’t get annoyed, in that case, go right ahead and use AI, and it will probably turn out fine.
I think it probably makes sense to use less artificial intelligence to put things into context, and instead focus on providing information with high quality instead of information with high quantity. I guess you could use AI to provide high quality information, but I also guess that by default you end up providing a large quantity, since that’s a lot easier to do.
If I want a fire-hose of information, I can just use ChatGPT myself (or google, or wikipedia, or read the comments on the prediction market websites, etc.). And I anticipate I’ll just be kinda annoyed if you provide me with the same type of info, but less useful since I wasn’t the one who wrote the ChatGPT prompt.
Or maybe you spend a bunch of time on prompt engineering and I don’t get annoyed, in that case, go right ahead and use AI, and it will probably turn out fine.
Eventually I would like to commission more high quality content from e.g. superforecasters.
I’m also working to improve the AI-summaries… eg wouldn’t it be cool if I could get AI to find historical data to contextualise each news story?