At my last job we had some data scientists build a tool to convert a relatively intuitive interface for inputting Boolean search terms to the specific forms needed for APIs of various data sources. We used it for patents, papers, that kind of thing. Sometimes the search strings ended up having to be 5-10 lines long. That was mostly when a term could be used lots of ways and we only needed one of them, or when there were combinatorically many ways of combining sets of terms to mean the overall same thing. So, I do think there can be a lot of value in prompt engineering for searches in targeted contexts.
Do you think LLMs will get to a point of being able to do this relatively well with the right prompts?
I didn’t know google had that limit, thanks.
At my last job we had some data scientists build a tool to convert a relatively intuitive interface for inputting Boolean search terms to the specific forms needed for APIs of various data sources. We used it for patents, papers, that kind of thing. Sometimes the search strings ended up having to be 5-10 lines long. That was mostly when a term could be used lots of ways and we only needed one of them, or when there were combinatorically many ways of combining sets of terms to mean the overall same thing. So, I do think there can be a lot of value in prompt engineering for searches in targeted contexts.
Do you think LLMs will get to a point of being able to do this relatively well with the right prompts?