(Also, if you want, you can share your report URL here, others will be able to take a look.)
My wife does that for her day job (in the U.K. national healthcare system) and the systematic reviews have to be super well documented and even pre-registered on a system called PROSPERO. The published papers always have the full search strategy at the end.
However, because I'm writing a methods-focused review -- we only look at RCTs meeting certain (pretty minimal) criteria relating to statistical power and measurement validity -- what I'm doing is closer to a combination of review of previous reviews (there have been dozens in my field) and a snowball search (searching bibliographies of papers that are relevant). I also consulted with experts in the field. however, finding bachelor's theses has been challenging, but many are actually relevant, so undermind was helpful there.
I gave it a version of my question, it asked me reasonable follow-ups, and we refined the search to:
> I want to find randomized controlled trials published by December 2023, investigating interventions to reduce consumption of meat and animal products with control groups receiving no treatment, measuring direct consumption (self-reported outcomes are acceptable), with at least 25 subjects in treatment and control groups (or at least 10 clusters for cluster-assigned studies), and with outcomes measured at least one day after treatment begins.
I just got the results back: https://www.undermind.ai/query_app/display_one_search/e5d964....
It certainly didn't find everything in my dataset, but:
* the first result is in the dataset.
* The second one is a study I excluded for something buried deep in the text.
* The third is in our dataset.
* The fourth is excluded for something the machine should have caught (32 subjects in total), but perhaps I needed to clarify 25 subjects in treatment and control each.
* The fifth result is a protocol for the study in result 3, so a more sophisticated search would have identified that these were related.
* The sixth study was entirely new to me, and though it didn't qualify because of the way the control group received some aspect of treatment, it's still something that my existing search processes missed, so right away I see real value.
So, overall, I am impressed, and I can easily imagine my lab paying for this. It would have to advance substantially before it was my only search method for a meta-analysis -- it seems to have missed a lot of the gray literature, particularly those studies published on animal advocacy websites -- but that's a much higher bar than I need for it to be part of my research toolkit.