You have a unmaintained Solr search sitting somewhere in your tech stack? You want to revamp it with shiny new GenAI features without walking through the desert of "ground work" tuning tokenization, synonyms or query reformulation? This talk got you covered!
I took a (almost) unmaintained Solr search engine from a German publisher and used off-the-shelf LLMs to tackle recall, precision, ranking and even diversity challenges. LLMs helped to improve recall and precision by pre-processing articles. By searching on extracted information only, SEO optimizations are stripped off of the articles and very precise search results are returned.
To reveal diversity in search results, LLMs cluster the search results using retrieved facets. This leads to a compact clustered search result page that highlights different aspects of a topic.This talk will guide you through the easy steps to apply GenAI with off-the-shelf LLMs to your own search system.