Adithya Parthasarathy
Title of the Talk :
The Great Hallucination Tradeoff: Why Better Retrieval Still Isn’t Enough
Abstract of Talk:
“Just add RAG” became the default answer to hallucinations: retrieve documents, stuff them into context, and the model will stay grounded. In practice, teams discover a frustrating truth: retrieval improves factuality, but it also introduces new ways for systems to fail. Better retrieval can increase latency and cost, amplify spurious correlations, leak irrelevant or unsafe content into context, and create a false sense of confidence when the model cites sources it didn’t truly use. The result is a fundamental tradeoff: you can reduce hallucinations while simultaneously increasing errors of relevance, faithfulness, and overconfidence.
This keynote unpacks that tradeoff with a production lens. We’ll map the major failure modes missing evidence, wrong evidence, contradictory evidence, poisoned evidence, and context overload and show why each requires different fixes than “retrieve more.” We’ll discuss mitigation strategies that combine ranking, uncertainty estimation, selective generation, abstention, and citation-aware verification. We’ll also cover the evaluation gap that causes teams to ship misleading “grounded” systems: why offline QA accuracy doesn’t predict real-user trust, how to measure attribution and faithfulness, and how to design feedback loops that don’t quietly train your system to be confidently wrong.
The takeaway is a pragmatic framework for building LLM products that remain helpful even when the world is messy: knowing when to answer, when to ask a clarifying question, when to refuse, and when to say “I don’t know” and being able to prove it with metrics.
Bio:
Adithya Parthasarathy is an AI researcher and IEEE Senior Member specializing in LLMs, RecSys, and search ranking. Bringing nearly a decade of experience building solutions for leading global tech platforms, he is dedicated to advancing the field through active research, technical reviews, and strategic collaborations.
