Hallucina-Gen
Spot where your LLM might make mistakes on documents
เด่น
10 โหวต



คำอธิบาย
Using LLMs to summarize or answer questions from documents? We auto-analyze your PDFs and prompts, and produce test inputs likely to trigger hallucinations. Built for AI developers to validate outputs, test prompts, and squash hallucinations early.