Fact-checking and source requests
Day 8 of 30 · 30 Days of AI
Reduce hallucinations with verification prompts
Learning goal
- Use prompts that ask for sources/uncertainties.
- Apply a quick fact-check routine.
Why it matters
- Models can hallucinate; you must verify claims.
- Sources and uncertainty flags guide your review.
Explanation
- Add: “Cite sources or say if unknown.”
- Ask for “3 risks or missing info.”
- Cross-check numbers/dates with a trusted source.
Examples
- Prompt: “Summarize X in 120 words. List sources or say ‘no source’. Add 2 risks.”
- QA: “Check dates vs official site; flag mismatches.”
Guided exercise (10–15 min)
- Pick a factual topic; ask for summary + sources + risks.
- Manually verify 2 claims; note discrepancies.
Independent exercise (5–10 min)
Rewrite your QA checklist to include “sources/uncertainty” for future prompts.
Self-check
- Sources requested in the prompt.
- Uncertainties/risks listed.
- At least 2 claims verified manually.
Optional deepening
- Safety best practices: https://platform.openai.com/docs/guides/safety-best-practices