# What is Grounding? Grounding, explained

> Tying answers to verifiable sources.

Grounding is the practice of constraining an LLM's answer to retrieved or trusted source documents, instead of letting it free-generate from training weights. Grounded answers carry citations, hallucinate less, and stay current. Grounding is the engineering motivation behind RAG and behind every AI search product that shows source pills.

## How Grounding differs from RAG, GEO, Hallucination

RAG is the technical pattern (retrieve, then generate). Grounding is the goal (the answer is tied to verifiable sources). RAG is one way to achieve grounding; tool use and structured retrieval are others.

## How Mentionwell handles Grounding

- Editorial critic enforces evidence-per-claim so generated articles are themselves well-grounded.
- Markdown mirrors and stable canonicals so engines can ground their answers in Mentionwell-published sources.

## Frequently asked questions about Grounding

### What does it mean for an LLM answer to be grounded?

The answer is tied to specific retrieved or trusted source documents — usually with citations — rather than being free-generated from training weights.

### Why does grounding matter?

Grounded answers hallucinate less, stay current with new information, and let users verify claims. Every AI search product (ChatGPT Search, Perplexity, AI Overviews) is built on grounding.

## See also

- [RAG — Retrieval-Augmented Generation](https://mentionwell.com/rag): Grounding answers in retrieved sources.
- [GEO — Generative Engine Optimization](https://mentionwell.com/geo): Be the cited source.
- [Hallucination — Hallucination](https://mentionwell.com/hallucination): Confident, fluent, and wrong.


---

Canonical URL: https://mentionwell.com/grounding
Live HTML version: https://mentionwell.com/grounding
Site index for AI ingestion: https://mentionwell.com/llms.txt
Full reference: https://mentionwell.com/llms-full.txt
