2 results
Letting an LLM call external functions/APIs to fetch data, compute, or take actions, improving reliability.
Constraining outputs to retrieved or provided sources, often with citation, to improve factual reliability.