Get the latest tech news
Every Way to Get Structured Output from LLMs
A survey of every framework for extracting structured output from LLMs, and how they compare.
✅ Yes, using a new Rust-based error-tolerant parser (e.g. can parse{"foo": "bar})Jinja templates✅ YesVSCode extension✅ OpenAI ✅ Azure OpenAI✅ Anthropic ✅ Ollama *: Honorable mention to Microsoft's AICI, which is working on creating a shim for cooperative constraints implemented in Python/JS using a WASM runtime. LLMs make a lot of the same mistakes that humans do when producing JSON (e.g. a } in the wrong place or a missing comma), so it's important that the framework can help you handle these errors.
Or read this on Hacker News