I'd like to query an LLM and have the response in a specific JSON format. Is that possible? Or do I just need to ask this as part of the query?
— Steve
I'd like to query an LLM and have the response in a specific JSON format. Is that possible? Or do I just need to ask this as part of the query?
— Steve
If you specify in your prompt the response should be in JSON format, the LLM should respond this way.
Thanks Bruno. Some LLMs have the option to force a JSON response. Can I request this is added in future versions?
— Steve
It can typically be forced by passing it to the system prompt. Did you try this?
You mean setting the SystemRole to the JSON template. I haven't tried it but I will.
Also, when retrieving the model, the JSON response is somewhat different for different providers e.g. Claude and OpenAI. Is this something the component can standardize?
We'll go over all supported LLMs and see if there is a common ground among all services for automatically extracting the data in the requested format.