-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix JSON parsing from model #85
Conversation
https://rwilinski.ai/posts/benchmarking-llms-for-structured-json-generation/ Cold start problem:
For a simple to medium complex schema, it is reasonable to go non-strict. Based on the success rate for complex (below)
We should expose a strict flag. In line with the author's suggestion:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Small change to the error message.
Just say the model successfully responded with a string that is JSON but doesn't match the schema
src/bespokelabs/curator/request_processor/base_request_processor.py
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
Since we are not using strict mode, it's possible for OpenAI to return invalid JSON (either not a JSON or valid JSON but not conform to the JSON schema). In this case we catch the invalid JSON and skip the response.