Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add OpenAIBatch backend and refactor RequestProcessor to be compatible #28

Merged
merged 19 commits into from
Nov 8, 2024

Conversation

RyanMarten
Copy link
Contributor

Now you can use batch via

curator.Prompter(
    prompt_func=prompt_func, 
    parse_func=parse_func, 
    model_name="gpt-4o-mini", 
    batch=True
)

New example in examples/distill.py which shows this for a large reannotation job.

Also includes small change to the interface:
prompt_func returns a dictionary
{ "user_prompt": content, "system_prompt": content}
or
{ "user_prompt": content: "system_prompt": content}
to
prompt_func returns a list (of standard form messages or string
content
which gets converted to

[
 {"role": "user", "content": content}
]

or the messages

[
 {"role": "system", "content": content}
 {"role": "user", "content": content},
]

@RyanMarten RyanMarten requested a review from vutrung96 November 8, 2024 05:43
Copy link
Contributor

@vutrung96 vutrung96 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@vutrung96 vutrung96 merged commit 10d58d3 into main Nov 8, 2024
@vutrung96 vutrung96 deleted the ryanm/OpenAIBatch branch November 8, 2024 07:26
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants