v1.0.0 Migration Guide #742
Replies: 73 comments 207 replies
-
Is the highlighted line a typo? (image removed because it was huge) |
Beta Was this translation helpful? Give feedback.
-
Needs a comma after defining client = AzureOpenAI(
# https://learn.microsoft.com/en-us/azure/ai-services/openai/reference#rest-api-versioning
api_version="2023-07-01-preview"
# https://learn.microsoft.com/en-us/azure/cognitive-services/openai/how-to/create-resource?pivots=web-portal#create-a-resource
azure_endpoint="https://example-endpoint.openai.azure.com",
) |
Beta Was this translation helpful? Give feedback.
-
Incomprensibile ma chi le scrive queste cose? |
Beta Was this translation helpful? Give feedback.
-
You can run |
Beta Was this translation helpful? Give feedback.
-
What happened to the functions array that was in chatCompletions? Seems to no longer function in the chat.completions?? Sample code?? |
Beta Was this translation helpful? Give feedback.
-
Adesso sono impegnato poi mi spieghero'
Il gio 9 nov 2023, 15:39 Robert Craigie ***@***.***> ha
scritto:
… It's still there, what's not working for you?
https://github.com/openai/openai-python/blob/8e4e5d490629a5abfbdf4b6a2ce20c7eee848a2e/src/openai/resources/chat/completions.py#L63
—
Reply to this email directly, view it on GitHub
<#742 (reply in thread)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAMXYQRJPKWOUMVS7OXZSVTYDTTKRAVCNFSM6AAAAAA7DOKUK2VHI2DSMVQWIX3LMV43SRDJONRXK43TNFXW4Q3PNVWWK3TUHM3TKMRTGI2TC>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
It would be great if you could update ChatGPT's knowledge as well. Currently, it provides OpenAI-related code of an old version, even when I provide examples and ask for updates to the new version. |
Beta Was this translation helpful? Give feedback.
-
Was there a heads-up on this? We haven't deployed our tools utilizing openai api into production yet, but good thing we didn't. We had no idea that there would be a new version. Maybe next time there could be extended support for legacy? |
Beta Was this translation helpful? Give feedback.
-
@rattrayalex @RobertCraigie - hey there - Keiji from Azure OpenAI here! We have written a more extended Azure OpenAI migration guide in our documentation: https://aka.ms/oai/v1-python-migration Would it be possible to add it to the main body of this Guide? |
Beta Was this translation helpful? Give feedback.
This comment has been hidden.
This comment has been hidden.
-
II don't want to use WSL
Il giorno gio 9 nov 2023 alle ore 18:09 Alex Rattray <
***@***.***> ha scritto:
… You need to use Windows Subsystem for Linux (WSL) for this.
https://learn.microsoft.com/en-us/windows/wsl/install
—
Reply to this email directly, view it on GitHub
<#742 (reply in thread)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAMXYQQCMXDSAJLB5HT2WYTYDUE25AVCNFSM6AAAAAA7DOKUK2VHI2DSMVQWIX3LMV43SRDJONRXK43TNFXW4Q3PNVWWK3TUHM3TKMRVGAZTI>
.
You are receiving this because you commented.Message ID:
***@***.***>
--
*Giovanni Poidomani*
|
Beta Was this translation helpful? Give feedback.
-
still don't understand how to use grit |
Beta Was this translation helpful? Give feedback.
-
Did openai.Timeout get changed to openai.APITimeoutError? Don't see that in the list of name changes. |
Beta Was this translation helpful? Give feedback.
-
I am stuck with a bunch of bots that no longer work |
Beta Was this translation helpful? Give feedback.
-
Hi there. While I use openai.proxies = {
'http': 'http://127.0.0.1:7890',
'https': 'http://127.0.0.1:7890',
} Should |
Beta Was this translation helpful? Give feedback.
-
You need to make additional changes by hand like: -output = completion['choices'][0]['message']['content']
-prom = completion['usage']['prompt_tokens']
-comp = completion['usage']['completion_tokens']
+output = completion.choices[0].message.content
+prom = completion.usage.prompt_tokens
+comp = completion.usage.completion_tokens |
Beta Was this translation helpful? Give feedback.
-
I'm encountering issues when trying to use the OpenAI API with AWS Lambda following an update that involves pydantic. It's reporting that there is no module named pydantic_core. I am using Python 3.9 for both the Lambda function code and the OpenAI API layer. [ERROR] Runtime.ImportModuleError: Unable to import module 'lambda_function': No module named 'pydantic_core._pydantic_core' Traceback (most recent call last): I'm receiving the error message above from the CloudWatch error log. |
Beta Was this translation helpful? Give feedback.
-
Hello and thanks, I have a question and really thankful if you can help me out. I have installed openai(1.7.1), my python version is 3.11.7, in the first step of my code which is |
Beta Was this translation helpful? Give feedback.
This comment was marked as off-topic.
This comment was marked as off-topic.
-
What can I do about 'OpenAIError'? because I need the exception:
|
Beta Was this translation helpful? Give feedback.
-
After following the guide for adding to Windows. I get the same result. what could be the issue? |
Beta Was this translation helpful? Give feedback.
-
Hello |
Beta Was this translation helpful? Give feedback.
-
how to enter in Ubuntu this 3 lines as it is accepts only one and than run it.... |
Beta Was this translation helpful? Give feedback.
-
maybe I am doing something very stupid, but nothing happens: |
Beta Was this translation helpful? Give feedback.
-
Can anyone help me with this code? I'm trying to run this function, but this error appears: File "c:\Users", line 254, in distances_from_embeddings
|
Beta Was this translation helpful? Give feedback.
-
Hello, I am a little bit confused with the steps:
I tried the above while being in the randomly chosen directory and in ~/.grit/bin/grit. Nothing works. Please help! thanks! |
Beta Was this translation helpful? Give feedback.
-
Hi @rattrayalex, |
Beta Was this translation helpful? Give feedback.
-
The Docs on the Open AI Website is outdated |
Beta Was this translation helpful? Give feedback.
-
terrible update. mess up all scripts. APIs are not consistent e.g. base_url for OpenAI() and azure_endpoint for AzureOpenAI() |
Beta Was this translation helpful? Give feedback.
-
Thanks to migration from |
Beta Was this translation helpful? Give feedback.
-
We have released a new major version of our SDK, and we recommend upgrading promptly.
It's a total rewrite of the library, so many things have changed, but we've made upgrading easy with a code migration script and detailed docs below. It was extensively beta tested prior to release.
Getting started
What's changed
Migration guide
For Azure OpenAI users, see Microsoft's Azure-specific migration guide.
Automatic migration with grit
You can automatically migrate your codebase using grit, either online or with the following CLI command on Mac or Linux:
The grit binary executes entirely locally with AST-based transforms.
Be sure to audit its changes: we suggest ensuring you have a clean working tree beforehand, and running
git add --patch
afterwards. Note that grit.io also offers opt-in automatic fixes powered by AI.Automatic migration with grit on Windows
To use grit to migrate your code on Windows, you will need to use Windows Subsystem for Linux (WSL). Installing WSL is quick and easy, and you do not need to keep using Linux once the command is done.
Here's a step-by-step guide for setting up and using WSL for this purpose:
wsl --install
.cd
into the appropriate directory (e.g.,cd /mnt/c/Users/Myself/my/code/
) and then run the following commands:curl -fsSL https://docs.grit.io/install | bash grit install grit apply openai
Then, you can close WSL and go back to using Windows.
Automatic migration with grit in Jupyter Notebooks
If your Jupyter notebooks are not in source control, they will be more difficult to migrate. You may want to copy each cell into grit's web interface, and paste the output back in.
If you need to migrate in a way that preserves use of the module-level client instead of instantiated clients, you can use the openai_global grit migration instead.
Initialization
Responses
Response objects are now pydantic models and no longer conform to the dictionary shape. However you can easily convert them to a dictionary with
model.model_dump()
.Async client
We do not support calling asynchronous methods in the module-level client, instead you will have to instantiate an async client.
The rest of the API is exactly the same as the synchronous client.
Module client
Important
We highly recommend instantiating client instances instead of relying on the global client.
We also expose a global client instance that is accessible in a similar fashion to versions prior to v1.
The API is the exact same as the standard client instance based API.
This is intended to be used within REPLs or notebooks for faster iteration, not in application code.
We recommend that you always instantiate a client (e.g., with
client = OpenAI()
) in application code because:Pagination
All
list()
methods that support pagination in the API now support automatic iteration, for example:Previously you would have to explicitly call a
.auto_paging_iter()
method instead.See the README for more details.
Azure OpenAI
To use this library with Azure OpenAI, use the
AzureOpenAI
class instead of theOpenAI
class.A more comprehensive Azure-specific migration guide is available on the Microsoft website.
Important
The Azure API shape differs from the core API shape which means that the static types for responses / params
won't always be correct.
In addition to the options provided in the base
OpenAI
client, the following options are provided:azure_endpoint
azure_deployment
api_version
azure_ad_token
azure_ad_token_provider
An example of using the client with Azure Active Directory can be found here.
All name changes
openai.api_base
->openai.base_url
openai.proxy
->openai.proxies
(docs)openai.InvalidRequestError
->openai.BadRequestError
openai.Audio.transcribe()
->client.audio.transcriptions.create()
openai.Audio.translate()
->client.audio.translations.create()
openai.ChatCompletion.create()
->client.chat.completions.create()
openai.Completion.create()
->client.completions.create()
openai.Edit.create()
->client.edits.create()
openai.Embedding.create()
->client.embeddings.create()
openai.File.create()
->client.files.create()
openai.File.list()
->client.files.list()
openai.File.retrieve()
->client.files.retrieve()
openai.File.download()
->client.files.retrieve_content()
openai.FineTune.cancel()
->client.fine_tunes.cancel()
openai.FineTune.list()
->client.fine_tunes.list()
openai.FineTune.list_events()
->client.fine_tunes.list_events()
openai.FineTune.stream_events()
->client.fine_tunes.list_events(stream=True)
openai.FineTune.retrieve()
->client.fine_tunes.retrieve()
openai.FineTune.delete()
->client.fine_tunes.delete()
openai.FineTune.create()
->client.fine_tunes.create()
openai.FineTuningJob.create()
->client.fine_tuning.jobs.create()
openai.FineTuningJob.cancel()
->client.fine_tuning.jobs.cancel()
openai.FineTuningJob.delete()
->client.fine_tuning.jobs.create()
openai.FineTuningJob.retrieve()
->client.fine_tuning.jobs.retrieve()
openai.FineTuningJob.list()
->client.fine_tuning.jobs.list()
openai.FineTuningJob.list_events()
->client.fine_tuning.jobs.list_events()
openai.Image.create()
->client.images.generate()
openai.Image.create_variation()
->client.images.create_variation()
openai.Image.create_edit()
->client.images.edit()
openai.Model.list()
->client.models.list()
openai.Model.delete()
->client.models.delete()
openai.Model.retrieve()
->client.models.retrieve()
openai.Moderation.create()
->client.moderations.create()
openai.api_resources
->openai.resources
Removed
openai.api_key_path
openai.app_info
openai.debug
openai.log
openai.OpenAIError
openai.Audio.transcribe_raw()
openai.Audio.translate_raw()
openai.ErrorObject
openai.Customer
openai.api_version
openai.verify_ssl_certs
openai.api_type
openai.enable_telemetry
openai.ca_bundle_path
openai.requestssession
(we now use httpx)openai.aiosession
(we now use httpx)openai.Deployment
(only used for Azure) – please use the azure-mgmt-cognitiveservices client library instead (here's how to list deployments, for example).openai.Engine
openai.File.find_matching_files()
openai.embeddings_utils
(now in the cookbook)Beta Was this translation helpful? Give feedback.
All reactions