-
Notifications
You must be signed in to change notification settings - Fork 4.3k
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
Confirm this is an issue with the Python library and not an underlying OpenAI API
- This is an issue with the Python library
Describe the bug
As soon as the package openai v1.108.0 was released, it broke functionality with older pydantic v2 versions like v2.3.0. There is a NameError caused by the usage of __pydantic_extra__
like from recent changes made to the module openai.types.evals.runs: https://github.com/openai/openai-python/tree/main/src/openai/types/evals/runs
It was always OK in previous versions like openai v1.107.3 was OK.
/usr/local/lib/python3.10/site-packages/llama_index/__init__.py:13: in <module>
from llama_index.callbacks.global_handlers import set_global_handler
/usr/local/lib/python3.10/site-packages/llama_index/callbacks/__init__.py:7: in <module>
from .token_counting import TokenCountingHandler
/usr/local/lib/python3.10/site-packages/llama_index/callbacks/token_counting.py:6: in <module>
from llama_index.utilities.token_counting import TokenCounter
/usr/local/lib/python3.10/site-packages/llama_index/utilities/token_counting.py:6: in <module>
from llama_index.llms import ChatMessage, MessageRole
/usr/local/lib/python3.10/site-packages/llama_index/llms/__init__.py:26: in <module>
from llama_index.llms.litellm import LiteLLM
/usr/local/lib/python3.10/site-packages/llama_index/llms/litellm.py:27: in <module>
from llama_index.llms.litellm_utils import (
/usr/local/lib/python3.10/site-packages/llama_index/llms/litellm_utils.py:4: in <module>
from openai.resources import Completions
/usr/local/lib/python3.10/site-packages/openai/resources/__init__.py:27: in <module>
from .evals import (
/usr/local/lib/python3.10/site-packages/openai/resources/evals/__init__.py:3: in <module>
from .runs import (
/usr/local/lib/python3.10/site-packages/openai/resources/evals/runs/__init__.py:3: in <module>
from .runs import (
/usr/local/lib/python3.10/site-packages/openai/resources/evals/runs/runs.py:16: in <module>
from .output_items import (
/usr/local/lib/python3.10/site-packages/openai/resources/evals/runs/output_items.py:17: in <module>
from ....types.evals.runs import output_item_list_params
/usr/local/lib/python3.10/site-packages/openai/types/evals/runs/__init__.py:6: in <module>
from .output_item_list_response import OutputItemListResponse as OutputItemListResponse
/usr/local/lib/python3.10/site-packages/openai/types/evals/runs/output_item_list_response.py:14: in <module>
class Result(BaseModel):
/usr/local/lib/python3.10/site-packages/pydantic/_internal/_model_construction.py:98: in __new__
private_attributes = inspect_namespace(
/usr/local/lib/python3.10/site-packages/pydantic/_internal/_model_construction.py:327: in inspect_namespace
raise NameError(
E NameError: Fields must not use names with leading underscores; e.g., use 'pydantic_extra__' instead of '__pydantic_extra__'.
To Reproduce
- Install an older pydantic v2 like pydantic==2.3.0 and openai==1.108.0
- Create a mytest.py file and import from openai.resources import Completions
- Run mytest.py. Notice the NameError.
- Then, install openai==1.107.3 and run mytest.py again. Notice there is no more NameError.
Code snippets
pip install pydantic==2.3.0 openai==1.108.0
from openai.resources import Completions
OS
Windows
Python version
v3.10.12
Library version
openai v1.108.0
mdrxy and MaggiR
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working