Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Crash "Unknown field for Schema: title" when using langchain_google_genai.ChatGoogleGenerativeAI #28568

Open
5 tasks done
majorgilles opened this issue Dec 6, 2024 · 3 comments
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature

Comments

@majorgilles
Copy link

majorgilles commented Dec 6, 2024

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

class PictureCategorizationOutputState(BaseModel):
    """Picture categorization output state model.

    Attributes:
        picture_type: The picture type.
        image_texts: The image texts.
        operational_error: The operational error, if you encounter any issue during your task.
    """

    picture_type: AIPictureType = Field(description="The type of the picture.")
    image_texts: list[ImageText] = Field([], description="The texts found in the image.")
    dimensions: Dimensions = Field(description="The dimensions of the image in pixels (height and width).")
    operational_error: str = Field(
        "", description="The operational error, if you encounter any issue during your task."
    )
    highlighted_image_b64: str = Field(description="The b64 encoded image with highlighted text.")
def question_vehicle_picture_category_node(state: PictureCategorizationSharedState) -> PictureCategorizationOutputState:
    """Prompt AI for vehicle picture type (INTERIOR / EXTERIOR).

    Args:
        state: Picture categorization shared state.

    Returns:
        AIVehiclePicture: Vehicle picture
    """
    llm = ChatGoogleGenerativeAI(
        model="gemini-1.5-pro",
        temperature=0,
        project="some_project"
    ).with_structured_output(PictureCategorizationOutputState)   #!!!!!! THIS CRASHES !!!!!!!!!
    prompt = get_picture_categorization_prompt_template(str(state.picture_url))
    return cast(PictureCategorizationOutputState, (prompt | llm).invoke({}))

Error Message and Stack Trace (if applicable)

Traceback (most recent call last):
File "C:\Program Files\JetBrains\PyCharm 2024.2.4\plugins\python-ce\helpers\pydev\pydevd.py", line 1570, in exec
pydev_imports.execfile(file, globals, locals) # execute the script
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Program Files\JetBrains\PyCharm 2024.2.4\plugins\python-ce\helpers\pydev_pydev_imps_pydev_execfile.py", line 18, in execfile
exec(compile(contents+"\n", file, 'exec'), glob, loc)
File "C:\dev\lizy-ai\projects\picture_categorization\graph_v1.py", line 49, in
compiled_graph.invoke({"picture_url": "https://media.gq.com/photos/6508829d305ef4e0229049b3/master/pass/plane.jpg"})
File "C:\Users\giloz\AppData\Local\pypoetry\Cache\virtualenvs\lizy-ai-8ngQPScW-py3.12\Lib\site-packages\langgraph\pregel_init
.py", line 1929, in invoke
for chunk in self.stream(
^^^^^^^^^^^^
File "C:\Users\giloz\AppData\Local\pypoetry\Cache\virtualenvs\lizy-ai-8ngQPScW-py3.12\Lib\site-packages\langgraph\pregel_init_.py", line 1649, in stream
for _ in runner.tick(
^^^^^^^^^^^^
File "C:\Users\giloz\AppData\Local\pypoetry\Cache\virtualenvs\lizy-ai-8ngQPScW-py3.12\Lib\site-packages\langgraph\pregel\runner.py", line 105, in tick
run_with_retry(t, retry_policy, writer=writer)
File "C:\Users\giloz\AppData\Local\pypoetry\Cache\virtualenvs\lizy-ai-8ngQPScW-py3.12\Lib\site-packages\langgraph\pregel\retry.py", line 44, in run_with_retry
task.proc.invoke(task.input, config)
File "C:\Users\giloz\AppData\Local\pypoetry\Cache\virtualenvs\lizy-ai-8ngQPScW-py3.12\Lib\site-packages\langgraph\utils\runnable.py", line 410, in invoke
input = context.run(step.invoke, input, config, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\giloz\AppData\Local\pypoetry\Cache\virtualenvs\lizy-ai-8ngQPScW-py3.12\Lib\site-packages\langgraph\utils\runnable.py", line 184, in invoke
ret = context.run(self.func, input, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\dev\lizy-ai\projects\picture_categorization\nodes.py", line 28, in question_vehicle_picture_category_node
).with_structured_output(PictureCategorizationOutputState)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\giloz\AppData\Local\pypoetry\Cache\virtualenvs\lizy-ai-8ngQPScW-py3.12\Lib\site-packages\langchain_google_genai\chat_models.py", line 1239, in with_structured_output
tool_choice = _get_tool_name(schema) if self._supports_tool_choice else None
^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\giloz\AppData\Local\pypoetry\Cache\virtualenvs\lizy-ai-8ngQPScW-py3.12\Lib\site-packages\langchain_google_genai\chat_models.py", line 1383, in _get_tool_name
genai_tool = tool_to_dict(convert_to_genai_function_declarations([tool]))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\giloz\AppData\Local\pypoetry\Cache\virtualenvs\lizy-ai-8ngQPScW-py3.12\Lib\site-packages\langchain_google_genai_function_utils.py", line 173, in convert_to_genai_function_declarations
fd = _format_to_gapic_function_declaration(tool)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\giloz\AppData\Local\pypoetry\Cache\virtualenvs\lizy-ai-8ngQPScW-py3.12\Lib\site-packages\langchain_google_genai_function_utils.py", line 197, in _format_to_gapic_function_declaration
return _convert_pydantic_to_genai_function(tool)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\giloz\AppData\Local\pypoetry\Cache\virtualenvs\lizy-ai-8ngQPScW-py3.12\Lib\site-packages\langchain_google_genai_function_utils.py", line 270, in _convert_pydantic_to_genai_function
function_declaration = gapic.FunctionDeclaration(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\giloz\AppData\Local\pypoetry\Cache\virtualenvs\lizy-ai-8ngQPScW-py3.12\Lib\site-packages\proto\message.py", line 728, in init
pb_value = marshal.to_proto(pb_type, value)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\giloz\AppData\Local\pypoetry\Cache\virtualenvs\lizy-ai-8ngQPScW-py3.12\Lib\site-packages\proto\marshal\marshal.py", line 235, in to_proto
pb_value = self.get_rule(proto_type=proto_type).to_proto(value)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\giloz\AppData\Local\pypoetry\Cache\virtualenvs\lizy-ai-8ngQPScW-py3.12\Lib\site-packages\proto\marshal\rules\message.py", line 45, in to_proto
return self._wrapper(value)._pb
^^^^^^^^^^^^^^^^^^^^
File "C:\Users\giloz\AppData\Local\pypoetry\Cache\virtualenvs\lizy-ai-8ngQPScW-py3.12\Lib\site-packages\proto\message.py", line 728, in init
pb_value = marshal.to_proto(pb_type, value)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\giloz\AppData\Local\pypoetry\Cache\virtualenvs\lizy-ai-8ngQPScW-py3.12\Lib\site-packages\proto\marshal\marshal.py", line 233, in to_proto
return {k: self.to_proto(recursive_type, v) for k, v in value.items()}
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\giloz\AppData\Local\pypoetry\Cache\virtualenvs\lizy-ai-8ngQPScW-py3.12\Lib\site-packages\proto\marshal\marshal.py", line 235, in to_proto
pb_value = self.get_rule(proto_type=proto_type).to_proto(value)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\giloz\AppData\Local\pypoetry\Cache\virtualenvs\lizy-ai-8ngQPScW-py3.12\Lib\site-packages\proto\marshal\rules\message.py", line 45, in to_proto
return self._wrapper(value)._pb
^^^^^^^^^^^^^^^^^^^^
File "C:\Users\giloz\AppData\Local\pypoetry\Cache\virtualenvs\lizy-ai-8ngQPScW-py3.12\Lib\site-packages\proto\message.py", line 728, in init
pb_value = marshal.to_proto(pb_type, value)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\giloz\AppData\Local\pypoetry\Cache\virtualenvs\lizy-ai-8ngQPScW-py3.12\Lib\site-packages\proto\marshal\marshal.py", line 235, in to_proto
pb_value = self.get_rule(proto_type=proto_type).to_proto(value)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\giloz\AppData\Local\pypoetry\Cache\virtualenvs\lizy-ai-8ngQPScW-py3.12\Lib\site-packages\proto\marshal\rules\message.py", line 45, in to_proto
return self._wrapper(value)._pb
^^^^^^^^^^^^^^^^^^^^
File "C:\Users\giloz\AppData\Local\pypoetry\Cache\virtualenvs\lizy-ai-8ngQPScW-py3.12\Lib\site-packages\proto\message.py", line 724, in init
raise ValueError(
ValueError: Unknown field for Schema: title

Description

it seems the "Title" property is giving an issue during the marshalling. same basic structure is working with ChatOpenAI class.

System Info

System Information

OS: Windows
OS Version: 10.0.22631
Python Version: 3.12.7 (tags/v3.12.7:0b05ead, Oct 1 2024, 03:06:41) [MSC v.1941 64 bit (AMD64)]

Package Information

langchain_core: 0.3.21
langchain: 0.3.9
langchain_community: 0.3.9
langsmith: 0.1.147
langchain_anthropic: 0.2.4
langchain_chroma: 0.1.4
langchain_fireworks: 0.2.5
langchain_google_genai: 2.0.6
langchain_google_vertexai: 2.0.8
langchain_openai: 0.2.11
langchain_pinecone: 0.2.0
langchain_text_splitters: 0.3.2
langgraph_sdk: 0.1.43

Optional packages not installed

langserve

Other Dependencies

aiohttp: 3.9.5
anthropic: 0.40.0
anthropic[vertexai]: Installed. No version info available.
async-timeout: Installed. No version info available.
chromadb: 0.5.21
dataclasses-json: 0.6.7
defusedxml: 0.7.1
fastapi: 0.115.6
filetype: 1.2.0
fireworks-ai: 0.15.9
google-cloud-aiplatform: 1.74.0
google-cloud-storage: 2.19.0
google-generativeai: 0.8.3
httpx: 0.27.2
httpx-sse: 0.4.0
jsonpatch: 1.33
langchain-mistralai: Installed. No version info available.
langsmith-pyo3: Installed. No version info available.
numpy: 1.26.4
openai: 1.57.0
orjson: 3.10.12
packaging: 24.2
pinecone-client: 5.0.1
pydantic: 2.9.2
pydantic-settings: 2.6.1
PyYAML: 6.0.2
requests: 2.32.3
requests-toolbelt: 1.0.0
SQLAlchemy: 2.0.36
tenacity: 8.2.3
tiktoken: 0.8.0
typing-extensions: 4.12.2

@dosubot dosubot bot added the 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature label Dec 6, 2024
@salah55s
Copy link

salah55s commented Dec 10, 2024

Same, it's a langchain bug, I think we should use another release, but which one!

@databaaz
Copy link

databaaz commented Dec 10, 2024

I encountered the same issue. The ChatGoogleGenerativeAI class throws this error with Pydantic schema.

What worked for me is the following:
Instead of doing (prompt | llm.with_structured_output(Schema)).invoke({}),
instead do chain = (prompt | llm.bind_tools([Schema])).invoke()

Where llm is an instance of ChatGoogleGenerativeAI

@Hamza5
Copy link

Hamza5 commented Dec 16, 2024

I have a simpler example that causes the same bug, which is a variation of the official extraction tutorial:

from typing import Optional, List
from pydantic import BaseModel, Field
from dotenv import load_dotenv
from langchain_google_genai import ChatGoogleGenerativeAI
from langchain_core.prompts import ChatPromptTemplate


load_dotenv()


class Person(BaseModel):
    """Represents a person's characteristics."""
    name: Optional[str] = Field(description="The name of the person.", default=None)
    age: Optional[int] = Field(description="The age of the person in years.", default=None)
    sex: Optional[str] = Field(description="The sex of the person (male or female). Try to guess if possible. Otherwise, return None.", default=None)


class Persons(BaseModel):
    """Represents a list of persons' characteristics."""
    persons: List[Person] = Field(description="A list of persons' characteristics.", default_factory=list)


prompt_template = ChatPromptTemplate.from_messages([
    ("system", "You are a parser that extracts mentioned persons with their characteristics from a text. If a characteristic is not found, return None for the corresponding field."),
    ("human", "{text}")
])

model = ChatGoogleGenerativeAI(model='gemini-1.5-flash').with_structured_output(Persons)

runnable = prompt_template | model

text = input("Enter a text: ")
if text:
    output = runnable.invoke({"text": text})
    print(output.model_dump_json(indent=2))

Now, I am using the workaround of @databaaz to make this work:

model = ChatGoogleGenerativeAI(model='gemini-1.5-flash').bind_tools([Persons])

Please note that the problem occurs only when one of the Fields is a List. In my code, when using Person instead of Persons, everything works perfectly (but I need to use List).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature
Projects
None yet
Development

No branches or pull requests

4 participants