You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
> Task :sdks:python:test-suites:direct:py38:tensorflowInferenceTest FAILED
File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
File "/runner/_work/beam/beam/build/gradleenv/1398941892/lib/python3.8/site-packages/_pytest/assertion/rewrite.py", line 186, in exec_module
exec(co, module.__dict__)
File "/runner/_work/beam/beam/build/gradleenv/1398941892/lib/python3.8/site-packages/typeguard/__init__.py", line 21, in <module>
from ._importhook import ImportHookManager as ImportHookManager
File "/runner/_work/beam/beam/build/gradleenv/1398941892/lib/python3.8/site-packages/_pytest/assertion/rewrite.py", line 186, in exec_module
exec(co, module.__dict__)
File "/runner/_work/beam/beam/build/gradleenv/1398941892/lib/python3.8/site-packages/typeguard/_importhook.py", line 22, in <module>
from typing_extensions import Buffer
ImportError: cannot import name 'Buffer' from 'typing_extensions' (/runner/_work/beam/beam/build/gradleenv/1398941892/lib/python3.8/site-packages/typing_extensions.py)
Judging from:
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
Successfully installed Pillow-10.3.0 absl-py-2.1.0 astunparse-1.6.3 flatbuffers-24.3.25 gast-0.4.0 google-auth-oauthlib-1.0.0 google-pasta-0.2.0 h5py-3.11.0 keras-2.13.1 libclang-18.1.1 markdown-3.6 numpy-1.24.3 oauthlib-3.2.2 opt-einsum-3.3.0 requests-oauthlib-2.0.0 tensorboard-2.13.0 tensorboard-data-server-0.7.2 tensorflow-2.13.1 tensorflow-estimator-2.13.0 tensorflow-io-gcs-filesystem-0.34.0 tensorflow_hub-0.16.1 termcolor-2.4.0 tf-keras-2.15.0 typing-extensions-4.5.0 werkzeug-3.0.3 wheel-0.43.0
apache-beam 2.57.0.dev0 requires protobuf!=4.0.*,!=4.21.*,!=4.22.0,!=4.23.*,!=4.24.*,<4.26.0,>=3.20.3, but you have protobuf 4.23.3 which is incompatible.
azure-core 1.30.1 requires typing-extensions>=4.6.0, but you have typing-extensions 4.5.0 which is incompatible.
azure-storage-blob 12.20.0 requires typing-extensions>=4.6.0, but you have typing-extensions 4.5.0 which is incompatible.
fastapi 0.111.0 requires typing-extensions>=4.8.0, but you have typing-extensions 4.5.0 which is incompatible.
pydantic 2.7.1 requires typing-extensions>=4.6.1, but you have typing-extensions 4.5.0 which is incompatible.
pydantic-core 2.18.2 requires typing-extensions!=4.7.0,>=4.6.0, but you have typing-extensions 4.5.0 which is incompatible.
typeguard 4.2.1 requires typing-extensions>=4.10.0; python_version < "3.13", but you have typing-extensions 4.5.0 which is incompatible.
we might need some tweaking to
beam/sdks/python/apache_beam/ml/inference/tensorflow_tests_requirements.txt or adding a special 'extra' for TF dependency and using that to install the requirements for the test.
Issue Failure
Failure: Test is continually failing
Issue Priority
Priority: 2 (backlog / disabled test but we think the product is healthy)
Issue Components
Component: Python SDK
Component: Java SDK
Component: Go SDK
Component: Typescript SDK
Component: IO connector
Component: Beam YAML
Component: Beam examples
Component: Beam playground
Component: Beam katas
Component: Website
Component: Spark Runner
Component: Flink Runner
Component: Samza Runner
Component: Twister2 Runner
Component: Hazelcast Jet Runner
Component: Google Cloud Dataflow Runner
The text was updated successfully, but these errors were encountered:
What happened?
Judging from:
we might need some tweaking to
beam/sdks/python/apache_beam/ml/inference/tensorflow_tests_requirements.txt or adding a special 'extra' for TF dependency and using that to install the requirements for the test.
Issue Failure
Failure: Test is continually failing
Issue Priority
Priority: 2 (backlog / disabled test but we think the product is healthy)
Issue Components
The text was updated successfully, but these errors were encountered: