Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Iceberg Write failing to upgrade due to "IllegalArgumentException: unable to serialize SchemaCoder" #32795

Closed
2 of 17 tasks
chamikaramj opened this issue Oct 16, 2024 · 12 comments

Comments

@chamikaramj
Copy link
Contributor

What happened?

I'm getting following (via the ExpansionService) when upgrading the Iceberg Write transform.

2024/10/16 02:58:02 E1016 02:58:02.041566      11 managed_transforms_worker_main.cc:138] Failed to upgrade using the expansion service manager: INTERNAL: Expansion request failed: java.lang.IllegalArgumentException: unable to serialize SchemaCoder<Schema: Fields:
2024/10/16 02:58:02 Field{name=tableIdentifierString, description=, type=STRING NOT NULL, options={{}}}
2024/10/16 02:58:02 Field{name=serializableDataFile, description=, type=ROW<path STRING NOT NULL, fileFormat STRING NOT NULL, recordCount INT64 NOT NULL, fileSizeInBytes INT64 NOT NULL, partitionPath STRING NOT NULL, partitionSpecId INT32 NOT NULL, keyMetadata BYTES, splitOffsets ARRAY<INT64 NOT NULL>, columnSizes MAP<INT32 NOT NULL, INT64 NOT NULL>, valueCounts MAP<INT32 NOT NULL, INT64 NOT NULL>, nullValueCounts MAP<INT32 NOT NULL, INT64 NOT NULL>, nanValueCounts MAP<INT32 NOT NULL, INT64 NOT NULL>, lowerBounds MAP<INT32 NOT NULL, BYTES NOT NULL>, upperBounds MAP<INT32 NOT NULL, BYTES NOT NULL>> NOT NULL, options={{}}}
2024/10/16 02:58:02 Encoding positions:
2024/10/16 02:58:02 {tableIdentifierString=0, serializableDataFile=1}
2024/10/16 02:58:02 Options:{{}}UUID: 1373ba11-1080-4271-b79a-985f2ff03727  UUID: 1373ba11-1080-4271-b79a-985f2ff03727 delegateCoder: org.apache.beam.sdk.coders.Coder$ByteBuddy$X4azj9mR@4a19cae6
2024/10/16 02:58:02     at org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:59)
2024/10/16 02:58:02     at org.apache.beam.sdk.util.construction.CoderTranslation.toCustomCoder(CoderTranslation.java:158)
2024/10/16 02:58:02     at org.apache.beam.sdk.util.construction.CoderTranslation.toProto(CoderTranslation.java:118)
2024/10/16 02:58:02     at org.apache.beam.sdk.util.construction.SdkComponents.registerCoder(SdkComponents.java:284)
2024/10/16 02:58:02     at org.apache.beam.sdk.util.construction.PCollectionTranslation.toProto(PCollectionTranslation.java:35)
2024/10/16 02:58:02     at org.apache.beam.sdk.util.construction.SdkComponents.registerPCollection(SdkComponents.java:239)
2024/10/16 02:58:02     at org.apache.beam.sdk.util.construction.PTransformTranslation.translateAppliedPTransform(PTransformTranslation.java:610)
2024/10/16 02:58:02     at org.apache.beam.sdk.util.construction.ParDoTranslation$ParDoTranslator.translate(ParDoTranslation.java:184)
2024/10/16 02:58:02     at org.apache.beam.sdk.util.construction.PTransformTranslation.toProto(PTransformTranslation.java:277)
2024/10/16 02:58:02     at org.apache.beam.sdk.util.construction.SdkComponents.registerPTransform(SdkComponents.java:183)
2024/10/16 02:58:02     at org.apache.beam.sdk.util.construction.PipelineTranslation$1.visitPrimitiveTransform(PipelineTranslation.java:96)
2024/10/16 02:58:02     at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:593)
2024/10/16 02:58:02     at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)

Seems like the coder for FileWriteResult is failing during translation. I'm not sure why the SchemaCoder didn't properly resolve for SerializableDataFile (so it ended up defaulting to SerializableCoder).

Issue Priority

Priority: 1 (data loss / total loss of function)

Issue Components

  • Component: Python SDK
  • Component: Java SDK
  • Component: Go SDK
  • Component: Typescript SDK
  • Component: IO connector
  • Component: Beam YAML
  • Component: Beam examples
  • Component: Beam playground
  • Component: Beam katas
  • Component: Website
  • Component: Infrastructure
  • Component: Spark Runner
  • Component: Flink Runner
  • Component: Samza Runner
  • Component: Twister2 Runner
  • Component: Hazelcast Jet Runner
  • Component: Google Cloud Dataflow Runner
@ahmedabu98
Copy link
Contributor

Can you check if this works? #32796

@chamikaramj
Copy link
Contributor Author

Getting the same error with that PR.

@ahmedabu98 ahmedabu98 added this to the 2.60.0 Release milestone Oct 16, 2024
@ahmedabu98
Copy link
Contributor

ahmedabu98 commented Oct 16, 2024

I'm seeing that Iceberg unit and integration tests started failing ~10 hours ago.

@chamikaramj can you try it with the 2.60.0 release branch? I suspect something got merged recently and only HEAD is affected but we should check if this is affecting the release too.

@ahmedabu98
Copy link
Contributor

I think #32757 may be the culprit PR. The tests are passing when reverting this change: #32802

@ahmedabu98
Copy link
Contributor

CC @reuvenlax

@ahmedabu98
Copy link
Contributor

full stacktrace:

at org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:59)
        at org.apache.beam.sdk.util.construction.CoderTranslation.toCustomCoder(CoderTranslation.java:158)
        at org.apache.beam.sdk.util.construction.CoderTranslation.toProto(CoderTranslation.java:118)
        at org.apache.beam.sdk.util.construction.SdkComponents.registerCoder(SdkComponents.java:284)
        at org.apache.beam.sdk.util.construction.PCollectionTranslation.toProto(PCollectionTranslation.java:35)
        at org.apache.beam.sdk.util.construction.SdkComponents.registerPCollection(SdkComponents.java:239)
        at org.apache.beam.sdk.util.construction.PTransformTranslation.translateAppliedPTransform(PTransformTranslation.java:610)
        at org.apache.beam.sdk.util.construction.ParDoTranslation$ParDoTranslator.translate(ParDoTranslation.java:184)
        at org.apache.beam.sdk.util.construction.PTransformTranslation.toProto(PTransformTranslation.java:277)
        at org.apache.beam.sdk.util.construction.SdkComponents.registerPTransform(SdkComponents.java:183)
        at org.apache.beam.sdk.util.construction.PipelineTranslation$1.visitPrimitiveTransform(PipelineTranslation.java:96)
        at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:593)
        at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
        at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
        at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
        at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
        at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
        at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
        at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
        at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
        at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
        at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
        at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:477)
        at org.apache.beam.sdk.util.construction.PipelineTranslation.toProto(PipelineTranslation.java:68)
        at org.apache.beam.sdk.util.construction.PipelineTranslation.toProto(PipelineTranslation.java:59)
        at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:1213)
        at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:203)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:325)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:310)
        at org.apache.beam.examples.multilanguage.Temp.runJob(Temp.java:442)
        at org.apache.beam.examples.multilanguage.Temp.main(Temp.java:289)
Caused by: java.io.NotSerializableException: org.apache.beam.sdk.schemas.utils.ByteBuddyUtils$TransformingMap
        at java.base/java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1200)
        at java.base/java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1585)
        at java.base/java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1542)
        at java.base/java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1451)
        at java.base/java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194)
        at java.base/java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1585)
        at java.base/java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1542)
        at java.base/java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1451)
        at java.base/java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194)
        at java.base/java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1585)
        at java.base/java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1542)
        at java.base/java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1451)
        at java.base/java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194)
        at java.base/java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1585)
        at java.base/java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1542)
        at java.base/java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1451)
        at java.base/java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194)
        at java.base/java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358)
        at org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:55)
        ... 30 more

@Abacn
Copy link
Contributor

Abacn commented Oct 16, 2024

#32757 isn't in 2.60.0, should the milestone item be 2.61.0?

@reuvenlax
Copy link
Contributor

I'm taking a look.

@ahmedabu98
Copy link
Contributor

@Abacn yes sorry, removing now

@ahmedabu98 ahmedabu98 removed this from the 2.60.0 Release milestone Oct 16, 2024
@reuvenlax
Copy link
Contributor

I'm trying to repro - when I run these tests I see the following failure:

unable to serialize org.apache.beam.sdk.io.iceberg.ScanSource@490d9c41

However ScanSource is a BoundedSource, not an element. Does this have anything to do with schemas?

@reuvenlax
Copy link
Contributor

@chamikaramj can you see if this works? #32810

@chamikaramj
Copy link
Contributor Author

Yeah, #32810 works. Thank you!

@github-actions github-actions bot added this to the 2.61.0 Release milestone Oct 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants