-
Notifications
You must be signed in to change notification settings - Fork 103
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
runStream returning incomplete and out of bounds for non existing values #1959
Comments
@surbhigarg92 Can we get any update on this. spanner runStream causing the issues with swapping the column values |
Hi @saranyasengo Sorry for the delay. I haven't been able to look into this issue. Can you please share the below information to help me reproduce this issue ?
|
Hey @surbhigarg92 I believe the issue is from so long, but we captured recently since its thrown out of bounds error. But the actual issue is column values are swapped. We have upgraded recently to @google-cloud/spanner version 7.0.0 still we see the values are getting swapped. Schema of the table is secure to share. We have created google support ticket for this. I will share the table schema, sample data's with table and how the spanner runStream returns the data in support ticket. |
🤖 I have created a release *beep* *boop* --- ## [7.4.0](https://togithub.com/googleapis/nodejs-spanner/compare/v7.3.0...v7.4.0) (2024-02-23) ### Features * **spanner:** Add PG.OID support ([#1948](https://togithub.com/googleapis/nodejs-spanner/issues/1948)) ([cf9df7a](https://togithub.com/googleapis/nodejs-spanner/commit/cf9df7a54c21ac995bbea9ad82c3544e4aff41b6)) * Untyped param types ([#1869](https://togithub.com/googleapis/nodejs-spanner/issues/1869)) ([6ef44c3](https://togithub.com/googleapis/nodejs-spanner/commit/6ef44c383a90bf6ae95de531c83e21d2d58da159)) * Update TransactionOptions to include new option exclude_txn_from_change_streams ([#1998](https://togithub.com/googleapis/nodejs-spanner/issues/1998)) ([937a7a1](https://togithub.com/googleapis/nodejs-spanner/commit/937a7a13f8c7660e21d34ebbaecad426b2bacd99)) ### Bug Fixes * **deps:** Update dependency google-gax to v4.3.1 ([#1995](https://togithub.com/googleapis/nodejs-spanner/issues/1995)) ([bed4832](https://togithub.com/googleapis/nodejs-spanner/commit/bed4832445e72c7116fe5495c79d989664220b38)) * Only reset pending value with resume token ([#2000](https://togithub.com/googleapis/nodejs-spanner/issues/2000)) ([f337089](https://togithub.com/googleapis/nodejs-spanner/commit/f337089567d7d92c9467e311be7d72b0a7dc8047)), closes [#1959](https://togithub.com/googleapis/nodejs-spanner/issues/1959) --- This PR was generated with [Release Please](https://togithub.com/googleapis/release-please). See [documentation](https://togithub.com/googleapis/release-please#release-please).
@saranyasengo Can you please try using the latest release https://github.com/googleapis/nodejs-spanner/releases/tag/v7.4.0 and let us know if this resolves your issue ? |
@surbhigarg92 I will update to latest version and since its not reproducible in local and its coming weekly once. I will update and monitor for couple of weeks and update on this. Can you please share us the reproduction step to confirm before using the latest one? |
We found that error was happening in a particular case where the pending value were getting cleared when partialResultSet had a resume token . Please refer #2000 for the fix and test cases added. |
@surbhigarg92 I have upgraded the version to 7.4.0 and now I see this issue is not coming for past 2 days, But we are facing different type of issue now, hence rollback the changes to previous one. The specific error we are getting is |
@saranyasengo The error looks like because of a recent feature which was launched #1869 I am working on fixing this. Sorry for the inconvenience. |
@saranyasengo As communicated earlier the error was happening because of a recent feature launch. We have reverted that feature for the time being. Can you please try using 7.5.0 release |
@surbhigarg92 We will use 7.5.0 and will monitor for somedays for fixes. |
@saranyasengo Let us know if this worked for you |
@surbhigarg92 it works, So far we have not faced this issue. |
@saranyasengo We are closing this ticket. Please feel free to reopen if you face any issues. |
Environment details
@google-cloud/spanner
version: 5.18.0Steps to reproduce
This error is inconsistently happening its similar to issues #197 and #180
The code which gives error
const rowsStream = spannerDb.runStream({ sql:
SELECT * FROM ${TABLE_NAME} ORDER BY storeId, started DESC, json: true, });
Error thrown as below
GoogleError: Serializing column "started" encountered an error: Integer 211959131678161148 is out of bounds. Call row.toJSON({ wrapNumbers: true }) to receive a custom type. at Int.valueOf (/opt/app/node_modules/@google-cloud/spanner/build/src/codec.js:121:19) at convertValueToJson (/opt/app/node_modules/@google-cloud/spanner/build/src/codec.js:263:22) at Object.convertFieldsToJson (/opt/app/node_modules/@google-cloud/spanner/build/src/codec.js:239:31) at Array.value (/opt/app/node_modules/@google-cloud/spanner/build/src/partial-result-stream.js:206:38) at PartialResultStream._addValue (/opt/app/node_modules/@google-cloud/spanner/build/src/partial-result-stream.js:187:34) at /opt/app/node_modules/@google-cloud/spanner/build/src/partial-result-stream.js:163:24 at Array.forEach (<anonymous>) at PartialResultStream._addChunk (/opt/app/node_modules/@google-cloud/spanner/build/src/partial-result-stream.js:162:16) at PartialResultStream._transform (/opt/app/node_modules/@google-cloud/spanner/build/src/partial-result-stream.js:85:24) at Transform._write (node:internal/streams/transform:175:8) at obj.<computed> [as _write] (/opt/app/node_modules/stubs/index.js:26:26) at doWrite (node:internal/streams/writable:411:12) at clearBuffer (node:internal/streams/writable:572:7) at onwrite (node:internal/streams/writable:464:7) at node:internal/streams/transform:190:7 at PartialResultStream._tryResume (/opt/app/node_modules/@google-cloud/spanner/build/src/partial-result-stream.js:108:13)
Thanks!
The text was updated successfully, but these errors were encountered: