You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
At present it is possible to create a bigquery datasource which uses an explicit key by creating an import_datasources.yaml file that contains something similar to the block below:
This can be imported using superset import_datasources -p filename.yaml from a running instance.
However, if I wish to import the same data structure using the REST API /api/v1/database/import it will fail.
It seems that the REST version applies additional schema checks on the bundled zipfile and fails because encrypted_extra is not part of ImportV1DatabaseSchema
Screenshots/recordings
No response
Superset version
master / latest-dev
Python version
3.11
Node version
16
Browser
Chrome
Additional context
No response
Checklist
I have searched Superset docs and Slack and didn't find a solution to my problem.
I have searched the GitHub issue tracker and didn't find a similar bug report.
I have checked Superset's logs for errors and if I found a relevant Python stacktrace, I included it here as text in the "additional context" section.
The text was updated successfully, but these errors were encountered:
Hey @withnale! I'm here to help you with your Superset issue. I'm a bot designed to assist with bugs, questions, and contributions. Let's tackle this together!
A similar issue was previously reported and closed. The issue was about importing a Snowflake database with the encrypted_extra field, which was resolved by a pull request that allowed the encrypted_extra field to be imported [1].
In your case, the failure when importing a BigQuery datasource with encrypted_extra parameters via the REST API /api/v1/database/import is likely due to the schema validation that renames encrypted_extra to masked_encrypted_extra. The schema expects encrypted_extra to be renamed to masked_encrypted_extra, which is automatically handled in the superset import_datasources -p filename.yaml command. However, when using the REST API, this renaming might not occur, leading to a failure if encrypted_extra is not manually renamed to masked_encrypted_extra in the payload [2].
To comment on the bot notes above. I've seen the masked_encypted_extra section and also tried that. This is another entry that isn't present in ImportV1DatabaseSchema
I think the reasoning is that when we export the database we strip passwords and encrypted extra, so the import doesn't expect it. But people can add it manually, and the import should accept it and use it if present.
Does it make sense to ensure the same validations take place on the CLI and REST invocations? It seems strange that an object with encrypted_extra can sneak through the schema validation on the CLI version.
Bug description
At present it is possible to create a bigquery datasource which uses an explicit key by creating an
import_datasources.yaml
file that contains something similar to the block below:This can be imported using
superset import_datasources -p filename.yaml
from a running instance.However, if I wish to import the same data structure using the REST API
/api/v1/database/import
it will fail.It seems that the REST version applies additional schema checks on the bundled zipfile and fails because
encrypted_extra
is not part ofImportV1DatabaseSchema
Screenshots/recordings
No response
Superset version
master / latest-dev
Python version
3.11
Node version
16
Browser
Chrome
Additional context
No response
Checklist
The text was updated successfully, but these errors were encountered: