Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Epic] Increase SQL Test Coverage #374

Open
dehume opened this issue Nov 20, 2023 · 5 comments
Open

[Epic] Increase SQL Test Coverage #374

dehume opened this issue Nov 20, 2023 · 5 comments
Assignees

Comments

@dehume
Copy link
Contributor

dehume commented Nov 20, 2023

The Terraform provider has to maintain its own client to generate and execute SQL DDL. As part of maintaining the provider we need to ensure the that client has as close to full coverage as possible. Within the provider there are several places where we can test SQL generation:

  • pkg/materialize - This package defines all the base builders we use to generate SQL
  • pkg/resources - This package defines all the Terraform resources and how the attributes defined by the resources are passed to the builders
  • pkg/provider - Contains the acceptance tests that execute small snippets of Terraform code against the latest Materialize image (Create, Destroy, Update)
  • integration - Maintains a functional Terraform project executed against the latest Materialize image (Create, Plan, Destroy)
Package Test Type Github Action Time Notes
pkg/materialize Unit < 1 sec
pkg/resources Unit < 1 sec
pkg/provider Integration 40 mins Requires additional services (Docker Compose)
integration Integration 10 mins Requires additional services (Docker Compose)

In trying to expand test coverage. The easiest additional tests would be in materialize and resources. Here we can add any number of permutations. Currently we try and write tests based on our docs but if there is a place in testing with a more exhaustive list, we can use that. And while it is slightly redundant we will need to include tests for both materialize and resources to ensure the correct attributes are passed and parsed by the builders.

The integration tests give us the most confidence but require the most additional work to setup. Especially in testing all the various configurations for permutations that depend on additional infrastructure (such as private link).

@dehume
Copy link
Contributor Author

dehume commented Nov 21, 2023

Discussing with QA on how we can improve this.

@benesch
Copy link
Member

benesch commented Nov 21, 2023

Short term - include all tests from https://github.com/MaterializeInc/materialize/tree/main/test/sqllogictest

Wait, sorry, can you say more about this idea? I'm not sure how we'd plug in SLT to Terraform.

Longer term - include something related to Terraform as part of the Nightly

I don't know that we're seeing a lot of bugs resulting from backwards incompatible changes in Materialize breaking the Terraform provider?

Most of the bugs seem to be a result of folks using the Terraform provider in ways that we don't test for. E.g., combinations of options that we've not tested, or, more often, issuing plan updates in a way that updates or deletes existing resources.

I'm not sure there's much of a solution here besides building out a massive suite of acceptance tests in this repository that covers more ground.

@dehume
Copy link
Contributor Author

dehume commented Nov 21, 2023

For sqllogictest I mostly meant we can use that to ensure we cover all the SQL use cases within the Terraform provider as unit tests (in the materialize and resources package). Will feel a lot more comfortable knowing that both repos are using the same use cases. As you said this starts by greatly expanding the test coverage.

@benesch
Copy link
Member

benesch commented Nov 22, 2023

Sorry, still not sure I follow! sqllogictest is largely about testing the logic of SELECT statements—something which the Terraform provider, funnily enough, doesn't care about at all. The testdrive suite has a lot more tests that do DDL (e.g., CREATE SOURCE, CREATE SINK), but that test suite similarly cares a lot more about Materialize's behavior—e.g., if I create this sink, do I get the records I expect in Kafka, etc.

The Terraform provider doesn't care about the Materialize-side behavior of what it creates at all, as long as it creates it successfully! E.g., for a Kafka sink, the test suite here doesn't need to verify that the sink writes anything in particular to Kafka—just needs to verify that the Terraform materialize_sink_kafka resource is correctly translated into a sink in Materialize with the state you'd expect in Terraform and in the mz_sinks table in Materialize.

So, I'm not sure that you need to go spelunking through the sqllogictest/testdrive suites to determine what to test here! I'd advocate for a coverage-focused approach. As a north star, the ideal would be that for every resource in the Terraform provider, we have coverage of setting and changing every property to a wide variety of values, both valid and invalid. This is no small feat, and of course you can't test every possible combination of properties, so there's an art to structuring the tests to cover the important cases, without massive duplication of tests or exploding the runtime of the test suite.

One thing we should explore here is code coverage: https://go.dev/blog/integration-test-coverage. It's proven valuable in the database repo. Ideally we'd have 100% code coverage of the Terraform provider. That certainly doesn't guarantee you've tested all the important cases (e.g., adding and removing grants over multiple terraform applys), but it's a good proximate objective.

By the way, I generally find that unit tests don't provide much value. Not zero value, but not much! They're easy to write and they feel protective, but in my experience they don't tend to surface many bugs. You mostly just lock in the bugs that you have. E.g., the provider is generating SQL that Materialize won't accept, and the test just verifies that the provider produces that incorrect SQL. The bugs tend to be in the integration between Materialize and the provider, and the only way to find those bugs is to test the integration itself.

I'd advocate for plowing as much possible bandwidth into the acceptance tests rather than into unit tests—especially acceptance tests that simulate more real world use cases, like evolving sources and grants over multiple terraform applys, manually dropping resources on the Materialize side and ensuring that the provider doesn't choke on the next plan—that sort of thing!

The places where you need to rely on unit testing are hopefully relatively few and far between. You'll need it for testing Kafka sources that use PrivateLink connections, as there's no way to write an acceptance test for that without hitting real AWS, and that's probably not worth the trouble. But we can help you pull in some of the mzcompose workflows from the database repo that stand up Kafka and an SSH bastion, for example, and then the Terraform provider can have real acceptance tests that use SSH bastions (if you don't have them already—I didn't check!).

@dehume
Copy link
Contributor Author

dehume commented Nov 23, 2023

Resource Test Coverage

Resource Assigned Notes PRs
Cluster Replicas @dehume #387
Clusters @dehume #387
Connection - AWS Privatelink @bobbyiliev #387
Connection - Confluent Schema Registry @bobbyiliev #388
Connection - Kafka @bobbyiliev #388
Connection - Postgres @bobbyiliev #388
Connection - SSH @bobbyiliev #388
Databases @dehume #387
Grants @dehume #387
Grants Default @dehume #387
Indexes @dehume #387
Materialized Views @dehume #387
Roles @dehume #387
Schemas @dehume #387
Secrets @dehume #387
Sinks - Kafka @dehume #387
Sources - Kafka @dehume #387
Sources - Load Generator @dehume #387
Sources - Postgres @dehume #387
Sources - Webhook @dehume #387
Tables @dehume #387
Types @dehume #387
Views @dehume #387

@dehume dehume removed their assignment Dec 13, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants