Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add extra marks to VertexAI IT tests #28534

Merged
merged 5 commits into from
Sep 19, 2023
Merged

Conversation

jrmccluskey
Copy link
Contributor

Mark the Vertex AI IT tests as using tensorflow and add an extra check to ensure the dependency has been installed before running the test suite.


Thank you for your contribution! Follow this checklist to help us incorporate your contribution quickly and easily:

  • Mention the appropriate issue in your description (for example: addresses #123), if applicable. This will automatically add a link to the pull request in the issue. If you would like the issue to automatically close on merging the pull request, comment fixes #<ISSUE NUMBER> instead.
  • Update CHANGES.md with noteworthy changes.
  • If this contribution is large, please file an Apache Individual Contributor License Agreement.

See the Contributor Guide for more tips on how to make review process smoother.

To check the build health, please visit https://github.com/apache/beam/blob/master/.test-infra/BUILD_STATUS.md

GitHub Actions Tests Status (on master branch)

Build python source distribution and wheels
Python tests
Java tests
Go tests

See CI.md for more information about GitHub Actions CI or the workflows README to see a list of phrases to trigger workflows.

@jrmccluskey
Copy link
Contributor Author

Run Python 3.8 PostCommit

@codecov
Copy link

codecov bot commented Sep 19, 2023

Codecov Report

Merging #28534 (3bea45a) into master (8f60924) will increase coverage by 0.01%.
Report is 77 commits behind head on master.
The diff coverage is n/a.

@@            Coverage Diff             @@
##           master   #28534      +/-   ##
==========================================
+ Coverage   72.33%   72.34%   +0.01%     
==========================================
  Files         683      683              
  Lines      100709   100814     +105     
==========================================
+ Hits        72843    72938      +95     
- Misses      26288    26298      +10     
  Partials     1578     1578              
Flag Coverage Δ
python 82.85% <ø> (+0.01%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

see 18 files with indirect coverage changes

📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more

@jrmccluskey
Copy link
Contributor Author

Run Python 3.8 PostCommit

@github-actions
Copy link
Contributor

Assigning reviewers. If you would like to opt out of this review, comment assign to next reviewer:

R: @damccorm for label python.

Available commands:

  • stop reviewer notifications - opt out of the automated review tooling
  • remind me after tests pass - tag the comment author after tests pass
  • waiting on author - shift the attention set back to the author (any comment or push by the author will return the attention set to the reviewers)

The PR bot will only process comments in the main thread (not review comments).

Copy link
Contributor

@damccorm damccorm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, lets make sure that the postcommit passes before merging though

@@ -45,6 +47,10 @@


class VertexAIInference(unittest.TestCase):
@unittest.skipIf(
tf is None, 'Missing dependencies. '
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Given that we already have @pytest.mark.uses_vertex_ai , I would't do this skip. We might accidentally silently disable the test and won't know it's not running.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do we need to remove @pytest.mark.it_postcommit if this test is accidentally included into a suite where it is not supposed to run?

Copy link
Contributor

@tvalentyn tvalentyn Sep 19, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

actually, this is almost the same as statements in line 32. i guess i have a general question, how do we make sure this test is running in a suite where it is supposed to run in. It sounds like this test was passing before, and now it is failing, b/c TF no longer installed in beam containers, and the spirit of this change is to skip the test.

sounds like something might be misconfigured.

  • If it was included in a suite that picks up all it_postcommit, not vertexai-specific - that's not expected - is it because we incorrectly included it_postcommit annotation? then we should remove that.
  • If the test was included in a special vertex AI suite, which we run as part of postcommits, and now starts to fail - i'd expect seeing changes pertaining to adding tensorflow in a requirements file, to make the test pass again.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks like we have

, so probably need to remove it_postcommit annotation from this test to not include it twice.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All of our other IT tests (tensorflow model handler IT test included) also have the it_postcommit annotation.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

disregard that, it's WAI the limited load on the logs just obscured the rest of the suites listed

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All of our other IT tests (tensorflow model handler IT test included) also have the it_postcommit annotation.

So, if we look at tests that have uses_tf, we have two kinds of tests: unit tests and postcommit IT tests (the latter run a whole pipeline).
Since for vertex AI you don't have unit tests and only have postcommit tests, I think can simplify the markers in your suite and only use one marker uses_vertex_ai, and collect that test in postcommit suite.

Copy link
Contributor

@tvalentyn tvalentyn Sep 19, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What's weird is that we're getting the gradle task invoked separately and not as part of the postcommit

postCommitIT is a one test suite out of many that run in a postcommit Jenkins / GHA suite. VertextAI suite is another suite running in parallel.

tasks.register("python38PostCommit") {

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

to clarify: vertex ai suite is a part of inferencePostcommitIT suite:

project.tasks.register("inferencePostCommitIT") {

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

image

Tensorflow is getting installed and the test isn't getting skipped for lacking tensorflow, but the test is running on Dataflow where the worker doesn't have tensorflow - https://console.cloud.google.com/dataflow/jobs/us-central1/2023-09-19_10_21_28-17008252103869768778?project=apache-beam-testing

@unittest.skipIf(
tf is None, 'Missing dependencies. '
'Test depends on tensorflow')
@pytest.mark.uses_tf
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is this necessary?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it's ok too keep. looks like we alredy defined this marker and use it in other tests.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perhaps not but considering the postcommits went red without a clear reason I'd rather cover all of our bases here. would have thought that missing TF would have gotten caught in the try-except-skip import block getting the pipeline code, but it didn't.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

actually, i take it back. I think this decorator is serving a different puprose, see

/bin/sh -c 'pytest -o junit_suite_name={envname} --junitxml=pytest_{envname}.xml -n 6 -m uses_tf {posargs}; ret=$?; [ $ret = 5 ] && exit 0 || exit $ret'

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

adding this annotation would make tox pick up this test some other unit test suites, and then skip b/c vertex ai deps are not installed.

@jrmccluskey
Copy link
Contributor Author

Run Python 3.8 PostCommit

@jrmccluskey
Copy link
Contributor Author

Specifying the requirements file for the IT test seems to have resolved the issue - https://ci-beam.apache.org/job/beam_PostCommit_Python38_PR/777/

image

@jrmccluskey
Copy link
Contributor Author

Run Python_Coverage PreCommit

@tvalentyn tvalentyn merged commit a1ec055 into apache:master Sep 19, 2023
77 checks passed
@tvalentyn
Copy link
Contributor

somehow postcommits are still failing https://ci-beam.apache.org/job/beam_PostCommit_Python38/4612/

@tvalentyn
Copy link
Contributor

sent #28571

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants