-
Notifications
You must be signed in to change notification settings - Fork 30k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
test: mark test-fs-rmdir-recursive flaky on win #41533
Conversation
Refs: nodejs#41201 From recent reliability reports this is now the most common failure by far in CI runs. Mark the test as flaky until the issue is resolved. Signed-off-by: Michael Dawson <[email protected]>
The issue might be fixed by #41545 so I would wait before landing this. |
@bcoe any chance you could have a quick look at parallel/test-fs-rm |
@mhdawson I did notice a couple missing 6b9d2ae#diff-596c3dc1f2c2b0c3dc8f6c087b9e0f1910188ff28eaee22800367acc0e9a4f43L188 I'm not 100% sure they're the cause of flakes. But I could imagine them causing some weirdness depending on timing. It seemed like the step that fails is actually the Should we keep this open for a little bit and see if the flakes have gone away? How often were we seeing the failures. |
|
That was my plan. I'll leave open for a week or so to see how the flakes look on new PRs. If things are green (keeping my fingers crossed) then I'll go ahead and close this. |
@bcoe I'll also say that today things look better in the CI so I'm hopeful. Thanks for your help on this one. |
@mhdawson if I'm reading the daily reports correctly, I'm still seeing quite a few flakes in One thought I had was serializing the tests, so that the promise API and callback API are not being exercised a the same time -- my hunch continues to be parallel operations causing issues in Windows due to contention on files. In the very least, perhaps we will get a clearer picture of which test is failing to cleanup. |
There have been a few failures within the last few days, but they might be from PRs that were not rebased? I figure we should wait another week and then take a look at the reliability report again. |
@bcoe unfortunately from the latest reliability report - nodejs/reliability#185. It still looks like failures are occurring with parallel/test-fs-rmdir-recursive. The latest failure was just yesterday and the PR being tested was only opened yesterday as well. @lpinca I'm going to propose we land this PR. I'll agree to keep an eye on the CI and if we no longer see the test failing in the next week or so I'll back out the change, otherwise we can do that as part of whatever future fixes/updates there are to make the test more reliable on windows. |
@mhdawson I agree, go ahead. |
Refs: #41201 From recent reliability reports this is now the most common failure by far in CI runs. Mark the test as flaky until the issue is resolved. Signed-off-by: Michael Dawson <[email protected]> PR-URL: #41533 Reviewed-By: Ben Coe <[email protected]> Reviewed-By: James M Snell <[email protected]> Reviewed-By: Luigi Pinca <[email protected]>
Landed in 7faf763 |
Refs: #41201 From recent reliability reports this is now the most common failure by far in CI runs. Mark the test as flaky until the issue is resolved. Signed-off-by: Michael Dawson <[email protected]> PR-URL: #41533 Reviewed-By: Ben Coe <[email protected]> Reviewed-By: James M Snell <[email protected]> Reviewed-By: Luigi Pinca <[email protected]>
Refs: #41201 From recent reliability reports this is now the most common failure by far in CI runs. Mark the test as flaky until the issue is resolved. Signed-off-by: Michael Dawson <[email protected]> PR-URL: #41533 Reviewed-By: Ben Coe <[email protected]> Reviewed-By: James M Snell <[email protected]> Reviewed-By: Luigi Pinca <[email protected]>
Refs: #41201 From recent reliability reports this is now the most common failure by far in CI runs. Mark the test as flaky until the issue is resolved. Signed-off-by: Michael Dawson <[email protected]> PR-URL: #41533 Reviewed-By: Ben Coe <[email protected]> Reviewed-By: James M Snell <[email protected]> Reviewed-By: Luigi Pinca <[email protected]>
Refs: #41201 From recent reliability reports this is now the most common failure by far in CI runs. Mark the test as flaky until the issue is resolved. Signed-off-by: Michael Dawson <[email protected]> PR-URL: #41533 Reviewed-By: Ben Coe <[email protected]> Reviewed-By: James M Snell <[email protected]> Reviewed-By: Luigi Pinca <[email protected]>
Refs: #41201
From recent reliability reports this is now the most
common failure by far in CI runs. Mark the test as
flaky until the issue is resolved.
Signed-off-by: Michael Dawson [email protected]