-
Notifications
You must be signed in to change notification settings - Fork 4.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[#30083] Add synthetic processing time to prism. #30492
Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #30492 +/- ##
==========================================
+ Coverage 38.52% 38.55% +0.02%
==========================================
Files 698 699 +1
Lines 102374 102439 +65
==========================================
+ Hits 39442 39497 +55
- Misses 61302 61307 +5
- Partials 1630 1635 +5
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
313bc5b
to
f434d46
Compare
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #30492 +/- ##
=============================================
- Coverage 71.44% 38.55% -32.90%
=============================================
Files 906 699 -207
Lines 113271 102439 -10832
Branches 1076 0 -1076
=============================================
- Hits 80931 39497 -41434
- Misses 30327 61307 +30980
+ Partials 2013 1635 -378
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
f434d46
to
ae36257
Compare
bc9b26d
to
c14e517
Compare
Stopping reviewer notifications for this pull request: review requested by someone other than the bot, ceding control |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM pending the latest check re-run passes; nothing really stood out for me accept for the couple of comments.
{"Greedy", false, false}, | ||
{"AllElementsPerKey", false, true}, | ||
{"OneElementPerKey", true, false}, | ||
// {"OneElementPerBundle", true, true}, // Reveals flaky behavior |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not PR blocking and not sure if this might be problematic for the AllElementsPerKey and OneElementPerKey cases in the future. I commented out this flakey OneElementPerBundle case and inserted log statements after https://github.com/lostluck/beam/blob/beam30083ProcessingTimme/sdks/go/test/integration/primitives/timers.go#L197 and after https://github.com/lostluck/beam/blob/beam30083ProcessingTimme/sdks/go/test/integration/primitives/timers.go#L210. I observed that the key associated with the panic here: https://github.com/lostluck/beam/blob/beam30083ProcessingTimme/sdks/go/test/integration/primitives/timers.go#L213 never appeared in the aforementioned logged steps. I haven't yet figured out why this is but wanted to relay my findings.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I had not observed that specific behavior (the key not being in the logs in an earlier part of the same call). Very interesting. The latter log is designed to catch that specific flake a bit more clearly than the downstream handling. We'll get it! I believe in us.
envID string | ||
stateful bool | ||
hasTimers []string | ||
processingTimeTimers map[string]bool |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is https://github.com/lostluck/beam/blob/beam30083ProcessingTimme/sdks/go/pkg/beam/runners/prism/internal/engine/elementmanager.go#L843 the reason why we only have a map of processing time timers and not event time timers?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There are only two time domains (Event, and Processing), and since event time was implemented first, it was the default, while processing time needs to be called out specifically.
There's certainly improvements we can make to how these are handled, but ultimately we need a mapping from timer names, to their domains in the element manager.
// If there are no watermark refreshes available, we wait until there are. | ||
for len(em.watermarkRefreshes) == 0 { | ||
for len(em.watermarkRefreshes)+len(ptRefreshed) == 0 { // TODO Add processing time event condition instead of piggybacking on watermarks? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Non PR blocking but curious what "processing time event condition" means and especially with respect to the "instead of" in the TODO statement.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In this case, it's a leftover comment. The "ProcessingTime event condition" is basically rthat list of stages that have been triggered by the Processing time queue. I originally was keeping everything in the watermark refreshes instead, but that didn't work as cleanly as I hoped.
Old comment removed, other comment clarified.
The only failure was the Dataflow run of the ProcessingTime_Bounded test, which I had not filtered out. Dataflow only properly handles processing time in Streaming pipelines,and that test pipeline executes as "batch". |
Add a ProcessingTime queue to handle ProcessingTime timers to Prism's element manager, that can be appropriately controlled by the TestStream notion of time.
Basic design is that the queue in the ElementManager doesn't contain the elements, but instead manages which stage needs to be notified when to inject queue elements into processing, and in particular, syncing it with existing watermark processing.
This allows continuing to have each stage maintain all the state for the stage, instead of splitting it between the element manager and the stages.
This doesn't yet complete #30083 which requires adding the hooks to handle real processing time via a clock. However, this should unblock simple test usage. Real time settings would require support for pipeline plumbing to enable/disable real time behavior and to inject ProcessContinuation elements into the queue as well.
As the PR was getting very large we stuck simply to the timer case.
Thank you for your contribution! Follow this checklist to help us incorporate your contribution quickly and easily:
addresses #123
), if applicable. This will automatically add a link to the pull request in the issue. If you would like the issue to automatically close on merging the pull request, commentfixes #<ISSUE NUMBER>
instead.CHANGES.md
with noteworthy changes.See the Contributor Guide for more tips on how to make review process smoother.
To check the build health, please visit https://github.com/apache/beam/blob/master/.test-infra/BUILD_STATUS.md
GitHub Actions Tests Status (on master branch)
See CI.md for more information about GitHub Actions CI or the workflows README to see a list of phrases to trigger workflows.