Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

checking in changes in verify-loader script for log loss simulation #17

Open
wants to merge 5 commits into
base: main
Choose a base branch
from

Conversation

atrimandal
Copy link

Script changes for simulating log loss

verify-loader Outdated
if not line.startswith("loader seq - "):
#find the log header: loader seq -
#for container logs the line will not start with header - instead there's a timestamp; if header not present ignore this line
if "loader seq - " not in line:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If the log lines read by verify-loader have a fixed prefix added to them, then something else should process that header away before sending the log lines to the verify-loader.

I don't think we should try to endow verify-loader with an understanding of how to pull out the log lines.

verify-loader Outdated
try:
_, invocid, seqval, payload = line.split('-', 4)
indx = line.find('loader seq -')
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same as above, we don't want to add support for prefixed data of the log lines generated by the loader. If there is a prefix, let's have another tool strip it before sending the data to verify-loader.

verify-loader Outdated
if prev is not None:
# Bad record encountered, flag it
print("%s: %d %d <-" % (invocid, seq, prev))
loss_count += (seq-prev)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So if seq is less than prev this will be a negative value. I don't think we want to account for loss that way.

Instead we might want to consider two conditions: seq > prev + 1 and seq <= prev.

The add the distance between seq and prev makes sense on the first condition. But for the second condition we'll want to keep track of contiguous ranges, expanding the range as the sequence grows, creating a new sequence when a gap is encountered, and looking for duplicates by seeing if the new seq is in any known ranges.

verify-loader Outdated
@@ -192,6 +201,7 @@ def verify(input_gen, report_interval=REPORT_INTERVAL):
(total_count / (now - start)),
(ignored_bytes / MB) / (now - start),
(ignored_count / (now - start))))
print("interval stats:: total bytes: %d, total lines: %d, ignored: %d" % (report_bytes, report_count, report_ignored_count))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not add this data to the previous print statement?

verify-loader Outdated
@@ -226,6 +236,8 @@ def verify(input_gen, report_interval=REPORT_INTERVAL):
(total_count / (now - start)),
(ignored_bytes / MB) / (now - start),
(ignored_count / (now - start))))
print("total bytes: %d, total lines: %d, ignored lines: %d, lost(out-ofseq) lines: %d" % (total_bytes, total_count, ignored_count, loss_count))
print("overall loss percentage = %.3f" %(loss_count*100.0/total_count))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not add this data to the previous print statement?

@openshift-ci
Copy link

openshift-ci bot commented Mar 12, 2022

@atrimandal: PR needs rebase.

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants