You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I noticed some other code reading raw files and claiming that the PIPERBLK field had to be used to find missing blocks. However rawspec doesn't seem to use this field at all. It seems like the logic to detect missing blocks calculates PIPERBLK (calling it dpktidx) by subtracting the first two PKTIDX's it sees:
PIPERBLK is "Packet Index PER BLocK". It is the expected increment in PKTIDX ("PacKeT InDeX") from one block to the next. Not all GUPPI RAW files have PIPERBLK in the header. PIPERBLK is used to detect "missing" blocks, which need to be processed as blocks of zeros to keep the integrations evenly spaced. When PKTIDX is missing, the best we can do is derive it by taking the difference between the PKTIDX values of the first two blocks (and hope that there are no missing blocks in between). If there is a one block in one RAW file, then there can be no missing blocks (other than the infinite "missing" blocks after the sequence of RAW files. I suppose there might be a corner case where there is a sequence of one-block files. That corner case may not be properly handled, but to my knowledge nobody creates a sequence of one-block files.
That makes sense. My point is just that rawspec isn't actually using the PIPERBLK header. It always does the "derive it by hoping there are no missing blocks in between the first two" thing.
I noticed some other code reading raw files and claiming that the PIPERBLK field had to be used to find missing blocks. However rawspec doesn't seem to use this field at all. It seems like the logic to detect missing blocks calculates PIPERBLK (calling it dpktidx) by subtracting the first two PKTIDX's it sees:
rawspec/rawspec.c
Line 911 in 15790b1
but, that doesn't seem correct, I think it will cause rawspec to fail if the second block is ever missing.
The text was updated successfully, but these errors were encountered: