-
Notifications
You must be signed in to change notification settings - Fork 121
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Define all dependences in requirements files #894
Conversation
Requirements are defined at one place('requirements*.txt' files) and are read or linked from there.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks again for this! You're making me feel bad with all these contributions. :-) I've added a few comments of minor things I think we need to address.
Also removes all hardcoded requirements from the tox script.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks again! Just a couple of very minor changes left -- comments added.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, thanks!
This looks fine, but I'm curious as to the motivation. I'm not against pinning versions, but it's uncommon practice in Python, even if we've all probably been bitten many times by "whoops, major version change happened overnight and now my CI explodes". Was this a request, due to a specific issue, or just a change in practice? |
Yeah, it's to tighten things up a bit so that CI (or our users' CI) doesn't break so easily. :-) I had run into this once or twice with ops dev dependencies already. Is it uncommon practice in Python? I've seen it a ton. Obviously more in applications because there you want to deploy exact versions, but in libraries I've definitely seen it done to pin major or minor (note that we're only pinning major versions here, except for our dev dependencies). |
Yeah, keeping requirements pinned to specific versions has been my experience across the projects I've been involved in as well. Obviously helps improve the stability in a CI or containerised environment. |
It's not a bad idea, but it's far from common practice. The community is pretty split on it, and the phenomenal amount of breakage in very well backed projects around a Jinja/MarkupSafe last year is evidence both that pinning is a good idea sometimes, but also that the Python community is still so torn on it that ~100 projects you've probably heard of had fiery CI implosions when it happened. Could we? Sure. It is easier? Sure, most of the time. But there's a distinct tradeoff when it comes to automated CVE scanners. Most of which, admittedly, end up flagging stuff like "if a user submits a request to a REST endpoint in some part of the library you probably aren't using anyway with incredibly carefully crafted headers, they might be able to peer at something trivial: CVSS 9.9/10", but that's still a ding on the project for customers running naive scans |
I'm not very familiar with automated CVE scanning tools, but as for any security concerns that may arise due to pinning dependencies to versions would it make sense to include a dependabot workflow in the project? https://github.blog/2020-06-01-keep-all-your-packages-up-to-date-with-dependabot |
The honest answer is "probably not". Automated CVE scanners aren't worth the time more often than not, especially compared to the hassle introduced by dependabot. It's particularly obvious in any container built from, say, Ubuntu running as a non-privileged user with a Golang binary as the entrypoint (rather than using But it's Go. It's probably not parsing XML anyway, and if it is, there's almost no chance it's CGO and using expat at all, so the "vulnerability" is completely meaningless. Or you pulled in some Prometheus module without quite enough granularity, and its deptree yanked API integrations for every endpoint known to man (even if the code paths aren't reachable from your code), and... But the scanner flags it anyway. And eventually, you get a ticket to go update the image, with no measurable positive outcome and no assurances that it won't just happen again tomorrow. Partly this is CVSS being a terrible system. This, for example, is "Negligible" from our rating, isn't meaningfully exploitable on containers anyway, possibly never happened in Ubuntu at all (the report is from SuSE Tumbleweed years ago). It's still rated as a 9.8, and filed last year. It's part of the base packageset, and is almost certain to come up as a "vulnerability" in any given If that's a 9.8/10, what's Shellshock? 100/10? It's a bad system, but CVSS is the "default" scoring, and not pinning deps is a tradeoff of "CI may break and I'll have to investigate it" versus "constant CVE reports will be filed and burn development time for nothing". It's a matter of picking your poison, and there's not a right/good way, but it's a large part of the reason that projects don't pin. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Overall I'm +1 on this.
To a limited extent, I do agree with @rbarry82 here, and perhaps pinning the dev dependencies is less of a priority, but pinning the runtime deps is a must in my opinion, given that this is a core library for many projects now.
I would personally square this off with a a nicely tuned bot (renovate, or similar), but we could write a very simple automation that just checks the pinned versions every week, or even every month, and opens an issue or PR to make sure they're kept fresh.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm broadly in favor of trying to do the major version pinning, and see if this gives an overall better experience than having PyYAML release a major version 7, and charms that haven't been doing pinning just falling over.
ops/pebble.py
Outdated
self._buf = bytearray() | ||
self._pos = 0 # current position in buf | ||
self._buf: bytearray = bytearray() | ||
self._pos: int = 0 # current position in buf |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why do we need these, but not ": bool" and ":str" on the next few? Is it just that bytearray doesn't have a type and 0 could be int or float?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, I couldn't figure this out either. I just spent a bunch of time on it creating a cut-down reproducer ... and it turns out it's a bug in Pyright being confused about Unknown types in some cases. It was a weird one to reproduce because I'd take out a code block or if statement that was seemingly unrelated, and Pyright would suddenly be OK. But it was fixed with Pyright 1.1.290, so I upgraded Pyright to the latest (1.1.291) and removed those type annotations now.
It was likely this Pyright fix which has to do with caching of Unknown types or something.
Anyway, thanks for nerd-sniping me. ;-)
I guess this shows why we pin dev dependencies! This was seemingly fixed in 1.1.290, likely this fix to do with caching of Unknown types: microsoft/pyright@bd010d1 Prior to this Pyright fix, the following cut-down snippet showed the buf: ----- import typing class Foo: def __init__(self): self._buf = bytearray() self._done = False def bar(self, data: bytes): self._buf.extend(data) while True: i, self._done = thing() if i == -1 or self._done: return self._buf = self._buf[1:] def thing() -> typing.Tuple[int, bool]: return 0, False -----
Defines all project dependencies in one place (
requirements.txt
andrequirements-dev.txt
) and pins them to a version.Initial review: tinvaan#1
QA steps
Run the
tox
script and real pebble tests to verify sanity.