-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
It will break the web #23
Comments
If A Promise polyfill should be able to use es6-shim's technique of having a slightly more convoluted inheritance chain, so that |
It doesn't really matter if it's a bug in the polyfill or not. The question is whether the change proposed in this repo would cause websites to stop working from the user's perspective. |
Fair enough; the wording of the OP wasn't clear as to whether core-js actually has that bug or not, but I suppose I should assume it does given the context. |
@ljharb one more time - it's supported in actual versions of |
Thanks for clarifying. |
@ljharb I checked the actual version of |
Thanks, looks like I need to fix that bug too then, if Promise is replaced in an environment where |
@codehag @syg so, any feedback? This proposal is a blocker for many other proposals, for example, tc39/proposal-collection-methods#45 (comment) or tc39/proposal-change-array-by-copy#33 |
It isn't quite clear to me what the exact issue here is, so I will try to answer but I may be misunderstanding your concern. Question 1
There might be a performance impact, but taking a look at coreJS, I don't expect it to be as significant as you are saying here. Performance issues can result in breakage, but that doesn't appear to be the case here. I ran an experiment with array.map and 10000 iterations, here is what I found. Just a small example: for (var i = 0; i < 10000; i++) {
[0,1,2,3,4,5,6,7,8,9].map( x => x + 1);
} On Firefox, to give a ballpark: the polyfill was about 20% slower. Same for Safari. What about v8? Well, you special class v8 to avoid performance problems. Cumulatively they make up around 80% of the market: https://github.com/zloirock/core-js/blob/ce52fdc735c5c809c9e85b2072f92d41b5a3885a/packages/core-js/internals/array-method-has-species-support.js#L8-L10 . For the other 20% -- If we look at the difference between the browsers, the difference between browsers for the same piece of code is up to 169% different. While it is slower -- it is not so slow that it breaks the browsing experience. We are looking at, potentially, a 20% slow down within a browser -- this is a less alarmist way to look at it. To say that methods "will work tens and hundreds of times slower" is simply not true. It isn't great, but this is not a web compatibility issue in the way you have framed this issue. That 20% could also be made up for! By removing species. We expect a performance boost on not needing to do the species lookup every time we run a method. In addition, we will end up using less memory. The goal of this proposal is to investigate if (only if) we can have a faster, more secure JavaScript. We can work together with you and make things faster for all users. There isn't any reason to do the version check for only v8 -- Firefox does too. If we address this, your library will have a better overall experience. For your second question. Question 2
fetch(url) instanceof Promise; // => false It isn't clear to me what you are asking here. What do you mean by "Promise will be replaced"? Are you concerned that the following pattern will fail? class MyArray extends Array {}
MyArray instanceof Array If this is the case, then we generally don't think we will get rid of subclass II -- unless you meant something else and I misunderstood. |
Answer 1
It should rely on
I would be happy to work together with you and make things faster for all users but seems you don't understand the problem. The special case for V8 was added because the slowdown after some features detection even without polyfilling in V8 was catastrophic - see, for example, zloirock/core-js#677 (one more case of 100x slowdown in V8, but not so critical since it's in a rare use case), the problem is not so noticeable in other browsers. It's much better to use proper feature detection where it's possible instead of specifying of engine version. If something works fine and in a new browser version starts to work 100x slower it could cause, for example, freezing of the page and it's definitely breaking the web. In weekly downloads I see that 7.5% of users use Answer 2That just means that All Another case is APIs which expect only native promises, I showed this example: zloirock/core-js#815 - one more example of issues that will be in modern engines. One more kind of issue is |
Re answer 1, If it's that critically slow such that it's unusable, why polyfill it at all in the first place? If it's still usable, just slow, then why is it a critical problem? A 100x slowdown isn't "catastrophic" unless it's in a hot path, and if it is, then it wouldn't be able to pragmatically support an engine where it wasn't native. |
@ljharb because it causes critical slowdown, it was polyfilled only when it was supported in all modern major browsers. A slowdown in obsolete browsers is not something critical. But this proposal will cause it in all modern major browsers, so it's inapplicable. A constant 100x slowdown after feature detection even without polyfilling is catastrophic anyway. |
The point of polyfilling is that older browsers are identically as important as modern ones - an unacceptable slow down in any is an unacceptable slowdown in all. If it's too slow for an old browser, then the old browser shouldn't be supported at all. Put another way, if the performance is acceptable in any supported browser, then it is automatically acceptable in every supported browser. |
...however, your |
Its performance (which is not O(n) in practice) is quite acceptable in every possible browser - including every modern one - or else it wouldn't be polyfilling it at all. |
It IS |
I think we outlined this in the exit criteria in the proposal itself. I have been trying to understand why there was such an alarmist tone to this issue, but it might be unclear from the proposal how the investigation is going to go. We have a segment on it, please read it (as it seems you were focused on corejs and not on the method for testing). if it is still unclear let us know. In which case, we should clarify the readme. It would be impossible for us to move forward with this without testing the compatibility of removal -- Firefox, and I believe chrome, have test implementations. On Firefox we are planning a test to determine the breakage on top sites as well as manual testing -- if there are serious issues like the ones you are concerned about, we will catch them. At present this isn't high priority so I can't say when this test will be run. Perhaps something that we should add in the readme to make this clear is the method by which we evaluate that criteria. This was a hard stop for us always, and the testing method was obvious to us. As for your answers: I've tested the RegExp functions on firefox with your library, and the difference is still around 20%. I wasn't too careful so I may have missed something. But, I am not going to spend more time investigating. Benchmarks like these are pretty meaningless and the test we have planned as part of our exit criteria should cover web breakages of this sort. Once that happens we will update the committee. For promises -- I tested your library -- Completely disabling species doesn't cause any failures on modern browsers, even with evaluating promises produced by the async keyword against the forced promise from corejs in the absence of the species symbol. It only happens in two cases: when I force the polyfill to overwrite the promise itself. You mentioned your library has issues logged regarding this failure, like the one you linked, but it doesn't seem to apply in the case of removing species support. I am not an expert on your library, so, again - likely missed something. But the full browser implementation test will uncover any failures. This is again, something that would be covered as part of the exit criteria for stage 1. This is something we want to verify experimentally. As mentioned in our readme exit criteria -- if there are web breakages and depending on the type and severity, we will change approach or abandon the proposal. edit: for wording, understanding, and tried to make it shorter :/ |
I had initially read it. What could I say... since the method for testing found usage of I'm concentrated on
One more time - makes no sense even try to advance this proposal since it will break the web and I showed where it will be broken. Do you want to do a useless job?
This proposal is a block for some other proposals, in this case, it would be better to say them to use current ways of subclassing.
I've tested the
I don't know how to say, I already wrote it too many times, THE POLYFILL WILL OVERWRITE Let's take a look at the reasons for the appearance of polyfills of these kinds of subclassing in 6 years ago @cscott asked for forced replacement of Take a look at Even this case should be enough to stop thinking about removing those ways of subclassing. Sorry, I'm fucking tired to explain obvious things to you, if you wanna break the web - feel free, I wash my hands. |
The old species subclassing was detected by a crawler running on the top 10000 sites, not by the actual web compat test which will be run in browsers. The test, as I mentioned before, has not yet been run, which is why we are still at stage 1. As mentioned in the readme, all of your concerns are covered (since long before this was published) and the proposal will not be pursued if it indeed breaks the web. I have said that a couple of times now. The example from babel as a potential breakage is useful. Class IV is orthogonal from the other classes, and we can investigate further. Examples, and concrete points, are much more useful for evaluation than "it will break the web". The examples you initially gave from coreJS were not concrete. They were vague and difficult to evaluate, leading to a long and unproductive discussion. It is up to you how you want to communicate, but this absolutely did not need to be this frustrating for either of us. Thanks for your input. |
I've now tested the regex example from babel, here is the output on the firefox build with class IV (the class that would cause breakage) disabled: I also verified that we never run the babel function that is implemented via symbol.replace. The wrapper does run. You wrote the following
Well, examples like this are why. You cannot -- purely from reading code and making a conjecture, know for certain that the code will break. We cannot know the behaviour of all libraries and their interactions with a given engine. The most thorough way to test is to actually check. |
Did you see the link above? How
Do you really think that I didn't call tests? If I wrote code, I know how it works a why it could stop work. |
Can you tone it down a bit? The aggression is absolutely not useful. Also, what are you trying to achieve? That we abandon the proposal? It is stage 1, this is an incredibly weak signal from committee. It is literally an investigation -- nothing more. The broader question of how we handle sub classing is what the other proposals are waiting on. That is partially related to this proposal, but is also a question of direction. Even if this proposal fails, we may choose not to do sub classing for future proposals. |
I only asked how will be called I'm just showing cases that will be broken by this proposal - and I showed some examples for each from II-IV types of subclassing - and it's just 4 libraries - |
When you say "something like that can be accepted" I guess you mean into stage 1? This is an unusual proposal. We are considering a new process for investigation of removal of features -- we never had one before. For now, we are using the pre-existing proposal system. I can see that this is causing concern. What stage 1 means is that the committee has acknowledged that we are investigating. The staging process is not linear -- many proposals enter stage 1 and are retracted. Some reach stage 3, and are rolled back to stage 1. That something is stage 1 is not a signal that "it will be implemented", it is a signal that "we are investigating". Stage 1 is also highly malleable. Your contribution here can shape the way that this goes. Complete abandonment of the proposal just gets us back to square one, whereas investigation and collecting information might actually be productive and help us rethink the spec in a web compatible way. For why this moved to stage 1: This proposal has been on the mind of many implementer for a long long time. But we never sat down and thought about how to fix it. Stage 1 seemed like a good way to say "hey, we are thinking about this, and how we might do it". In the end, it is likely that the shape of this will change. We might instead come up with a recommendation for the future, based on how subclassing works today and which classes are highly used. Or we may be able to restrict sub classing in such a way that it maintains web compatibility while improving what engines can do to optimize. Ultimately that is the goal. We might not achieve it, but we are giving it a shot and doing so in the public. In part -- it is to invite library authors such as yourself to comment and give advice. Ideally, it would take a more productive form than this discussion has been, by having concrete examples that we can look at and consider. Keep in mind, much like a language might not have all of its parts used -- a library might not either. Or, alternatively as we have done in the past for introducing new methods on builtins, we can reach out to individual sites if the breakage is small. It isn't productive to say "just abandon this", because the investigation may uncover other paths forward. I would be very very thankful for clear examples from each of the classes. I think a good way to do this is to have dedicated issues for "potential class II breakage" etc for each of the classes. This way we can collect them and make sure they are not missed when we run the actual test. It can also help verify that our implementation is correct. Does this bring us to the same page? |
I mean that nothing that will break the web can be accepted and it can't be fixed in the scope of this proposal. Even for stage 1. Since no one of type II-IV ways of subclassing can be removed from current APIs, it should be withdrawn or changed to something that will rethink subclassing only for new APIs - as you wrote. The existence and unresolved state of this proposal is a block for other proposals, the links above.
We had
Yes. |
Reflect.enumerate was a normative PR. I did not say this was the first backwards breaking change, it isnt and neither was Reflect.enumerate. I was referring to a new process that is designed to ensure we fully vet removals, which we do not have. The unresolved state of this proposal is not in and of itself a blocker to those other proposals. The blocker is the question "what do we do about subclassing" -- If this proposal did not exist, as this is something implementations are concerned about this would still potentially be a blocker. We have similar discussion about other types of "exists but we really don't like it so we won't allow it into new language features" features, but they are not recorded. It is being investigated in depth here, and why people are pointing here. As a way forward: I can update the proposal readme to be more explicit about this, and reword things a bit to make the fluidity clear. Stage 1 is not about the solution -- it is about the problem. As for why exec isn't being called -- that is a good question. I have other work to do so i didn't read the code -- I just ran it. But now that you mention it: I threw a few logs and it looks like we are saving the state of the object somewhere, so it is failing on every other run. For some reason we are not calling this consistently: https://searchfox.org/mozilla-central/rev/85a74c965bd4bcab091b5767c13c44f84671d9ce/js/src/builtin/RegExp.js#9-48 and it is leading to false positives, which is what I saw. So the breakage is legit, looks like we have a bug, thanks for pointing it out. |
Ok, so the list of problems from this issue:
|
Fantastic, thank you! |
To be clear, we can always accept something that will break the web into stage 1, because stage 1 is solely about exploring a problem space - only a solution could break the web, and no solution exists until at least stage 2. |
Dear TC39! Seems you forgot about polyfills.
The readme mentions that obsolete
core-js
versions from pre-ES6 era unconditionally set@@species
to some built-ins. But you forgot about actualcore-js
versions and feature detection.Detection of
@@spicies
support and delegating to.exec
,@@match
,@@replace
,@@search
,@@split
- all kinds of subclassing mentioned in this proposal - existent in most of relatedcore-js
polyfills. Removing those kinds of subclassing means that... all of those features will be polyfilled.At least all of
Array
,%TypedArray%
,RegExp
methods which now should work with those kinds of subclassing andPromise
constructor will be replaced.What does it mean?
Array
,%TypedArray%
,RegExp
methods will work tens and hundreds of times slower since polyfilled methods can't be so optimized like natives. Because of this, feature detection of subclassing was added to those methods only when it was supported in most modern engines. However, it's just a slowdown.Promise
will be replaced. That means that for all native APIs that should return a promise, the result will not be an instance of globalPromise
.For most cases that should return native promises (like the example above) added workarounds in the most recent
core-js
versions, but there are cases that could not be fixed, oldcore-js
versions, APIs that could accept only native promises like zloirock/core-js#815 and other polyfills.It's only the most obvious cases, I'm sure there are deeper cases as well.
core-js
is used in most websites - even naive detection shows thatcore-js
is used on ~59% of top 1000 websites, you could add some more percents of sites uncovered by this detection.The text was updated successfully, but these errors were encountered: