Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Investigate speed regressions #7110

Closed
rogeliog opened this issue Oct 6, 2018 · 11 comments
Closed

Investigate speed regressions #7110

rogeliog opened this issue Oct 6, 2018 · 11 comments

Comments

@rogeliog
Copy link
Contributor

rogeliog commented Oct 6, 2018

💥 Regression Report

Speed regression in [email protected]

Last working version

Worked up to version:

[email protected]

Stopped working in version:

A big chunk of the regression was introduced with this PR #5932

To Reproduce

Steps to reproduce the behavior:

I've been using a small repo, https://github.com/rogeliog/jest-benchmark(I can give you access if you need to) to run simple benchmarks across Jest versions.

Expected behavior

Test should run faster

Here are is progress that I've made on the speed regression investigation.

  1. A big chunk of the speed regression was introduced here Updates babel-jest resolution #5932
  2. I've been testing with https://github.com/rogeliog/jest-benchmark and the differences that I get between jest@22 and jest@23 are the following
    ------------------- Jest@22 -------------------------
    0m2.338s
    0m2.317s
    ------------------- Jest@23 -------------------------
    0m6.068s
    0m6.048s
    

Initial findings

  1. In jest@22 https://github.com/facebook/jest/blob/v22.4.4/packages/jest-config/src/normalize.js#L128 returns null which causes https://github.com/facebook/jest/blob/v22.4.4/packages/jest-config/src/normalize.js#L130-L132 not to get executed.

  2. In jest@23 https://github.com/facebook/jest/blob/master/packages/jest-config/src/normalize.js#L143 returns a path and https://github.com/facebook/jest/blob/master/packages/jest-config/src/normalize.js#L144-L146 gets executed.

  3. If I comment out https://github.com/facebook/jest/blob/master/packages/jest-config/src/normalize.js#L144-L146 then I get the following stats.

    ------------------- Jest@22 -------------------------
    0m2.537s
    0m2.331s
    ------------------- Jest@dev -------------------------
    0m3.658s
    0m3.633s
    
  4. I'm not to familiar the reasoning behind it, but it seems that since the original there were some comments about them returning different values Updates babel-jest resolution #5932 (comment).

cc: @cpojer, @arcanis

@arcanis
Copy link
Contributor

arcanis commented Oct 7, 2018

It seems to me the problem is that babel-jest is now always used to run the tests, is that correct? Whereas before it was only used if it was listed as a dependency in the root package.json.

I'm not sure what's the behavior that Jest wants to keep, but a possible solution might be to use the paths option from require.resolve (Node 8+, iirc) to specifically make the lookup relative to options.rootDir.

@SimenB
Copy link
Member

SimenB commented Oct 7, 2018

We used babel-jest all the time previously as well, see this example with jest 22: https://github.com/SimenB/jest-22-babel

@jkillian
Copy link

jkillian commented Nov 27, 2018

We've found that switching from Jest 23.5 -> 23.6, upgrading babel-jest, and switching from Babel 6 -> Babel 7 has somehow caused our testing time to increase by about 30% when using --no-cache. It doesn't appear babel is the culprit because webpack builds and other tooling that uses babel are actually slightly faster now.

Still trying to narrow down what's causing the perf regression, haven't been able to figure out what jest (or other) package is causing the issue.

Any advice on tracking down the cause? I can try to build out a minimal repro if it would be useful.

@rickhanlonii
Copy link
Member

@jkillian thanks for the info - typically the way to track this down is to run with a local copy of jest (there are instructions in CONTRIBUTING.md) and doing a git bisect to track down the PR responsible for the regression

@jkillian
Copy link

For what it's worth, turns out I misdiagnosed the cause of the problem. It was actually an upgrade from ts-jest 23.1.4 -> 23.10.5 which was causing the large slowdown and memory usage increase (babel and jest proper were red herrings). We found switching our jest config from:

    globals: {
      'ts-jest': {
        diagnostics: false,
      },
    },

to

    globals: {
      'ts-jest': {
        diagnostics: false,
        isolatedModules: true,
      },
    },

resolved the performance issues for us! Just posting the info here to absolve jest of the blame I put on it 😁 and to help anyone else if they have similar issues with ts-jest

@rickhanlonii
Copy link
Member

Oh great find, thanks for following up @jkillian!

@nasreddineskandrani
Copy link
Contributor

nasreddineskandrani commented Dec 10, 2018

please check my comment here:
problem: #6783 (comment)
investigation: #6783 (comment)
cause: #6783 (comment)
solution: #7518

I am sure 100% that the TestScheduler is creating the regression for the case i reported in the issue. I don't know yet if solving this will help with the run on all files => if you think i should isolate this in a specific issue let me know.

@massimonewsuk
Copy link

Is the advice to stick to older versions of Jest for now?

@github-actions
Copy link

This issue is stale because it has been open for 1 year with no activity. Remove stale label or comment or this will be closed in 14 days.

@github-actions github-actions bot added the Stale label Feb 25, 2022
@github-actions
Copy link

This issue was closed because it has been stalled for 7 days with no activity. Please open a new issue if the issue is still relevant, linking to this one.

@github-actions
Copy link

github-actions bot commented May 2, 2022

This issue has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.
Please note this issue tracker is not a help forum. We recommend using StackOverflow or our discord channel for questions.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators May 2, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

7 participants