-
-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🚀 Feature: Run last failed tests #4108
Comments
Any thoughts on this? @juergba or anybody else? I would happily work on this as it is really pain-point for me 😒 |
Yeah, the #4183 would be definitely beneficial for this idea as I don't like the "huge I am not yet familiar how mocha internally identifies each test-case, but the general outline could be like this: For every test run
Last failed / first failed mocha run
CacheI would propose to follow the direction of projects like The cache can be a simple JSON object with schema: {
"lastRun": <<timestamp of last run>>,
"results": {
"<<ID of testcase>>": <<result>>
}
} where
IssuesReorderingI assume filtering out test-cases should be fairly straight-forward, especially with the #4183, yet I am not sure how reordering of the test-cases is feasible. But since there is no need for very complex reordering this could be for example solved by two internal test runs. The first test run would consist of filtered failed test cases and the second test run would consist of the rest test cases. @juergba your thoughts? |
@AuHau thank you. First I need to know what the author of #4183 exactly wants.
|
There is one additional problem, see #1955. |
@AuHau I think a watch mode that reruns only failed tests would be even more efficient, since you would only need to run a single command, instead of an initial run and then a separate |
@jedwards1211 hmm while I understand your workflow and its value, I don't think that it should be "one feature" as you propose it, but rather separate ones that are interoperable. So you can do something like I can see use-cases and workflows that does not need or want the watch mode to be part of this (for example IDEs could support it natively and there the "Watch mode" does not really make sense). |
I see, well rerunning failed without watch mode would be a more complicated change since Mocha would have to write failed tests to some file, meaning people have to edit their gitignore etc. With watch mode it could just remember them in memory |
Well, that is true, but IMHO it is not such a big deal as there are solutions already in widely used projects that worked around needing "git-ignoring" and other problems, with the solution that I have described earlier using the |
Oh yeah, that's true, that would work |
Duplicate of #1690. |
Is your feature request related to a problem or a nice-to-have?? Please describe.
During development of new features, there is usually only a small subset of tests failing. With big test-suite, it can be a lengthy process to run the whole test-suite.
Coming from Python background, one of the testing framework that I was using (
pytest
) supported the ability to rerun only failed tests or even better run failed tests prior rest of the test suite. See its documentation here: Cache: working with cross-testrun state. This is a feature that I am very much missing inmocha
.Describe the solution you'd like
Same ability as
pytest
supports, both for "last failed" and "failed first" options.Storing cross-run state of failed tests, that is stored on every
mocha
run. Whenmocha
is run with proper parameter (for example--last-failed
or--failed-first
) it will use this stored state to limit the list of executed tests to those that failed prior.Describe alternatives you've considered
The only current solution to this problem known to me is using
grep
orfgrep
, yet this is not flexible enough to completely cover this usecase.There is a package that does this on top of
mocha
calledmocha-broken
yet I would prefer to have natively supported this by mocha.If this feature would be accepted, I would be happy to discuss more deeply its implementation and then provide a patch for it.
The text was updated successfully, but these errors were encountered: