-
Notifications
You must be signed in to change notification settings - Fork 60
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Migrate tests to JUnit #142
Conversation
Note many of the images showed small differences from old tests due to fonts. I added a new tag that's just informative that these tests are font dependent (the images I'm adding pass on my machine- there might be issues on different machines).
New tag RequiresContent for tests that are relying on the content/setup.xml, I still need to find a way to make those tests not require manual setup.
Skipping the setup.xml validation allows additional tests that don't rely on those values
Last modified is difficult to have consistent across machines (and even on the same machine).
…hat. Also migrate EDDGridFromDap to JUnit. Sadly most tests are reliant on datasets from THREDDS, so they still aren't running. The plan to fix that is one of: mock THREDDS responses, local files for those datasets, or different datasets.
Planning to transition to downloading the test resource files.
…e large files Also moves some files to a new scripts folder that is for code designed to be run manually that is not part of the main server.
Also update image all comparisons to use resource paths.
…files This is to make the folders small enough they can be zipped under 2gb and served from a git release
…ugin now) Also adding flaky annotations for tests that failed (several of them I believe failed due to things like file last modified timestamp being different).
Add the flaky annotation to several more tests.
Also fix a couple hardcoded dataset ids to use suggested ids.
…rif for tests. Tests are now using SansSerif because it is always available in java distributions and so it eliminates one more step needed for developer setup.
Is any setup besides getting netcdfAll-5.5.3.jar needed before running
Haven't dug in at all yet but many of the errors seem to be around tests taking a bit longer than the specific threshold, or the |
All of the tests that are currently enabled are passing on my machine. There are several hundred more tests that are disabled with various Tags. I tried to have maven handle getting everything (including netcdf) so there should be minimal setup. My guess with that number of errors is the src/test/resources directory didn't get setup properly. Maven should be downloading several large zip files from here: https://github.com/ERDDAP/erddapTest/releases/tag/rc-test The other possibility (because you mentioned errors about erddapContentDirectory ) is the download of the content.zip (and unzipping of it) might have failed. If it's neither of those, the earliest error about failing to initialize a class is generally the most helpful. So could you share that? What OS are you on? Did any maven steps fail before running the tests? |
I think that's actually hashing the file contents of the file at that path. The path comes from It does also look like the resource path didn't get updated when I moved it to resources/data/LICENSE.txt I am shocked that didn't cause an error for me. |
Ah! Ok, that makes sense. The previous
Ok great, I'll work on a PR fixing the case sensitivity issues and we'll see where we are... |
Well, it looks like the case of the test file
Simply updating the |
Spent quite a bit of time over the weekend reviewing test failures on Linux. A PR with fixes for many of them are here: Most of the remaining failing tests (~45 in total) fall into two categories: Performance tests running slower than expected This has been discussed, but the performance specifics in the test are non-portable and should be able to be disabled and/or adjusted during testing. The tests should be runnable on a minimal CI virtual machine, etc.
Differences in generated images due to font rendering Haven't dug much into this one yet. It's possible I don't have my Linux environment set up completely correctly with regard to the desired fonts, but I believe I do. Also definitely possible that text rending in images is just a little different between Windows and Linux.
The good news is that all of the image checking is done in a single utility method If we add an option to disable, I think it should only disable the image comparison part and still run the image generation code, as that's still very valuable to ensure working. One example set of expected/observed/difference images: Other than that, there are ~10 other test failures which seem due to various differences, I'll try to chase those down later, but I probably won't be able to give them much attention next week. Not sure what the goal is for Linux compatibility on these tests at this point. I don't think it needs to block merging, we can definitely fix tests in separate PRs later. Oh and one other note that's already been discussed but I want to emphasize: the default values in |
I believe there's around 40 image comparison tests currently so it's quite possible that's the vast majority of the differences left. For anybody making changes that might impact the generated images, I'd like them to be able to run the tests with a before/after comparison. I actually set up the code to automatically create the expected files from the observed (if the expected files are missing). We could have the expected images be a separate optional pack and just have the expected images be locally generated with instructions to generate the images before making changes to confirm what impact changes have. If you want to test this locally, delete src/test/resources/data/images and target/test-classes/data/images. I do want the tests to be runnable on Linux, both to lower barriers to others contributing, but also because many people run ERDDAP on Linux servers, so it is good to be confident in that environment as well. I don't have a linux machine available to me currently though so it's something I'll need assistance with. As for the setup.xml issues. Micah's pull #147 should help with that. It provides a setup.xml intended to work for local development/tests. I'll make sure it works well with all the tests once it is merged. |
* Fix some jUnit tests on Linux Fix many path related and other issues with jUnit tests on Linux. Note that these changes don't yet make every test pass, there are still issues with font differences in image generation, non-portable performance checks, and a handful of other differences. * Fix watch directory tests on windows While linux may not send the modify events on deletion or the directory event, windows does * Disable the display in browser, it causes tests to pause * Have the path regex work for both windows and linux * Fix the forwardSlashDir utility function This is only used in one spot, so should be safe to modify. Also it was previously checking if the string started with a / and if not, adding a slash to the end of it. * Fix some of the EDDGridFromNcFilesTests on Linux The wording of FileNotFoundException differs, so just make sure its that kind of exception. Mark testNcml flaky (for now), and fix the capitilzation of testGridNThreads for systems that are sensitive to that. * Disable the performance part of persistentTableTests for now * Change to using a temp dir for eddtablefromhttpgettests --------- Co-authored-by: Chris John <[email protected]>
Add some explanation for image comparison tests to the readme.
…ddap into test_migrations
…move the performance checks in a couple spots.
@srstsavage I've uploaded a new version of data.zip. The normal process would be to have a new version of the data which the download/cache system should update automatically. For right now though you'll need to delete your download_cache/data.zip, src/test/resources/data, and target/test-classes/data files. ERDDAP assumes linux machines have csh installed. If you do not you likely have a few errors due to that missing. And I disabled a couple tests for now due to different behavior on Linux/Windows which I'll need to investigate more. This gets all of the tests passing in my virtual linux machine (I talked to IT and got VirtualBox installed). |
Recent pom changes mean the java files aren't copied over to the build outputs
… there's discepencies on how OSes handle that field, different files can be selected for the attributes of a dataset. Update some tests that were running into this with pulling some attributes from the loaded data instead of fully hardcoded.
…ddap into test_migrations
@srstsavage I marked this as ready for review. While debugging some of the differences I had between my Windows and Linux (VirtualBox) machines, I realized that because of how many of the tests are constructed there are quite possibly errors for others. It comes down to a lot of the tests being based on comparing the dataset's attributes to a hardcoded set of values. However the attributes for a dataset are in many cases based on which file has the first or last (depending on settings) file modification timestamp. On a windows machine a copied (or file extracted from a zip) will retain it's original modification timestamp (but have a creation timestamp when it was copied/extracted). On Linux (possibly exact distribution dependent) when Java asks for the last modified timestamp of a file, the (OS) provided value will not be earlier than the file creation. What this means is which file is used for the metadata (and therefore attributes the test is comparing against) can be effectively random because it's now based on the order the files were extracted from the zip. It would likely be good to move away from tests that can break if files are extracted from the zip in a different order, but that is outside of the scope of this pull. I'm also open to changes to how dataset metadata is extracted from files if that's appropriate (but also outside the scope of this change). So that's a long way of saying, I think this pull is good enough. If there are tests that are failing on a clean build on other machines, I'll fix them, but I don't currently know of any. |
Ah thanks, that makes sense about the zip extraction. I'll give this another full test run and review tonight. |
Added another small PR here to re-add I still have ~10 failing tests but I believe they're mostly due to the indeterminate zip extraction issue you mentioned. I agree that the few remaining failures don't need to get sorted out here and can be addressed in more targeted PRs. |
…-portable performance tests (#8) * re-add maven-antrun-plugin creation of jetty test data directory * Comment out additional non-portable performance tests
Description
This migrates tests to JUnit along with updates to file paths to make them runnable with minimal setup for a new developer.
This also loads test resources through maven by downloading from a github release rather than including the files in the repo. There were some files included previously those have been removed. The reasoning is a lot of the test data files are >10 mb and in total the test data is several gb of data. While downloading the files aren't great, I think its a better solution than including them in the main repo.
About 400 tests will now run with mvn test. On my computer(s) the first run takes a while, but once things are cached appropriately, future runs are around 30 minutes.
Fixes # (issue)
Type of change
Please delete options that are not relevant.
Checklist before requesting a review