Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Revision of TestFiles #171

Open
Jean1995 opened this issue May 18, 2021 · 0 comments
Open

Revision of TestFiles #171

Jean1995 opened this issue May 18, 2021 · 0 comments
Assignees

Comments

@Jean1995
Copy link
Member

After we've been successful at moving from Travis to GitHub actions (#155), another important step will be to revise the UnitTests themselves.

One main issue is that there are UnitTests that are not checked by the CI at all due to the long runtime of the current tests (these are the main CrossSection tests EPair, Brems, Photonuclear and MuPair). I recently tried to run them manually, and I found out that one of them (Brems) is actually failing on the current master... Furthermore, I figured out that the python scripts that generate the files for the reproducibility tests (/tests/gen_testfiles_scripts) are not fully working yet..

Rather than doing some quick and dirty fixes, I think it would be more appropriate to take some time to actually revise the UnitTests (maybe a "focus week" in a foreseeable future).

I would like to gather some ideas about what could and should be improved about the UnitTests. My first thoughts:

  • Not all CrossSectionTests are done automatically in CI (see above). At the very least, we should conduct a limited amount of testing in the CI to avoid such situations. One idea could be: Test at least all integration tests in the CI (since they only take some minutes compared to the interpolation tests) and do at least one interpolation test for the default cross sections.
  • For the CrossSection UnitTests, we conduct reproducibility tests (which is generally ok). However, the interpolation tests are currently also reproducibility tests (i.e. we compare the interpolated results to the interpolated results from previous versions saved in txt files). This could be easily turned into UnitTests by comparing the interpolated values to the integrated values. This way, we could also check the interpolation for a much bigger phase space to find possible interpolation issues.
  • Split between unit and regression tests Split between unit and regression tests #88
  • In addition to the reproducibility tests, there should be much more physical tests. Ideas could be to make some sanity checks (e.g. "does the LPM effect actually suppress the differential cross sections?") or to compare results to theoretical predictions (e.g. dEdx values).
  • Add codacy / codecov to PROPOSAL Add codacy / codecov to PROPOSAL #50. This way we can also see what still needs to be tested.

Please add your comments and further ideas.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant