Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add GridSearchColorPlot tests + minor gridsearch change, part of #308 #1097

Open
wants to merge 5 commits into
base: develop
Choose a base branch
from

Conversation

tktran
Copy link
Contributor

@tktran tktran commented Aug 29, 2020

This PR partially addresses #308.

Changes:

  1. Set up tests/test_gridsearch with basic tests based on examples/pbs929/gridsearch.ipynb.
  2. Only one change was made to gridsearch code (the quick method's return was changed from None to visualizer, in accordance with other quick methods in the project and to work nicely with assert_images_similarity()).

Still to do:

  • [Documentation, as requested in the issue]

CHECKLIST

  • Is the commit message formatted correctly?
  • Have you noted the new functionality/bugfix in the release notes of the next release?
  • Do all of your functions and methods have docstrings?
  • Have you added/updated unit tests where appropriate?
  • Have you updated the baseline images if necessary?
  • Have you run the unit tests using pytest?
  • Is your code style correct (are you using PEP8, pyflakes)?
  • Have you documented your new feature/functionality in the docs?

Sets up tests/test_gridsearch with basic tests based on sample .ipynb. Only one change to gridsearch code (quick method return changed from None to visualizer).
@tktran
Copy link
Contributor Author

tktran commented Oct 11, 2020

The Travis CI builds that use a regular Python distro pass, but those using Miniconda fail, due to failures on the image similarity tests I wrote. I'm not sure why, as the base images I committed seem to be in good order. Would need some help from core contributors on this.

@bbengfort
Copy link
Member

bbengfort commented Oct 12, 2020

@tktran I'd be happy to help with the baseline images issue: it turns out that prebuilt images on Miniconda and Windows may have minor variations in how they render images, slightly different fonts, aliases, etc. When we test images we compute the root mean square difference (RMS) of the generated image and the baseline; if the RMS is 0, then the images are exactly alike, and the larger the RMS, the more different the images are. Luckily, in this case, we can use the RMS as a tolerance, giving us some flexibility to test across these slight variations.

In your tests, you can modify the tolerance as follows:

# RMS on Miniconda of 0.987
self.assert_images_similar(viz, tol=0.5, windows_tol=1.25)

Note that you can specify two tolerances, the default tolerance tol and a windows_tol that checks if the OS is windows and if so, uses that tolerance instead. Since your tests are failing on Travis (Linux), you'll want to increase the tol parameter.

Below are the tests, the RMS on miniconda, and my recommended tolerance value. Please do include a comment as in the above example, but feel free to omit windows_tol and just specify the tol.

  • test_gridsearchcolorplot: failing with RMS 0.685; recommend tol=0.95
  • test_quick_method: failing with RMS 0.690; recommend tol=0.95
  • test_pandas_integration: failing with RMS 0.902; recommend tol=1.05
  • test_numpy_integration: failing with RMS 0.902; recommend tol=1.05

I hope that helps get the tests sorted, let me know if you have any questions! Thank you so much for contributing to Yellowbrick!

Also, when the tests pass and you're ready for us to review - please ping us to let us know! I've marked this PR as a draft in the meantime, just for our own organizational purposes.

@bbengfort bbengfort marked this pull request as draft October 12, 2020 15:38
@tktran tktran marked this pull request as ready for review November 1, 2020 06:00
@tktran
Copy link
Contributor Author

tktran commented Nov 1, 2020

@bbengfort Hello, thanks for your help - your time invested explaining this aspect of testing to me won't go to waste going forward. The CI checks have passed now, so I would like to say that this PR is ready for merging.

@bbengfort
Copy link
Member

@tktran excellent, thank you so much for your hard work on this! I'll take a look as soon as I can!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants