Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Increase CPython test coverage #1671

Open
palaviv opened this issue Jan 10, 2020 · 44 comments
Open

Increase CPython test coverage #1671

palaviv opened this issue Jan 10, 2020 · 44 comments
Labels
E-help-wanted Extra attention is needed

Comments

@palaviv
Copy link
Contributor

palaviv commented Jan 10, 2020

We are currently running part of the CPython test suite. This is a good place for people new to the project to contribute. Contribution here can come in two ways:

  • Fix failing test - Look for tests marked as skipped or xfail and fix RustPython so that the test will pass.
  • Add more tests suites - Copy test suite from CPython and mark failing tests with xfail or skip.

In order to run the test suite use cargo run -- -m test -v.
All tests are in Lib/test

Guide: https://rustpython.github.io/blog/2020/04/04/how-to-contribute-by-cpython-unittest.html

@palaviv palaviv added E-help-wanted Extra attention is needed good first issue Good for newcomers labels Jan 10, 2020
@abitrolly
Copy link

@palaviv is it possible to generate and online dashboard that shows which tests pass and which need more help?

@palaviv
Copy link
Contributor Author

palaviv commented Jan 25, 2020

I am sure it is possible... Would you like to create such a dashboard? I think you can use testgrid for that.
My recommendation would be to run the test suite as described above and pick any of the exprectedFailure or skipped test marked as RustPython change. I am not sure a dashboard would be effective here.

@abitrolly
Copy link

@palaviv is it possible to get the report in a file in machine readable format, such as https://testanything.org ?

@palaviv
Copy link
Contributor Author

palaviv commented Jan 26, 2020

The test use test\libregtest and run the underline unittests tests. I am not sure if there is a builtin support in either for the format you want...

@mireille-raad
Copy link
Member

Would it help having the dashboard somewhere on the current website?

If you are looking for a dashboard, we can do two pages:

  • one for tests passed/failed status
  • another for the "what is left" script.

The goal would be for people to know what they can help on in one place.

I can probably get the output of both scripts to push to github repos and then use that as part of jekyll page or just as something people can read.

@coolreader18
Copy link
Member

I think that would be cool, and it would probably require getting the test results in a machine readable format like TAP. It seems like tappy works perfectly fine on RustPython, the only issue is getting regrtest to run a custom unittest test runner.

@abitrolly
Copy link

@mireille-raad is it be possible to have single dashboard for both passed/failed and "what is left" by color coding them with green, red and yellow?

@coolreader18
Copy link
Member

I don't know if it makes sense to have them in the same dashboard, since the regrtests are tests that get run and are well structured, while the whats_left script just shows standard library items that aren't present in RustPython; they don't really show the same kind of data.

@abitrolly
Copy link

@coolreader18 regrtests and standard library items are different test sets?

@coolreader18
Copy link
Member

Well yeah, sort of; all whats_left.sh does is diff the list of standard library modules and the dir() outputs of them, and see what CPython has that RustPython doesn't. It doesn't actually run any code; you could "fix" something to remove it from the whats_left list by just doing def missingthing(): raise NotImplementedError.

@abitrolly
Copy link

@coolreader18 but do testnames match between what is available in RustPython and upstream? If they do, then it might be possible to get the test results from both test servers in TAP format and diff them.

@coolreader18
Copy link
Member

That's not really necessary, since every test that's in CPython is passing (or should be, at least), so we can assume that any test that fails on RustPython should be passing, no diffing required.

@coolreader18
Copy link
Member

whats_left.sh isn't for regrtests, it's only for the standard library.

@abitrolly
Copy link

@coolreader18 there should be platform dependent tests that are skipped in CPython. Therefore the system configuration of test environment for CPython and RustPython should be the same.

@coolreader18
Copy link
Member

Ah, I see what you mean. My idea for that was to check if the skip message contains the string "RUSTPYTHON", since so far (just by convention) all the tests that we've marked as skip do.

@mireille-raad
Copy link
Member

mireille-raad commented Mar 27, 2020

@abitrolly do you think you can figure out how to generate one or a couple of machine readable files for the test results? (my preference would be json, but i can manage if xml or just plain text)... it looks like you have a specific idea of what need to be done. i can then put them together on a page.

@coolreader18
Copy link
Member

I've created a PR for JSON output for CPython tests, #1834, you just need to run tests/jsontests.py with rustpython and it'll output all the results in a json file.

@mireille-raad
Copy link
Member

yay... i will take a look at the output, do a sketch. share here. get your feedback on gitter. if things look ok, we should be ready to roll soon!

@abitrolly
Copy link

I am excited to see how the JSON output looks like as well. Need to figure out how to provide it as a part of CI build result.

@mireille-raad
Copy link
Member

@coolreader18 can you automate running the tests and putting the .json file in the other repo https://github.com/RustPython/rustpython.github.io? under the _data/ directory?
https://github.com/RustPython/rustpython.github.io/tree/master/_data

@coolreader18
Copy link
Member

Yeah, I think that would be possible, maybe running it once a week would be good?

@coolreader18
Copy link
Member

coolreader18 commented Apr 13, 2020

@mireille-raad I've set up a cron job in #1866, and already tested it a few times, so the results file is already pushed to the git repo.

@mireille-raad
Copy link
Member

thank you! I saw it yesterday night and already created a draft page... I'll use the other repo for discussion on design, page layout and such.

@abitrolly
Copy link

@coolreader18 very nice. ) It would be easier to review changes in test coverage if the produced cpython_tests_results.json was indented.

@mireille-raad
Copy link
Member

@abitrolly for readiablity, you can always format the document in your code editor

or soon enough, go to the web page on rustpython.github.io ... adding indentation and spacing will make the file even bigger. usually you compress json files, not inflate them.

@abitrolly
Copy link

Maybe https://rustpython.github.io/pages/regression-tests-results.html output can be improved to include not-implemented tests as a separate color.

@jirheee
Copy link
Contributor

jirheee commented Jul 22, 2022

Thanks! I will look at the guides and try to fix some :)

dvermd added a commit to dvermd/RustPython that referenced this issue Oct 17, 2022
dvermd added a commit to dvermd/RustPython that referenced this issue Oct 17, 2022
dvermd added a commit to dvermd/RustPython that referenced this issue Oct 17, 2022
dvermd added a commit to dvermd/RustPython that referenced this issue Oct 17, 2022
dvermd added a commit to dvermd/RustPython that referenced this issue Oct 17, 2022
dvermd added a commit to dvermd/RustPython that referenced this issue Oct 17, 2022
dvermd added a commit to dvermd/RustPython that referenced this issue Oct 18, 2022
dvermd added a commit to dvermd/RustPython that referenced this issue Oct 18, 2022
dvermd added a commit to dvermd/RustPython that referenced this issue Oct 24, 2022
dvermd added a commit to dvermd/RustPython that referenced this issue Oct 24, 2022
dvermd added a commit to dvermd/RustPython that referenced this issue Oct 24, 2022
dvermd added a commit to dvermd/RustPython that referenced this issue Oct 24, 2022
youknowone pushed a commit to dvermd/RustPython that referenced this issue Oct 26, 2022
youknowone pushed a commit to dvermd/RustPython that referenced this issue Oct 26, 2022
youknowone pushed a commit to dvermd/RustPython that referenced this issue Oct 26, 2022
youknowone pushed a commit to dvermd/RustPython that referenced this issue Oct 26, 2022
dvermd added a commit to dvermd/RustPython that referenced this issue Oct 26, 2022
@youknowone youknowone removed the good first issue Good for newcomers label Feb 24, 2023
@youknowone
Copy link
Member

I removed good first issue label because it is not trivially easy like before in 2020.
But it still has many easy issues yet!

@youknowone youknowone pinned this issue Feb 24, 2023
@DimitrisJim
Copy link
Member

@youknowone can't this be closed in favor of #4564? They both overlap but 4564 is more organized and up to date.

@youknowone
Copy link
Member

I removed adding test from the issue body but kept fixing errors. does it make sense? We may can open new one with more organized form using dashboard

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
E-help-wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

10 participants