Conversation
crusaderky
left a comment
There was a problem hiding this comment.
Shouldn't this be matched by a bunch of lines disappearing from torch-xfails.txt? And if not, does it mean this change is untested?
|
Oh yeah, running tests locally with torch 2.5.1 generates 78 new failures (below the fold); torch 2.6 has 73. Some are in the test suite, and some are torch limitations. So it is tested, if indirectly --- just not on CI. Which means we might want to revert gh-244 for the next release and lump it together with this one and the necessary fixes to the test suite. Just adding ~70 xfails is a bit extreme, at least without a deeper investigation. Details |
|
Please don't revert #244; array-api-extras needs it for its tests to pass. #244 shows that basic functionality for these dtypes does work. I suspect scipy also benefits from it (see scipy/scipy#22584). This definitely highlights the current state of affairs for uint types in torch. Without it I was under the false impression that everything worked now. |
This reverts commit c787bea. PyTorch support for unsigned ints is patchy, as of pytorch 2.6 at least. See data-apis#253 (comment) for a discussion.
|
Ah I see that these tests run on hypothesis. Which means that if you XFAIL them for uint you are suppressing any failures on signed ints too - which is much worse. I would suggest to leave the env variable where it is and add a comment to the xfails file to point out the elephant in the room. With that I'm happy to merge this PR. |
|
Okay, the fact that scipy tests pass after scipy/scipy#22584 is reassuring.
That makes it two of us under that impression. To make it more visible, this PR now runs uint tests on CI, https://github.com/data-apis/array-api-compat/actions/runs/13540575384/job/37840526473?pr=253
It's not only that it's too much. The problem AFAICS is that it's currently not easy to xfail a test with, say, uint16 and keep the test of |
It would require moving the dtype out of hypothesis and into pytest.mark.parametrize, which to me feels like a lot of work if just for pytorch. |
|
For completeness, cross-ref the PyTorch issue: pytorch/pytorch#58734 |
Keeping gh-244 and keeping this skip in CI seems like the okay and low-effort way to go at first sight. |
|
Superseded by #298; you may close this PR |
|
agreed, we don't need two unmergeable PRs :-). |
_infoEDIT: this PR is not mergeable currently. Running tests torch/uint results in ~70 failures as of pytorch 2.6.0, see the CI logs below (and/or the fold-out in #253 (comment)). We have to wait for pytorch itself to proceed with
array-api-compat.torchwrappers.