Conversation
Codecov Report✅ All modified and coverable lines are covered by tests.
Additional details and impacted files@@ Coverage Diff @@
## main #1016 +/- ##
===========================================
- Coverage 66.63% 24.20% -42.44%
===========================================
Files 40 41 +1
Lines 6060 6053 -7
Branches 1015 1012 -3
===========================================
- Hits 4038 1465 -2573
- Misses 1662 4571 +2909
+ Partials 360 17 -343 🚀 New features to boost your workflow:
|
|
No changes in benchmarks. Comparison: https://github.com/scverse/squidpy/compare/662165070dcc38b424c9feb077c22fce69599c51..98b3deca6c60fc8df93b987345da8750ab7cb275 More details: https://github.com/scverse/squidpy/pull/1016/checks?check_run_id=49604095778 |
|
OK, silly me, it was RIGHT THERE in the “Usage” part of the readme: https://github.com/scverse/benchmark#usage I improved that bit a little, but we should all have seen it immediately. |
4cb1e34 to
1771711
Compare
1771711 to
9c1b472
Compare
|
I tried a lot of asv environment creation backends, but in the end, I just went with One isse with
that being said, I think you can use {
...,
"install_command": ["python -m uv pip install {r_numpy} {wheel_file}"],
"matrix": {
"env": {
"R_NUMPY": ["numpy<2", "numpy>=2"],
},
},
} |
|
I made an issue about this: airspeed-velocity/asv#1542 |
This PR adds benchmarks similar to the ones in https://github.com/scverse/scanpy/blob/main/benchmarks/benchmarks