Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #471 +/- ##
==========================================
+ Coverage 63.85% 64.09% +0.24%
==========================================
Files 40 40
Lines 3591 3724 +133
Branches 774 790 +16
==========================================
+ Hits 2293 2387 +94
- Misses 772 800 +28
- Partials 526 537 +11 ☔ View full report in Codecov by Sentry. |
61f4aa8 to
5ec5337
Compare
9265999 to
99fa170
Compare
f649e7f to
d0e1548
Compare
9b797e7 to
bc6e897
Compare
`workflow_dispatch` allows CodSpeed to trigger backtest performance analysis in order to generate initial data. See also: https://docs.codspeed.io/ci/github-actions#2-create-the-benchmarks-workflow
tests/sync/test_git.py:11: error: Skipping analyzing "pytest_codspeed.plugin": module is installed, but missing library stubs or py.typed marker [import-untyped] tests/sync/test_git.py:11: note: See https://mypy.readthedocs.io/en/stable/running_mypy.html#missing-imports
tony
added a commit
that referenced
this pull request
Oct 12, 2024
# Problem Git, Mercurial, and Subversion repositories are unnecessarily reinitialized for each test. - We're not utilizing session-based scoping. - A single initial repo could be created, then copied to [`tmp_path`](https://docs.pytest.org/en/8.3.x/how-to/tmp_path.html#the-tmp-path-fixture) using [`shutil.copytree`](https://docs.python.org/3/library/shutil.html#shutil.copytree) ([source](https://github.com/python/cpython/blob/v3.13.0/Lib/shutil.py#L550-L605)). Issue #471 highlighted this inefficiency, where benchmarks showed tens of thousands of redundant functional calls. # Improvement ``` ❯ hyperfine -L branch master,pytest-plugin-fixture-caching 'git checkout {branch} && py.test' Benchmark 1: git checkout master && py.test Time (mean ± σ): 32.062 s ± 0.869 s [User: 41.391 s, System: 9.931 s] Range (min … max): 30.878 s … 33.583 s 10 runs Benchmark 2: git checkout pytest-plugin-fixture-caching && py.test Time (mean ± σ): 14.659 s ± 0.495 s [User: 16.351 s, System: 4.433 s] Range (min … max): 13.990 s … 15.423 s 10 runs Summary git checkout pytest-plugin-fixture-caching && py.test ran 2.19 ± 0.09 times faster than git checkout master && py.test ``` # Changes ## Pytest fixtures overhaul 1. Create a base VCS repo. 2. For subsequent tests, copy and modify from this template.
7f69eab to
c480bb4
Compare
Member
Author
|
@sourcery-ai review |
Reviewer's Guide by SourceryThis pull request introduces performance benchmarks using CodSpeed. It sets up CodSpeed configuration, adds the Sequence diagram for benchmark execution flowsequenceDiagram
participant Dev as Developer
participant CI as CI Pipeline
participant CS as CodSpeed
Dev->>CI: Push code changes
activate CI
CI->>CI: Run tests with pytest
CI->>CI: Execute benchmarks
CI->>CS: Send benchmark results
activate CS
CS->>CS: Analyze performance
CS-->>Dev: Report performance changes
deactivate CS
deactivate CI
File-Level Changes
Tips and commandsInteracting with Sourcery
Customizing Your ExperienceAccess your dashboard to:
Getting Help
|
There was a problem hiding this comment.
Hey @tony - I've reviewed your changes - here's some feedback:
Overall Comments:
- Remember to add the
benchmarkannotation/fixture as noted in your TODO. This is needed for proper benchmark function identification.
Here's what I looked at during the review
- 🟢 General issues: all looks good
- 🟢 Security: all looks good
- 🟢 Testing: all looks good
- 🟢 Complexity: all looks good
- 🟢 Documentation: all looks good
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Note
benchmark(BenchmarkFixture)Problem
It's difficult to catch performance degradation or improvements over time, in a PR, etc.
Changes
Add performance benchmarks
TBD
Setup codpseed
Configure on website, set secret, etc.
py(deps[test]) Add
pytest-codspeedSee also:
Summary by Sourcery
Add performance benchmarks using Codspeed. Integrate Codspeed into the CI workflow to automatically run performance tests and report results.
CI:
Tests:
pytest-codspeedto enable performance testing.