-
Notifications
You must be signed in to change notification settings - Fork 13
Open
Description
Right now QA-Board focuses on algorithm engineering. Another big area is software performance.
How do people track software performance?
Unit tests are not enough to judge software performance. Some organizations:
- track their test suit runtime over time. It helps get a trend but comparisons are hard because the tests keeps changing.
- use acceptance tests that check runtime/memory thresholds, and monitor regressions.
On the ops side, if we're talking about applications/services:
- there are many great products: monitoring like datadog/newrelic, crash analytics like sentry...
- smart monitoring solutions correlate anomalies with commits and feature flags.
- the "future" is likely tooling based on canary deploys to identify perf regressions on real workflows.
For libraries or products used as dependencies by others, it's not possible to setup those tools. Could QA-Board help "shift-left" and help identify issues before releases?
Development workflows for performance engineering
- Engineers doing optimization have a hard time keeping track of all their versions and microbenchmarks. The tooling is focused on the live experience (debuggers-like, checking the assembly) and investigate one version at a time.
- To keep track, the best tool I've seen to identify issues ahead of time and help during coding is https://perf.rust-lang.org
Software engineers have the same need for "run tracking" as algorithm engineers.
Features needed
- Examples of integrations with tools such as
perf. - Visualizations:
- Brendan Gregg's flame graphs, and the diff version
- Charts from a benchmark tool (like
hyperfine) - Compare generated code like Goldbolt
- Examples of visualizations of metrics like binary size, IPC, time, page faults, gas..
- We could add anomaly detection on top to warn about regressions early.
Reference: perf/profiling tools
- gperftools
- pprof
- flame graph viewers: flamescope,speedscope and flamebearer
- List of performance analysis tools
- servo benchmarking, screenshot
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels