|
| 1 | +--- |
| 2 | +name: writing-python-sdk-spector-mock-api-tests |
| 3 | +description: Writes TypeSpec http-client-python generator mock API tests (azure/unbranded/generic) from a Spector case. Use when given a Spector case link or a PR link that modifies Spector cases under http-specs/azure-http-specs. |
| 4 | +--- |
| 5 | + |
| 6 | +# Writing python SDK tests from a Spector case |
| 7 | + |
| 8 | +## Inputs |
| 9 | + |
| 10 | +You may receive either: |
| 11 | + |
| 12 | +- **Spector case link** (preferred): link to a specific scenario/case under: |
| 13 | + - `https://github.com/microsoft/typespec/tree/main/packages/http-specs/specs/...` |
| 14 | + - `https://github.com/Azure/typespec-azure/tree/main/packages/azure-http-specs/specs/...` |
| 15 | +- **PR link**: a GitHub PR that changes one or more Spector cases. |
| 16 | + |
| 17 | +Spector cases define the **expected request + response**. The goal is to add/extend python tests that validate the generated SDK behaves accordingly. |
| 18 | + |
| 19 | +## Output |
| 20 | + |
| 21 | +A python pytest test (sync) added to one of: |
| 22 | + |
| 23 | +- `packages/http-client-python/generator/test/azure/mock_api_tests` |
| 24 | +- `packages/http-client-python/generator/test/unbranded/mock_api_tests` |
| 25 | +- `packages/http-client-python/generator/test/generic_mock_api_tests` |
| 26 | + |
| 27 | +And a corresponding async pytest test added under the matching `asynctests/` folder: |
| 28 | + |
| 29 | +- `packages/http-client-python/generator/test/azure/mock_api_tests/asynctests` |
| 30 | +- `packages/http-client-python/generator/test/unbranded/mock_api_tests/asynctests` |
| 31 | +- `packages/http-client-python/generator/test/generic_mock_api_tests/asynctests` |
| 32 | + |
| 33 | +## Workflow (copy as checklist) |
| 34 | + |
| 35 | +Test-writing progress: |
| 36 | + |
| 37 | +- [ ] Identify the Spector case link (directly, or extracted from PR) |
| 38 | +- [ ] Decide the destination folder(s): azure vs unbranded vs generic |
| 39 | +- [ ] Find existing test file to extend (or create a new one) |
| 40 | +- [ ] Implement sync + async test(s) that match the case’s request/response expectations |
| 41 | +- [ ] Update test requirements only if a new dependency is introduced |
| 42 | +- [ ] Format changed python files with Black (`python -m black <paths> -l 120`) |
| 43 | + |
| 44 | +## Step 1 — Identify the Spector case link |
| 45 | + |
| 46 | +### If input is a Spector case link |
| 47 | + |
| 48 | +Use it directly. |
| 49 | + |
| 50 | +### If input is a PR link |
| 51 | + |
| 52 | +1. List changed files. |
| 53 | +2. From changed files, pick the ones under: |
| 54 | + - `packages/http-specs/specs/` (microsoft/typespec) |
| 55 | + - `packages/azure-http-specs/specs/` (Azure/typespec-azure) |
| 56 | +3. Extract the specific case/scenario path(s) to target. |
| 57 | + |
| 58 | +## Step 2 — Choose where to put the test |
| 59 | + |
| 60 | +### Rule A: Spector in Azure/typespec-azure |
| 61 | + |
| 62 | +Write the python test in: |
| 63 | + |
| 64 | +- `packages/http-client-python/generator/test/azure/mock_api_tests` |
| 65 | + |
| 66 | +### Rule B: Spector in microsoft/typespec |
| 67 | + |
| 68 | +You may need either: |
| 69 | + |
| 70 | +- **Option (a) generic only**: `.../generic_mock_api_tests` |
| 71 | +- **Option (b) both flavors**: `.../azure/mock_api_tests` AND `.../unbranded/mock_api_tests` |
| 72 | + |
| 73 | +Decide with this concrete check: |
| 74 | + |
| 75 | +1. Locate the generated python package/module for the scenario in BOTH: |
| 76 | + - `packages/http-client-python/generator/test/azure/generated` |
| 77 | + - `packages/http-client-python/generator/test/unbranded/generated` |
| 78 | +2. If the import root and client/model API surface you need are the same (same module path + same client entrypoint), write ONE shared test in `generic_mock_api_tests`. |
| 79 | +3. If import paths differ (or one flavor lacks the needed client), write separate tests under both `azure/mock_api_tests` and `unbranded/mock_api_tests`. |
| 80 | + |
| 81 | +Why: both azure and unbranded tox runs include `../generic_mock_api_tests`, so shared tests are preferred when they can import the same generated package. |
| 82 | + |
| 83 | +## Step 3 — Find existing test file (or create one) |
| 84 | + |
| 85 | +1. Search in the chosen folder for an existing test covering the same feature area. |
| 86 | + - Prefer extending an existing `test_*.py` when it already imports the same generated module. |
| 87 | +2. In parallel, find the matching async test in `asynctests/`. |
| 88 | + |
| 89 | +- If you extend `mock_api_tests/test_<area>.py`, also extend `mock_api_tests/asynctests/test_<area>_async.py` (or create it if missing). |
| 90 | +- If you extend `generic_mock_api_tests/test_<area>.py`, also extend `generic_mock_api_tests/asynctests/test_<area>_async.py`. |
| 91 | + |
| 92 | +3. If no sync test exists, create a new `test_<area>.py` and also create the async counterpart under `asynctests/`. |
| 93 | + |
| 94 | +Conventions to match: |
| 95 | + |
| 96 | +- Use `pytest`. |
| 97 | +- Use a `client()` fixture that constructs the generated client and yields it via context manager. |
| 98 | +- For async tests: use `async def client()` fixture + `async with ...` and mark tests with `@pytest.mark.asyncio`. |
| 99 | +- Follow existing assertion style (direct equality for models; `list(...)` for paged results). |
| 100 | + |
| 101 | +## Step 4 — Implement the test from the Spector expectations |
| 102 | + |
| 103 | +1. Read the Spector case to identify: |
| 104 | + - operation name / route |
| 105 | + - HTTP method |
| 106 | + - parameter locations (path/query/header/body) |
| 107 | + - request body shape + media type |
| 108 | + - response status code + headers + body |
| 109 | +2. Translate into SDK calls: |
| 110 | + - construct the client |
| 111 | + - call the method with the specified inputs |
| 112 | + - assert the returned value matches the expected response |
| 113 | + - if the scenario requires sending data, call the corresponding PUT/POST/PATCH and assert no unexpected error |
| 114 | + |
| 115 | +Practical guidance: |
| 116 | + |
| 117 | +- Prefer comparing with generated model instances (e.g., `models.Foo(...)`) when the SDK returns models. |
| 118 | +- If the response is a stream or iterator, materialize it (`list(...)`) before asserting. |
| 119 | +- Async iterator pattern (seen in existing tests): `result = [item async for item in client.list(...)]`. |
| 120 | +- If the scenario is about serialization (e.g., XML), assert round-trip via GET/PUT (pattern: `assert client.foo.get() == model; client.foo.put(model)`). |
| 121 | + |
| 122 | +Async client import patterns (match the folder you’re writing to): |
| 123 | + |
| 124 | +- Azure: import `aio` submodule alongside models, e.g. `from specs.<...> import models, aio`, then `async with aio.<Client>()` and `await client.<op>(...)`. |
| 125 | +- Generic/unbranded generated clients often expose `.aio` modules, e.g. `from <pkg>.aio import <Client>`. |
| 126 | + |
| 127 | +## Step 5 — Dependencies (only when needed) |
| 128 | + |
| 129 | +Default: do NOT add new dependencies. |
| 130 | + |
| 131 | +Only if your new/extended test imports a package not already available: |
| 132 | + |
| 133 | +- Add it to the appropriate requirements file: |
| 134 | + - `packages/http-client-python/generator/test/azure/requirements.txt` |
| 135 | + - `packages/http-client-python/generator/test/unbranded/requirements.txt` |
| 136 | + |
| 137 | +Avoid adding dependencies unless strictly required by the test. |
| 138 | + |
| 139 | +## Step 6 — Format changed files |
| 140 | + |
| 141 | +Format any python files you changed with Black using a 120 character line length: |
| 142 | + |
| 143 | +- `python -m black <paths> -l 120` |
| 144 | + |
| 145 | +Replace `<paths>` with the specific files and/or folders you modified. |
| 146 | + |
| 147 | +## Notes |
| 148 | + |
| 149 | +- Keep the skill concise: prefer adding a single focused test per scenario. |
| 150 | +- Don’t duplicate existing coverage: extend an existing file when reasonable. |
| 151 | +- Use forward-slash paths only. |
0 commit comments