Improve code coverage#120
Conversation
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
📝 WalkthroughWalkthroughNarrowed test/coverage filters to AI-specific packages, updated codecov paths/ignores/thresholds, removed the DispatcherService implementation, and added several new Ballerina test modules that start mock HTTP services to exercise AI-related providers and an HTTP toolkit. Changes
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~25 minutes Possibly related PRs
Suggested reviewers
Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 inconclusive)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
Codecov Report✅ All modified and coverable lines are covered by tests. ❌ Your project check has failed because the head coverage (77.96%) is below the target coverage (80.00%). You can increase the head coverage or adjust the target coverage. Additional details and impacted files@@ Coverage Diff @@
## main #120 +/- ##
=============================================
+ Coverage 56.11% 77.96% +21.85%
Complexity 238 238
=============================================
Files 80 62 -18
Lines 5587 4493 -1094
Branches 1098 1032 -66
=============================================
+ Hits 3135 3503 +368
+ Misses 2305 843 -1462
Partials 147 147 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
There was a problem hiding this comment.
🧹 Nitpick comments (2)
ballerina/tests/http-service-toolkit-execution-test.bal (2)
20-21: Consider using the port constant in the URL to avoid duplication.The port
9095is hardcoded in bothMOCK_HTTP_SVC_PORTandMOCK_HTTP_SVC_URL. Using string interpolation would ensure consistency if the port changes.♻️ Suggested improvement
const int MOCK_HTTP_SVC_PORT = 9095; -const MOCK_HTTP_SVC_URL = "http://localhost:9095/api"; +const string MOCK_HTTP_SVC_URL = string `http://localhost:${MOCK_HTTP_SVC_PORT}/api`;🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@ballerina/tests/http-service-toolkit-execution-test.bal` around lines 20 - 21, MOCK_HTTP_SVC_URL duplicates the port number from MOCK_HTTP_SVC_PORT; update MOCK_HTTP_SVC_URL to build the URL using the MOCK_HTTP_SVC_PORT constant (string interpolation or concatenation) so the port is defined in one place and changing MOCK_HTTP_SVC_PORT automatically updates MOCK_HTTP_SVC_URL; adjust the declaration of MOCK_HTTP_SVC_URL to reference MOCK_HTTP_SVC_PORT instead of hardcoding 9095.
187-210: Custom headers are not validated by the mock service.The test verifies that
HttpServiceToolKitaccepts custom headers without error, but the mock service doesn't echo or validate that headers were actually received. Consider enhancing the mock service to return received headers for verification.♻️ Suggested improvement
Add a resource that echoes headers back:
resource function get echo\-headers(http:Request req) returns json { return { "X-Custom-Header": check req.getHeader("X-Custom-Header"), "X-Request-Id": check req.getHeader("X-Request-Id") }; }Then update the test to use this endpoint and verify the echoed header values.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@ballerina/tests/http-service-toolkit-execution-test.bal` around lines 187 - 210, The test testHttpSvcToolKitWithCustomHeaders currently sets customHeaders on HttpServiceToolKit but the mock service never verifies them; add a resource function on the mock HTTP service (e.g., resource function get echo-headers(http:Request req) returns json) that reads and returns the incoming X-Custom-Header and X-Request-Id values, then change the test's request path to hit that endpoint (e.g., "/echo-headers" or similar) and assert the returned JSON contains the expected header values, keeping references to HttpServiceToolKit, toolkit, ToolStore, and output to locate and update the test and mock service.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Nitpick comments:
In `@ballerina/tests/http-service-toolkit-execution-test.bal`:
- Around line 20-21: MOCK_HTTP_SVC_URL duplicates the port number from
MOCK_HTTP_SVC_PORT; update MOCK_HTTP_SVC_URL to build the URL using the
MOCK_HTTP_SVC_PORT constant (string interpolation or concatenation) so the port
is defined in one place and changing MOCK_HTTP_SVC_PORT automatically updates
MOCK_HTTP_SVC_URL; adjust the declaration of MOCK_HTTP_SVC_URL to reference
MOCK_HTTP_SVC_PORT instead of hardcoding 9095.
- Around line 187-210: The test testHttpSvcToolKitWithCustomHeaders currently
sets customHeaders on HttpServiceToolKit but the mock service never verifies
them; add a resource function on the mock HTTP service (e.g., resource function
get echo-headers(http:Request req) returns json) that reads and returns the
incoming X-Custom-Header and X-Request-Id values, then change the test's request
path to hit that endpoint (e.g., "/echo-headers" or similar) and assert the
returned JSON contains the expected header values, keeping references to
HttpServiceToolKit, toolkit, ToolStore, and output to locate and update the test
and mock service.
ea96416 to
42910f0
Compare
There was a problem hiding this comment.
🧹 Nitpick comments (3)
ballerina/tests/wso2-embedding-provider-test.bal (1)
27-52: Mock service implementation looks correct.The service properly handles both single input and array input cases, returning appropriate embedding structures. One observation: the authorization validation on line 31-33 returns an
errortype, but this code path is never exercised by the tests since all tests use the valid"test-token".Consider adding a negative test case for invalid authorization if you want to verify that error path, or simplify the mock by removing the unused validation.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@ballerina/tests/wso2-embedding-provider-test.bal` around lines 27 - 52, Add a negative test to exercise the authorization failure path: update the test suite to call the mock service's resource function post embeddings with an invalid Authorization header (not "Bearer test-token") and assert it returns an error with message "invalid authorization token"; locate the mock service implementation (resource function post embeddings) and ensure the test sends requests to the same listener (MOCK_EMBED_PORT) and verifies the error response instead of removing the Authorization check so the existing validation is covered by tests.ballerina/tests/wso2-model-provider-test.bal (1)
104-112: Temperature parameter is not actually verified.The test
testWso2ModelProviderChatWithCustomTemperaturepasses a custom temperature (0.2d) to the provider but only verifies that a response is returned. The mock service doesn't validate or echo back the temperature value, so this test doesn't confirm the parameter is correctly propagated.If the intent is to verify temperature handling, consider having the mock service echo the received temperature in the response and asserting on it. Otherwise, rename the test to clarify it's testing provider instantiation with optional parameters.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@ballerina/tests/wso2-model-provider-test.bal` around lines 104 - 112, The test testWso2ModelProviderChatWithCustomTemperature sets temperature = 0.2d on Wso2ModelProvider but never verifies it; either update the mock to echo back the received temperature and assert the echoed value in the test, or rename the test to reflect it only verifies provider instantiation with optional params. Locate the test function testWso2ModelProviderChatWithCustomTemperature and the mock chat endpoint (MOCK_CHAT_URL) used by provider->chat; if choosing the echo approach, modify the mock to include the temperature in its response and add an assertion checking that response.content (or a new response field) contains 0.2, otherwise rename the test to something like testWso2ModelProviderChatWithOptionalTemperature to avoid implying temperature behavior is validated.ballerina/tests/http-service-toolkit-execution-test.bal (1)
187-210: Custom headers test does not verify header propagation.The test
testHttpSvcToolKitWithCustomHeaderscreates a toolkit with custom headers but only verifies a 200 response is returned. The mock service doesn't echo or validate thatX-Custom-HeaderandX-Request-Idwere received.To fully validate header propagation, consider modifying the mock service to echo received headers or return them in the response body for assertion.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@ballerina/tests/http-service-toolkit-execution-test.bal` around lines 187 - 210, The test testHttpSvcToolKitWithCustomHeaders creates an HttpServiceToolKit with customHeaders but never asserts that those headers were sent; update the mock HTTP service used by MOCK_HTTP_SVC_URL (or its handler in the test harness) to echo back incoming request headers (at least X-Custom-Header and X-Request-Id) in the response body or response headers, then extend the test after obtaining ToolOutput output (and casting to HttpOutput) to assert the echoed values match customHeaders (e.g., check value.body or value.headers contain "X-Custom-Header":"test-value" and "X-Request-Id":"12345"), keeping references to HttpServiceToolKit, customHeaders, ToolStore, execute, and HttpOutput so the assertions validate header propagation.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Nitpick comments:
In `@ballerina/tests/http-service-toolkit-execution-test.bal`:
- Around line 187-210: The test testHttpSvcToolKitWithCustomHeaders creates an
HttpServiceToolKit with customHeaders but never asserts that those headers were
sent; update the mock HTTP service used by MOCK_HTTP_SVC_URL (or its handler in
the test harness) to echo back incoming request headers (at least
X-Custom-Header and X-Request-Id) in the response body or response headers, then
extend the test after obtaining ToolOutput output (and casting to HttpOutput) to
assert the echoed values match customHeaders (e.g., check value.body or
value.headers contain "X-Custom-Header":"test-value" and
"X-Request-Id":"12345"), keeping references to HttpServiceToolKit,
customHeaders, ToolStore, execute, and HttpOutput so the assertions validate
header propagation.
In `@ballerina/tests/wso2-embedding-provider-test.bal`:
- Around line 27-52: Add a negative test to exercise the authorization failure
path: update the test suite to call the mock service's resource function post
embeddings with an invalid Authorization header (not "Bearer test-token") and
assert it returns an error with message "invalid authorization token"; locate
the mock service implementation (resource function post embeddings) and ensure
the test sends requests to the same listener (MOCK_EMBED_PORT) and verifies the
error response instead of removing the Authorization check so the existing
validation is covered by tests.
In `@ballerina/tests/wso2-model-provider-test.bal`:
- Around line 104-112: The test testWso2ModelProviderChatWithCustomTemperature
sets temperature = 0.2d on Wso2ModelProvider but never verifies it; either
update the mock to echo back the received temperature and assert the echoed
value in the test, or rename the test to reflect it only verifies provider
instantiation with optional params. Locate the test function
testWso2ModelProviderChatWithCustomTemperature and the mock chat endpoint
(MOCK_CHAT_URL) used by provider->chat; if choosing the echo approach, modify
the mock to include the temperature in its response and add an assertion
checking that response.content (or a new response field) contains 0.2, otherwise
rename the test to something like
testWso2ModelProviderChatWithOptionalTemperature to avoid implying temperature
behavior is validated.
There was a problem hiding this comment.
🧹 Nitpick comments (1)
ballerina/tests/to_json_schema_test.bal (1)
245-283: No coverage forgenerateJsonSchemaForTypedescwithnilableType = trueon scalar typesEvery scalar test passes
falsefornilableType. It's unclear whether the function generates aoneOfschema (analogous to what the nilable-array tests expect) or returns an error whennilableType = trueis combined with a simple type. Consider adding at least one case, e.g.:`@test`:Config { groups: ["json-schema", "generateJsonSchemaForTypedesc"] } function testGenerateJsonSchemaForNilableStringType() { JsonSchema|JsonArraySchema|map<json>|Error result = generateJsonSchemaForTypedesc(string, true); // assert the expected shape, e.g. oneOf or error }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@ballerina/tests/to_json_schema_test.bal` around lines 245 - 283, Add a test that calls generateJsonSchemaForTypedesc with nilableType = true for a scalar (e.g., string) to cover the missing branch; create a new test function (e.g., testGenerateJsonSchemaForNilableStringType) alongside the existing tests (like testGenerateJsonSchemaForStringType) and assert the expected output shape — either a oneOf schema containing the scalar type and null (e.g., oneOf with {"type":"string"} and {"type":"null"}) or an Error if that’s the intended behavior; ensure the test uses the same return type annotation (JsonSchema|JsonArraySchema|map<json>|Error) and test:assertEquals/assertTrue to validate the result.
ℹ️ Review info
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
ballerina/tests/to_json_schema_test.bal
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Nitpick comments:
In `@ballerina/tests/to_json_schema_test.bal`:
- Around line 245-283: Add a test that calls generateJsonSchemaForTypedesc with
nilableType = true for a scalar (e.g., string) to cover the missing branch;
create a new test function (e.g., testGenerateJsonSchemaForNilableStringType)
alongside the existing tests (like testGenerateJsonSchemaForStringType) and
assert the expected output shape — either a oneOf schema containing the scalar
type and null (e.g., oneOf with {"type":"string"} and {"type":"null"}) or an
Error if that’s the intended behavior; ensure the test uses the same return type
annotation (JsonSchema|JsonArraySchema|map<json>|Error) and
test:assertEquals/assertTrue to validate the result.
c39ea42 to
d28b431
Compare
Purpose
Fixes:
Examples
Checklist
This pull request updates coverage and build configurations, adds targeted tests for AI providers and HTTP tooling, and removes an unused dispatcher service file to align repository structure and coverage with the AI module.
Key changes
No public API additions aside from the new test modules; the DispatcherService public API was removed. Commit history includes a test-focused commit ("Add test for to_json_schema").