Skip to content

Comments

Handle client errors properly in np module#36

Closed
SasinduDilshara wants to merge 3 commits intoballerina-platform:mainfrom
SasinduDilshara:handle-client-errors-properly
Closed

Handle client errors properly in np module#36
SasinduDilshara wants to merge 3 commits intoballerina-platform:mainfrom
SasinduDilshara:handle-client-errors-properly

Conversation

@SasinduDilshara
Copy link
Contributor

Handle client errors properly in np module when parsing errors occured

Fixes ballerina-platform/ballerina-library#7695

@codecov
Copy link

codecov bot commented Mar 16, 2025

Codecov Report

Attention: Patch coverage is 30.76923% with 9 lines in your changes missing coverage. Please review.

Project coverage is 61.41%. Comparing base (ad112f5) to head (6a41779).

Files with missing lines Patch % Lines
ballerina/utils.bal 0.00% 4 Missing ⚠️
ballerina/llm_client_default.bal 0.00% 3 Missing ⚠️
ballerina/llm_client_azure_open_ai.bal 66.66% 1 Missing ⚠️
ballerina/llm_client_open_ai.bal 66.66% 1 Missing ⚠️

❌ Your project status has failed because the head coverage (61.41%) is below the target coverage (80.00%). You can increase the head coverage or adjust the target coverage.

Additional details and impacted files
@@            Coverage Diff             @@
##             main      #36      +/-   ##
==========================================
- Coverage   63.79%   61.41%   -2.39%     
==========================================
  Files           6        7       +1     
  Lines         174      184      +10     
  Branches       50       51       +1     
==========================================
+ Hits          111      113       +2     
- Misses         63       71       +8     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

// under the License.

const JSON_CONVERSION_ERROR = "FromJsonStringError";
const ERROR_MESSAGE = "Error occurred while converting the LLM response to the given type. Please refined your prompt to get a better result.";
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This error isn't correct.

  1. This parsing can fail due to a number of reasons - the LLM didn't respond in the specified format but with correct info, the LLM responds with something random, etc. If it is the first, this error doesn't make sense. Even in the second case, it may or may not be a prompt issue. I would go with a generic error and include the detail message from the original error in this error.
  2. Will go away, but Refined -> Refine

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

will do that

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Unclear error message when the LLM response is not compatible with the expected type

2 participants