Skip to content

Comments

Fix possible bug leading to wrong error message when instantiating LLM#334

Merged
Noravee merged 5 commits intomainfrom
fix-possible-bug-leading-to-wrong-error-message
Jul 25, 2025
Merged

Fix possible bug leading to wrong error message when instantiating LLM#334
Noravee merged 5 commits intomainfrom
fix-possible-bug-leading-to-wrong-error-message

Conversation

@Noravee
Copy link
Collaborator

@Noravee Noravee commented Jul 25, 2025

This PR improves error handling during LLM instantiation when factory-based resolution fails.

Problem:

When all llm_factories fail to create an LLM, we fallback to using the "class" value in the config to attempt instantiation via a user-defined class. However, "class" is always set — even for factory-backed models like "openai" or "bedrock".

If we blindly attempt to resolve these default classes again via create_base_chat_model_from_user_class, it can raise a secondary error that masks the original factory error, making debugging much harder.

Fix:

Add a check to skip fallback resolution if config["class"] is one of the default classes.

This ensures we preserve and raise the original, meaningful exception instead of a misleading one from reprocessing default class paths.

from neuro_san.internals.run_context.langchain.util.argument_validator import ArgumentValidator
from neuro_san.internals.utils.resolver_util import ResolverUtil

DEFAULT_LLM_CLASSES: Set[str] = {"anthropic", "azure-openai", "bedrock", "gemini", "ollama", "openai", "nvidia"}
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Default classes

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not get these from the keys listed in default_llm_info.hocon "classes" dictionary?
Then when we add more we won't have 2 places to update.

if (
llm is None
and found_exception is not None
and class_path not in DEFAULT_LLM_CLASSES
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Check that class is not in the default classes.

@Noravee Noravee requested review from d1donlydfink and ofrancon July 25, 2025 08:04
ofrancon
ofrancon previously approved these changes Jul 25, 2025
Copy link
Contributor

@ofrancon ofrancon left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

Copy link
Collaborator

@d1donlydfink d1donlydfink left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not get the default classes from the keys listed in default_llm_info.hocon "classes" dictionary?
Then when we add more we won't have 2 places to update.
Single point of truth is waaaay better.

from neuro_san.internals.run_context.langchain.util.argument_validator import ArgumentValidator
from neuro_san.internals.utils.resolver_util import ResolverUtil

DEFAULT_LLM_CLASSES: Set[str] = {"anthropic", "azure-openai", "bedrock", "gemini", "ollama", "openai", "nvidia"}
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not get these from the keys listed in default_llm_info.hocon "classes" dictionary?
Then when we add more we won't have 2 places to update.

@Noravee Noravee requested a review from d1donlydfink July 25, 2025 16:40
# and factory resolution failed.
class_path: str = config.get("class")
if llm is None and found_exception is not None and class_path:
default_llm_classes: Set[str] = set(self.llm_infos.get("classes"))
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Get the default llm classes from the default llm info file.

Thank you @d1donlydfink for the suggestion.

@Noravee Noravee merged commit b77d651 into main Jul 25, 2025
4 checks passed
@Noravee Noravee deleted the fix-possible-bug-leading-to-wrong-error-message branch July 25, 2025 17:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants