Skip to content

feat: add support for system messages to RunnableRails#1106

Merged
Pouyanpi merged 2 commits intoNVIDIA-NeMo:developfrom
smruthi33:support-system-message
May 5, 2025
Merged

feat: add support for system messages to RunnableRails#1106
Pouyanpi merged 2 commits intoNVIDIA-NeMo:developfrom
smruthi33:support-system-message

Conversation

@smruthi33
Copy link
Contributor

@smruthi33 smruthi33 commented Apr 10, 2025

Description

The current implementation does not support messages that have a System Message

guardrails_config = RailsConfig.from_path(<Path>)
guardrails = RunnableRails(guardrails_config)
client = ChatOpenAI(
                base_url=GEN_AI_ENDPOINT,
                api_key=GEN_AI_ENDPOINT_AUTHORIZATION,
                model= MODEL_NAME,
                temperature=0                
            )
messages = [
                (
                    "system", system
                )
                ,(
                    "human", prompt
                )
            ]
prompt = ChatPromptTemplate.from_messages(messages)

chain_with_guardrails =  prompt | (guardrails | client)

In this setup, RunnableRails does not handle messages containing system roles. Attempting to pass such a structure as a dictionary causes conflicts with the format expected by the Runnable interface from LangChain, which assumes a flat input structure and does not support multiple message types out of the box. This change addresses that limitation.

Related Issue(s)

Checklist

  • I've read the CONTRIBUTING guidelines.
  • I've updated the documentation if applicable.
  • I've added tests if applicable.
  • @mentions of the person or team responsible for reviewing proposed changes.

@Pouyanpi
Copy link
Collaborator

Thank you @smruthi33 for opening this PR 🚀

Would be great if you can add a test to test_runnable_rails.py. Thank you!

@Pouyanpi Pouyanpi changed the title Adding support for system message feat: add support for system messages in RunnableRails Apr 10, 2025
@Pouyanpi Pouyanpi changed the title feat: add support for system messages in RunnableRails feat: add support for system messages to RunnableRails Apr 10, 2025
@Pouyanpi Pouyanpi self-requested a review April 11, 2025 10:05
@Pouyanpi Pouyanpi added this to the v0.14.0 milestone May 1, 2025
@Pouyanpi Pouyanpi force-pushed the support-system-message branch from ac471ce to d0d38ef Compare May 1, 2025 14:52
@Pouyanpi
Copy link
Collaborator

Pouyanpi commented May 1, 2025

@tgasser-nv I added some tests. I think we are good to merge after your review:

I am not going to include following minimal integration test but it works.

import pytest
import os
from langchain_openai import ChatOpenAI
from langchain_core.messages import AIMessage, HumanMessage, SystemMessage
from langchain_core.prompt_values import ChatPromptValue

from nemoguardrails import RailsConfig
from nemoguardrails.integrations.langchain.runnable_rails import RunnableRails

requires_openai = pytest.mark.skipif(
    os.environ.get("OPENAI_API_KEY") is None,
    reason="Requires OPENAI_API_KEY environment variable",
)


@requires_openai
def test_system_message_with_openai():
    """Tests SystemMessage handling with a real OpenAI model."""
    try:
        llm = ChatOpenAI(model="gpt-4o", temperature=0)
    except ImportError:
        pytest.skip("langchain-openai not installed")
    except Exception as e:
        pytest.skip(f"Could not initialize ChatOpenAI: {e}")
        
    config = RailsConfig.from_content(config={"models": []})
    model_with_rails = RunnableRails(config)
    chain = model_with_rails | llm
    input_messages = [
        SystemMessage(
            content="You are an assistant that always replies starting with 'LOLOLOLO!'"
        ),
        HumanMessage(content="what can you do?"),
    ]
    result = chain.invoke(input=ChatPromptValue(messages=input_messages))
    assert isinstance(result, AIMessage)
    assert result.content.lower().startswith("lolololo!")

@Pouyanpi Pouyanpi requested a review from tgasser-nv May 1, 2025 14:59
@Pouyanpi Pouyanpi force-pushed the support-system-message branch from d0d38ef to c7707ab Compare May 2, 2025 06:59
@codecov-commenter
Copy link

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 68.00%. Comparing base (67b8b7a) to head (c7707ab).

Additional details and impacted files
@@           Coverage Diff            @@
##           develop    #1106   +/-   ##
========================================
  Coverage    68.00%   68.00%           
========================================
  Files          161      161           
  Lines        15793    15795    +2     
========================================
+ Hits         10740    10742    +2     
  Misses        5053     5053           
Flag Coverage Δ
python 68.00% <100.00%> (+<0.01%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

@Pouyanpi Pouyanpi merged commit a44566e into NVIDIA-NeMo:develop May 5, 2025
7 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants