Skip to content

Does .chat() support setting context length? #173

@buggyfound

Description

@buggyfound

Hello I'm trying to set context length when running this code. Placing the configuration value in different places hasn't worked.

I am able to load the model with manually set context length in the GUI, then the terminal program runs fine using model = lms.llm()

https://lmstudio.ai/docs/python/llm-prediction/chat-completion#example-multi-turn-chat

If I load the config value with the model it doesn't apply the setting when opening .chat()
model = lms.llm("OpenAI-20B-NEO-CODEPlus-Uncensored-IQ4_NL.gguf", config={ "contextLength": 8192 })

If I set it in the chat I get an error. Example code:
chat = lms.Chat(systemprompt, config={ "contextLength": 8192 })
error:
line 59, in <module> chat = lms.Chat(systemprompt, config={ "contextLength": 8192 }) TypeError: Chat.__init__() got an unexpected keyword argument 'config'

Help with this issue would be appreciated.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions