Replies: 1 comment
-
|
I just noticed this "Discussions" was enabled a minute ago, I'm sorry. I didn't mean to ignore these messages. Your issue was due to an earlier release and the default LLM URL, which has to be configured in the Settings dialog (hit the Settings button on the main chat UI). The LLM URL must match your local or remote LLM server host, port and API URL. There are common examples in the README and in the work-in-progress Pidoc. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I am not doing something right.
Please help me set this up.
Thanks Steve
Sending data to LLM at: http://127.0.0.1:8010/llm/chat
NetworkTransfer POST failed. HTTP Status: 0
Error Info: NetworkTransfer::POST(): Failed to connect to 127.0.0.1 port 8010 after 2034 ms: Could not connect to server: http://127.0.0.1:8010/llm/chat
--- Chat session and history reset by user. ---
--- Extracting image profile for view: Image01 ---
Performing raw FITS keyword extraction from file: F:/Squid Nebula/StarAlignment_D1/PLANSHORT_L_OBJNAME_Squid Neblua C2 AskC1_EXP_300s_GAIN_G101_ISOBIN_Bin1x1_EXIF_30F_IMGID_1847_c_d_r.xisf
Found 75 raw FITS keywords.
--- Image profile extraction complete. ---
Sending data to LLM at: http://127.0.0.1:8010/llm/chat
NetworkTransfer POST failed. HTTP Status: 0
Error Info: NetworkTransfer::POST(): Failed to connect to 127.0.0.1 port 8010 after 2025 ms: Could not connect to server: http://127.0.0.1:8010/llm/chat
Resetting LLM Assistant settings to default.
--- LLM Assistant Closed ---
Beta Was this translation helpful? Give feedback.
All reactions