-
Notifications
You must be signed in to change notification settings - Fork 2.3k
Description
Hello,
EDIT:
Found the error.
I first started Open Notebook and much much later ollama. Thats the Problem.
Maby a clearer Errormessage would be helpful. Like:
Ollama isnt running. Please stop Open Notebook, launch ollama and restart Open Notebook.
(Or somethink like that.)
I really think there is huuuuge potential in this Project and i really like to help you as far as i can 😊
I just found out, that ollama wasnt started... 🤦 (so the below isnt up to date. Scroll at the bottom
to find the new error i got)
if i choose to upload a file, i get this error:
RuntimeError: API request failed: 500 - {"detail":"Error creating source: All connection attempts failed"}
File "/app/pages/stream_app/source.py", line 115, in add_source
sources_service.create_source(
File "/app/api/sources_service.py", line 128, in create_source
source_data = api_client.create_source(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/api/client.py", line 311, in create_source
return self._make_request("POST", "/api/sources", json=data)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/api/client.py", line 49, in _make_request
raise RuntimeError(
Whats stange about it: if i close the Notebook and reopen it. The File is there.
I installed Open Notebook like this:
docker-compose down
docker-compose pull
docker-compose up -d
docker-compose.yaml:
services:
surrealdb:
image: surrealdb/surrealdb:v2
ports:
- "8000:8000"
volumes:
- surreal_data:/mydata
command: start --user root --pass root rocksdb:/mydata/mydatabase.db
pull_policy: always
user: root
open_notebook:
image: lfnovo/open_notebook:latest
ports:
- "8080:8502"
environment:
- OLLAMA_BASE_URL=http://host.docker.internal:11434
- OLLAMA_API_BASE=http://host.docker.internal:11434
- SURREAL_ADDRESS=surrealdb
- SURREAL_PORT=8000
- SURREAL_USER=root
- SURREAL_PASS=root
- SURREAL_NAMESPACE=open_notebook
- SURREAL_DATABASE=open_notebook
depends_on:
- surrealdb
pull_policy: always
volumes:
- notebook_data:/app/data
volumes:
surreal_data:
notebook_data:
If i then want to chat with it, i get this error:
httpx.ConnectError: [Errno 111] Connection refused
File "/app/.venv/lib/python3.12/site-packages/streamlit/runtime/scriptrunner/exec_code.py", line 128, in exec_func_with_error_handling
result = func()
^^^^^^
File "/app/.venv/lib/python3.12/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 667, in code_to_exec
_mpa_v1(self._main_script_path)
File "/app/.venv/lib/python3.12/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 165, in _mpa_v1
page.run()
File "/app/.venv/lib/python3.12/site-packages/streamlit/navigation/page.py", line 300, in run
exec(code, module.__dict__) # noqa: S102
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/pages/2_📒_Notebooks.py", line 118, in <module>
notebook_page(current_notebook)
File "/app/pages/2_📒_Notebooks.py", line 94, in notebook_page
chat_sidebar(current_notebook=current_notebook, current_session=current_session)
File "/app/pages/stream_app/chat.py", line 216, in chat_sidebar
response = execute_chat(
^^^^^^^^^^^^^
File "/app/pages/stream_app/chat.py", line 61, in execute_chat
result = chat_graph.invoke(
^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/langgraph/pregel/__init__.py", line 2844, in invoke
for chunk in self.stream(
^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/langgraph/pregel/__init__.py", line 2534, in stream
for _ in runner.tick(
^^^^^^^^^^^^
File "/app/open_notebook/graphs/chat.py", line 36, in call_model_with_messages
ai_message = model.invoke(payload)
^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 378, in invoke
self.generate_prompt(
File "/app/.venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 963, in generate_prompt
return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 782, in generate
self._generate_with_cache(
File "/app/.venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 1028, in _generate_with_cache
result = self._generate(
^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/langchain_ollama/chat_models.py", line 800, in _generate
final_chunk = self._chat_stream_with_aggregation(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/langchain_ollama/chat_models.py", line 735, in _chat_stream_with_aggregation
for chunk in self._iterate_over_stream(messages, stop, **kwargs):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/langchain_ollama/chat_models.py", line 822, in _iterate_over_stream
for stream_resp in self._create_chat_stream(messages, stop, **kwargs):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/langchain_ollama/chat_models.py", line 721, in _create_chat_stream
yield from self._client.chat(**chat_params)
File "/app/.venv/lib/python3.12/site-packages/ollama/_client.py", line 165, in inner
with self._client.stream(*args, **kwargs) as r:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/contextlib.py", line 137, in __enter__
return next(self.gen)
^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 868, in stream
response = self.send(
^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 914, in send
response = self._send_handling_auth(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 942, in _send_handling_auth
response = self._send_handling_redirects(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 979, in _send_handling_redirects
response = self._send_single_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 1014, in _send_single_request
response = transport.handle_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 249, in handle_request
with map_httpcore_exceptions():
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/contextlib.py", line 158, in __exit__
self.gen.throw(value)
File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 118, in map_httpcore_exceptions
raise mapped_exc(message) from exc
new error i got, processing a png :
ConnectionError: Failed to connect to API: timed out
File "/app/pages/stream_app/source.py", line 115, in add_source
sources_service.create_source(
File "/app/api/sources_service.py", line 128, in create_source
source_data = api_client.create_source(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/api/client.py", line 311, in create_source
return self._make_request("POST", "/api/sources", json=data)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/api/client.py", line 44, in _make_request
raise ConnectionError(f"Failed to connect to API: {str(e)}")
i dont know why i cant connect to the API and get a timeout 🤔