Commit 25eb0d4
Fix Azure OpenAI integration for v1 API compatibility (getzep#1192)
* Fix MCP server documentation to reference correct entry point
Update all documentation references from graphiti_mcp_server.py to main.py.
The old filename was causing "No such file or directory" errors when users
tried to run the commands as documented. The actual entry point is main.py
in the mcp_server directory.
Changes:
- Update 7 command examples in README.md
- Update example configuration file with correct path
Co-Authored-By: Claude (us.anthropic.claude-sonnet-4-5-20250929-v1:0) <noreply@anthropic.com>
* @andreibogdan has signed the CLA in getzep#1179
* Add extracted edge facts to entity summaries (getzep#1182)
* Add extracted edge facts to entity summaries
Update _extract_entity_summary to include facts from edges connected to
each node. Edge facts are appended to the existing summary, and LLM
summarization is only triggered if the combined content exceeds the
character limit.
- Add edges parameter to extract_attributes_from_nodes and related functions
- Filter edges per node before passing to attribute extraction
- Append edge facts (newline-separated) to node summary
- Skip LLM call when combined summary is within length limits
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
* Remove unused reflexion prompts and invalidate_edges v1
- Remove reflexion prompts from extract_nodes.py and extract_edges.py
- Remove extract_nodes_reflexion function from node_operations.py
- Remove unused v1 function from invalidate_edges.py
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
* Filter out None/empty edge facts when building summary
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
* Remove unused MissedEntities import
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
* Optimize edge filtering with pre-built lookup dictionary
Replace O(N * E) per-node edge filtering with O(E + N) pre-built
dictionary lookup. Edges are now indexed by node UUID once before
the gather operation.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
* Handle empty summary edge case
Return early if summary_with_edges is empty after stripping,
avoiding storing empty summaries when node.summary and all
edge facts are empty.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
* Update tests to reflect summary optimization behavior
Tests now expect that short summaries are kept as-is without LLM calls.
Added new test to verify LLM is called when summary exceeds character
limit due to edge facts.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
* format
* Bump version to 0.27.0
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
* lock
* change version
---------
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
* Fix dependabot security vulnerabilities (getzep#1184)
Fix dependabot security vulnerabilities in dependencies
Update lock files to address multiple security alerts:
- pyasn1: 0.6.1 → 0.6.2 (CVE-2026-23490)
- langchain-core: 0.3.74 → 0.3.83 (CVE-2025-68664)
- mcp: 1.9.4 → 1.26.0 (DNS rebinding, DoS)
- azure-core: 1.34.0 → 1.38.0 (deserialization)
- starlette: 0.46.2/0.47.1 → 0.50.0/0.52.1 (DoS vulnerabilities)
- python-multipart: 0.0.20 → 0.0.22 (arbitrary file write)
- fastapi: 0.115.14 → 0.128.0 (for starlette compatibility)
- nbconvert: 7.16.6 → 7.17.0
- orjson: 3.11.5 → 3.11.6
- protobuf: 6.33.4 → 6.33.5
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
* Revert "Fix dependabot security vulnerabilities" (getzep#1185)
Revert "Fix dependabot security vulnerabilities (getzep#1184)"
This reverts commit 30cd907.
* Pin mcp_server to graphiti-core 0.26.3 (getzep#1186)
* Fix dependabot security vulnerabilities in dependencies
Update lock files to address multiple security alerts:
- pyasn1: 0.6.1 → 0.6.2 (CVE-2026-23490)
- langchain-core: 0.3.74 → 0.3.83 (CVE-2025-68664)
- mcp: 1.9.4 → 1.26.0 (DNS rebinding, DoS)
- azure-core: 1.34.0 → 1.38.0 (deserialization)
- starlette: 0.46.2/0.47.1 → 0.50.0/0.52.1 (DoS vulnerabilities)
- python-multipart: 0.0.20 → 0.0.22 (arbitrary file write)
- fastapi: 0.115.14 → 0.128.0 (for starlette compatibility)
- nbconvert: 7.16.6 → 7.17.0
- orjson: 3.11.5 → 3.11.6
- protobuf: 6.33.4 → 6.33.5
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
* Pin mcp_server to graphiti-core 0.26.3 from PyPI
- Change dependency from >=0.23.1 to ==0.26.3
- Remove editable source override to use published package
- Addresses code review feedback about RC version usage
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
* Fix remaining security vulnerabilities in mcp_server
Update vulnerable transitive dependencies:
- aiohttp: 3.12.15 → 3.13.3 (High: zip bomb, DoS)
- urllib3: 2.5.0 → 2.6.3 (High: decompression bomb bypass)
- filelock: 3.19.1 → 3.20.3 (Medium: TOCTOU symlink)
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
---------
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
* Update manual code review workflow to use claude-opus-4-5-20251101 (getzep#1189)
* Update manual code review workflow to use claude-opus-4-5-20251101
Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
* Update auto code review workflow to use claude-opus-4-5-20251101
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
---------
Co-authored-by: Claude Haiku 4.5 <noreply@anthropic.com>
* Fix Azure OpenAI integration for v1 API compatibility
This commit addresses several issues with Azure OpenAI integration:
1. Azure OpenAI Client (graphiti_core/llm_client/azure_openai_client.py):
- Use AsyncOpenAI with v1 endpoint instead of AsyncAzureOpenAI
- Implement separate handling for reasoning models (responses.parse) vs
non-reasoning models (beta.chat.completions.parse)
- Add custom response handler to parse both response formats correctly
- Fix RefusalError import path from llm_client.errors
2. MCP Server Factories (mcp_server/src/services/factories.py):
- Update Azure OpenAI factory to use v1 compatibility endpoint
- Use same deployment for both main and small models in Azure
- Add support for custom embedder endpoints (Ollama compatibility)
- Add support for custom embedding dimensions
- Remove unused Azure AD authentication code (TODO for future)
- Add reasoning model detection for OpenAI provider
3. MCP Server Configuration (mcp_server/pyproject.toml):
- Add local graphiti-core source dependency for development
4. Tests (tests/llm_client/test_azure_openai_client.py):
- Update test mocks to support beta.chat.completions.parse
- Update test expectations for non-reasoning model path
These changes enable Azure OpenAI to work correctly with both reasoning
and non-reasoning models, support custom embedder endpoints like Ollama,
and maintain compatibility with the OpenAI v1 API specification.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
* fix: address code review feedback
- Remove local development dependency from mcp_server/pyproject.toml
that would break PyPI installations
- Move json import to top of azure_openai_client.py
- Add comments explaining why non-reasoning models use
beta.chat.completions.parse instead of responses.parse
(Azure v1 compatibility limitation)
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
* fix: address minor code review issues
- Add noqa comment for unused response_model parameter (inherited from
abstract method interface)
- Fix misleading comment in factories.py that referenced Azure OpenAI
in the regular OpenAI case
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
---------
Co-authored-by: Claude (us.anthropic.claude-sonnet-4-5-20250929-v1:0) <noreply@anthropic.com>
Co-authored-by: Daniel Chalef <131175+danielchalef@users.noreply.github.com>
Co-authored-by: Preston Rasmussen <109292228+prasmussen15@users.noreply.github.com>
Co-authored-by: prestonrasmussen <prasmuss15@gmail.com>1 parent 5d5a3f3 commit 25eb0d4
File tree
6 files changed
+260
-681
lines changed- graphiti_core/llm_client
- mcp_server
- config
- src/services
- tests/llm_client
6 files changed
+260
-681
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
14 | 14 | | |
15 | 15 | | |
16 | 16 | | |
| 17 | + | |
17 | 18 | | |
18 | | - | |
| 19 | + | |
19 | 20 | | |
20 | 21 | | |
21 | 22 | | |
| |||
63 | 64 | | |
64 | 65 | | |
65 | 66 | | |
66 | | - | |
67 | | - | |
68 | | - | |
69 | | - | |
70 | | - | |
71 | | - | |
72 | | - | |
73 | | - | |
| 67 | + | |
74 | 68 | | |
75 | | - | |
76 | | - | |
77 | | - | |
78 | | - | |
79 | | - | |
80 | | - | |
81 | | - | |
82 | | - | |
83 | | - | |
| 69 | + | |
| 70 | + | |
| 71 | + | |
| 72 | + | |
84 | 73 | | |
85 | | - | |
| 74 | + | |
| 75 | + | |
| 76 | + | |
| 77 | + | |
| 78 | + | |
| 79 | + | |
| 80 | + | |
| 81 | + | |
| 82 | + | |
| 83 | + | |
| 84 | + | |
| 85 | + | |
| 86 | + | |
| 87 | + | |
| 88 | + | |
| 89 | + | |
| 90 | + | |
| 91 | + | |
| 92 | + | |
| 93 | + | |
| 94 | + | |
| 95 | + | |
| 96 | + | |
| 97 | + | |
| 98 | + | |
| 99 | + | |
| 100 | + | |
| 101 | + | |
| 102 | + | |
| 103 | + | |
| 104 | + | |
86 | 105 | | |
87 | 106 | | |
88 | 107 | | |
89 | 108 | | |
90 | 109 | | |
91 | 110 | | |
92 | 111 | | |
93 | | - | |
| 112 | + | |
94 | 113 | | |
95 | 114 | | |
96 | 115 | | |
| |||
108 | 127 | | |
109 | 128 | | |
110 | 129 | | |
| 130 | + | |
| 131 | + | |
| 132 | + | |
| 133 | + | |
| 134 | + | |
| 135 | + | |
| 136 | + | |
| 137 | + | |
| 138 | + | |
| 139 | + | |
| 140 | + | |
| 141 | + | |
| 142 | + | |
| 143 | + | |
| 144 | + | |
| 145 | + | |
| 146 | + | |
| 147 | + | |
| 148 | + | |
| 149 | + | |
| 150 | + | |
| 151 | + | |
| 152 | + | |
| 153 | + | |
| 154 | + | |
| 155 | + | |
| 156 | + | |
| 157 | + | |
| 158 | + | |
| 159 | + | |
| 160 | + | |
| 161 | + | |
| 162 | + | |
111 | 163 | | |
112 | 164 | | |
113 | 165 | | |
| |||
0 commit comments