-
Notifications
You must be signed in to change notification settings - Fork 245
Description
Search before asking
- I have searched the Inference issues and found no similar open bug report.
Bug
The fastapi dependency in requirements/_requirements.txt is pinned to >=0.100,<0.116:
fastapi>=0.100,<0.116 # be careful with upper pin - fastapi might remove support for on_event
FastAPI 0.116.0 was released on 2025-07-07 — over 7 months ago. The latest FastAPI is 0.128.0 (2025-12-27). This means inference is incompatible with any project that requires a modern FastAPI version.
Prior art
This was partially addressed in #1398 / PR #1407 (merged 2025-07-10), which bumped the upper bound from <0.111 to <0.116. However, FastAPI has since moved significantly further (0.116 → 0.128), and the same conflict pattern has re-emerged.
Impact
Any project that depends on both inference and a library requiring FastAPI ≥0.116 hits an unresolvable dependency conflict. For example, frameworks and shared libraries that track modern FastAPI versions force users to apply [tool.uv] override-dependencies workarounds to bypass the constraint — running inference against an untested FastAPI version anyway, but without upstream CI coverage.
The on_event concern
The pinned comment says:
be careful with upper pin - fastapi might remove support for on_event
As of FastAPI 0.128.0, on_event is deprecated but still functional (replaced by lifespan handlers). It has not been removed. The concern that motivated the pin has not materialized in 12+ minor releases beyond the current upper bound.
Suggested fix
Bump the upper bound to cover current FastAPI releases, e.g.:
fastapi>=0.100,<0.129
Or, ideally, migrate the on_event usage to lifespan handlers and remove the upper bound entirely.
Environment
- inference: 0.51.9 (also affects latest 0.64.6)
- FastAPI latest: 0.128.0
- Python: 3.13
Minimal Reproducible Example
uv init --bare --python 3.13
uv add inference "fastapi>=0.116"Produces an unresolvable conflict:
Because inference>=0.51.9 depends on fastapi<0.116, and you require fastapi>=0.116:
inference>=0.51.9 and fastapi>=0.116 are incompatible.
Are you willing to submit a PR?
- Yes I'd like to help by submitting a PR!