Skip to content

Commit 10f9e03

Browse files
committed
Update iris fastapi and mms examples
1 parent 5852144 commit 10f9e03

File tree

8 files changed

+21
-203
lines changed

8 files changed

+21
-203
lines changed
Lines changed: 7 additions & 27 deletions
Original file line numberDiff line numberDiff line change
@@ -1,46 +1,26 @@
1-
# getting-started-examples
2-
Examples to get started with using TrueFoundry
1+
# Deploy Scikit-Learn Iris flower classification model with FastAPI
32

4-
Deployment
53
---
6-
This example runs a simple iris app for inferring using a iris classifier.
7-
Mainly this example shows how to deploy to TrueFoundry using a Pythonfile and TrueFoundry Python SDK.
84

9-
## Run Locally
5+
### Install requirements
106

117
1. Install requirements
128

139
```shell
1410
python -m pip install -r requirements.txt
1511
```
1612

17-
2. Start the iris app
13+
### Start the server
1814

1915
```shell
2016
export MODEL_DIR="$(pwd)"
2117
gunicorn -k uvicorn.workers.UvicornWorker --bind 0.0.0.0:8000 server:app
2218
```
2319

24-
## Deploy with TrueFoundry
25-
26-
1. Install `truefoundry`
27-
28-
```shell
29-
python -m pip install -U "truefoundry>=0.10.0,<0.11.0"
30-
```
31-
32-
2. Login
33-
34-
```shell
35-
tfy login --host "<Host name of TrueFoundry UI. e.g. https://company.truefoundry.cloud>"
36-
```
37-
38-
3. Deploy!
39-
40-
> Please refer to following docs
41-
> - [Getting workspace FQN](https://docs.truefoundry.com/docs/key-concepts#get-workspace-fqn)
42-
> - [Get host and path for deploying applications](https://docs.truefoundry.com/docs/define-ports-and-domains#identifying-available-domains)
20+
### Example Inference Call
4321

4422
```shell
45-
python deploy.py --name iris --workspace-fqn <Workspace FQN> --host <Ingress Host for the cluster> --path <optional path>
23+
curl -X 'POST' \
24+
'http://0.0.0.0:8000/predict?sepal_length=1&sepal_width=1&petal_length=1&petal_width=1' \
25+
-H 'accept: application/json'
4626
```

deploy-model-with-fastapi/deploy.py

Lines changed: 0 additions & 90 deletions
This file was deleted.
Original file line numberDiff line numberDiff line change
@@ -1,2 +1,3 @@
11
model/
22
logs/
3+
model_store/

deploy-model-with-mms/.gitignore

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
logs/

deploy-model-with-mms/Dockerfile

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -33,10 +33,6 @@ WORKDIR /home/model-server
3333
COPY requirements.txt requirements.txt
3434
RUN python -m pip install --no-cache-dir -r requirements.txt
3535

36-
# Here we are copying the models to the image. But for large models it is recommended to cache the models in a volume.
37-
# See: https://docs.truefoundry.com/docs/download-and-cache-models#download-models-and-artifacts
38-
COPY model_store/ model_store/
39-
4036
COPY config.properties config.properties
4137

4238
ENTRYPOINT ["multi-model-server"]

deploy-model-with-mms/README.md

Lines changed: 12 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,20 +1,28 @@
11
# Deploy MNIST Model with MMS
2+
23
---
34

5+
### Install requirements
6+
7+
```bash
8+
python -m pip install -r requirements.txt
9+
```
10+
411
### Package the model
512

613
```bash
714
model-archiver --model-name mnist --model-path model/ --handler mnist_handler.py:handle --export-path model_store/ --runtime python --force
815
```
916

10-
### Deploy
17+
### Start the server
1118

12-
```shell
13-
python deploy.py --workspace-fqn ... --host ... --path ...
19+
```bash
20+
export MODEL_DIR="$(pwd)/model_store"
21+
multi-model-server --foreground --model-store $MODEL_DIR --start --mms-config config.properties
1422
```
1523

1624
### Example Inference Call
1725

1826
```bash
19-
curl -X POST -H "Content-Type: application/json" https://<endpoint>/predictions/mnist -T 0.png
27+
curl -X POST -H "Content-Type: application/json" http://0.0.0.0:8080/predictions/mnist -T 0.png
2028
```

deploy-model-with-mms/config.properties

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,5 +11,4 @@ load_models=ALL
1111
management_address=http://0.0.0.0:8081
1212
max_request_size=6553500
1313
max_response_size=6553500
14-
model_store=/home/model-server/model_store
1514
number_of_netty_threads=8

deploy-model-with-mms/deploy.py

Lines changed: 0 additions & 77 deletions
This file was deleted.

0 commit comments

Comments
 (0)