Replies: 1 comment
-
|
Sorry about that found the Batched example! For anyone looking for the fix the key is this: py_run_args = ezkl.PyRunArgs()
py_run_args.variables = [("batch_size", 2)]
res = ezkl.gen_settings(model=model_path, output=settings_path, py_run_args=py_run_args)My guess is that it links that variable name to the dynamic axis while exporting: torch.onnx.export(model, # model being run
x_tensor, # model input (or a tuple for multiple inputs) - Ensure input is a tensor
model_path, # where to save the model (can be a file or file-like object)
export_params=True, # store the trained parameter weights inside the model file
opset_version=10, # the ONNX version to export the model to
do_constant_folding=True, # whether to execute constant folding for optimization
input_names = ['input'], # the model's input names
output_names = ['output'], # the model's output names
dynamic_axes={'input' : {0 : 'batch_size'}, # variable length axes
'output' : {0 : 'batch_size'}}) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
My Model and Export Code
I defined a simple MLP for MNIST and exported it to ONNX:
EZKL setup
Generate Proof
The output of this is the following:
I feel like I am missing something, as you can see the outputs of the witness is just 10 elements while the outputs of the normal model are 80 (which makes sense for 8 images). Is this the expected behavior? or am I setting something wrong? Did I forget to do a step?
I also runned the example: https://github.com/zkonduit/ezkl/blob/main/examples/notebooks/mnist_classifier.ipynb
and it also appears to have the same issue... Is this expected then? If it is expected, in an inference service how should I approach processing a batch of images?
Disclaimer: I am new, sorry if I have missed something, I am trying to get my head around this.
Beta Was this translation helpful? Give feedback.
All reactions