Memory usage #768
mammadmaheri7
started this conversation in
General
Replies: 1 comment 1 reply
-
|
This is a pretty big model in terms of num of constraints -- so I think the memory usage makes sense. I'd note that num of params is not always a good proxy for num of constraints (architectures will induce very big differences) |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I encountered an issue while attempting to verify the tiny_vit model, which contains approximately 5 million parameters. During the setup step, it is consuming a surprisingly large amount of RAM, exceeding even 1000GB (though the exact amount is unknown). This leads me to wonder whether there might be a bug in the process or if adjusting certain settings variables, such as
num_cols,logrows,tolerance,etc., could potentially resolve the problem.The commands I executed are as follows (the onnx file, calibration data and settings after calibration have been attached in the zip file):
ezkl gen-settings -M output.onnx -O settings.json --input-visibility public --param-visibility fixed --output-visibility publicezkl calibrate-settings -D cal_data.json -M output.onnx -O settings.json --target resource --lookup-safety-margin 1 --scales 0,5,10,15 --only-range-check-rebaseezkl compile-circuit -M output.onnx -S settings.json --compiled-circuit network.ezklezkl get-srs -S settings.json --srs-path kzg.srsezkl setup -M network.ezkl --srs-path=kzg.srs --vk-path=vk.key --pk-path=pk.keyAdditionally, I would like to specify that I built the ezkl tool from the following commit:
Link to Commit
tiny_vit.zip
Beta Was this translation helpful? Give feedback.
All reactions