Conversion to ONNX
#14
by
imandel
- opened
After finetuning according to the suggested collab notebook, how do I convert to ONNX for running with WebGPU?
That model works great! I was more curious what scripts are used to convert to ONNX? LFM2 does note seem to be currently supported by huggingface optimum. If I'm starting with the current LiquidAI/LFM2-350M model, or a finetune with a LoRA adapter, i then need to convert it to ONNX myself rather than using the existing model you linked.
@imandel optimum support is coming soon, the model was merged into executorch https://github.com/pytorch/executorch/pull/13805