Conversion to ONNX

#14
by imandel - opened

After finetuning according to the suggested collab notebook, how do I convert to ONNX for running with WebGPU?

Liquid AI org

That model works great! I was more curious what scripts are used to convert to ONNX? LFM2 does note seem to be currently supported by huggingface optimum. If I'm starting with the current LiquidAI/LFM2-350M model, or a finetune with a LoRA adapter, i then need to convert it to ONNX myself rather than using the existing model you linked.

Liquid AI org

@imandel optimum support is coming soon, the model was merged into executorch https://github.com/pytorch/executorch/pull/13805

Sign up or log in to comment