Model Card for MetaCLIP 2 (worldwide) ONNX
MetaCLIP 2 (worldwide) was presented in MetaCLIP 2: A Worldwide Scaling Recipe.
This checkpoint corresponds to ONNX implementation of the original implementation.
Install
First install the optimum-onnx library (from source for now):
pip install -q git+https://github.com/huggingface/optimum-onnx.git
Usage
Next you can use it like so:
import requests
from PIL import Image
from transformers import AutoProcessor
from optimum.onnxruntime.modeling import ORTModelForZeroShotImageClassification
model = ORTModelForZeroShotImageClassification.from_pretrained("onnx-community/metaclip-2-worldwide-huge-378-ONNX", subfolder="onnx")
processor = AutoProcessor.from_pretrained("onnx-community/metaclip-2-worldwide-huge-378-ONNX")
url = "http://images.cocodataset.org/val2017/000000039769.jpg"
image = Image.open(requests.get(url, stream=True).raw)
labels = ["a photo of a cat", "a photo of a dog", "a photo of a car"]
inputs = processor(text=labels, images=image, return_tensors="pt", padding=True)
outputs = model(**inputs)
logits_per_image = outputs.logits_per_image
Acknowledgements
- Downloads last month
 - 88