Acknowledge license to accept the repository

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

This model is similar to the Unbabel/wmt22-cometkiwi-da but using XLM-R XL. Thus, this model has 3.5 billion parameters and requires a minimum of 15GB of GPU memory.

Paper

Scaling up CometKiwi: Unbabel-IST 2023 Submission for the Quality Estimation Shared Task

License:

cc-by-nc-sa-4.0

Usage (unbabel-comet)

Best used with unbabel-comet (>=2.1.0) to be installed:

pip install --upgrade pip  # ensures that pip is current 
pip install "unbabel-comet>=2.1.0"

Then you can use it through comet CLI:

comet-score -s {source-input}.txt -t {translation-output}.txt --model Unbabel/wmt23-cometkiwi-da-xl

Or using Python:

from comet import download_model, load_from_checkpoint

model_path = download_model("Unbabel/wmt23-cometkiwi-da-xl")
model = load_from_checkpoint(model_path)
data = [
    {
        "src": "The output signal provides constant sync so the display never glitches.",
        "mt": "Das Ausgangssignal bietet eine konstante Synchronisation, so dass die Anzeige nie stรถrt."
    },
    {
        "src": "Krouลพek ilustrace je urฤen vลกem milovnรญkลฏm umฤ›nรญ ve vฤ›ku od 10 do 15 let.",
        "mt": "ะšั–ะปัŒั†ะต ั–ะปัŽัั‚ั€ะฐั†ั–ั— ะฟั€ะธะทะฝะฐั‡ะตะฝะต ะดะปั ะฒัั–ั… ะปัŽะฑะธั‚ะตะปั–ะฒ ะผะธัั‚ะตั†ั‚ะฒะฐ ัƒ ะฒั–ั†ั– ะฒั–ะด 10 ะดะพ 15 ั€ะพะบั–ะฒ."
    },
    {
        "src": "Mandela then became South Africa's first black president after his African National Congress party won the 1994 election.",
        "mt": "ใใฎๅพŒใ€1994ๅนดใฎ้ธๆŒ™ใงใ‚ขใƒ•ใƒชใ‚ซๅ›ฝๆฐ‘ไผš่ญฐๆดพใŒๅ‹ๅˆฉใ—ใ€ๅ—ใ‚ขใƒ•ใƒชใ‚ซๅˆใฎ้ป’ไบบๅคง็ตฑ้ ˜ใจใชใฃใŸใ€‚"
    }
]
model_output = model.predict(data, batch_size=8, gpus=1)
print (model_output)

Intended uses

Our model is intented to be used for reference-free MT evaluation.

Given a source text and its translation, outputs a single score between 0 and 1 where 1 represents a perfect translation and 0 a random translation.

Languages Covered:

This model builds on top of XLM-R XL which cover the following languages:

Afrikaans, Albanian, Amharic, Arabic, Armenian, Assamese, Azerbaijani, Basque, Belarusian, Bengali, Bengali Romanized, Bosnian, Breton, Bulgarian, Burmese, Burmese, Catalan, Chinese (Simplified), Chinese (Traditional), Croatian, Czech, Danish, Dutch, English, Esperanto, Estonian, Filipino, Finnish, French, Galician, Georgian, German, Greek, Gujarati, Hausa, Hebrew, Hindi, Hindi Romanized, Hungarian, Icelandic, Indonesian, Irish, Italian, Japanese, Javanese, Kannada, Kazakh, Khmer, Korean, Kurdish (Kurmanji), Kyrgyz, Lao, Latin, Latvian, Lithuanian, Macedonian, Malagasy, Malay, Malayalam, Marathi, Mongolian, Nepali, Norwegian, Oriya, Oromo, Pashto, Persian, Polish, Portuguese, Punjabi, Romanian, Russian, Sanskri, Scottish, Gaelic, Serbian, Sindhi, Sinhala, Slovak, Slovenian, Somali, Spanish, Sundanese, Swahili, Swedish, Tamil, Tamil Romanized, Telugu, Telugu Romanized, Thai, Turkish, Ukrainian, Urdu, Urdu Romanized, Uyghur, Uzbek, Vietnamese, Welsh, Western, Frisian, Xhosa, Yiddish.

Thus, results for language pairs containing uncovered languages are unreliable!

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for Unbabel/wmt23-cometkiwi-da-xl

Finetuned
(9)
this model

Space using Unbabel/wmt23-cometkiwi-da-xl 1

Collection including Unbabel/wmt23-cometkiwi-da-xl