Update README.md
Browse files
README.md
CHANGED
|
@@ -80,9 +80,9 @@ The benchmark result in [MTVQA](https://github.com/bytedance/MTVQA/tree/main)
|
|
| 80 |
|
| 81 |
| Models | Open-Source | Vietnamese Score |
|
| 82 |
|:----------------------------------:|:-------------:|:------------------:|
|
| 83 |
-
| Qwen2-VL 72B (
|
| 84 |
-
| GPT-4o (
|
| 85 |
-
| **Vintern-1B-V2** (
|
| 86 |
| Qwen2-VL 7B | β | 30.0 |
|
| 87 |
| Claude3 Opus | β | 29.1 |
|
| 88 |
| GPT-4o mini | β | 29.1 |
|
|
|
|
| 80 |
|
| 81 |
| Models | Open-Source | Vietnamese Score |
|
| 82 |
|:----------------------------------:|:-------------:|:------------------:|
|
| 83 |
+
| Qwen2-VL 72B (Top 1) | β | 41.6 |
|
| 84 |
+
| GPT-4o (Top 2) | β | 34.2 |
|
| 85 |
+
| **Vintern-1B-V2** (Top 3) | β | **31.7** |
|
| 86 |
| Qwen2-VL 7B | β | 30.0 |
|
| 87 |
| Claude3 Opus | β | 29.1 |
|
| 88 |
| GPT-4o mini | β | 29.1 |
|