Update README.md
Browse files
    	
        README.md
    CHANGED
    
    | 
         @@ -174,24 +174,37 @@ You can finetune this model on your own dataset. 
     | 
|
| 174 | 
         
             
            ### Retieval
         
     | 
| 175 | 
         
             
            Evaluated with [MIRACL-ja](https://huggingface.co/datasets/miracl/miracl), [JQARA](https://huggingface.co/datasets/hotchpotch/JQaRA) , [JaCWIR](https://huggingface.co/datasets/hotchpotch/JaCWIR) and [MLDR-ja](https://huggingface.co/datasets/Shitao/MLDR).
         
     | 
| 176 | 
         | 
| 177 | 
         
            -
            |  
     | 
| 178 | 
         
            -
             
     | 
| 179 | 
         
            -
            | 
     | 
| 180 | 
         
            -
            | 
     | 
| 181 | 
         
            -
             
     | 
| 182 | 
         
            -
            |  
     | 
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 183 | 
         | 
| 184 | 
         
             
            ### JMTEB
         
     | 
| 185 | 
         
             
            Evaluated with [JMTEB](https://github.com/sbintuitions/JMTEB).
         
     | 
| 186 | 
         
            -
            * The time-consuming datasets ['amazon_review_classification', 'mrtydi', 'jaqket', 'esci'] were excluded, and the evaluation was conducted on the other 12 datasets.
         
     | 
| 187 | 
         
            -
            * The average is a macro-average per task.
         
     | 
| 188 | 
         
            -
             
     | 
| 189 | 
         
            -
            | model | size | Class. | Ret. | STS. | Clus. | Pair. | Avg. |
         
     | 
| 190 | 
         
            -
            |:--:|:--:|:--:|:--:|:----:|:-------:|:-------:|:------:|
         
     | 
| 191 | 
         
            -
            | [mE5-base](https://huggingface.co/intfloat/multilingual-e5-base)  | 0.3B | 75.1 | 80.6 | 80.5 | **52.6** | 62.4 | 70.2 |
         
     | 
| 192 | 
         
            -
            | [GLuCoSE](https://huggingface.co/pkshatech/GLuCoSE-base-ja) | 0.1B | **82.6** | 69.8 | 78.2 | 51.5 | **66.2** | 69.7 |
         
     | 
| 193 | 
         
            -
            | GLuCoSE v2 | 0.1B | 80.5 | **82.8** | **83.0** | 49.8 | 62.4 | **71.7** |
         
     | 
| 194 | 
         | 
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 195 | 
         | 
| 196 | 
         
             
            ## Authors
         
     | 
| 197 | 
         
             
            Chihiro Yano, Mocho Go, Hideyuki Tachibana, Hiroto Takegawa, Yotaro Watanabe
         
     | 
| 
         | 
|
| 174 | 
         
             
            ### Retieval
         
     | 
| 175 | 
         
             
            Evaluated with [MIRACL-ja](https://huggingface.co/datasets/miracl/miracl), [JQARA](https://huggingface.co/datasets/hotchpotch/JQaRA) , [JaCWIR](https://huggingface.co/datasets/hotchpotch/JaCWIR) and [MLDR-ja](https://huggingface.co/datasets/Shitao/MLDR).
         
     | 
| 176 | 
         | 
| 177 | 
         
            +
            | Model | Size | MIRACL<br>Recall@5 | JQaRA<br>nDCG@10 | JaCWIR<br>MAP@10 | MLDR<br>nDCG@10 |
         
     | 
| 178 | 
         
            +
            |:--|:--|:--|:--|:--|:----|
         
     | 
| 179 | 
         
            +
            |OpenAI/text-embedding-3-small|-|processing...|38.8|81.6|processing...|
         
     | 
| 180 | 
         
            +
            |OpenAI/text-embedding-3-large|-|processing...|processing...|processing...|processing...|
         
     | 
| 181 | 
         
            +
            ||||||||||
         
     | 
| 182 | 
         
            +
            |[intfloat/multilingual-e5-large](https://huggingface.co/intfloat/multilingual-e5-large) | 0.6B | 89.2 | 55.4 | **87.6** | 29.8 |
         
     | 
| 183 | 
         
            +
            |[cl-nagoya/ruri-large](https://huggingface.co/cl-nagoya/ruri-large) | 0.3B | 78.7 | 62.4 | 85.0 | **37.5** |
         
     | 
| 184 | 
         
            +
            ||||||||||
         
     | 
| 185 | 
         
            +
            |[intfloat/multilingual-e5-base](https://huggingface.co/intfloat/multilingual-e5-base) | 0.3B | 84.2| 47.2 | **85.3** | 25.4 |
         
     | 
| 186 | 
         
            +
            |[cl-nagoya/ruri-base](https://huggingface.co/cl-nagoya/ruri-base) | 0.1B | 74.3 | 58.1 | 84.6 | **35.3** |
         
     | 
| 187 | 
         
            +
            |[pkshatech/GLuCoSE-base-ja](https://huggingface.co/pkshatech/GLuCoSE-base-ja) | 0.1B | 53.3 | 30.8 | 68.6 | 25.2 |
         
     | 
| 188 | 
         
            +
            |**GLuCoSE v2**| 0.1B | **85.5** | **60.6** | **85.3** | 33.8 |
         
     | 
| 189 | 
         
            +
            Note: Results for OpenAI small embeddings in JQARA and JaCWIR are quoted from the [JQARA](https://huggingface.co/datasets/hotchpotch/JQaRA) and [JaCWIR](https://huggingface.co/datasets/hotchpotch/JCWIR).
         
     | 
| 190 | 
         
            +
             
     | 
| 191 | 
         | 
| 192 | 
         
             
            ### JMTEB
         
     | 
| 193 | 
         
             
            Evaluated with [JMTEB](https://github.com/sbintuitions/JMTEB).
         
     | 
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 194 | 
         | 
| 195 | 
         
            +
            |Model|Size|Avg.|Retrieval|STS|Classification|Reranking|Clustering|PairClassification|
         
     | 
| 196 | 
         
            +
            |:--|:--|:--|:--|:--|:--|:--|:--|:--|
         
     | 
| 197 | 
         
            +
            |OpenAI/text-embedding-3-small|-|70.86|66.39|79.46|73.06|92.92|51.06|62.27|
         
     | 
| 198 | 
         
            +
            |OpenAI/text-embedding-3-large|-|73.97|74.48|82.52|77.58|93.58|53.32|62.35|
         
     | 
| 199 | 
         
            +
            ||||||||||
         
     | 
| 200 | 
         
            +
            |[intfloat/multilingual-e5-large](https://huggingface.co/intfloat/multilingual-e5-large)|0.6M|71.65|70.98|79.70|72.89|92.96|51.24|62.15|
         
     | 
| 201 | 
         
            +
            |[cl-nagoya/ruri-large](https://huggingface.co/cl-nagoya/ruri-large)|0.3B|73.31|73.02|83.13|77.43|92.99|51.82|62.29|
         
     | 
| 202 | 
         
            +
            ||||||||||
         
     | 
| 203 | 
         
            +
            |[intfloat/multilingual-e5-base](https://huggingface.co/intfloat/multilingual-e5-base)|0.3B|70.12|68.21|79.84|69.30|**92.85**|48.26|62.26|
         
     | 
| 204 | 
         
            +
            |[cl-nagoya/ruri-base](https://huggingface.co/cl-nagoya/ruri-base) |0.1B|71.91|69.82|82.87|75.58|92.91|**54.16**|62.38|
         
     | 
| 205 | 
         
            +
            |[pkshatech/GLuCoSE-base-ja](https://huggingface.co/pkshatech/GLuCoSE-base-ja)|0.1B|70.44|59.02|78.71|**76.82**|91.90|49.78|**66.39**|
         
     | 
| 206 | 
         
            +
            |**GLuCoSE v2**|0.1B|**72.22**|**73.36**|**82.96**|74.21|93.01|48.65|62.37|
         
     | 
| 207 | 
         
            +
            Note: Results for OpenAI embeddings and multilingual-e5 models are quoted from the [JMTEB leaderboard](https://github.com/sbintuitions/JMTEB/blob/main/leaderboard.md). Results for ruri are quoted from the [cl-nagoya/ruri-base model card](https://huggingface.co/cl-nagoya/ruri-base/blob/main/README.md).
         
     | 
| 208 | 
         | 
| 209 | 
         
             
            ## Authors
         
     | 
| 210 | 
         
             
            Chihiro Yano, Mocho Go, Hideyuki Tachibana, Hiroto Takegawa, Yotaro Watanabe
         
     |