Update README.md
Browse files
    	
        README.md
    CHANGED
    
    | 
         @@ -5382,10 +5382,7 @@ license: mit 
     | 
|
| 5382 | 
         | 
| 5383 | 
         
             
            ## Multilingual-E5-large-instruct
         
     | 
| 5384 | 
         | 
| 5385 | 
         
            -
            [Text Embeddings  
     | 
| 5386 | 
         
            -
            Liang Wang, Nan Yang, Xiaolong Huang, Binxing Jiao, Linjun Yang, Daxin Jiang, Rangan Majumder, Furu Wei, arXiv 2022
         
     | 
| 5387 | 
         
            -
             
     | 
| 5388 | 
         
            -
            [Multilingual E5 Text Embeddings: A Technical Report](https://arxiv.org/abs/2402.05672).
         
     | 
| 5389 | 
         
             
            Liang Wang, Nan Yang, Xiaolong Huang, Linjun Yang, Rangan Majumder, Furu Wei, arXiv 2024
         
     | 
| 5390 | 
         | 
| 5391 | 
         
             
            This model has 24 layers and the embedding size is 1024.
         
     | 
| 
         @@ -5518,11 +5515,11 @@ so this should not be an issue. 
     | 
|
| 5518 | 
         
             
            If you find our paper or models helpful, please consider cite as follows:
         
     | 
| 5519 | 
         | 
| 5520 | 
         
             
            ```
         
     | 
| 5521 | 
         
            -
            @article{ 
     | 
| 5522 | 
         
            -
              title={Text Embeddings  
     | 
| 5523 | 
         
            -
              author={Wang, Liang and Yang, Nan and Huang, Xiaolong and  
     | 
| 5524 | 
         
            -
              journal={arXiv preprint arXiv: 
     | 
| 5525 | 
         
            -
              year={ 
     | 
| 5526 | 
         
             
            }
         
     | 
| 5527 | 
         
             
            ```
         
     | 
| 5528 | 
         | 
| 
         | 
|
| 5382 | 
         | 
| 5383 | 
         
             
            ## Multilingual-E5-large-instruct
         
     | 
| 5384 | 
         | 
| 5385 | 
         
            +
            [Multilingual E5 Text Embeddings: A Technical Report](https://arxiv.org/pdf/2402.05672).
         
     | 
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 5386 | 
         
             
            Liang Wang, Nan Yang, Xiaolong Huang, Linjun Yang, Rangan Majumder, Furu Wei, arXiv 2024
         
     | 
| 5387 | 
         | 
| 5388 | 
         
             
            This model has 24 layers and the embedding size is 1024.
         
     | 
| 
         | 
|
| 5515 | 
         
             
            If you find our paper or models helpful, please consider cite as follows:
         
     | 
| 5516 | 
         | 
| 5517 | 
         
             
            ```
         
     | 
| 5518 | 
         
            +
            @article{wang2024multilingual,
         
     | 
| 5519 | 
         
            +
              title={Multilingual E5 Text Embeddings: A Technical Report},
         
     | 
| 5520 | 
         
            +
              author={Wang, Liang and Yang, Nan and Huang, Xiaolong and Yang, Linjun and Majumder, Rangan and Wei, Furu},
         
     | 
| 5521 | 
         
            +
              journal={arXiv preprint arXiv:2402.05672},
         
     | 
| 5522 | 
         
            +
              year={2024}
         
     | 
| 5523 | 
         
             
            }
         
     | 
| 5524 | 
         
             
            ```
         
     | 
| 5525 | 
         |