add sentence transformers configs (#4)
Browse files- sbert.net (67a456de1d77c6159603f4bad4a2fe6b12d8bcda)
- update readme (0f45e2944ec95914b340972ad1b9bd6d66ef6c70)
- add python example (a68bac4365867eaf42a3ac2e2530cd379479c056)
- 1_Pooling/config.json +7 -0
- README.md +11 -1
- modules.json +20 -0
- sentence_bert_config.json +4 -0
    	
        1_Pooling/config.json
    ADDED
    
    | @@ -0,0 +1,7 @@ | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | 
|  | |
| 1 | 
            +
            {
         | 
| 2 | 
            +
                "word_embedding_dimension": 384,
         | 
| 3 | 
            +
                "pooling_mode_cls_token": false,
         | 
| 4 | 
            +
                "pooling_mode_mean_tokens": true,
         | 
| 5 | 
            +
                "pooling_mode_max_tokens": false,
         | 
| 6 | 
            +
                "pooling_mode_mean_sqrt_len_tokens": false
         | 
| 7 | 
            +
            }
         | 
    	
        README.md
    CHANGED
    
    | @@ -2671,4 +2671,14 @@ If you find our paper or models helpful, please consider cite as follows: | |
| 2671 |  | 
| 2672 | 
             
            ## Limitations
         | 
| 2673 |  | 
| 2674 | 
            -
            This model only works for English texts. Long texts will be truncated to at most 512 tokens.
         | 
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | 
|  | |
| 2671 |  | 
| 2672 | 
             
            ## Limitations
         | 
| 2673 |  | 
| 2674 | 
            +
            This model only works for English texts. Long texts will be truncated to at most 512 tokens.
         | 
| 2675 | 
            +
             | 
| 2676 | 
            +
            ## Sentence Transformers
         | 
| 2677 | 
            +
             | 
| 2678 | 
            +
            Below is an example for usage with sentence_transformers. `pip install sentence_transformers~=2.2.2`
         | 
| 2679 | 
            +
            This is community contributed, and results may vary up to numerical precision.
         | 
| 2680 | 
            +
            ```python
         | 
| 2681 | 
            +
            from sentence_transformers import SentenceTransformer
         | 
| 2682 | 
            +
            model = SentenceTransformer('intfloat/e5-small-v2')
         | 
| 2683 | 
            +
            embeddings = model.encode(input_texts, normalize_embeddings=True)
         | 
| 2684 | 
            +
            ```
         | 
    	
        modules.json
    ADDED
    
    | @@ -0,0 +1,20 @@ | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | 
|  | |
| 1 | 
            +
            [
         | 
| 2 | 
            +
                {
         | 
| 3 | 
            +
                  "idx": 0,
         | 
| 4 | 
            +
                  "name": "0",
         | 
| 5 | 
            +
                  "path": "",
         | 
| 6 | 
            +
                  "type": "sentence_transformers.models.Transformer"
         | 
| 7 | 
            +
                },
         | 
| 8 | 
            +
                {
         | 
| 9 | 
            +
                  "idx": 1,
         | 
| 10 | 
            +
                  "name": "1",
         | 
| 11 | 
            +
                  "path": "1_Pooling",
         | 
| 12 | 
            +
                  "type": "sentence_transformers.models.Pooling"
         | 
| 13 | 
            +
                },
         | 
| 14 | 
            +
                {
         | 
| 15 | 
            +
                  "idx": 2,
         | 
| 16 | 
            +
                  "name": "2",
         | 
| 17 | 
            +
                  "path": "2_Normalize",
         | 
| 18 | 
            +
                  "type": "sentence_transformers.models.Normalize"
         | 
| 19 | 
            +
                }
         | 
| 20 | 
            +
              ]
         | 
    	
        sentence_bert_config.json
    ADDED
    
    | @@ -0,0 +1,4 @@ | |
|  | |
|  | |
|  | |
|  | 
|  | |
| 1 | 
            +
            {
         | 
| 2 | 
            +
                "max_seq_length": 512,
         | 
| 3 | 
            +
                "do_lower_case": false
         | 
| 4 | 
            +
            }
         | 
