Merge branch 'main' of https://huggingface.co/EleutherAI/pile-t5-large into main
Browse files
    	
        README.md
    CHANGED
    
    | @@ -129,7 +129,7 @@ with the span-corruption objective. | |
| 129 |  | 
| 130 | 
             
            Intermediate checkpoints for Pile-T5 are accessible within this repository. 
         | 
| 131 | 
             
            There are in total 200 checkpoints that are spaced 10,000 steps. For T5x-native
         | 
| 132 | 
            -
            checkpoints that can be used for finetuning with the T5x library, refer to [here](https://huggingface.co/lintang/pile-t5- | 
| 133 |  | 
| 134 | 
             
            The training loss (in tfevent format) and validation perplexity (in jsonl) can be found [here](https://huggingface.co/EleutherAI/pile-t5-large/blob/main/large.zip).
         | 
| 135 |  | 
|  | |
| 129 |  | 
| 130 | 
             
            Intermediate checkpoints for Pile-T5 are accessible within this repository. 
         | 
| 131 | 
             
            There are in total 200 checkpoints that are spaced 10,000 steps. For T5x-native
         | 
| 132 | 
            +
            checkpoints that can be used for finetuning with the T5x library, refer to [here](https://huggingface.co/lintang/pile-t5-large-t5x)
         | 
| 133 |  | 
| 134 | 
             
            The training loss (in tfevent format) and validation perplexity (in jsonl) can be found [here](https://huggingface.co/EleutherAI/pile-t5-large/blob/main/large.zip).
         | 
| 135 |  | 

